Don’t let the title ‘computing’ put you off. This is more about teaching and learning as a whole than I realised since I wrote part one.
Computer use has gone though a few different phases since the home computing revolution of the 1980s:
Programming > gaming > admin > multimedia > communications
Initially microcomputers were novelties where users could understand how the machine worked and learn how to do things with it from scratch. Over 60 books were published for the commodore VIC20 between 1982 and 1984 (and just as many for every other platform). There was even a term, now long forgotten, for someone who used computers in this way: a computerist. But just as we now don’t have true motorists, we just have drivers, the computerists have by and large gone. Today we just have computer drivers who are users of the software that only a few experts know how to make.
We were promised as computerists that we could do our accounts, control our central heating, manage our entertainment – but we couldn’t, not yet anyway. What we could do was play games and as the machines gained more memory and higher resolution graphics, that’s what everyone did.
Then in the 1990s we saw the rise of the IBM PC clone and Microsoft productivity software. Suddenly you could be your own personal assistant, your own typist and data manager. We entered the age of Admin.
The late 1990s brought Steve Jobs back to Apple where he launched the iMac and iPod, devices that could manage your multimedia. With Photoshop and desktop publishing, the computer became a creative tool to manage photos, music and later, movies.
As the new millennium began, the internet started to become fast enough to be useful and computers could be used for shopping, banking and research. The shift to the then-called Web 2.0 gave rise to two-way communication between websites and the era of mass social communication networking began. This became so much more powerful in 2008 with the launch of the first proper mobile smart internet devices.
If course, people obviously still do all of the five types of computer use today but it’s the first one that fascinates me the most. In the rush to get better games, better graphics, more useful apps and entertainment, I believe we have lost touch with what a computer is and how it works. I think this is detrimental to society and especially the next generation who were born into this closed magic box world. With the next stage of the internet well underway (the so-called ‘internet of things’) where computers upload and download data to the internet themselves, we need a new generation of computerists to design and shape this technology for the greater good, for themselves and us.
It was with all this in mind that I set up my computer club at Fyling Hall School. It’s been running for three terms now (since September 2018) so what have we learnt?
There have been a few students who came, then didn’t come back, but a few have been coming back every week, girls and boys aged 11 to 17.
The initial idea was to create a text adventure game (see my previous blog). But I found I’d have to back-track a little further than even that. I wrongly assumed that kids would know what a simple text adventure was. It was only by explaining it that I actually realised what it was: a simple, primordial artificial intelligence. You think about it: the user enters a two word phrase (a verb and a noun) and the computer has to take apart that entry, work out if it makes sense compared to its database, work out what the verb and the noun is and respond accordingly. It is Siri. It is Alexia. It is and Cortina (or whatever that thing is called).
We’re not just making a pointless old fashioned game here. We were looking at the basis of AI, with a direct link to present systems and the future.
Some people might frown that I’m using a suite of commodore VIC20s, 39 year old technology, to do this. They might think it quaint, amusingly retro, or crudely out of date. But no one teaches music by giving the kids an orchestra to play with and expect them to write a symphony: you give than a recorder to learn the basics – even though a recorder isn’t in the professional orchestra. You don’t teach electricity by showing them the controls for the National Grid or expect them to safely wire a house: you give them batteries, bulbs and wires. You start simple, you start basic and you start from first principles. Otherwise you’re just giving instructions to prepare the simplest type of microwave meal, a user of food, when we need them to be a chef, a designer and creator.
You can’t deny the facts: kids do use technology. This is why there is a big debate on whether phones and such like should be allowed in schools or not (we’ll save that for another day). But whatever they are using their devices for, for good or bad, it’s not programming that they’re doing on their phones, iPads and laptops. They’re spending all their time being really good users of someone else’s app.
In our lust for the latest tech we’ve lost sight of the idea of having a device that does one thing well. Your laptop is an unlimited multitrack recording studio. (The Beatles recorded Sgt Pepper on tape with just four tracks. What would they have done with a MacBook?). It is a high resolution graphic design and photo studio, greater than Fleet Street in its heyday. It has more productive movie making power than Hollywood had in its first 80 years. It has access to almost the entire sum of human knowledge. (And yet what do we do with it all? Look at videos of skateboarding cats and play Candy Crush. Clearly having a great tool doesn’t make you an engineer. Give a paintbrush to an elephant and it’ll paint a picture. It might be an amusing novelty but it won’t be a match for the Sistine Chapel.)
It’s of no surprise that when you have decent ICT suit in a school in which you can research any fact or image, watch the latest tv or movie and listen to any record, it’s hard to say to a student, ‘ok, we’re going to forget all that useful exciting stuff and learn how to move a single character around the screen’ or something. Teaching coding on an all-singing-all-dancing PC is horrendous, it’s so dull and we all know it.
Most schools in the UK are using Scratch to animate characters and that apparently ticks the box for ‘coding’. It’s a bit like the Logo system of old. A high level language that has done all the hard work for you. All the students are doing is clipping bits of pre-made code tother presented to them as simple graphical building blocks. This is fine for showing the flow of logic and its a bit of fun for younger ones. But is this really the best we can do?
Give these kids access to the isolated sandbox which is a microcomputer that won’t do anything at all unless you program it, it’s a different story: they get excited by it. That’s what I’ve found.
Here are a few recent comments from people on a VIC20 users forum about what I’ve been doing:
“Is one of the best educational computers ever made. The weak BASIC that requires PEEKS and POKES to do graphics and sound is even an advantage.”
“I used to truck around a pair of Vic-20 systems to homeschoolers’ homes to give private computer lessons to kids whose families couldn’t afford a computer. Though the current systems ranged from 386s to Pentium 4s during that time, the kids learned more with the “old” Vics. Several of my former students have crossed paths with me over the years, telling me how much that prepared them for college compared to what their friends who only used PCs had going for them in the same situation.”
“I have programmed all my life, including Vic-20. It is alien to me that I cannot do the same things I could do then, on modern platforms. I just want a language and an output. It all seems so complicated and inaccessible these days.”
“To learn electronics, you need to know what a resistor does, and how a transistor works. You learn the building block first, not from a chip that has millions of interconnected transistors and resistors in it.”
“The principles of programming are the same. Variables, loops etc. Also the discipline is the same. I was self taught programming on the Vic20 and I now programme c# .net. Most importantly 8 bit computers fire the imagination.”
Elon Musk learned to program on his VIC20, he went on to do quite well. Linus Torvalds invented Linux, the open source operating system used worldwide. The BASIC language we’re using on the VIC was actually designed by none other than Bill Gates.
But first we had to get back to basics, I mean real basics. When my generation first used a microcomputer, we brought with us knowledge of the keyboard from the typewriter. That’s how we knew what ‘carriage return’ was, it returned the carriage typing head to the next line. So when faced with a terminal text window, we knew what the ‘Return’ key was going to do. Kids today think it’s all a Word document. They’ve no idea that pressing ‘Return’ means something. They thought they were just arranging characters own a screen, not entering lines of code into the chip.
They’ve heard of Mega or Gigabytes but don’t know what they are. Some think it’s a power rating. This sort of misconception is up there with ‘the Moon only comes out at night’, ‘the Earth is closer to the Sun in summer’ or ‘plants get their food from the soil’, which I have to deal with when teaching year 7 science as clear and as fast as I can. The misconceptions of computer lore have gone unchecked.
The pace of any school curriculum in any subject is super fast. We have hardly any time to reflect, to teach patience or to teach self-drive. All of these issues have come up in the club. I’ve had kids type in their relatively short lines of code, try to RUN it and find it doesn’t work:
Their first port of call is to tell me. I ask, ‘have you checked it?’. ‘No’ They reply. I can scan the screen and immediately see they’ve missed off quote marks or used a semi-colon instead of a colon or whatever. ‘How can you see that so fast sir?’ they might ask. ‘Because continued effort into directed practice has enabled me to spot patterns of syntax quicker’ I reply. (That by the way is the most profound definition of learning you’ll come across today).
Our kids (and to a large degree all of us) are an instant gratification generation. If we have to wait, it’s boring, and we get distracted. If we have to do something again to correct it, that’s too much like hard work. It we get something wrong, the immediate reaction is to give up.
So, yes, I may be using 39 year old technology. I may be teaching them a hard programming language that is only one step up from machine code assembler but what I’m really showing them is far more important than I realised.
After all, what is a computer programer if not someone who is skilled, logical and creative, who has immense patience and accuracy, willing to keep going until the job is done? Hang on, isn’t that a description of any expert in the workplace?
It’s no wonder industry doesn’t have enough of them.
Like the article Ayd and support your approach. It’s a national scandal that we abandoned or world leading position in computer education in schools and for a decade or two stopped teaching programming and substituted ICT meaning USING computers not programming them. A few (understandable as you’re a youngster yourself) historical inaccuracies in your article though. For example BASIC pre dated Gates and Microsoft by a decade or two. I think Dartmouth BASIC was the original. But I won’t go on being a pedant as in principle a good article. Lol.
Thanks Bob. It was Bill Gates who wrote/coded the actual version of BASIC for commodore for the PET in 1977. Commodore got a good deal as they bought it outright, not as a license, so they rebranded it commodore BASIC and it remained unchanged for a decade.