Imagine that some politician said “Cars are an important part of our modern economy. Many jobs involve cars. Therefore, all students must learn how to build cars; we will be adding it to the curriculum.” This does (to me, anyway) seem pretty ridiculous on its own. Now imagine that the students are being taught to make small car models out of card, whilst being told that this is actually how it’s always done (this sort of falls apart, because unlike with programming the difference is quite easy to tell, since people understand cars more). This is what the whole “learn to code” buzz seems like to me. In fact, it’s even worse, with hyperbolic claims of it being “the new literacy”, often made by people who have never actually programmed.
The average person is going to have lots of interaction with things which have been programmed, obviously (citation: smartphones exist). This does not, however, mean, that they must know every detail of how they work - and in any case, they would not be taught this. Most of the “learn to code” resources, especially those done in schools, simply start with very simple, visual, 2D-graphical environments. There is nothing intrinsically wrong with this. The problem emerges from the fact that these typically fail to map nicely to standard (sensible) environments (for example, Scratch, widely used for this, lacked functions until recently, and does not really have proper variables still). And anything beyond these basic environments is rarely taught; people decry it as “too hard”, ignoring the fact that anything much simpler is useless. This computer science GCSE doesn’t seem to properly cover much of the things which are actually useful in computer usage or programming. Based on the lesson names and a brief skimming of BBC content related to the course, it seems that it:
- covers quite a lot of stuff, ranging from high-level to low-level, in little detail.
- was written by people who don’t know much about it, given the mention of making a webpage in the “Networking” section, and “Operation Systems”.
- is not actually relevant to most programming/computer usage: “Small Basic, Python, C#, SQL and VB.Net” are mentioned as programming languages to be used - a very narrow view of the huge amount of available languages, and SQL seems a very odd choice, given that it’s specifically for handling (some) databases and not much else. It also seems to have an entire section on binary, hexadecimal etc, which are not hugely relevant, despite binary being used internally in digital electronics.
This is obviously not what’s covered in “Hour of Code” lessons - this is a multi-year course. Lessons focused on “learning to code” typically involve dragging a few blocks around by following tutorials to make simple pictures
I believe that it is important that people know some things other than “you can download apps and stuff and use the web”, but this is in my opinion not a good way to teach the broad field of “computers and their uses”. I think that instead of useless programs involving “coding” - mostly with non-scalable graphical environments - which isn’t actually ever going to be useful, the following should be covered:
- a quick introduction to hardware (for troubleshooting, etc) and what all those cables are
- basics of networking (what routers do, ISPs and their job, different connections, HTTP(S), DNS)
- privacy in the digital age (i.e. maybe stop giving Facebook/Google/Amazon all your private information)
- operating systems, what various programs are for, and the fact that ones which aren’t Windows exist
- what programming involes
Such a course may not sound as catchy as “coding: the new literacy” but would, I believe, generally help people to at least use modern platforms reasonably easily.