mailto: blog -at- heyrick -dot- eu

Navi: Previous entry Display calendar Next entry
Switch to desktop version

FYI! Last read at 01:21 on 2024/11/24.

Year Of Code

Last week, in the UK, the "Year Of Code" was launched. This somewhat directionless "initiative" is... well... what can you say and still be polite?

How about saying the initiative to get kids coding is probably going to do more damage than any good it could ever have hoped to achieve. The official site is hysterical - a dead boring intro video, a giant text that reads:

CODE IS THE LANGUAGE WE USE
TO INSTRUCT COMPUTERS.
then it goes on to say:
It is really simple to learn and
anyone can do it - not just rocket scientists.

Yes, it really says that. You might want to direct abuse at lottie.dexter@yearofcode.org but before you do, be aware that she is the hapless 24 year old eye-candy that was somehow chosen to front this PR disaster. So she admitted she can't code. That sounds like a kick in the balls when the project is trying to say how eeeeasy-peeeeasy this coding lark is. It is more likely that she was too inexperienced to run away, she wasn't properly briefed on what exactly the project is and is trying to aim for (and judging by the website...) and perhaps worst of all, she's completely the wrong front for the project. I have no issue with girl-geeks (in fact, I warmly encourage them and not just because I'm a bloke but because a really clever girl tends to run rings around most of us non-girls - we need clever girls to be interested in this sort of stuff), but Lottie isn't a girl-geek. Where's her passion for coding? How can she be believable when she isn't an example of what the project is trying to promote? [how can she be believable with a video featuring George Osbourne?]

 

Here at HeyRick, I will present something closer to reality.

First up, if you are here wondering "is programming for me?", please allow me to direct you to the interesting world of Ikebana.
You need to come here because a search engine brought you, a friend recommended you, or something else because you already have an interest. If you don't have an interest, there isn't much I can do for you.

Now, look around you. You are using a computer or a tablet to read this. You may have a mobile phone or media (MP3?) player. You watch TV and these days it is all digital. DVDs and Bluray. Skype. VoIP telephony. Programmable microwaves. Bread makers. Printers. Cameras.
Back when I was a child, I got myself a BBC Micro. An ancient machine by today's standards. It was the only processor (CPU) in the entire house. Quite possibly the only one on the street until one of the other kids got a Spectrum.
Your computer has several processors (the main one, the graphics one, one for sound, one to do the input/output, the harddisc has a processor, as does the WiFi, and ditto the Bluetooth). Your display device will have a processor. Your phone will have a few. Your set top boxes and players, your Internet Livebox or Home Hub. Do you use a radio walky-talky telephone (cordless phone)? If it is modern, it will have a processor. There's a little one inside every bread maker. If your microwave is programmable, there is one in there too. Surprised? Well here's a better one - there's a little processor inside each and every SD card. Even the impossibly small µSD cards.
In short, the average home of today has so many processors that it is really difficult to count them all. You, sitting in front of your computer reading this, the connection between your eyeballs and the phone line is handled efficiently by what may - with no hint of irony - be a dozen processors.
Why am I banging on about processors?
Because that is what is at the heart of a computer. A processor is what does stuff, and software (also called "firmware" when it is supplied built-in) is what tells the processor what to do. And programming is the art of creating software to tell processors what to do. Every aspect of our lives these days - cars, communications, medical treatment, entertainment... - makes extensive use of computers in some way, and all these computers need to be told what to do.
Now d'you see why programming is important?

 

Let me present five nuggets of information that I hope will prove useful:

How to really be a programmer - #1 [ 10 PRINT "Hello!" ]

You may, if you have looked elsewhere on my site, have noticed I am rather partial to a little operating system known as RISC OS.
I will not be recommending this to you.
Nor Windows.
Or Linux.
Not even Plan9.

Instead, I will recommend, simply, whatever your friends use. You may think that programming is a solitary activity, and I'm certainly living the stereotype. Programming, maybe. But learning is better when it is a shared activity.
You could spend a year working on something that, with friends to bounce ideas off of each other, a group effort could get sorted in a week.

In your group, you don't need to strive to be the best. Try not to be the worst. But pay close attention to both. Look at what the lame duck keeps getting wrong to know the pitfalls to avoid, and learn from the best of the group.

How to really be a programmer - #2 [ int main(void) { printf("Hello!\n"); return 0; } ]

Did you buy yourself a book, like "Teach yourself C++##! in 24 hours!"? If you did, please stand up and carry the book over to the dustbin, drop it in, and come back.

You cannot learn a language in 24 hours. You can get a notion of what it looks like and how it works, but an hour a day for a month (not including weekends)? Either you'd be impossibly bright, or just plain delusional.
Sorry to be harsh, but being a mediocre programmer will take you a year.
To be a good programmer? Count on a decade.
Obviously these times will be less if you spend every bit of free time you have in the process of learning. But 24 hours? Just... No.

How to really be a programmer - #3 [ <?php echo "Hello!\n"; ?> ]

Now step away from the computer. Don't even look to see what your compiler calls itself. There are two sides to programming. The awesome sitting-in-front-of-the-computer-bashing-out-code that you see in the movies? That's a sexy glamourisation of writing stuff. Seriously, I am writing this but I could just as easily be writing code. It's a little less exciting when said like that, isn't it?

The bigger, and more important part of the puzzle, has nothing to do with programming. Yes, I did just say that. I'll say it again. The more important part of programming has nothing to do with programming.
Confused?

What a program is can be defined by saying, essentially, it is a recipe. It is a sequence of instructions that will cause something to happen. With this, we have two clearly defined parameters. We have "what we start with" and we have "what we want".
When we start to think like a programmer, we will realise that there is a third part. The all-important "how do we get from this to that?".

You do not need to know syntax for this. You don't need to know what keys build your project. This can be done on a piece of paper. It is basic problem solving.
I will give you something to ponder: You have a list of names and addresses (don't worry about how they are arranged, as far as you are concerned, each name and address is an individual item). You want them to be sorted. To make life easier for you, the names are given in Lastname,Firstname style, so you can simply compare each one directly.
How do you sort them?

When you know what you start with, the steps you need to take, and what the end result will look like, then and only then is it time to start thinking of code.

But, do not fall into the trap of pride. When you have come up with a good idea, a good process, that is the time to ask yourself: Is there a better way I could be doing this?
Think about that, too. Some of my first ideas that I thought were great were actually varying degrees of suck. After thinking about it awhile, it is often possible to come up with a better way. Don't be afraid to reconsider. Better to throw away pieces of paper, than screenfuls of code, right?

How to really be a programmer - #4 [ SWI OS_WriteS : ="Hello!\0\0" : SWI OS_NewLine : MOV PC, R14 ]

Here is the best piece of advice I can ever give you.

Buy an old '80s computer off of eBay. A Spectrum, a BBC Micro, an Apple II, a Vic20, an Oric-1, a C64... It does not matter which. Look around and find schematics (official or made by third parties). Dig up datasheets. Can you find a disassembly and explanation of how the little operating system inside works? It is probably 8 or 16K of code, tiny by today's standards.
My personal recommendation is for a BBC Micro - this is based on the fact that it is a rich and complicated circuit board inside. Some other machines, such as the Oric-1, have a fairly low chip count so there is less to fiddle with internally.

Now, open the thing up and study the circuit board. You will find a processor. Memory chips. ROMs (like memory, but read only and don't lose their contents when switched off). I/O chips for handling stuff like keyboards and disc drives. Probably some sort of tape interface. As in audio tape like you used to get in Walkmans!
But, wait, lurking underneath that is so much more. There is memory decoding, to take certain "locations in memory" (we call these "addresses") and direct the information from the processor to specific bits of hardware depending on what the address is. You can look at the reset circuitry, how it is wired up, and what it does. You can look at how the clock signal is passed through the system. If you are really really adventurous, many of the old machines ran at between 1 and 4MHz, which is within the range of a basic analogue oscilloscope, so you can actually jack in and see the data moving around inside the machine. Other options are to wire up LEDs to important signals and then fiddle the clock circuit to slow the machine right down.

You are probably wondering why I'm being crazy and telling you to poke around inside an ancient computer.
It is quite simple.
The computers of today are just like the computers of yesterday, only a heck of a lot more complicated and with everything squished together. Your average PC probably has three big lumps of silicon - a processor, a GPU, and a combined NorthBridge/SouthBridge (this talks to other hardware and/or to memory). The little ARM boards like the Beagle, the RaspberryPi, and such? Most of the good stuff is inside a single chip that is about the size of a thumb nail. There is practically zero possibility of looking to see how modern computers work.
And there is practically zero possibility of you ever becoming a good programmer without having an understanding of how the machine itself reacts to the commands you give it.
For all of our illusions, all of our object oriented thingummies and Ruby and such, down underneath it all it is just a processor reading bits of data, and doing stuff with those bits of data. Understand a little of how this bare-metal process works, you'll understand programming in a richer way.

How to really be a programmer - #5 [ ." Hello! " ]

Finally - keep at it.
An hour or two a week will introduce you to the piano. That's all.
To progress, you will need to take initiative, practice on your own. Challenge yourself. Learn. And do enjoyable things.

Piano. A foreign language. Programming. They're all alike. No mystery. No secret shortcut.

It depends upon you.

Now answer me this: Do you want to be a programmer?

 

 

Your comments:

Gavin Wraith, 16th February 2014, 18:44
Dear Rick, why have I not found your blog earlier? I always enjoy your opinions and the way you express them. May I suggest a trawl through Dijkstra's works (http://www.cs.utexas.edu/users/EWD/ ) for entertaining wisdom about programming, education and the folly of politicians. 
I like the sidebar. I will be back.

Add a comment (v0.11) [help?]
Your name:

 
Your email (optional):

 
Validation:
Please type 89794 backwards.

 
Your comment:

 

Navi: Previous entry Display calendar Next entry
Switch to desktop version

Search:

See the rest of HeyRick :-)