The other day I mentioned to someone that even though I am only a middle aged man, I wrote my first computer program more than three decades ago. But don't worry, this blog entry is not going to be about nostalgia for a lost past. Quite the contrary, our computing technology today is much better, more powerful and has matured to a stage where it has become a most useful tool in going about our lives.
As so many others, I started my journey in programming using Basic, on a variety of machines, but with my Commodore home computer as the most intensively used environment. I wasn't really interested in games. Instead my projects (on which I worked after school and in weekends, I was still a teenager) were about databases (address books in particular), word processing (one of my programmes allowed to quickly construct letters by picking common building blocks) and other such "useful" applications.
The machines we were using in those days were very basic compared to what we have today. The 6502-derived CPU in my Commodores ran at clock speeds of about 1Mhz-1.7MHz, had 16KB and 64KB memory (Commodore C16 and C64 respectively) and I mostly used cassette tapes for data storage. Within such constraints in terms of memory and processing capacity in particular, creating useful applications really demanded that as a programmer you exploited every last bit of storage and every cycle of processing to the absolute fullest extent. This in turn required an intimate knowledge of the underlying hardware and, inevitably, learning machine language.
Certainly, the interpreted language that was Basic in those days was just not good enough if you needed speed and could not afford to waste memory on overheads. So, I devoured computer magazines that shared some of the Hidden Knowledge about chips, internal registers and protocols. I sought everywhere for information about the 6502 family, its instruction set and machine language principles. These days, the stuff is literally at your fingertips, on the Internet. It's hard to imagine again, and certainly impossible to explain to people who have grown up in the Internet age, how hard good information was to come by even just a few decades ago. Even trips to large libraries yielded only a meagre harvest compared to what I can Google now in microseconds.
But it did mean that I developed a very clear and detailed understanding of what happened in the machine and why I had the outcomes I got when writing a given piece of code. Critically, understanding what went on at the lowest level helped immensely with creating better code in higher level languages. As I moved from home computers to larger and newer platforms, I maintained this approach of understanding in a detailed manner what went on under the bonnet. And this is precisely the skill that we no longer see in younger generations. They were brought up with higher level languages only, or even just black box frameworks, and 9 out of 10 programmers I meet have no or only very scant knowledge of why things are the way they are. They frequently cannot distinguish between a sensible or foolish architecture or algorithmic design simply because the inner workings or consequences are entirely opaque to them. There is a whole generation of coders out there who are severely handicapped by such lack of understanding. I cannot but think that this is why there is so much badly designed, terribly bloated and not very stable software out there today. Anyone involved in Enterprise level platforms, or CRM solutions and the like will know exactly what I am talking about.
As such, I very much welcome the launch of the Raspberry Pi and the recent clamours here in the UK for going back to basics and starting to teach children again what a computer actually is and help them understand what actually goes on inside the hardware and software. In a world where technology pervades ever more aspects of life, this is more important than ever - not in the least to ensure that our future will be one in which we remain firmly in charge.