Like many developers, I have shelves of books that I consider important to have within reach should I need them. While there are tons of resources online, there are a number of classics that are worth having on the shelf within easy reach. These include some of the books I wrote as well so that I can remind myself what I knew at one point in the past.
One of the books I found on my shelf caught my attention. It is a book that I picked up from my kids’ school several years ago. The library was tossing it out, so I grabbed it. The book is called Microcomputers at Work by James Hargrove. It’s a kid book of about 45 pages that is so old that it has a pocket in the back for the ‘check out’ card. The book was written in the Orwellian year of 1984, well before the internet and many of today’s modern thrills.
The book includes coverage of a lot of basic concepts that are still relevant today. This includes coverage of chips, disk drives, integrated circuits, processors, and even things like interpreters for “turning BASIC words into computer numbers.” In addition, a bit of history is covered, including a discussion of ENIAC and vacuum tubes.
The Predicted Future of Computers
What was interesting enough to inspire this short article was the ending of the book, where it covers microcomputers in the future. There was a comment about how far computers had come from ENIAC in 1946 to modern times – which was 1984. That was roughly 38 years. Ironically, we are almost 38 years from the time the book was written.
So, what were the predictions for computers?
It was predicted that in the future, computers could have such large memories that “tapes and disks” would not be needed to store information. Clearly, this prediction was found to be very accurate. While people use thumb drives and other memory disks, those tend to be for backing up or sharing data as much as anything. Even then, more people simply use internet connectivity and cloud storage to save, backup, or share information. Clearly, this prediction happened.
Also predicted was that computer languages would become more and more like everyday speech. While there are natural programming languages, I’d say the world has still fallen a little short of being able to simply describe a problem in simple English and get a working program. We are, however, closer than we were 37 years ago. There are languages like Scratch that can use drag and drop widgets to create programs that are nearly as easy as using natural language – something known as low-code and no-code software development.
If you look online, you’ll find that there are languages referred to as natural language programming languages. In many cases, they use English-like syntax but still require a lot of structure to get a working program. These include languages like AppleScript, COBOL, HyperTalk, SenseTalk, and many more. While the area of natural computer languages has evolved, we are not fully there yet. Having said that, with the increases in artificial intelligence (AI) and machine learning (ML), it is easy to predict that this is an area that, in the next 37-38 years, will likely change. In that time frame, I imagine you’ll be able to describe verbally to a computer what you want a program to do, and it will be able to generate the app.
The last prediction made in the book from 1984 will make most people chuckle because of how true it is today:
“Soon, even small computers will have ‘ears.’ One day, you might be able to say: ‘Computer! Do you think we’ll have snow tonight or an earthquake?’ and get an answer.”
Simply substitute “Computer” with “Siri,” “Okay Google,” or “Cortana,” and you’ll get the answer along with many more details. I doubt in 1984 they fully comprehended how “small” a small computer would be, nor that the level of weather predicting would be at the level it currently is due to what computers and computer modeling can do.
The Orwellian Predictions
Of course, these predictions were written in the year 1984. It might be that in 1949, the predictions of where computers were headed might have been more accurate. With the predictions of history being rewritten and technology being able to track every move of people, it could easily be said that three or four decades before 1984, Orwell had a better idea of what the future of computing would be able to do.
Predicting the Future of Computing
Of course, it could be silly to try to predict where computing will be in 37 to 38 years from today. I’m not sure how many people would have predicted in 1984 the level of connectivity we have in today’s world. We live in a world where many minor things are connected as a result of computers. For example, the tires in some of our cars tell the car when they are low, and the cars tell the driver (and others) when they need service. We live in a world that has shoes that can track the number of steps taken by the wearer. We have devices that let you pay for services by simply tapping the device to the register. The power and application of the internet was likely unfathomable by most in 1984, much less the rise of the Internet of Things (Iot).
In the late 1980s, I took a C programming course at Microsoft University on the Microsoft Campus in Redmond, Washington. One of the predictions that the person teaching the course made was that programming would change. It would lead in two directions. Most people would use very high-level languages to create programs – just like what the book predicted. These programmers could be non-technical end-users. In today’s terms, they would be citizen developers. However, there would also be a small group of programmers that would go the opposite direction. They would need to be highly skilled and technical because they would be writing the underlying widgets and code being used by the citizen developers.
So, where are computers and software development headed? It is hard to say. With the advent of machine learning (ML), artificial intelligence (AI), connectivity (including wireless communications), improved storage, and hardware optimization, it is clear that things will continue to evolve and get better. With the changes and improvements in areas such as quantum computing, it is clear that the potential for change in the next three or four decades could blow away what was done in the previous 3 or 4.
It’s going to be an interesting ride!