What is Funny about Moore’s Law?
A fully updated Chapter 33 of Our Long Walk to Economic Freedom
The fully revised, updated and expanded edition of Our Long Walk to Economic Freedom is now available on Kindle or in any good South African bookshop. Over the last few months, paid subscribers have received early versions of seven chapters. Find them here: Chapter 3, Chapter 5, Chapter 11, Chapter 16, Chapter 20, Chapter 27 and Chapter 36. Consider a paid subscription to read them – and this one – in full.
You wouldn’t call it a classic joke. It’s more of a quip, to be honest; something you might hear at a computer-science convention. It is said that the number of people predicting the end of Moore’s law doubles every two years. Lol.
For the uninitiated, Moore’s law refers to Gordon Moore’s prediction, in 1965, that the number of transistors on a computer microchip would double every two years while the cost of computers would be halved. It was a brave prediction to make when microprocessors and home computers were still just a distant dream. But despite the countless experts predicting the demise of Moore’s law, as the quip insinuates, it has remained true for almost five decades.
Today’s microchips are millions of times stronger than those that guided Apollo 11 to the moon in 1969. It gets even more bizarre: the smartphone in your pocket is now faster than the most famous supercomputer that has existed – IBM’s 1997 Deep Blue, which beat Garry Kasparov in a historic chess showdown. That is the power of exponential growth.
This exponential growth of the computer chip has transformed our lives in ways that no one at the dawn of the ICT revolution could have imagined. We now open our smartphones – an invention of the 1990s – to read our daily news, order a lift or food delivery, make payments, watch videos and listen to music, have meetings, attend lectures and interact with friends via social media sites such as TikTok and X. Sometimes we may even call someone to chat. But, surprising as it may be to us now, the benefits of microchips and home computers were not immediately obvious to everyone. In fact, by the end of the 1980s economists were puzzled: why had productivity growth been so sluggish despite huge improvements in computing technology? The Nobel Prize-winning economist Robert Solow summarised the paradox at the time: ‘You can see the computer age everywhere but in the productivity statistics.’1
Keep reading with a 7-day free trial
Subscribe to Our Long Walk to keep reading this post and get 7 days of free access to the full post archives.