My previous post, Simultaneous Ingress, Runkeeper, last.fm, Spotify – let’s just call it MORES law, has resulted in some Google+ chatter. Specifically, the chatter focuses upon this statement that I made:
Things have changed significantly in the computer world between 1982 and 2012, and they’ll change a lot more between 2012 and 2042.
Upon seeing this statement, Tad Donaghe said:
They’re going to change a lot more in the next 10 years than in the last 50…
Michael Cohen, who reshared my original post on Google+, agreed with Donaghe’s assessment.
And they’re right. Well, they’re right within the constraints previously specified in this blog – despite the changes going on around us, we’ll still have hopes and fears just like the people from thousands of years ago.
You can see that Cohen and Donaghe are right by looking at the past. There was technological change between 1970 and 1980, and there was change between 1980 and 1990, but a valid argument can be made that the change in the 1980s exceeded that in the 1970s.
But why, when the underlying hardware changes were constant, did the overall technology changes resemble something in a more exponential sense? This was examined by Linas Vepstas.
Every year, there is a 5% or 10% or 20% increase in the size of a silicon die that a computer CPU is made out of. The change is not exponential, it is linear, more or less.
Yup, that’s what we said before. But Vepstas goes on:
There are also small, incremental improvements in all aspects of die-making technology.
After listing examples of these incremental die-making technology improvements, Vepstas goes on to talk about other improvements, such as improvements in the machinery used to make the chip, improvements in machine components, and the like.
And Vepstas doesn’t even talk about the things that use the chip – the hardware, the software, and the infrasturcture such as the networks. The improvements in the chip allow all sorts of improvements in other areas. You’re not going to be able to host a graphical multi-font word processing application on a windowed operating system if you only have an Intel 4004 chip.
Ray Kurzweil also discusses this:
Most long range forecasts of technical feasibility in future time periods dramatically underestimate the power of future technology because they are based on what I call the “intuitive linear” view of technological progress rather than the “historical exponential view.” To express this another way, it is not the case that we will experience a hundred years of progress in the twenty-first century; rather we will witness on the order of twenty thousand years of progress (at today’s rate of progress, that is).
Later in the essay, Kurzweil goes on to state:
Indeed, we find not just simple exponential growth, but “double” exponential growth, meaning that the rate of exponential growth is itself growing exponentially.
Kurzweil then uses the term “paradigm shift,” but unlike some people, he uses the term correctly.
The first technological steps-sharp edges, fire, the wheel–took tens of thousands of years. For people living in this era, there was little noticeable technological change in even a thousand years. By 1000 A.D., progress was much faster and a paradigm shift required only a century or two. In the nineteenth century, we saw more technological change than in the nine centuries preceding it. Then in the first twenty years of the twentieth century, we saw more advancement than in all of the nineteenth century. Now, paradigm shifts occur in only a few years time.
Or, when you work through the math,
So the twenty-first century will see almost a thousand times greater technological change than its predecessor.
Then Kurzweil starts talking about the Singularity…but I think I’ll stop there.