# Empirical View
# Moore’s Law – Exponential Increase of the Number of Transistors on Integrated Circuits
Moore’s Law is the observation that the number of transistors on integrated circuits doubles approximately every two years. This accelerating progress is important as the capabilities of many digital electronic devices are strongly linked to Moore’s Law and are improving exponentially. I will show how aspects as diverse as processing speed, product price, memory capacity, and even the number and size of pixels in digital cameras have been progressing with similar speed.
The law was described as early as 1965 by the Intel co-founder Gordon E. Moore after whom it is named.1
Before we look at the development so far, I have reprinted the famous little graph that Moore published in 1965.
As you can see, Moore had seven observations from 1959 until 1965 and he predicted a continuing growth saying, “There is no reason to believe it will not remain nearly constant for at least 10 years”.3 As the next graph shows, he was not only right about the next ten years but astonishingly the regularity he found is true for more than half a century now.
Note the logarithmic vertical axis chosen to show the linearity of the growth rate. The line corresponds to exponential growth with transistor count doubling every two years.
# Other Laws of Exponential Technological Progress
In the following, I show that technological developments in many different respects are growing exponentially. Moore’s early observation is important as it showed that technological advances do not progress linearly but exponentially. But in and of itself, the doubling of transistors every two years does not directly matter in our lives. Therefore I ask in which ways the exponential growth of technology matters and will give an overview of how the exponential technological advancement is a driver of technological and social change that very much matters for our live now.
More importantly for us is that the power and speed of computers increased exponentially; the doubling time of computational capacity for personal computers was 1.5 years between 1975 to 2009. The increasing power of a wider range of computers – starting with the first general purpose computer (ENIAC) in 1946 – is shown in the following graph.
# Exponentially increasing computational capacity over time (computations per second) – Koomey, Berard, Sanchez, and Wong (2011)4
# Increasing Product Quality for a Decreasing Price
Increasing computational power – and increasing product quality – matters indeed more than a mere doubling of transistors. But if the technologically-advanced products are prohibitively expensive then they can only have a limited impact on the whole society. For this reason, it is interesting to look at both the product quality and the price. The author and inventor Ray Kurzweil analyzed the change of price and quality for computing machines since 1900. He not only analyzed the improvements of integrated circuits but also looked at the predecessors – earlier transistors, vacuum tubes, relays and electromechanical computers. What he found is that Moore did not only make a valid prediction of the future, but his description is also valid for the past! The exponential growth rate that Moore picked up in the 1960s was driving technological progress since the beginning of the century.
The following graph shows the computer power that consumers could purchase for a price of $1000. It is especially insightful if one wants to understand how technological progress mattered as a driver of social change.
The extension of the time frame also makes clear how our modern computers evolved. It is an insightful way of understanding that the computer age really is the successor to the Industrial Revolution.
One could also view the previous graph as a function of price instead of calculations per second; in this view you would find an exponentially decreasing price for a given product quality over 110 years.
The implication of this rapid simultaneous improvement in quality and decrease of the product price is that, according to a detailed discussion on reddit (here), a current laptop (May 2013) has about the same computing power as the most powerful computer on earth in the mid 1990s.
# Exponential Increase of the Electrical Efficiency (computing per kWh) of Computing
The cost to keep the machine running also matters. Electrical efficiency measures the computational capacity per unit of energy, and it is also important with respect to the environmental impact that energy production has.
The progress in this respect has been tremendous: researchers found that over the last six decades the energy demand for a fixed computational load halved every 18 months.6
# Exponential Increase of Computer Memory – Exponentially Increasing Storage Capacity and Decreasing Storage Costs
Looking at these two pictures below it becomes immediately clear how fast technological progress increased the storage capacity. The left image is a hard disk from 1956 with a storage of 5MB on the right is a cheap modern Micro SD Card that stores as much as 12 800 of the old IBM drive.
Considering the time since the introduction of the IBM 350 in 1956, the growth rate of storage capacity has not been as constant as for the other measures discussed before. Early on, technological revolutions boosted the capacity stepwise and not linearly. Yet for the time since 1980, progress has been very steady and at an even higher rate than the increase of computer speed.
# Exponential Increase of the Quality & Exponential Decrease of the Price of Digital Cameras
Exponentially advancing technological progress can not only be found in computing machines. Cameras are a different example: for a given price consumers can buy cameras with more and more pixels. The number of pixels has again exponentially increased. This is shown in the following graph.
# The Future of Exponential Technological Growth
The exponential growth rates that we have observed over the last decades seem to promise more exciting technological advances in the future.
Many other types of technology have seen exponential growth rates beyond the ones discussed above. A couple of exceptionally promising examples are: Butters’ Law of Photonics and Rose’s Law. Butters’ Law says the amount of data one can transmit using optical fiber is doubling every nine months, which you can convert and say that the cost of transmission by optical fiber is halving every nine months. Rose’s Law describes the exponential growth of the number of qubits of quantum computers. If this growth rate should remain constant, it leads to some mind-bending opportunities.11
# Number of digits in the largest known prime since computers started looking for them, 1952-2008 – Wikipedia12