Technological Progress

OWID presents work from many different people and organizations. When citing this entry, please also cite the original data source. This entry can be cited as:

Max Roser (2016) – ‘Technological Progress’. Published online at OurWorldInData.org. Retrieved from: https://ourworldindata.org/technological-progress/ [Online Resource]

#  Empirical View

# Moore’s Law – Exponential Increase of the Number of Transistors on Integrated Circuits

Moore’s Law is the observation that the number of transistors on integrated circuits doubles approximately every two years. This accelerating progress is important as the capabilities of many digital electronic devices are strongly linked to Moore’s Law and are improving exponentially. I will show how aspects as diverse as processing speed, product price, memory capacity, and even the number and size of pixels in digital cameras have been progressing with similar speed.

The law was described as early as 1965 by the Intel co-founder Gordon E. Moore after whom it is named.1

Before we look at the development so far, I have reprinted the famous little graph that Moore published in 1965.

# Moore’s Original Graph: ‘The Number of Components per Integrated Function’ – Intel2

The Number of Components per Integrated Function Moore's Original Graph - Moore0

As you can see, Moore had seven observations from 1959 until 1965 and he predicted a continuing growth saying, “There is no reason to believe it will not remain nearly constant for at least 10 years”.3 As the next graph shows, he was not only right about the next ten years but astonishingly the regularity he found is true for more than half a century now.

Note the logarithmic vertical axis chosen to show the linearity of the growth rate. The line corresponds to exponential growth with transistor count doubling every two years.

Transistor-Count-over-time

# Other Laws of Exponential Technological Progress

In the following, I show that technological developments in many different respects are growing exponentially. Moore’s early observation is important as it showed that technological advances do not progress linearly but exponentially. But in and of itself, the doubling of transistors every two years does not directly matter in our lives. Therefore I ask in which ways the exponential growth of technology matters and will give an overview of how the exponential technological advancement is a driver of technological and social change that very much matters for our live now.

More importantly for us is that the power and speed of computers increased exponentially; the doubling time of computational capacity for personal computers was 1.5 years between 1975 to 2009. The increasing power of a wider range of computers – starting with the first general purpose computer (ENIAC) in 1946 – is shown in the following graph.

# Exponentially increasing computational capacity over time (computations per second) – Koomey, Berard, Sanchez, and Wong (2011)4

Exponentially increasing Computational Capacity over Time (Computations per Second) - Koomey, Berard, Sanchez, and Wong (2011)

# Increasing Product Quality for a Decreasing Price

Increasing computational power – and increasing product quality – matters indeed more than a mere doubling of transistors. But if the technologically-advanced products are prohibitively expensive then they can only have a limited impact on the whole society. For this reason, it is interesting to look at both the product quality and the price. The author and inventor Ray Kurzweil analyzed the change of price and quality for computing machines since 1900. He not only analyzed the improvements of integrated circuits but also looked at the predecessors – earlier transistors, vacuum tubes, relays and electromechanical computers. What he found is that Moore did not only make a valid prediction of the future, but his description is also valid for the past! The exponential growth rate that Moore picked up in the 1960s was driving technological progress since the beginning of the century.

The following graph shows the computer power that consumers could purchase for a price of $1000. It is especially insightful if one wants to understand how technological progress mattered as a driver of social change.

# Calculations per second per $1000 – exponential growth of computing for 110 years – Kurzweil5

Calculations per Second per 1000$ - Exponential Growth of Computing for 110 years - Kurzweil

The extension of the time frame also makes clear how our modern computers evolved. It is an insightful way of understanding that the computer age really is the successor to the Industrial Revolution.

One could also view the previous graph as a function of price instead of calculations per second; in this view you would find an exponentially decreasing price for a given product quality over 110 years.

The implication of this rapid simultaneous improvement in quality and decrease of the product price is that, according to a detailed discussion on reddit (here), a current laptop (May 2013) has about the same computing power as the most powerful computer on earth in the mid 1990s.

# Exponential Increase of the Electrical Efficiency (computing per kWh) of Computing

The cost to keep the machine running also matters. Electrical efficiency measures the computational capacity per unit of energy, and it is also important with respect to the environmental impact that energy production has.

The progress in this respect has been tremendous: researchers found that over the last six decades the energy demand for a fixed computational load halved every 18 months.6

# Computations per kilowatt-hour over time – Koomey, Berard, Sanchez, and Wong (2011)7

Computations per Kilowatt-Hour over Time - Koomey, Berard, Sanchez, and Wong (2011)

# Exponential Increase of Computer Memory – Exponentially Increasing Storage Capacity and Decreasing Storage Costs

Looking at these two pictures below it becomes immediately clear how fast technological progress increased the storage capacity. The left image is a hard disk from 1956 with a storage of 5MB on the right is a cheap modern Micro SD Card that stores as much as 12 800 of the old IBM drive.

# The IBM Model 350 disk file with a storage space of 5MB & an 8GB Micro SD Card8

1956-vs-Micro-SD

 

Considering the time since the introduction of the IBM 350 in 1956, the growth rate of storage capacity has not been as constant as for the other measures discussed before. Early on, technological revolutions boosted the capacity stepwise and not linearly. Yet for the time since 1980, progress has been very steady and at an even higher rate than the increase of computer speed.

# Increasing hard drive capacity, 1980-2011 – Wikipedia9

Increasing Hard Drive Capacity from 1980 till 2011 - Wikipedia

# Exponential Increase of the Quality & Exponential Decrease of the Price of Digital Cameras

Exponentially advancing technological progress can not only be found in computing machines. Cameras are a different example: for a given price consumers can buy cameras with more and more pixels. The number of pixels has again exponentially increased. This is shown in the following graph.

# Exponential growth of pixels per Australian dollar, 1994-2005 – Wikipedia10

Exponential growth of Pixels per Dollar (1994-2005) - Wikipedia

# The Future of Exponential Technological Growth

The exponential growth rates that we have observed over the last decades seem to promise more exciting technological advances in the future.

Many other types of technology have seen exponential growth rates beyond the ones discussed above. A couple of exceptionally promising examples are: Butters’ Law of Photonics and Rose’s Law. Butters’ Law says the amount of data one can transmit using optical fiber is doubling every nine months, which you can convert and say that the cost of transmission by optical fiber is halving every nine months. Rose’s Law describes the exponential growth of the number of qubits of quantum computers. If this growth rate should remain constant, it leads to some mind-bending opportunities.11

# Number of digits in the largest known prime since computers started looking for them, 1952-2008 – Wikipedia12

Graph of the number of digits in the largest known prime since computers started looking for them (1952-2008) - Wikipedia0