Gdoc/Admin
Daily Data InsightsSince 2010, the training computation of notable AI systems has doubled every six months

Since 2010, the training computation of notable AI systems has doubled every six months

A chart showing the computation used to train notable AI systems, measured in total floating-point operations (FLOP) and highlighting two distinct eras. In the first era from 1950 to 2010, the training computation doubled approximately every 21 months. With the rise of deep learning since 2010, it has been doubling approximately every 6 months. The y-axis ranges from 100 FLOP to 100 septillion FLOP. Several systems are highlighted, from early systems such as Theseus and the Perceptron Mark 1 to recent systems such as GPT-4 and Gemini 1.0 Ultra.

Artificial intelligence has advanced rapidly over the past 15 years, fueled by the success of deep learning.

A key reason for the success of deep learning systems has been their ability to keep improving with a staggering increase in the inputs used to train them — especially computation.

Before deep learning took off around 2010, the amount of computation used to train notable AI systems doubled about every 21 months. But, as you can see in the chart, this has accelerated significantly with the rise of deep learning, now doubling roughly every six months.

As one example of this pace, compared to AlexNet, the system that represented a breakthrough in computer vision in 2012, Google’s system “Gemini 1.0 Ultra” just 11 years later used 100 million times more training computation.

To put this in perspective, training Gemini 1.0 required roughly the same amount of computation as 50,000 high-end graphics cards working nonstop for an entire year.

Read more about how scaling up inputs has made AI more capable in our new article by Veronika Samborska

Our latest Daily Data Insights

See all Daily Data Insights

Get Daily Data Insights delivered to your inbox

Receive an email from us when we publish a Daily Data Insight (every weekday).

By subscribing you are agreeing to the terms of our privacy policy.