Technology
Charted: The Exponential Growth in AI Computation
Click to view this graphic in a higher-resolution.
Charted: The Exponential Growth in AI Computation
Electronic computers had barely been around for a decade in the 1940s, before experiments with AI began. Now we have AI models that can write poetry and generate images from textual prompts. But what’s led to such exponential growth in such a short time?
This chart from Our World in Data tracks the history of AI through the amount of computation power used to train an AI model, using data from Epoch AI.
The Three Eras of AI Computation
In the 1950s, American mathematician Claude Shannon trained a robotic mouse called Theseus to navigate a maze and remember its course—the first apparent artificial learning of any kind.
Theseus was built on 40 floating point operations (FLOPs), a unit of measurement used to count the number of basic arithmetic operations (addition, subtraction, multiplication, or division) that a computer or processor can perform in one second.
Computation power, availability of training data, and algorithms are the three main ingredients to AI progress. And for the first few decades of AI advances, compute, which is the computational power needed to train an AI model, grew according to Moore’s Law.
Period | Era | Compute Doubling |
---|---|---|
1950–2010 | Pre-Deep Learning | 18–24 months |
2010–2016 | Deep Learning | 5–7 months |
2016–2022 | Large-scale models | 11 months |
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
However, at the start of the Deep Learning Era, heralded by AlexNet (an image recognition AI) in 2012, that doubling timeframe shortened considerably to six months, as researchers invested more in computation and processors.
With the emergence of AlphaGo in 2015—a computer program that beat a human professional Go player—researchers have identified a third era: that of the large-scale AI models whose computation needs dwarf all previous AI systems.
Predicting AI Computation Progress
Looking back at the only the last decade itself, compute has grown so tremendously it’s difficult to comprehend.
For example, the compute used to train Minerva, an AI which can solve complex math problems, is nearly 6 million times that which was used to train AlexNet 10 years ago.
Here’s a list of important AI models through history and the amount of compute used to train them.
AI | Year | FLOPs |
---|---|---|
Theseus | 1950 | 40 |
Perceptron Mark I | 1957–58 | 695,000 |
Neocognitron | 1980 | 228 million |
NetTalk | 1987 | 81 billion |
TD-Gammon | 1992 | 18 trillion |
NPLM | 2003 | 1.1 petaFLOPs |
AlexNet | 2012 | 470 petaFLOPs |
AlphaGo | 2016 | 1.9 million petaFLOPs |
GPT-3 | 2020 | 314 million petaFLOPs |
Minerva | 2022 | 2.7 billion petaFLOPs |
Note: One petaFLOP = one quadrillion FLOPs. Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
The result of this growth in computation, along with the availability of massive data sets and better algorithms, has yielded a lot of AI progress in seemingly very little time. Now AI doesn’t just match, but also beats human performance in many areas.
It’s difficult to say if the same pace of computation growth will be maintained. Large-scale models require increasingly more compute power to train, and if computation doesn’t continue to ramp up it could slow down progress. Exhausting all the data currently available for training AI models could also impede the development and implementation of new models.
However with all the funding poured into AI recently, perhaps more breakthroughs are around the corner—like matching the computation power of the human brain.
Where Does This Data Come From?
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
Note: The time estimated to for computation to double can vary depending on different research attempts, including Amodei and Hernandez (2018) and Lyzhov (2021). This article is based on our source’s findings. Please see their full paper for further details. Furthermore, the authors are cognizant of the framing concerns with deeming an AI model “regular-sized” or “large-sized” and said further research is needed in the area.
Methodology: The authors of the paper used two methods to determine the amount of compute used to train AI Models: counting the number of operations and tracking GPU time. Both approaches have drawbacks, namely: a lack of transparency with training processes and severe complexity as ML models grow.

This article was published as a part of Visual Capitalist's Creator Program, which features data-driven visuals from some of our favorite Creators around the world.
Technology
AI Week Wrap Report: See All the Visuals in One Place
AI Week, sponsored by Terzo, is Visual Capitalist’s in-depth exploration of the latest insights in the world of artificial intelligence.

AI Week was our first-of-its-kind editorial event. In it, we examined the latest AI insights, including the amount each nation is spending on AI, how people utilize AI today, the skills required for a career in AI, and much more.
All these insights were drawn from the 2025 Stanford AI Index and other cutting-edge sources. All of these are available at Visual Capitalist’s AI content hub–brought to you by our partners at Terzo.
Tap Into AI Week
This week, we examined several key areas of the AI landscape in 2025.
First, we examined the state of global AI investment, discovering that the U.S. has raised nearly half a trillion dollars in private AI investment since 2013—the most of any nation.
Following the U.S. were China and the U.K., which invested $120 billion and $30 billion, respectively, during the same period.
Patent filings serve as a means to measure innovation and leadership in the technology sector while also providing legal protection for novel ideas and inventions.
In our second post, we examined AI leadership by analyzing the number of AI patents filed by major nations. We found that China has accumulated 70% of all global AI patents, the most in recent years.
However, evidence does suggest that many of these patents were applied for and protected within China alone.
AI can be found in nearly every digital product today. So, in the third post, we explored how people utilize AI today.
Here, we found that the primary reason people use AI today is for both professional and personal support. Showing that AI can assist humans in managing both their emotions and life.
However, AI continues to find many uses in content creation, learning, and creativity.
A significant aspect of the conversation surrounding AI models is the substantial amount of money tech giants are investing in their training.
We examined corporate investment in various AI models, finding that in recent years, Google has spent the most. Although data is limited, it’s believed that the company spent $192 million in training Gemini 1.0 Ultra—the highest amount across all leading models.
The conversation around AI has also raised the question of whether humans or machines are faster at technical tasks.
While AI systems have historically fallen short compared to humans, the gap has narrowed considerably over the past year. Now, AI surpasses humans in specific technical skills, including advanced mathematics and visual reasoning.
The advent of AI has also created the need for AI-based jobs. Stanford University’s 2025 AI Index examined AI job postings throughout the U.S. and found that the most sought-after skill is the programming language Python.
Computer science, data analysis, and an understanding of the Agile working methodology were also identified as valuable AI skills.
For our final AI Week graphic, we pit American AIs against their Chinese counterparts in a test of performance.
The graphic charts the performance of the top U.S. and Chinese AI models on LMSYS’s Chatbot Arena. It shows that while U.S. models have consistently outperformed Chinese models, the performance gap has closed dramatically in recent years.
Helping the World to Discover Your Data
At Visual Capitalist, we craft campaigns like AI Week that tackle our client’s key challenges.
Whether by making your data more discoverable, leveraging our brand and audience of 12 million people per month, or consulting and educating around data discovery, our goal is to help you isolate the signal from the noise.
If you want to learn how companies like Terzo, BlackRock, MSCI, and Morningstar grew their brands by partnering with Visual Capitalist, contact us today.Use This Visualization
-
Misc2 weeks ago
Mapped: The Most Popular Beer in Each U.S. State
-
Maps4 weeks ago
Mapped: Which U.S. States Import the Most from China?
-
Wealth3 weeks ago
Ranked: Daily Incomes of the Richest and Poorest in 25 Countries
-
Money2 weeks ago
Ranked: States Where Americans Have the Most Cash in the Bank
-
Maps3 weeks ago
Mapped: The Most Taxed States in America
-
Automotive3 weeks ago
Mapped: The Best Selling Vehicle in Every U.S. State in 2024
-
Automotive2 weeks ago
Ranked: Favorite Car Brands of the Ultra-Rich
-
Economy3 weeks ago
Ranked: Real GDP Per Capita Growth by Country (2014-2024)