Technology
Charted: The Exponential Growth in AI Computation
Click to view this graphic in a higher-resolution.
Charted: The Exponential Growth in AI Computation
Electronic computers had barely been around for a decade in the 1940s, before experiments with AI began. Now we have AI models that can write poetry and generate images from textual prompts. But what’s led to such exponential growth in such a short time?
This chart from Our World in Data tracks the history of AI through the amount of computation power used to train an AI model, using data from Epoch AI.
The Three Eras of AI Computation
In the 1950s, American mathematician Claude Shannon trained a robotic mouse called Theseus to navigate a maze and remember its course—the first apparent artificial learning of any kind.
Theseus was built on 40 floating point operations (FLOPs), a unit of measurement used to count the number of basic arithmetic operations (addition, subtraction, multiplication, or division) that a computer or processor can perform in one second.
Computation power, availability of training data, and algorithms are the three main ingredients to AI progress. And for the first few decades of AI advances, compute, which is the computational power needed to train an AI model, grew according to Moore’s Law.
Period | Era | Compute Doubling |
---|---|---|
1950–2010 | Pre-Deep Learning | 18–24 months |
2010–2016 | Deep Learning | 5–7 months |
2016–2022 | Large-scale models | 11 months |
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
However, at the start of the Deep Learning Era, heralded by AlexNet (an image recognition AI) in 2012, that doubling timeframe shortened considerably to six months, as researchers invested more in computation and processors.
With the emergence of AlphaGo in 2015—a computer program that beat a human professional Go player—researchers have identified a third era: that of the large-scale AI models whose computation needs dwarf all previous AI systems.
Predicting AI Computation Progress
Looking back at the only the last decade itself, compute has grown so tremendously it’s difficult to comprehend.
For example, the compute used to train Minerva, an AI which can solve complex math problems, is nearly 6 million times that which was used to train AlexNet 10 years ago.
Here’s a list of important AI models through history and the amount of compute used to train them.
AI | Year | FLOPs |
---|---|---|
Theseus | 1950 | 40 |
Perceptron Mark I | 1957–58 | 695,000 |
Neocognitron | 1980 | 228 million |
NetTalk | 1987 | 81 billion |
TD-Gammon | 1992 | 18 trillion |
NPLM | 2003 | 1.1 petaFLOPs |
AlexNet | 2012 | 470 petaFLOPs |
AlphaGo | 2016 | 1.9 million petaFLOPs |
GPT-3 | 2020 | 314 million petaFLOPs |
Minerva | 2022 | 2.7 billion petaFLOPs |
Note: One petaFLOP = one quadrillion FLOPs. Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
The result of this growth in computation, along with the availability of massive data sets and better algorithms, has yielded a lot of AI progress in seemingly very little time. Now AI doesn’t just match, but also beats human performance in many areas.
It’s difficult to say if the same pace of computation growth will be maintained. Large-scale models require increasingly more compute power to train, and if computation doesn’t continue to ramp up it could slow down progress. Exhausting all the data currently available for training AI models could also impede the development and implementation of new models.
However with all the funding poured into AI recently, perhaps more breakthroughs are around the corner—like matching the computation power of the human brain.
Where Does This Data Come From?
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
Note: The time estimated to for computation to double can vary depending on different research attempts, including Amodei and Hernandez (2018) and Lyzhov (2021). This article is based on our source’s findings. Please see their full paper for further details. Furthermore, the authors are cognizant of the framing concerns with deeming an AI model “regular-sized” or “large-sized” and said further research is needed in the area.
Methodology: The authors of the paper used two methods to determine the amount of compute used to train AI Models: counting the number of operations and tracking GPU time. Both approaches have drawbacks, namely: a lack of transparency with training processes and severe complexity as ML models grow.

This article was published as a part of Visual Capitalist's Creator Program, which features data-driven visuals from some of our favorite Creators around the world.
AI
Charted: What are Retail Investors Interested in Buying in 2023?
What key themes and strategies are retail investors looking at for the rest of 2023? Preview: AI is a popular choice.

Charted: Retail Investors’ Top Picks for 2023
U.S. retail investors, enticed by a brief pause in the interest rate cycle, came roaring back in the early summer. But what are their investment priorities for the second half of 2023?
We visualized the data from Public’s 2023 Retail Investor Report, which surveyed 1,005 retail investors on their platform, asking “which investment strategy or themes are you interested in as part of your overall investment strategy?”
Survey respondents ticked all the options that applied to them, thus their response percentages do not sum to 100%.
Where Are Retail Investors Putting Their Money?
By far the most popular strategy for retail investors is dividend investing with 50% of the respondents selecting it as something they’re interested in.
Dividends can help supplement incomes and come with tax benefits (especially for lower income investors or if the dividend is paid out into a tax-deferred account), and can be a popular choice during more inflationary times.
Investment Strategy | Percent of Respondents |
---|---|
Dividend Investing | 50% |
Artificial Intelligence | 36% |
Total Stock Market Index | 36% |
Renewable Energy | 33% |
Big Tech | 31% |
Treasuries (T-Bills) | 31% |
Electric Vehicles | 27% |
Large Cap | 26% |
Small Cap | 24% |
Emerging Markets | 23% |
Real Estate | 23% |
Gold & Precious Metals | 23% |
Mid Cap | 19% |
Inflation Protection | 13% |
Commodities | 12% |
Meanwhile, the hype around AI hasn’t faded, with 36% of the respondents saying they’d be interested in investing in the theme—including juggernaut chipmaker Nvidia. This is tied for second place with Total Stock Market Index investing.
Treasury Bills (30%) represent the safety anchoring of the portfolio but the ongoing climate crisis is also on investors’ minds with Renewable Energy (33%) and EVs (27%) scoring fairly high on the interest list.
Commodities and Inflation-Protection stocks on the other hand have fallen out of favor.
Come on Barbie, Let’s Go Party…
Another interesting takeaway pulled from the survey is how conversations about prevailing companies—or the buzz around them—are influencing trades. The platform found that public investors in Mattel increased 6.6 times after the success of the ‘Barbie’ movie.
Bud Light also saw a 1.5x increase in retail investors, despite receiving negative attention from their fans after the company did a beer promotion campaign with trans influencer Dylan Mulvaney.
Given the origin story of a large chunk of American retail investors revolves around GameStop and AMC, these insights aren’t new, but they do reveal a persisting trend.
-
History2 weeks ago
The Incredible Historical Map That Changed Cartography
-
Markets4 days ago
The $109 Trillion Global Stock Market in One Chart
-
Markets4 weeks ago
Charted: Six Red Flags Pointing to China’s Economy Slowing Down
-
VC+2 weeks ago
What’s New on VC+ in September
-
Markets3 days ago
Ranked: 15 of the World’s Least Affordable Housing Markets
-
Markets4 weeks ago
The 25 Best Stocks by Shareholder Wealth Creation (1926-2022)
-
Business2 weeks ago
Ranked: The 20 Best Franchises to Open in the U.S.
-
Economy3 days ago
Visualizing the Most Sought-After Entry Level Jobs in 2023