2024 Visual of the Year: The race is tight!
Vote now to shake up the leaderboard!

Connect with us

Technology

Charted: The Exponential Growth in AI Computation

Published

on

Click to view this graphic in a higher-resolution.

A time series chart showing the creation of machine learning systems on the x-axis and the amount of AI computation they used on the y-axis measured in FLOPs.

Charted: The Exponential Growth in AI Computation

Electronic computers had barely been around for a decade in the 1940s, before experiments with AI began. Now we have AI models that can write poetry and generate images from textual prompts. But what’s led to such exponential growth in such a short time?

This chart from Our World in Data tracks the history of AI through the amount of computation power used to train an AI model, using data from Epoch AI.

The Three Eras of AI Computation

In the 1950s, American mathematician Claude Shannon trained a robotic mouse called Theseus to navigate a maze and remember its course—the first apparent artificial learning of any kind.

Theseus was built on 40 floating point operations (FLOPs), a unit of measurement used to count the number of basic arithmetic operations (addition, subtraction, multiplication, or division) that a computer or processor can perform in one second.

ℹ️ FLOPs are often used as a metric to measure the computational performance of computer hardware. The higher the FLOP count, the higher computation, the more powerful the system.

Computation power, availability of training data, and algorithms are the three main ingredients to AI progress. And for the first few decades of AI advances, compute, which is the computational power needed to train an AI model, grew according to Moore’s Law.

PeriodEraCompute Doubling
1950–2010Pre-Deep Learning18–24 months
2010–2016Deep Learning5–7 months
2016–2022Large-scale models11 months

Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.

However, at the start of the Deep Learning Era, heralded by AlexNet (an image recognition AI) in 2012, that doubling timeframe shortened considerably to six months, as researchers invested more in computation and processors.

With the emergence of AlphaGo in 2015—a computer program that beat a human professional Go player—researchers have identified a third era: that of the large-scale AI models whose computation needs dwarf all previous AI systems.

Predicting AI Computation Progress

Looking back at the only the last decade itself, compute has grown so tremendously it’s difficult to comprehend.

For example, the compute used to train Minerva, an AI which can solve complex math problems, is nearly 6 million times that which was used to train AlexNet 10 years ago.

Here’s a list of important AI models through history and the amount of compute used to train them.

AIYearFLOPs
Theseus195040
Perceptron Mark I1957–58695,000
Neocognitron1980228 million
NetTalk198781 billion
TD-Gammon199218 trillion
NPLM20031.1 petaFLOPs
AlexNet2012470 petaFLOPs
AlphaGo20161.9 million petaFLOPs
GPT-32020314 million petaFLOPs
Minerva20222.7 billion petaFLOPs

Note: One petaFLOP = one quadrillion FLOPs. Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.

The result of this growth in computation, along with the availability of massive data sets and better algorithms, has yielded a lot of AI progress in seemingly very little time. Now AI doesn’t just match, but also beats human performance in many areas.

It’s difficult to say if the same pace of computation growth will be maintained. Large-scale models require increasingly more compute power to train, and if computation doesn’t continue to ramp up it could slow down progress. Exhausting all the data currently available for training AI models could also impede the development and implementation of new models.

However with all the funding poured into AI recently, perhaps more breakthroughs are around the corner—like matching the computation power of the human brain.

Where Does This Data Come From?

Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.

Note: The time estimated to for computation to double can vary depending on different research attempts, including Amodei and Hernandez (2018) and Lyzhov (2021). This article is based on our source’s findings. Please see their full paper for further details. Furthermore, the authors are cognizant of the framing concerns with deeming an AI model “regular-sized” or “large-sized” and said further research is needed in the area.

Methodology: The authors of the paper used two methods to determine the amount of compute used to train AI Models: counting the number of operations and tracking GPU time. Both approaches have drawbacks, namely: a lack of transparency with training processes and severe complexity as ML models grow.

green check mark icon

This article was published as a part of Visual Capitalist's Creator Program, which features data-driven visuals from some of our favorite Creators around the world.

Click for Comments

Business

What Types of Apps Do People Actually Pay For?

Nearly all the apps available are free, so which ones are people actually willing to drop a few dollars on?

Published

on

Ranked: Types of Mobile Apps, by Revenue in 2023

This was originally posted on our Voronoi app. Download the app for free on iOS or Android and discover incredible data-driven charts from a variety of trusted sources.

A lot of the digital sphere is technically free if one has an internet connection. After all, many useful apps—email, messaging, social media—can be accessed for the grand sum of zero dollars and zero cents.

But many people choose to pay for apps anyway, either to get rid of ads, access content locked behind paywalls, or to opt in for a more exclusive form of service.

So what are these apps? And how much money are we talking about?

This chart tracks the amount of money spent on apps by mobile users around the world using data from SensorTower’s State of Mobile 2024 report.

Importantly, it does not include spending on mobile games.

Entertainment, Dating, Are Big Wins

The top category of apps people pay for is Digital Entertainment, earning more than $8 billion in revenue for various providers (Netflix, Disney+, HBO).

And media access is a big theme in the app economy: from live sports and music to comics and books.

RankCategoryConsumer Spending 2023Examples
1Digital Entertainment$8.2BDisney+
2Dating$5.7BTinder
3Short Videos$4.3BTikTok
4Video Sharing$2.4BYouTube
5Comics$2.3BPiccoma
6Music/Podcasts$2.1BSpotify
7File Management$2.0BGoogle One
8Live Sports$1.2BESPN
9Live Streaming$1.2BBIGO LIVE
10Communication$1.1BLINE
11Photo Editing$1.0BPicsart
12Language Learning$0.9BDuolingo
13Business Software$0.9BAdobe Reader
14Fiction$0.9BGoodNovel
15Fitness$0.8BPeloton

Aside from that, mobile users around the world are also dropping big bucks on dating apps which pulled in close to $6 billion in revenue in 2024.

In fact, according to SensorTower, Tinder was the first non-game mobile app to reach $1 billion in user spending in 2020. It also remains the only dating app to do so.

By employing the paywall model, dating apps can either restrict access to new profiles, usually by setting a specific number of “likes” or “matches,” or offer to boost a user’s profile, both for a small fee.

Meanwhile, people theorise (currently without evidence) that dating apps cripple prospects for non-paying members, which may lead into a feedback loop towards paying for matches.

As of 2024, about 11% of Tinder’s user base are paying members, accounting for about 60% of its revenue.

With the World Health Organization warning that loneliness could soon become a global public health concern, one imagines dating apps are only going to see revenue gains.

Learn More on the Voronoi App

Our app is in fact free, and a veritable treasure trove of great data visualizations. Check out The Daily Scroll of a Social Media User, by creator MadeVisual.

Continue Reading
Hinrich Foundation IMD Sustainable Trade Index 2024. Download the free report.

Subscribe

Popular