Technology
A High Level Look at Satellites
A High Level Look at Satellites
View the full-size version of the first visualization to see in full detail.
Satellites rarely get much attention, but they’re the hubs that keep our modern world connected. Just how many satellites are orbiting around Earth? Who’s launching them? And, what exactly are they doing up there anyway? These are good questions. Let’s dig in.
Today’s visualization comes to us from Carey Spies, and while it is based on older data, it provides a useful breakdown of the types of satellites that orbit the Earth.
There are now nearly 1,500 satellites in orbit in 2017, and if SpaceX’s plans for a 4,425-satellite communications network come to fruition, our planet’s exosphere will become even more crowded.
What do satellites actually do?
Satellites are launched into space for a number of reasons.
They do everything from military reconnaissance to keeping our GPS systems working properly. The truly global scope of telecommunications wouldn’t be possible without our expansive network of orbiting satellites. For example, O3b Networks’ 12 satellites provide broadband internet service to emerging markets.
Who’s launching satellites?
The United States, with nearly 600 operating satellites, has clearly won the space race in this sense. That said, everyone from Azerbaijan to Vietnam now has equipment in orbit, and the list keeps growing.
The change over time, seen in this interactive map, shows that now practically everyone is in the game:
Launching rockets used to be the sole domain of nations, but the privatization of spaceflight has dramatically increased the number of commercial satellites in orbit. Iridium Communications, for example, has a constellation of 70+ operational satellites.
Anxiety in the Exosphere
Operating satellites are only one part of the equation. Sputnik I was launched into space nearly 60 years ago, and as one might guess, a lot of obsolete and dead equipment has built up over that time. The United States Space Surveillance Network estimates that there are 21,000 objects larger than 10cm orbiting the Earth. An increase in “space junk” could have major implications, as even tiny objects can cause severe damage to equipment.
We must cooperate now to guarantee economically vital spaceflight.
– Brigitte Zypries, German Federal Minister for Economic Affairs and Energy
Another looming issue is the potential weaponization of space. Until now, nations have operated under the “gentlemen’s agreement” that nothing launched into space should be weaponized, but the U.S., China, and Russia have all been accused of taking steps towards putting destructive objects into orbit. Beyond the obvious implications of conflict in space, damaged satellites would also exacerbate the aforementioned “space junk” problem.
What’s on the Horizon
While companies like SpaceX are looking for ways to reduce the overall cost of launching rockets into space, other innovations may also make it easier than ever to put structures into orbit. The Archinaut Program – which received $20 million in funding from NASA – is looking at ways to manufacture and assemble structures in space.
One thing is for certain; space is about to get a whole lot more crowded.
AI
Charted: The Exponential Growth in AI Computation
In eight decades, artificial intelligence has moved from purview of science fiction to reality. Here’s a quick history of AI computation.

Charted: The Exponential Growth in AI Computation
Electronic computers had barely been around for a decade in the 1940s, before experiments with AI began. Now we have AI models that can write poetry and generate images from textual prompts. But what’s led to such exponential growth in such a short time?
This chart from Our World in Data tracks the history of AI through the amount of computation power used to train an AI model, using data from Epoch AI.
The Three Eras of AI Computation
In the 1950s, American mathematician Claude Shannon trained a robotic mouse called Theseus to navigate a maze and remember its course—the first apparent artificial learning of any kind.
Theseus was built on 40 floating point operations (FLOPs), a unit of measurement used to count the number of basic arithmetic operations (addition, subtraction, multiplication, or division) that a computer or processor can perform in one second.
Computation power, availability of training data, and algorithms are the three main ingredients to AI progress. And for the first few decades of AI advances, compute, which is the computational power needed to train an AI model, grew according to Moore’s Law.
Period | Era | Compute Doubling |
---|---|---|
1950–2010 | Pre-Deep Learning | 18–24 months |
2010–2016 | Deep Learning | 5–7 months |
2016–2022 | Large-scale models | 11 months |
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
However, at the start of the Deep Learning Era, heralded by AlexNet (an image recognition AI) in 2012, that doubling timeframe shortened considerably to six months, as researchers invested more in computation and processors.
With the emergence of AlphaGo in 2015—a computer program that beat a human professional Go player—researchers have identified a third era: that of the large-scale AI models whose computation needs dwarf all previous AI systems.
Predicting AI Computation Progress
Looking back at the only the last decade itself, compute has grown so tremendously it’s difficult to comprehend.
For example, the compute used to train Minerva, an AI which can solve complex math problems, is nearly 6 million times that which was used to train AlexNet 10 years ago.
Here’s a list of important AI models through history and the amount of compute used to train them.
AI | Year | FLOPs |
---|---|---|
Theseus | 1950 | 40 |
Perceptron Mark I | 1957–58 | 695,000 |
Neocognitron | 1980 | 228 million |
NetTalk | 1987 | 81 billion |
TD-Gammon | 1992 | 18 trillion |
NPLM | 2003 | 1.1 petaFLOPs |
AlexNet | 2012 | 470 petaFLOPs |
AlphaGo | 2016 | 1.9 million petaFLOPs |
GPT-3 | 2020 | 314 million petaFLOPs |
Minerva | 2022 | 2.7 billion petaFLOPs |
Note: One petaFLOP = one quadrillion FLOPs. Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
The result of this growth in computation, along with the availability of massive data sets and better algorithms, has yielded a lot of AI progress in seemingly very little time. Now AI doesn’t just match, but also beats human performance in many areas.
It’s difficult to say if the same pace of computation growth will be maintained. Large-scale models require increasingly more compute power to train, and if computation doesn’t continue to ramp up it could slow down progress. Exhausting all the data currently available for training AI models could also impede the development and implementation of new models.
However with all the funding poured into AI recently, perhaps more breakthroughs are around the corner—like matching the computation power of the human brain.
Where Does This Data Come From?
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
Note: The time estimated to for computation to double can vary depending on different research attempts, including Amodei and Hernandez (2018) and Lyzhov (2021). This article is based on our source’s findings. Please see their full paper for further details. Furthermore, the authors are cognizant of the framing concerns with deeming an AI model “regular-sized” or “large-sized” and said further research is needed in the area.
Methodology: The authors of the paper used two methods to determine the amount of compute used to train AI Models: counting the number of operations and tracking GPU time. Both approaches have drawbacks, namely: a lack of transparency with training processes and severe complexity as ML models grow.
-
Countries1 hour ago
Charted: The World’s Biggest Oil Producers in 2022
-
Money3 weeks ago
Visualizing the World’s Growing Millionaire Population (2012-2022)
-
Energy2 weeks ago
What Electricity Sources Power the World?
-
United States3 weeks ago
Mapped: The Richest Billionaires in U.S. States
-
Markets2 weeks ago
The 25 Worst Stocks by Shareholder Wealth Losses (1926-2022)
-
Jobs3 weeks ago
Charted: Youth Unemployment in the OECD and China
-
Technology1 week ago
Visualizing Google’s Search Engine Market Share
-
Markets3 weeks ago
The Monthly Cost of Buying vs. Renting a House in America