Technology
Visualizing the Unicorn Landscape in 2019
Visualizing the Unicorn Landscape in 2019
It was only six years ago that venture capitalist Aileen Lee coined the term “unicorn” to describe any privately-held startup worth $1 billion or more.
At the time, such valuations were so rare that they deserved a special name – but since then, it’s fair to say that the landscape has shifted dramatically. The startup boom intensified, and capital flowed into private companies at an unprecedented pace.
In recent times, unicorns have multiplied more like rabbits, and investors have propped up the combined value of the world’s 326 unicorns to the tune of $1.1 trillion.
Breaking down the World’s 326 Unicorns
Today’s chart uses data from the Unicorn Tracker created by CB Insights, and it breaks down the unicorn landscape by sector, valuation, and country.
Let’s start by looking at the biggest unicorns currently in existence:
Rank | Company | Valuation ($B) | Country | Sector |
---|---|---|---|---|
#1 | Toutiao (ByteDance) | $75 | China | Media |
#2 | Uber | $72 | United States | On-Demand |
#3 | Didi Chuxing | $56 | China | On-Demand |
#4 | WeWork | $47 | United States | Other |
#5 | JUUL Labs | $38 | United States | Other |
#6 | Airbnb | $29 | United States | eCommerce |
#7 | Stripe | $23 | United States | Fintech |
#8 | SpaceX | $19 | United States | Other |
#9 | Epic Games | $15 | United States | Other |
#10 | GrabTaxi | $14 | Singapore | On-Demand |
ByteDance is the world’s largest unicorn at a $75 billion valuation. The company owns Toutiao, a popular machine-learning enabled content platform in China that customizes feeds based on a user’s reading preferences. It also owns video sharing platform Tik Tok.
Experts are estimating that over 100 unicorns could IPO in 2019, including Uber and Airbnb from the above list.
So far this year, Lyft and Pinterest have already hit the public market – and another recent unicorn to IPO was conferencing platform Zoom Video, which has seen shares increase 120% in price since its impressive mid-April debut.
Unicorns by Sector
The two most common sectors for unicorns are Internet Software Services and E-commerce.
Sector | # of Unicorns | Valuation ($B) |
---|---|---|
Internet Software Services | 82 | $153 |
e-commerce | 44 | $129 |
Fintech | 32 | $94 |
Healthcare | 30 | $76 |
On Demand | 23 | $200 |
Hardware | 14 | $56 |
Data analytics | 12 | $27 |
Social | 11 | $27 |
Autotech | 11 | $23 |
Media | 8 | $89 |
Travel Tech | 7 | $11 |
Cybersecurity | 7 | $15 |
Other | 45 | $186 |
Total | 326 | $1,086 billion |
However, as you can see, the segment most valued by investors is On-Demand, which includes companies like Uber, Didi Chuxing, and DoorDash.
Unicorns by Geography
Nearly half of the world’s unicorns come from the U.S., but China also has an impressive roster of highly valued startups.
Country | # of Unicorns | % |
---|---|---|
USA | 156 | 47.9% |
China | 94 | 28.8% |
UK | 17 | 5.2% |
India | 13 | 4.0% |
Germany | 8 | 2.5% |
South Korea | 6 | 1.8% |
Rest of World | 32 | 9.8% |
Total | 326 |
Strangely, outside of the six major countries listed above, the rest of the world only combines for a measly 32 unicorns – less than 10% of the global total.
Unicorns by Valuation
Seven unicorns – including Uber, WeWork, Airbnb, and ByteDance – account for almost 30% of all of the value of the entire landscape.
Valuation Range | # of Unicorns | Value ($B) | % of Value |
---|---|---|---|
$20+ billion | 7 | $321 | 29.5% |
$10-20 billion | 13 | $151 | 13.9% |
$5-10 billion | 26 | $153 | 14.1% |
$1-5 billion | 280 | $461 | 42.5% |
Total | 326 | $1,086 |
The bottom of the pyramid ($1-5 billion in valuation) holds 280 companies. Added together, they are worth $461 billion, which is equal to 42.5% of the unicorn total.
AI
Charted: The Exponential Growth in AI Computation
In eight decades, artificial intelligence has moved from purview of science fiction to reality. Here’s a quick history of AI computation.

Charted: The Exponential Growth in AI Computation
Electronic computers had barely been around for a decade in the 1940s, before experiments with AI began. Now we have AI models that can write poetry and generate images from textual prompts. But what’s led to such exponential growth in such a short time?
This chart from Our World in Data tracks the history of AI through the amount of computation power used to train an AI model, using data from Epoch AI.
The Three Eras of AI Computation
In the 1950s, American mathematician Claude Shannon trained a robotic mouse called Theseus to navigate a maze and remember its course—the first apparent artificial learning of any kind.
Theseus was built on 40 floating point operations (FLOPs), a unit of measurement used to count the number of basic arithmetic operations (addition, subtraction, multiplication, or division) that a computer or processor can perform in one second.
Computation power, availability of training data, and algorithms are the three main ingredients to AI progress. And for the first few decades of AI advances, compute, which is the computational power needed to train an AI model, grew according to Moore’s Law.
Period | Era | Compute Doubling |
---|---|---|
1950–2010 | Pre-Deep Learning | 18–24 months |
2010–2016 | Deep Learning | 5–7 months |
2016–2022 | Large-scale models | 11 months |
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
However, at the start of the Deep Learning Era, heralded by AlexNet (an image recognition AI) in 2012, that doubling timeframe shortened considerably to six months, as researchers invested more in computation and processors.
With the emergence of AlphaGo in 2015—a computer program that beat a human professional Go player—researchers have identified a third era: that of the large-scale AI models whose computation needs dwarf all previous AI systems.
Predicting AI Computation Progress
Looking back at the only the last decade itself, compute has grown so tremendously it’s difficult to comprehend.
For example, the compute used to train Minerva, an AI which can solve complex math problems, is nearly 6 million times that which was used to train AlexNet 10 years ago.
Here’s a list of important AI models through history and the amount of compute used to train them.
AI | Year | FLOPs |
---|---|---|
Theseus | 1950 | 40 |
Perceptron Mark I | 1957–58 | 695,000 |
Neocognitron | 1980 | 228 million |
NetTalk | 1987 | 81 billion |
TD-Gammon | 1992 | 18 trillion |
NPLM | 2003 | 1.1 petaFLOPs |
AlexNet | 2012 | 470 petaFLOPs |
AlphaGo | 2016 | 1.9 million petaFLOPs |
GPT-3 | 2020 | 314 million petaFLOPs |
Minerva | 2022 | 2.7 billion petaFLOPs |
Note: One petaFLOP = one quadrillion FLOPs. Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
The result of this growth in computation, along with the availability of massive data sets and better algorithms, has yielded a lot of AI progress in seemingly very little time. Now AI doesn’t just match, but also beats human performance in many areas.
It’s difficult to say if the same pace of computation growth will be maintained. Large-scale models require increasingly more compute power to train, and if computation doesn’t continue to ramp up it could slow down progress. Exhausting all the data currently available for training AI models could also impede the development and implementation of new models.
However with all the funding poured into AI recently, perhaps more breakthroughs are around the corner—like matching the computation power of the human brain.
Where Does This Data Come From?
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
Note: The time estimated to for computation to double can vary depending on different research attempts, including Amodei and Hernandez (2018) and Lyzhov (2021). This article is based on our source’s findings. Please see their full paper for further details. Furthermore, the authors are cognizant of the framing concerns with deeming an AI model “regular-sized” or “large-sized” and said further research is needed in the area.
Methodology: The authors of the paper used two methods to determine the amount of compute used to train AI Models: counting the number of operations and tracking GPU time. Both approaches have drawbacks, namely: a lack of transparency with training processes and severe complexity as ML models grow.
-
China3 weeks ago
Charted: Youth Unemployment in the OECD and China
-
Technology1 week ago
Visualizing Google’s Search Engine Market Share
-
Real Estate3 weeks ago
The Monthly Cost of Buying vs. Renting a House in America
-
Money1 week ago
Visualized: How Long Does it Take to Double Your Money?
-
Sports3 weeks ago
Ranked: The World’s Largest Stadiums
-
Maps1 week ago
The Incredible Historical Map That Changed Cartography
-
Markets2 weeks ago
Charted: Six Red Flags Pointing to China’s Economy Slowing Down
-
VC+6 days ago
What’s New on VC+ in September