Technology
Zoom is Now Worth More Than the World’s 7 Biggest Airlines
Zoom Is Now Worth More Than The 7 Biggest Airlines
Amid the COVID-19 pandemic, many people have transitioned to working—and socializing—from home. If these trends become the new normal, certain companies may be in for a big payoff.
Popular video conferencing company, Zoom Communications, is a prime example of an organization benefiting from this transition. Today’s graphic, inspired by Lennart Dobravsky at Lufthansa Innovation Hub, is a dramatic look at how much Zoom’s valuation has shot up during this unusual period in history.
The Zoom Boom, in Perspective
As of May 15, 2020, Zoom’s market capitalization has skyrocketed to $48.8 billion, despite posting revenues of only $623 million over the past year.
What separates Zoom from its competition, and what’s led to the app’s massive surge in mainstream business culture?
Industry analysts say that business users have been drawn to the app because of its easy-to-use interface and user experience, as well as the ability to support up to 100 participants at a time. The app has also blown up among educators for use in online learning, after CEO Eric Yuan took extra steps to ensure K-12 schools could use the platform for free.
Zoom meeting participants have skyrocketed in past months, going from 10 million in December 2019 to a whopping 300 million as of April 2020.
The Airline Decline
The airline industry has been on the opposite end of fortune, suffering an unprecedented plummet in demand as international restrictions have shuttered airports:
The world’s top airlines by revenue have fallen in total value by 62% since the end of January:
Airline | Market Cap Jan 31, 2020 | Market Cap May 15, 2020 |
---|---|---|
Southwest Airlines | $28.440B | $14.04B |
Delta | $35.680B | $12.30B |
United | $18.790B | $5.867B |
International Airlines Group | $14.760B | $4.111B |
Lufthansa | $7.460B | $3.873B |
American | $11.490B | $3.886B |
Air France | $4.681B | $2.137B |
Total Market Cap | $121.301B | $46.214B |
Source: YCharts. All market capitalizations listed as of May 15, 2020.
With countries scrambling to contain the spread of COVID-19, many airlines have cut travel capacity, laid off workers, and chopped executive pay to try and stay afloat.
If and when regular air travel will return remains a major question mark, and even patient investors such as Warren Buffett have pulled out from airline stocks.
Airline | % Change in Total Returns (Jan 31-May 15, 2020) |
---|---|
United | -72.91% |
International Airlines Group | -72.16% |
American | -65.76% |
Delta | -65.39% |
Air France | -54.34% |
Southwest Airlines | -56.35% |
Lufthansa | -48.08% |
Source: YCharts, as of May 15, 2020.
The world has changed for the airlines. The future is much less clear to me about how the business will turn out.
—Warren Buffett
What Does the Future Hold?
Zoom’s recent success is a product of its circumstances, but will it last? That’s a question on the mind of many investors and pundits ahead of the company’s Q1 results to be released in June.
It hasn’t been all smooth-sailing for the company—a spate of “Zoom Bombing” incidents, where uninvited people hijacked meetings, brought the app’s security measures under scrutiny. However, the company remained resilient, swiftly providing support to combat the problem.
Meanwhile, as many parts of the world begin taking measures to restart economic activity, airlines could see a cautious return to the skies—although any such recovery will surely be a “slow, long ascent”.
Correction: Changed the graphics to reflect 300 million daily active “meeting participants” as opposed to daily active users.
AI
Charted: The Exponential Growth in AI Computation
In eight decades, artificial intelligence has moved from purview of science fiction to reality. Here’s a quick history of AI computation.

Charted: The Exponential Growth in AI Computation
Electronic computers had barely been around for a decade in the 1940s, before experiments with AI began. Now we have AI models that can write poetry and generate images from textual prompts. But what’s led to such exponential growth in such a short time?
This chart from Our World in Data tracks the history of AI through the amount of computation power used to train an AI model, using data from Epoch AI.
The Three Eras of AI Computation
In the 1950s, American mathematician Claude Shannon trained a robotic mouse called Theseus to navigate a maze and remember its course—the first apparent artificial learning of any kind.
Theseus was built on 40 floating point operations (FLOPs), a unit of measurement used to count the number of basic arithmetic operations (addition, subtraction, multiplication, or division) that a computer or processor can perform in one second.
Computation power, availability of training data, and algorithms are the three main ingredients to AI progress. And for the first few decades of AI advances, compute, which is the computational power needed to train an AI model, grew according to Moore’s Law.
Period | Era | Compute Doubling |
---|---|---|
1950–2010 | Pre-Deep Learning | 18–24 months |
2010–2016 | Deep Learning | 5–7 months |
2016–2022 | Large-scale models | 11 months |
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
However, at the start of the Deep Learning Era, heralded by AlexNet (an image recognition AI) in 2012, that doubling timeframe shortened considerably to six months, as researchers invested more in computation and processors.
With the emergence of AlphaGo in 2015—a computer program that beat a human professional Go player—researchers have identified a third era: that of the large-scale AI models whose computation needs dwarf all previous AI systems.
Predicting AI Computation Progress
Looking back at the only the last decade itself, compute has grown so tremendously it’s difficult to comprehend.
For example, the compute used to train Minerva, an AI which can solve complex math problems, is nearly 6 million times that which was used to train AlexNet 10 years ago.
Here’s a list of important AI models through history and the amount of compute used to train them.
AI | Year | FLOPs |
---|---|---|
Theseus | 1950 | 40 |
Perceptron Mark I | 1957–58 | 695,000 |
Neocognitron | 1980 | 228 million |
NetTalk | 1987 | 81 billion |
TD-Gammon | 1992 | 18 trillion |
NPLM | 2003 | 1.1 petaFLOPs |
AlexNet | 2012 | 470 petaFLOPs |
AlphaGo | 2016 | 1.9 million petaFLOPs |
GPT-3 | 2020 | 314 million petaFLOPs |
Minerva | 2022 | 2.7 billion petaFLOPs |
Note: One petaFLOP = one quadrillion FLOPs. Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
The result of this growth in computation, along with the availability of massive data sets and better algorithms, has yielded a lot of AI progress in seemingly very little time. Now AI doesn’t just match, but also beats human performance in many areas.
It’s difficult to say if the same pace of computation growth will be maintained. Large-scale models require increasingly more compute power to train, and if computation doesn’t continue to ramp up it could slow down progress. Exhausting all the data currently available for training AI models could also impede the development and implementation of new models.
However with all the funding poured into AI recently, perhaps more breakthroughs are around the corner—like matching the computation power of the human brain.
Where Does This Data Come From?
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
Note: The time estimated to for computation to double can vary depending on different research attempts, including Amodei and Hernandez (2018) and Lyzhov (2021). This article is based on our source’s findings. Please see their full paper for further details. Furthermore, the authors are cognizant of the framing concerns with deeming an AI model “regular-sized” or “large-sized” and said further research is needed in the area.
Methodology: The authors of the paper used two methods to determine the amount of compute used to train AI Models: counting the number of operations and tracking GPU time. Both approaches have drawbacks, namely: a lack of transparency with training processes and severe complexity as ML models grow.
-
Money2 weeks ago
How Much Does it Take to Be Wealthy in America?
-
Markets2 days ago
Ranked: The Highest Paid CEOs in the S&P 500
-
Green4 weeks ago
The Anthropocene: A New Epoch in the Earth’s History
-
Retail2 weeks ago
Visualizing the Number of Costco Stores, by Country
-
Markets1 day ago
Charted: Market Volatility at its Lowest Point Since 2020
-
Green4 weeks ago
Tracking Antarctica Sea Ice Loss in 2023
-
Culture2 weeks ago
Ranked: Which Countries Drink the Most Beer?
-
Wealth1 day ago
Mapped: The Migration of the World’s Millionaires in 2023