Technology
Helium: A Valuable Gas Not To Be Taken Lightly
Helium makes up 25% of the atoms in the known universe, so one would guess that the inert gas would be quite plentiful on Earth.
Unfortunately, a familiar property of helium prevents this from happening. Helium gas is lighter than air and literally rises into space, depleting the Earth of almost all valuable helium resources over time.
Where do we get helium?
So how do we actually obtain new helium gas, which is necessary for important technological applications such as MRI machines, superconductors, and even the Large Hadron Collider?
Today’s infographic from Helium One shows everything you need to know on helium, including where we can find it on Earth, as well as the most important uses of the gas.
Although helium is plentiful in the universe, on Earth it is quite rare and difficult to obtain.
Why Do We Need Helium?
Helium has several properties that make it invaluable to modern humans, particularly for technological uses:
Helium Property | Benefits |
---|---|
Inert | Doesn’t react with other elements, and doesn’t explode like hydrogen |
Non-toxic | Can be used by humans in a variety of applications |
Lighter than air | Ability to lift and/or float |
Melting point -272˚C | Liquid at ultra-cool temps enables superconductivity |
Small molecular size | Can be used to find the smallest of leaks |
Helium Demand
Helium demand has risen consistently since 2009, and the market has been increasing at a CAGR of 10.1% since 2010. With that in mind, here are the specific constituents of helium demand today:
Helium Use | Global Share | Description |
---|---|---|
Cryogenics | 23% | Superconductors use ultracooled helium liquid. |
Lifting | 15% | Used in airships and balloons |
Electronics | 14% | Used to manufacture silicon wafers |
Optical Fiber | 11% | Necessary to make optical fiber cables |
Welding | 9% | Used as a shielding gas for welding |
Leak Detection | 6% | Helium particles are small, and can find the tiniest leaks |
Analytics | 6% | Used in chromatography and other applications |
Pressure & Purging | 5% | Used in rocket systems |
Diving | 3% | Mixed into commercial diving tanks for various reasons |
Other | 8% | Helium's diverse properties give it many other minor uses |
Helium’s melting point, which is the lowest found in nature, allows it to remain as a liquid at the coolest possible temperature. This makes helium ideal for uses in superconductors, including MRI machines – one of the fastest growing components of helium demand.
Helium Supply
But where do we obtain this elusive gas?
It turns out that new helium is actually created every day in very tiny amounts within the Earth’s crust as a by-product of radioactive decay. And like other gases below the Earth’s surface (i.e. natural gas), helium gets trapped in geological formations in economical amounts.
Today, much of helium is either produced as a by-product of natural gas deposits, or from helium-primary gas deposits with concentrations up to 7% He.
Here’s helium production by country:
Country | 2016 production (in billion cubic feet) | Share |
---|---|---|
USA | 2.2 | 41% |
USA (from Cliffside Field) | 0.8 | 14% |
Algeria | 0.4 | 6% |
Australia | 0.1 | 3% |
Poland | 0.1 | 1% |
Qatar | 1.8 | 32% |
Russia | 0.1 | 2% |
Total | 5.4 | 100% |
USA (from Cliffside Field)
The USA government has a helium stockpile at the Cliffside Field in Texas, developed as part of a WWI initiative. It is in the process of being phased out, and by as late as 2021 it will no longer contribute to supply.
Qatar
In December 2013, the Qatar Helium 2 project was opened. This new facility combined with the first helium project makes the country the 2nd largest source of helium globally.
Russia
Russia is looking to become a player in helium as well. Gazprom’s Amur LNG project will be one of the biggest gas facilities in the world, and it will include a helium processing plant. This won’t be online until 2024, though.
Tanzania
Though not a helium player yet, scientists have recently uncovered a major helium find in the Rift Valley of Tanzania which contains an estimated 99 billion cubic feet of gas.
The Future of the Helium Market?
Because of inflated demand, especially for cryogenics in MRI machines, helium prices have risen significantly over the years.
And with these market dynamics in mind, it’s clear that the future of helium is not full of hot air.
AI
Charted: The Exponential Growth in AI Computation
In eight decades, artificial intelligence has moved from purview of science fiction to reality. Here’s a quick history of AI computation.

Charted: The Exponential Growth in AI Computation
Electronic computers had barely been around for a decade in the 1940s, before experiments with AI began. Now we have AI models that can write poetry and generate images from textual prompts. But what’s led to such exponential growth in such a short time?
This chart from Our World in Data tracks the history of AI through the amount of computation power used to train an AI model, using data from Epoch AI.
The Three Eras of AI Computation
In the 1950s, American mathematician Claude Shannon trained a robotic mouse called Theseus to navigate a maze and remember its course—the first apparent artificial learning of any kind.
Theseus was built on 40 floating point operations (FLOPs), a unit of measurement used to count the number of basic arithmetic operations (addition, subtraction, multiplication, or division) that a computer or processor can perform in one second.
Computation power, availability of training data, and algorithms are the three main ingredients to AI progress. And for the first few decades of AI advances, compute, which is the computational power needed to train an AI model, grew according to Moore’s Law.
Period | Era | Compute Doubling |
---|---|---|
1950–2010 | Pre-Deep Learning | 18–24 months |
2010–2016 | Deep Learning | 5–7 months |
2016–2022 | Large-scale models | 11 months |
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
However, at the start of the Deep Learning Era, heralded by AlexNet (an image recognition AI) in 2012, that doubling timeframe shortened considerably to six months, as researchers invested more in computation and processors.
With the emergence of AlphaGo in 2015—a computer program that beat a human professional Go player—researchers have identified a third era: that of the large-scale AI models whose computation needs dwarf all previous AI systems.
Predicting AI Computation Progress
Looking back at the only the last decade itself, compute has grown so tremendously it’s difficult to comprehend.
For example, the compute used to train Minerva, an AI which can solve complex math problems, is nearly 6 million times that which was used to train AlexNet 10 years ago.
Here’s a list of important AI models through history and the amount of compute used to train them.
AI | Year | FLOPs |
---|---|---|
Theseus | 1950 | 40 |
Perceptron Mark I | 1957–58 | 695,000 |
Neocognitron | 1980 | 228 million |
NetTalk | 1987 | 81 billion |
TD-Gammon | 1992 | 18 trillion |
NPLM | 2003 | 1.1 petaFLOPs |
AlexNet | 2012 | 470 petaFLOPs |
AlphaGo | 2016 | 1.9 million petaFLOPs |
GPT-3 | 2020 | 314 million petaFLOPs |
Minerva | 2022 | 2.7 billion petaFLOPs |
Note: One petaFLOP = one quadrillion FLOPs. Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
The result of this growth in computation, along with the availability of massive data sets and better algorithms, has yielded a lot of AI progress in seemingly very little time. Now AI doesn’t just match, but also beats human performance in many areas.
It’s difficult to say if the same pace of computation growth will be maintained. Large-scale models require increasingly more compute power to train, and if computation doesn’t continue to ramp up it could slow down progress. Exhausting all the data currently available for training AI models could also impede the development and implementation of new models.
However with all the funding poured into AI recently, perhaps more breakthroughs are around the corner—like matching the computation power of the human brain.
Where Does This Data Come From?
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
Note: The time estimated to for computation to double can vary depending on different research attempts, including Amodei and Hernandez (2018) and Lyzhov (2021). This article is based on our source’s findings. Please see their full paper for further details. Furthermore, the authors are cognizant of the framing concerns with deeming an AI model “regular-sized” or “large-sized” and said further research is needed in the area.
Methodology: The authors of the paper used two methods to determine the amount of compute used to train AI Models: counting the number of operations and tracking GPU time. Both approaches have drawbacks, namely: a lack of transparency with training processes and severe complexity as ML models grow.
-
China2 weeks ago
Charted: Six Red Flags Pointing to China’s Economy Slowing Down
-
VC+6 days ago
What’s New on VC+ in September
-
Economy4 weeks ago
Visualizing the BRICS Expansion in 4 Charts
-
Markets2 weeks ago
The 25 Best Stocks by Shareholder Wealth Creation (1926-2022)
-
Brands3 days ago
Ranked: The 20 Best Franchises to Open in the U.S.
-
Technology4 weeks ago
Nvidia vs. AMD vs. Intel: Comparing AI Chip Sales
-
Money2 weeks ago
How Much Does it Take to Be Wealthy in America?
-
Markets3 days ago
Ranked: The Highest Paid CEOs in the S&P 500