Connect with us

Technology

Timeline: The Most Important Science Headlines of 2022

Published

on

Infographic illustrating the biggest scientific headlines of 2022

Can I share this graphic?
Yes. Visualizations are free to share and post in their original form across the web—even for publishers. Please link back to this page and attribute Visual Capitalist.
When do I need a license?
Licenses are required for some commercial uses, translations, or layout modifications. You can even whitelabel our visualizations. Explore your options.
Interested in this piece?
Click here to license this visualization.

Scientific discoveries and technological innovation play a vital role in addressing many of the challenges and crises that we face every year.

The last year may have come and gone quickly, but scientists and researchers have worked painstakingly hard to advance our knowledge within a number of disciplines, industries, and projects around the world.

Over the course of 2022, it’s easy to lose track of all the amazing stories in science and technology.

At a Glance: Major Scientific Headlines of 2022

Below we dive a little deeper into some of the most interesting headlines, while providing links in case you want to explore these developments further.

January 2022

The James Webb Space Telescope Arrives at its Destination

What happened: A new space telescope brings promise of exciting findings and beautiful images from the final frontier. This telescope builds on the legacy of its predecessor, the Hubble Space Telescope, which launched over 30 years ago.

Why it matters: The James Webb Space Telescope is our latest state-of-the-art “window” into deep space. With more access to the infrared spectrum, new images, measurements, and observations of outer space will become available.

» To learn more, read this article from The Planetary Society, or watch this video from the Wall Street Journal.

April 2022

Complete: The Human Genome

What happened: Scientists finish sequencing the human genome.

Why it matters: A complete human genome allows researchers to better understand the genetic basis of human traits and diseases. New therapies and treatments are likely to arise from this development.

» To learn more, watch this video by Two Minute Papers, or read this article from NIH

May 2022

Monkeypox Breaks Out

What happened: A higher volume of cases of the monkeypox virus was reported in non-endemic countries.

Why it matters: Trailing in the shadow of a global pandemic, researchers are keeping a closer eye on how diseases spread. The sudden spike of multinational incidences of monkeypox raises questions about disease evolution and prevention.

» To learn more, read this article by the New York Times.

June 2022

A Perfectly Preserved Woolly Mammoth

What happened: Gold miners unearth a 35,000 year old, well-preserved baby woolly mammoth in the Yukon tundra.

Why it matters: The mammoth, named Nun cho ga by the Tr’ondëk Hwëch’in First Nation, is the most complete specimen discovered in North America to date. Each new discovery allows paleontologists to broaden our knowledge of biodiversity and how life changes over time.

» To learn more, read this article from Smithsonian Magazine

July 2022

The Rise of AI Art

What happened: Access to new computer programs, such as DALL-E and Midjourney, give members of the general public the ability to create images from text-prompts.

Why it matters: Widespread access to generative AI tools fuels inspiration—and controversy. Concern for artist rights and copyright violations grow as these programs potentially threaten to diminish creative labor.

» To learn more, read this article by MyModernMet, or watch this video by Cleo Abram.

August 2022

Dead Organs Get a Second Chance

What happened: Researchers create a perfusion system that can revitalize organs after cellular death. Using a special mixture of blood and nutrients, organs of a dead pig can be sustained after death—and in some cases, even promote cellular repair.

Why it matters: This discovery could potentially lead to a greater shelf-life and supply of organs for transplant.

» To learn more, read this article by Scientific American, or this article from the New York Times

September 2022

DART Delivers A Cosmic Nudge

What happened: NASA crashes a spacecraft into an asteroid just to see how much it would move. Dimorphos, a moonlet orbiting a larger asteroid called Didymos 6.8 million miles (11 million km) from Earth, is struck by the DART (Double Asteroid Redirection Test) spacecraft. NASA estimates that as much as 22 million pounds (10 million kg) was ejected after the impact.

Why it matters: Earth is constantly at risk of being struck by stray asteroids. Developing reliable methods of deflecting near-Earth objects could save us from meeting the same fate as the dinosaurs.

» To learn more, watch this video by Real Engineering, or read this article from Space.com

November 2022

Falling Sperm Counts

What happened: A scientific review suggests human sperm counts are decreasing—up to 62% over the past 50 years.

Why it matters: A lower sperm count makes it more difficult to conceive naturally. Concerns about global declining male health also arise because sperm count is a marker for overall health. Researchers look to extraneous stressors that may be affecting this trend, such as diet, environment, or other means.

» To learn more, check out this article from the Guardian.

December 2022

Finding Ancient DNA

What happened: Two million-year-old DNA is found in Greenland.

Why it matters: DNA is a record of biodiversity. Apart from showing that a desolate Arctic landscape was once teeming with life, ancient DNA gives hints about our advancement to modern life and how biodiversity evolves over time.

» To learn more, read this article from National Geographic

December 2022

Fusing Energy

What happened: The U.S. Department of Energy reports achieving net energy gain for the first time in the development of nuclear fusion.

Why it matters: Fusion is often seen as the Holy Grail of safe clean energy, and this latest milestone brings researchers one step closer to harnessing nuclear fusion to power the world.

» To learn more, view our infographic on fusion, or read this article from BBC

Science in the New Year

The future of scientific research looks bright. Researchers and scientists are continuing to push the boundaries of what we know and understand about the world around us.

For 2023, some disciplines are likely to continue to dominate headlines:

  • Advancement in space continues with projects like the James Webb Space Telescope and SETI COSMIC’s hunt for life beyond Earth
  • Climate action may become more demanding as recovery and prevention from extreme weather events continue into the new year
  • Generative AI tools such as DALL-e and ChatGPT were opened to public use in 2022, and ignited widespread interest in the potential of artificial intelligence
  • Even amidst the lingering shadow of COVID-19, new therapeutics should advance medicine into new territories

Where science is going remains to be seen, but this past year instills faith that 2023 will be filled with even more progress.

Click for Comments

Technology

Charted: The Exponential Growth in AI Computation

In eight decades, artificial intelligence has moved from purview of science fiction to reality. Here’s a quick history of AI computation.

Published

on

A cropped version of the time series chart showing the creation of machine learning systems on the x-axis and the amount of AI computation they used on the y-axis measured in FLOPs.

Charted: The Exponential Growth in AI Computation

Electronic computers had barely been around for a decade in the 1940s, before experiments with AI began. Now we have AI models that can write poetry and generate images from textual prompts. But what’s led to such exponential growth in such a short time?

This chart from Our World in Data tracks the history of AI through the amount of computation power used to train an AI model, using data from Epoch AI.

The Three Eras of AI Computation

In the 1950s, American mathematician Claude Shannon trained a robotic mouse called Theseus to navigate a maze and remember its course—the first apparent artificial learning of any kind.

Theseus was built on 40 floating point operations (FLOPs), a unit of measurement used to count the number of basic arithmetic operations (addition, subtraction, multiplication, or division) that a computer or processor can perform in one second.

ℹ️ FLOPs are often used as a metric to measure the computational performance of computer hardware. The higher the FLOP count, the higher computation, the more powerful the system.

Computation power, availability of training data, and algorithms are the three main ingredients to AI progress. And for the first few decades of AI advances, compute, which is the computational power needed to train an AI model, grew according to Moore’s Law.

PeriodEraCompute Doubling
1950–2010Pre-Deep Learning18–24 months
2010–2016Deep Learning5–7 months
2016–2022Large-scale models11 months

Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.

However, at the start of the Deep Learning Era, heralded by AlexNet (an image recognition AI) in 2012, that doubling timeframe shortened considerably to six months, as researchers invested more in computation and processors.

With the emergence of AlphaGo in 2015—a computer program that beat a human professional Go player—researchers have identified a third era: that of the large-scale AI models whose computation needs dwarf all previous AI systems.

Predicting AI Computation Progress

Looking back at the only the last decade itself, compute has grown so tremendously it’s difficult to comprehend.

For example, the compute used to train Minerva, an AI which can solve complex math problems, is nearly 6 million times that which was used to train AlexNet 10 years ago.

Here’s a list of important AI models through history and the amount of compute used to train them.

AIYearFLOPs
Theseus195040
Perceptron Mark I1957–58695,000
Neocognitron1980228 million
NetTalk198781 billion
TD-Gammon199218 trillion
NPLM20031.1 petaFLOPs
AlexNet2012470 petaFLOPs
AlphaGo20161.9 million petaFLOPs
GPT-32020314 million petaFLOPs
Minerva20222.7 billion petaFLOPs

Note: One petaFLOP = one quadrillion FLOPs. Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.

The result of this growth in computation, along with the availability of massive data sets and better algorithms, has yielded a lot of AI progress in seemingly very little time. Now AI doesn’t just match, but also beats human performance in many areas.

It’s difficult to say if the same pace of computation growth will be maintained. Large-scale models require increasingly more compute power to train, and if computation doesn’t continue to ramp up it could slow down progress. Exhausting all the data currently available for training AI models could also impede the development and implementation of new models.

However with all the funding poured into AI recently, perhaps more breakthroughs are around the corner—like matching the computation power of the human brain.

Where Does This Data Come From?

Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.

Note: The time estimated to for computation to double can vary depending on different research attempts, including Amodei and Hernandez (2018) and Lyzhov (2021). This article is based on our source’s findings. Please see their full paper for further details. Furthermore, the authors are cognizant of the framing concerns with deeming an AI model “regular-sized” or “large-sized” and said further research is needed in the area.

Methodology: The authors of the paper used two methods to determine the amount of compute used to train AI Models: counting the number of operations and tracking GPU time. Both approaches have drawbacks, namely: a lack of transparency with training processes and severe complexity as ML models grow.

Continue Reading

Subscribe

Popular