Connect with us

How AI and the Metaverse Will Impact the Datasphere

Published

on

The following content is sponsored by HIVE Digital

How AI and the Metaverse Will Impact the Datasphere

The datasphere—the infrastructure that stores and processes our data—is critical to many of the advanced technologies on which we rely.

So we partnered with HIVE Digital on this infographic to take a deep dive on how it could evolve to meet the twin challenges of AI and the metaverse.

The Rise and Rise of Large Language Models

If the second decade of the 21st century is remembered for anything, it will probably be the leaps and bounds made in the field of AI. Large language models (LLMs) have pushed AI performance to near-human levels, and in some cases beyond. But to get there, it is taking more and more computational resources to train and operate them.

The Large-Scale Era is often considered to have started in late 2015 with the release of DeepMind’s AlphaGo Fan, the first computer to defeat a professional Go player

LLM SystemOrganizationPublication DateTraining Compute (FLOP/s)
AlphaGo FanDeepMind2015-Oct3.80E+20
AlphaGo LeeDeepMind2016-01-271.90E+21
GNMTGoogle2016-09-266.90E+21
AlphaGo MasterDeepMind2017-01-011.50E+23
AlphaGo ZeroDeepMind2017-10-183.41E+23
Alpha ZeroDeepMind2017-12-053.67E+22
BigGan-deep 512x512Heriot-Watt University, DeepMind2018-09-283.00E+21
Megatron-BERTNVIDIA2019-09-176.90E+22
Megatron-LM (Original, 8.3B)NVIDIA2019-Sep9.10E+21
T5-11BGoogle2019-Oct4.05E+22
AlphaStarDeepMind2019-10-302.02E+23
OpenAI FiveOpenAI2019-12-136.70E+22
OpenAI Five RerunOpenAI2019-12-131.30E+22
GPT-3 175B (davinci)OpenAI2020-May3.14E+23
SwitchGoogle2021-01-118.22E+22
ALIGNGoogle2021-06-112.15E+23
ERNIE 3.0Baidu Inc.2021-Jul2.35E+18
Jurassic-1-JumboAI21 Labs2021-Aug3.70E+23
Megatron-Turing NLG 530BMicrosoft, NVIDIA2021-Oct1.17E+24
Yuan 1.0Inspur2021-10-124.10E+23
Source 1.0Inspur2021-11-103.54E+23
GopherDeepMind2021-Dec6.31E+23
AlphaCodeDeepMind2022-Feb4.05E+23
LaMDAGoogle2022-02-103.55E+23
ChinchillaDeepMind2022-Mar5.76E+23
PaLM (540B)Google2022-Apr2.53E+24
OPT-175BMeta AI2022-May4.30E+23
Minerva (540B)Google2022-Jun2.74E+24
GPT-4OpenAI2023-Mar2.10E+25

That LLM required a training compute of 380 quintillion FLOP/s, or floating-point operations per second, a measure of computer performance. In 2023, OpenAI’s GPT-4 had a training compute 55 thousand times greater, at 21 septillion FLOP/s. 

At this rate of growth—essentially doubling every  9.9 months—future AI systems will need exponentially larger computers to train and operate them.

Building the Metaverse

The metaverse, an immersive and frictionless web accessed through augmented and virtual reality (AR and VR), will only add to these demands. One way to quantify this demand is to compare bitrates across applications, which measures the amount of data (i.e. bits) transmitted.

On the low end: music streaming, web browsing, and gaming all have relatively low bitrate requirements. Only streaming gaming breaks the one Mbps (megabits per second) threshold. Things go up from there, and fast. AR, VR, and holograms, all technologies that will be integral for the metaverse, top out at 300 Mbps

Consider also that VR and AR require incredibly low latency—less than five milliseconds—to avoid motion sickness. So not only will the metaverse contribute increase the amount of data that needs to be moved—644 GB per household per day—but it will also need to move it very quickly.

The Global Datasphere

At time of writing there are 5,065 data centers worldwide, with 39.0% located in the U.S. The next largest national player is the UK, with only 5.5%. Not only do they store the data we produce, but they also run the applications that we rely on. And they are evolving.

There are two broad approaches that data centers are taking to get ahead of the demand curve. The first and probably most obvious option is going BIG. The world’s three largest hyperscale data centers are: 

  1. 🇨🇳 China Telecom’s Inner Mongolia Information Park (10.8 million square feet)
  2. 🇨🇳 China Mobile’s Hohhot Data Center (7.8 million square feet)
  3. 🇺🇸 Switch’s Citadel Campus (7.2 million square feet)

The other route is to go small, but closer to where the action is. And this is what edge computing does, decentralizing the data center in order to improve latency. This approach will likely play a big part in the rollout of self-driving vehicles, where safety depends on speed. 

And investors are putting their money behind the idea. Global spending on edge data centers is expected to hit $208 billion in 2023, up 13.1% from 2022.

Welcome to the Zettabyte Era

The International Data Corporation projects that the amount of data produced annually will grow to 221 zettabytes by 2026, at a compound annual growth rate of 21.2%. With the zettabyte era nearly upon us, data centers will have a critical role to play.

Visual Capitalist Logo

Learn more about how HIVE Digital exports renewably sourced computing power to customers all over the world, helping to meet the demands of emerging technologies like AI.

Click for Comments

You may also like

Subscribe