Connect with us

Technology

AIoT: When Artificial Intelligence Meets the Internet of Things

Published

on

AIoT: When Artificial Intelligence Meets the Internet of Things

AIoT: When AI Meets the Internet of Things

The Internet of Things (IoT) is a technology helping us to reimagine daily life, but artificial intelligence (AI) is the real driving force behind the IoT’s full potential.

From its most basic applications of tracking our fitness levels, to its wide-reaching potential across industries and urban planning, the growing partnership between AI and the IoT means that a smarter future could occur sooner than we think.

This infographic by TSMC highlights the breakthrough technologies and trends making that shift possible, and how we’re continuing to push the boundaries.

AI + IoT = Superpowers of Innovation

IoT devices use the internet to communicate, collect, and exchange information about our online activities. Every day, they generate 1 billion GB of data.

By 2025, there’s projected to be 42 billion IoT-connected devices globally. It’s only natural that as these device numbers grow, the swaths of data will too. That’s where AI steps in—lending its learning capabilities to the connectivity of the IoT.

The IoT is empowered by three key emerging technologies:

  • Artificial Intelligence (AI)
    Programmable functions and systems that enable devices to learn, reason, and process information like humans.
  • 5G Networks
    Fifth generation mobile networks with high-speed, near-zero lag for real time data processing.
  • Big Data
    Enormous volumes of data processed from numerous internet-connected sources.

Together, these interconnected devices are transforming the way we interact with our devices at home and at work, creating the AIoT (“Artificial Intelligence of Things”) in the process.

The Major AIoT Segments

So where are AI and the IoT headed together?

There are four major segments in which the AIoT is making an impact: wearables, smart home, smart city, and smart industry:

1. Wearables

Wearable devices such as smartwatches continuously monitor and track user preferences and habits. Not only has this led to impactful applications in the healthtech sector, it also works well for sports and fitness. According to leading tech research firm Gartner, the global wearable device market is estimated to see more than $87 billion in revenue by 2023.

2. Smart Home

Houses that respond to your every request are no longer restricted to science fiction. Smart homes are able to leverage appliances, lighting, electronic devices and more, learning a homeowner’s habits and developing automated “support.”

This seamless access also brings about additional perks of improved energy efficiency. As a result, the smart home market could see a compound annual growth rate (CAGR) of 25% between 2020-2025, to reach $246 billion.

3. Smart City

As more and more people flock from rural to urban areas, cities are evolving into safer, more convenient places to live. Smart city innovations are keeping pace, with investments going towards improving public safety, transport, and energy efficiency.

The practical applications of AI in traffic control are already becoming clear. In New Delhi, home to some of the world’s most traffic-congested roads, an Intelligent Transport Management System (ITMS) is in use to make ‘real time dynamic decisions on traffic flows’.

4. Smart Industry

Last but not least, industries from manufacturing to mining rely on digital transformation to become more efficient and reduce human error.

From real-time data analytics to supply-chain sensors, smart devices help prevent costly errors in industry. In fact, Gartner also estimates that over 80% of enterprise IoT projects will incorporate AI by 2022.

The Untapped Potential of AI & IoT

AIoT innovation is only accelerating, and promises to lead us into a more connected future.

CategoryTodayTomorrow
Edge computingSmart thermostats
Smart appliances
Home robots
Autonomous vehicles
Voice AISmart speakers
Natural language processing (NLP)
ePayment voice authentication
Vision AIMassive object detectionVideo analytics on the edge
Super 8K resolution

The AIoT fusion is increasingly becoming more mainstream, as it continues to push the boundaries of data processing and intelligent learning for years to come.

Just like any company that blissfully ignored the Internet at the turn of the century, the ones that dismiss the Internet of Things risk getting left behind.

Jared Newman, Technology Analyst

Click for Comments

AI

Charted: The Exponential Growth in AI Computation

In eight decades, artificial intelligence has moved from purview of science fiction to reality. Here’s a quick history of AI computation.

Published

on

A cropped version of the time series chart showing the creation of machine learning systems on the x-axis and the amount of AI computation they used on the y-axis measured in FLOPs.

Charted: The Exponential Growth in AI Computation

Electronic computers had barely been around for a decade in the 1940s, before experiments with AI began. Now we have AI models that can write poetry and generate images from textual prompts. But what’s led to such exponential growth in such a short time?

This chart from Our World in Data tracks the history of AI through the amount of computation power used to train an AI model, using data from Epoch AI.

The Three Eras of AI Computation

In the 1950s, American mathematician Claude Shannon trained a robotic mouse called Theseus to navigate a maze and remember its course—the first apparent artificial learning of any kind.

Theseus was built on 40 floating point operations (FLOPs), a unit of measurement used to count the number of basic arithmetic operations (addition, subtraction, multiplication, or division) that a computer or processor can perform in one second.

ℹ️ FLOPs are often used as a metric to measure the computational performance of computer hardware. The higher the FLOP count, the higher computation, the more powerful the system.

Computation power, availability of training data, and algorithms are the three main ingredients to AI progress. And for the first few decades of AI advances, compute, which is the computational power needed to train an AI model, grew according to Moore’s Law.

PeriodEraCompute Doubling
1950–2010Pre-Deep Learning18–24 months
2010–2016Deep Learning5–7 months
2016–2022Large-scale models11 months

Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.

However, at the start of the Deep Learning Era, heralded by AlexNet (an image recognition AI) in 2012, that doubling timeframe shortened considerably to six months, as researchers invested more in computation and processors.

With the emergence of AlphaGo in 2015—a computer program that beat a human professional Go player—researchers have identified a third era: that of the large-scale AI models whose computation needs dwarf all previous AI systems.

Predicting AI Computation Progress

Looking back at the only the last decade itself, compute has grown so tremendously it’s difficult to comprehend.

For example, the compute used to train Minerva, an AI which can solve complex math problems, is nearly 6 million times that which was used to train AlexNet 10 years ago.

Here’s a list of important AI models through history and the amount of compute used to train them.

AIYearFLOPs
Theseus195040
Perceptron Mark I1957–58695,000
Neocognitron1980228 million
NetTalk198781 billion
TD-Gammon199218 trillion
NPLM20031.1 petaFLOPs
AlexNet2012470 petaFLOPs
AlphaGo20161.9 million petaFLOPs
GPT-32020314 million petaFLOPs
Minerva20222.7 billion petaFLOPs

Note: One petaFLOP = one quadrillion FLOPs. Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.

The result of this growth in computation, along with the availability of massive data sets and better algorithms, has yielded a lot of AI progress in seemingly very little time. Now AI doesn’t just match, but also beats human performance in many areas.

It’s difficult to say if the same pace of computation growth will be maintained. Large-scale models require increasingly more compute power to train, and if computation doesn’t continue to ramp up it could slow down progress. Exhausting all the data currently available for training AI models could also impede the development and implementation of new models.

However with all the funding poured into AI recently, perhaps more breakthroughs are around the corner—like matching the computation power of the human brain.

Where Does This Data Come From?

Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.

Note: The time estimated to for computation to double can vary depending on different research attempts, including Amodei and Hernandez (2018) and Lyzhov (2021). This article is based on our source’s findings. Please see their full paper for further details. Furthermore, the authors are cognizant of the framing concerns with deeming an AI model “regular-sized” or “large-sized” and said further research is needed in the area.

Methodology: The authors of the paper used two methods to determine the amount of compute used to train AI Models: counting the number of operations and tracking GPU time. Both approaches have drawbacks, namely: a lack of transparency with training processes and severe complexity as ML models grow.

Continue Reading

Subscribe

Popular