The Future of Food: How Tech Is Changing Our Food Systems
The urban population is exploding around the globe, and yesterday’s food systems will soon be sub-optimal for many of the megacities swelling with tens of millions of people.
Further, issues like wasted food, poor working conditions, polluted ecosystems, mistreated animals, and greenhouse gases are just some of the concerns that people have about our current supply chains.
Today’s infographic from Futurism shows how food systems are evolving – and that the future of food depends on technologies that enable us to get more food out of fewer resources.
The Next Gen of Food Systems
Here are four technologies that may have a profound effect on how we eat in the future:
1. Automated Vertical Farms
It’s already clear that vertical farming is incredibly effective. By stacking farms on top of another and using automation, vertical farms can produce 100x more effectively per acre than conventional agricultural techniques.
They grow crops at twice the speed as usual, while using 40% less power, having 80% less food waste, and using 99% less water than outdoor fields. However, the problem for vertical farms is still cost – and it is not clear when they will be viable on a commercial basis.
Another technology that has promise for the future of food is a unique combination of fish farming (aquaculture) with hydroponics.
In short, fish convert their food into nutrients that plants can absorb, while the plants clean the water for the fish. Compared to conventional farming, this technology uses about half of the water, while increasing the yield of the crops grown. As a bonus, it also can raise a significant amount of fish.
3. In Vitro Meats
Meat is costly and extremely resource intensive to produce. As just one example, to produce one pound of beef, it takes 1,847 gallons of water.
In vitro meats are one way to solve this. These self-replicating muscle tissue cultures are grown and fed nutrients in a broth, and bypass the need for having living animals altogether. Interestingly enough, market demand seems to be there: one recent study found that 70.6% of consumers are interested in trying lab grown beef.
4. Artificial Animal Products
One other route to get artificial meat is to use machine learning to grasp the complex chemistry and textures behind these products, and to find ways to replicate them. This has already been done for mayonnaise – and it’s in the works for eggs, milk, and cheese as well.
Tasting the Future of Food
As these new technologies scale and hit markets, the future of food could change drastically. Many products will flop, but others will take a firm hold in our supply chains and become culturally acceptable and commercially viable. Certainly, food will be grown locally in massive skyscrapers, and there will be decent alternatives to be found for both meat or animal products in the market.
With the global population rising by more than a million people each week, finding and testing new solutions around food will be essential to make the most out of limited resources.
Charted: The Exponential Growth in AI Computation
In eight decades, artificial intelligence has moved from purview of science fiction to reality. Here’s a quick history of AI computation.
Charted: The Exponential Growth in AI Computation
Electronic computers had barely been around for a decade in the 1940s, before experiments with AI began. Now we have AI models that can write poetry and generate images from textual prompts. But what’s led to such exponential growth in such a short time?
The Three Eras of AI Computation
In the 1950s, American mathematician Claude Shannon trained a robotic mouse called Theseus to navigate a maze and remember its course—the first apparent artificial learning of any kind.
Theseus was built on 40 floating point operations (FLOPs), a unit of measurement used to count the number of basic arithmetic operations (addition, subtraction, multiplication, or division) that a computer or processor can perform in one second.
Computation power, availability of training data, and algorithms are the three main ingredients to AI progress. And for the first few decades of AI advances, compute, which is the computational power needed to train an AI model, grew according to Moore’s Law.
|1950–2010||Pre-Deep Learning||18–24 months|
|2010–2016||Deep Learning||5–7 months|
|2016–2022||Large-scale models||11 months|
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
However, at the start of the Deep Learning Era, heralded by AlexNet (an image recognition AI) in 2012, that doubling timeframe shortened considerably to six months, as researchers invested more in computation and processors.
With the emergence of AlphaGo in 2015—a computer program that beat a human professional Go player—researchers have identified a third era: that of the large-scale AI models whose computation needs dwarf all previous AI systems.
Predicting AI Computation Progress
Looking back at the only the last decade itself, compute has grown so tremendously it’s difficult to comprehend.
For example, the compute used to train Minerva, an AI which can solve complex math problems, is nearly 6 million times that which was used to train AlexNet 10 years ago.
Here’s a list of important AI models through history and the amount of compute used to train them.
|Perceptron Mark I||1957–58||695,000|
|AlphaGo||2016||1.9 million petaFLOPs|
|GPT-3||2020||314 million petaFLOPs|
|Minerva||2022||2.7 billion petaFLOPs|
Note: One petaFLOP = one quadrillion FLOPs. Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
The result of this growth in computation, along with the availability of massive data sets and better algorithms, has yielded a lot of AI progress in seemingly very little time. Now AI doesn’t just match, but also beats human performance in many areas.
It’s difficult to say if the same pace of computation growth will be maintained. Large-scale models require increasingly more compute power to train, and if computation doesn’t continue to ramp up it could slow down progress. Exhausting all the data currently available for training AI models could also impede the development and implementation of new models.
However with all the funding poured into AI recently, perhaps more breakthroughs are around the corner—like matching the computation power of the human brain.
AI4 weeks ago
AI vs. Humans: Which Performs Certain Skills Better?
Maps2 weeks ago
Mapped: The Deadliest Earthquakes of the 21st Century
Oil and Gas2 hours ago
Charted: The World’s Biggest Oil Producers in 2022
Inequality3 weeks ago
Visualizing the World’s Growing Millionaire Population (2012-2022)
Energy2 weeks ago
What Electricity Sources Power the World?
Wealth3 weeks ago
Mapped: The Richest Billionaires in U.S. States
Markets2 weeks ago
The 25 Worst Stocks by Shareholder Wealth Losses (1926-2022)
China3 weeks ago
Charted: Youth Unemployment in the OECD and China