Technology
Tap Into the Mobile Payments Revolution
Tap Into the Mobile Payments Revolution
The trends and contenders that are shaping mobile payments.
Thanks to Purefunds Mobile Payments ETF (IPAY) for helping us put this together.
Yesterday, user-friendly payment processor Stripe announced a strategic investment and partnership from Visa that values the company at $5 billion. Other investors that participated are not unknowns either: Kleiner Perkins Caufield Byers, American Express, and Sequoia. It was only in December that Stripe was valued at $3.5 billion and in their previous financing, they were valued at just half of that. Stripe will use Visa’s international connections to help it expand beyond the 20 countries it currently services.
This type of story is not unusual in the payment space. Companies are scrambling to scale or adopt new technologies that integrate mobile and electronic features to make it easier, cheaper, and faster for customers to pay. The reason for this is that the payments ecosystem has always been more cumbersome and more expensive than it should be. In the United States alone, retail merchants that accept card-based payments were charged about $67 billion in fees. Add the rest of the world to that pie, and it makes it clear that the payments space is as ripe for disruption as any other.
Mobile and electronic payments allow customers to pay for goods with a tap of a phone or the press of a button. Two of every three Americans have a smartphone, and mobile payments can typically happen faster with less fees. The earliest adopters of mobile payments have a younger and affluent profile: they average just over 30 years old, have a higher annual income, and spend over 2x more on retail than unwilling non-users of mobile payments.
Big Data and the Developing World
One of the most attractive benefits of mobile payments is the integration of big data and predictive analytics. Retailers will have the capability to link purchases directly with location (GPS), consumer behaviour, purchase history, demographics, and social influence. Analyzing this information will allow companies to reach out to consumers with tailored offerings, loyalty programs, and rewards. Customers will be able to take action right from their mobile device.
The opportunities in payments are not just limited to in the United States or even the developed world. Perhaps one of the most interesting opportunities for the mobile payments space is in Africa, where bank penetration is extremely low at only about 25% and mobile phone penetration is higher at 60%. Kenya is a good example of a market where digitization has reached a large portion of the population, giving mobile payments an 86% household penetration.
Mckinsey did an analysis looking at the size of revenue pools for mobile payments if each market in Africa had the same penetration as Kenya, and it sees the pools more than doubling in places like Ethiopia and Nigeria. With the population in sub-Saharan Africa expected to balloon from 926 million to 2.2 billion by 2050, their appears to be even greater opportunity.
Tapping In
The earliest potential in the mobile and electronic payments market appears to be in areas such as micropayments, incidental payments, recurring bills, peer-to-peer money transfers, and cryptocurrency. However, in the long term, the concept can be applied to many different facets of commerce.
Mobile payments may continue to disrupt the big payments market because of several factors including a young and growing userbase, ease of use, faster transactions, cheaper costs, and increased adoption. As Smittipon Srethapramote, who covers the North American payments industry for Morgan Stanley, concludes in a summary on the subject: “Mobile Payments can expand the global revenue pie from $175 billion to $250 billion, including $45 billion in developed markets and $30 billion in emerging markets.”
AI
Charted: The Exponential Growth in AI Computation
In eight decades, artificial intelligence has moved from purview of science fiction to reality. Here’s a quick history of AI computation.

Charted: The Exponential Growth in AI Computation
Electronic computers had barely been around for a decade in the 1940s, before experiments with AI began. Now we have AI models that can write poetry and generate images from textual prompts. But what’s led to such exponential growth in such a short time?
This chart from Our World in Data tracks the history of AI through the amount of computation power used to train an AI model, using data from Epoch AI.
The Three Eras of AI Computation
In the 1950s, American mathematician Claude Shannon trained a robotic mouse called Theseus to navigate a maze and remember its course—the first apparent artificial learning of any kind.
Theseus was built on 40 floating point operations (FLOPs), a unit of measurement used to count the number of basic arithmetic operations (addition, subtraction, multiplication, or division) that a computer or processor can perform in one second.
Computation power, availability of training data, and algorithms are the three main ingredients to AI progress. And for the first few decades of AI advances, compute, which is the computational power needed to train an AI model, grew according to Moore’s Law.
Period | Era | Compute Doubling |
---|---|---|
1950–2010 | Pre-Deep Learning | 18–24 months |
2010–2016 | Deep Learning | 5–7 months |
2016–2022 | Large-scale models | 11 months |
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
However, at the start of the Deep Learning Era, heralded by AlexNet (an image recognition AI) in 2012, that doubling timeframe shortened considerably to six months, as researchers invested more in computation and processors.
With the emergence of AlphaGo in 2015—a computer program that beat a human professional Go player—researchers have identified a third era: that of the large-scale AI models whose computation needs dwarf all previous AI systems.
Predicting AI Computation Progress
Looking back at the only the last decade itself, compute has grown so tremendously it’s difficult to comprehend.
For example, the compute used to train Minerva, an AI which can solve complex math problems, is nearly 6 million times that which was used to train AlexNet 10 years ago.
Here’s a list of important AI models through history and the amount of compute used to train them.
AI | Year | FLOPs |
---|---|---|
Theseus | 1950 | 40 |
Perceptron Mark I | 1957–58 | 695,000 |
Neocognitron | 1980 | 228 million |
NetTalk | 1987 | 81 billion |
TD-Gammon | 1992 | 18 trillion |
NPLM | 2003 | 1.1 petaFLOPs |
AlexNet | 2012 | 470 petaFLOPs |
AlphaGo | 2016 | 1.9 million petaFLOPs |
GPT-3 | 2020 | 314 million petaFLOPs |
Minerva | 2022 | 2.7 billion petaFLOPs |
Note: One petaFLOP = one quadrillion FLOPs. Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
The result of this growth in computation, along with the availability of massive data sets and better algorithms, has yielded a lot of AI progress in seemingly very little time. Now AI doesn’t just match, but also beats human performance in many areas.
It’s difficult to say if the same pace of computation growth will be maintained. Large-scale models require increasingly more compute power to train, and if computation doesn’t continue to ramp up it could slow down progress. Exhausting all the data currently available for training AI models could also impede the development and implementation of new models.
However with all the funding poured into AI recently, perhaps more breakthroughs are around the corner—like matching the computation power of the human brain.
Where Does This Data Come From?
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
Note: The time estimated to for computation to double can vary depending on different research attempts, including Amodei and Hernandez (2018) and Lyzhov (2021). This article is based on our source’s findings. Please see their full paper for further details. Furthermore, the authors are cognizant of the framing concerns with deeming an AI model “regular-sized” or “large-sized” and said further research is needed in the area.
Methodology: The authors of the paper used two methods to determine the amount of compute used to train AI Models: counting the number of operations and tracking GPU time. Both approaches have drawbacks, namely: a lack of transparency with training processes and severe complexity as ML models grow.
-
Misc3 weeks ago
Ranked: The World’s Largest Stadiums
-
History1 week ago
The Incredible Historical Map That Changed Cartography
-
China2 weeks ago
Charted: Six Red Flags Pointing to China’s Economy Slowing Down
-
VC+7 days ago
What’s New on VC+ in September
-
Globalization4 weeks ago
Visualizing the BRICS Expansion in 4 Charts
-
Markets2 weeks ago
The 25 Best Stocks by Shareholder Wealth Creation (1926-2022)
-
United States4 days ago
Ranked: The 20 Best Franchises to Open in the U.S.
-
AI4 weeks ago
Nvidia vs. AMD vs. Intel: Comparing AI Chip Sales