Technology
How Many Music Streams Does it Take to Earn a Dollar?

How Many Music Streams Does it Take to Earn a Dollar?
A decade ago, the music industry was headed for a protracted fade-out.
The disruptive effects of peer-to-peer file sharing had slashed music revenues in half, casting serious doubts over the future of the industry.
Ringtones provided a brief earnings bump, but it was the growing popularity of premium streaming services that proved to be the savior of record labels and artists. For the first time since the mid-90s, the music industry saw back-to-back years of growth, and revenues grew a brisk 12% in 2018 – nearly reaching $10 billion. In short, people showed they were still willing to pay for music.
Although most forecasts show streaming services like Spotify and Apple Music contributing an increasingly large share of revenue going forward, recent data from The Trichordist reveals that these services pay out wildly different rates per stream.
Note: Due to the lack of publicly available data, calculating payouts from streaming services is not an exact science. This data set is based on revenue from an indie label with a ~150 album catalogue generating over 115 million streams.
Full Stream Ahead
One would expect streaming services to have fairly similar payout rates every time a track is played, but this is not the case. In reality, the streaming rates of major players in the market – which have very similar catalogs – are all over the map. Below is a full breakdown of how many streams it takes to earn a dollar on various platforms:
| Streaming service | Avg. payout per stream | # of streams to earn one dollar | # of streams to earn minimum wage* |
|---|---|---|---|
| Napster | $0.019 | 53 | 77,474 |
| Tidal | $0.0125 | 80 | 117,760 |
| Apple Music | $0.00735 | 136 | 200,272 |
| Google Play Music | $0.00676 | 147 | 217,751 |
| Deezer | $0.0064 | 156 | 230,000 |
| Spotify | $0.00437 | 229 | 336,842 |
| Amazon | $0.00402 | 249 | 366,169 |
| Pandora** | $0.00133 | 752 | 1,106,767 |
| YouTube | $0.00069 | 1,449 | 2,133,333 |
*U.S. monthly minimum wage of $1,472 **Premium tier
Napster, once public enemy number one in the music business, has some of the most generous streaming rates in the industry. On the downside, the brand currently has a market share of less than 1%, so getting a high volume of plays on an album isn’t likely to happen for most artists.
On the flip side of the equation, YouTube has the highest number of plays per song, but the lowest payout per stream by far. It takes almost 1,500 plays to earn a single dollar on the Google-owned video platform.
Spotify, which is now the biggest player in the streaming market, is on the mid-to-low end of the compensation spectrum.
The Payment Pipeline
How do companies like Spotify calculate the amount paid out to license holders? Here’s a look at their payout process:

As this chart reveals, dollars earned from streaming still don’t tell the full story of how much artists receive at the end of the line. This amount is influenced by whether or not the performer has a record deal, and if other contributors have a stake in the recorded work.
The Pressure is Heating Up
When Spotify was a scrappy startup providing a much needed revenue stream to the music industry, labels were temporarily willing to accept lower streaming rates.
But now that Spotify is a public company, and tech giants like Apple and Amazon are in the picture, a growing chorus of industry players will likely dial up the pressure to increase compensation rates.
AI
AI vs. Humans: Which Performs Certain Skills Better?
Progress in computation ability, data availability, and algorithm efficiency has led to rapid gains in performance for AI vs humans.
AI vs. Humans: Which Performs Certain Skills Better?
With ChatGPT’s explosive rise, AI has been making its presence felt for the masses, especially in traditional bastions of human capabilities—reading comprehension, speech recognition and image identification.
In fact, in the chart above it’s clear that AI has surpassed human performance in quite a few areas, and looks set to overtake humans elsewhere.
How Performance Gets Tested
Using data from Contextual AI, we visualize how quickly AI models have started to beat database benchmarks, as well as whether or not they’ve yet reached human levels of skill.
Each database is devised around a certain skill, like handwriting recognition, language understanding, or reading comprehension, while each percentage score contrasts with the following benchmarks:
- 0% or “maximally performing baseline”
This is equal to the best-known performance by AI at the time of dataset creation. - 100%
This mark is equal to human performance on the dataset.
By creating a scale between these two points, the progress of AI models on each dataset could be tracked. Each point on a line signifies a best result and as the line trends upwards, AI models get closer and closer to matching human performance.
Below is a table of when AI started matching human performance across all eight skills:
| Skill | Matched Human Performance | Database Used |
|---|---|---|
| Handwriting Recognition | 2018 | MNIST |
| Speech Recognition | 2017 | Switchboard |
| Image Recognition | 2015 | ImageNet |
| Reading Comprehension | 2018 | SQuAD 1.1, 2.0 |
| Language Understanding | 2020 | GLUE |
| Common Sense Completion | 2023 | HellaSwag |
| Grade School Math | N/A | GSK8k |
| Code Generation | N/A | HumanEval |
A key observation from the chart is how much progress has been made since 2010. In fact many of these databases—like SQuAD, GLUE, and HellaSwag—didn’t exist before 2015.
In response to benchmarks being rendered obsolete, some of the newer databases are constantly being updated with new and relevant data points. This is why AI models technically haven’t matched human performance in some areas (grade school math and code generation) yet—though they are well on their way.
What’s Led to AI Outperforming Humans?
But what has led to such speedy growth in AI’s abilities in the last few years?
Thanks to revolutions in computing power, data availability, and better algorithms, AI models are faster, have bigger datasets to learn from, and are optimized for efficiency compared to even a decade ago.
This is why headlines routinely talk about AI language models matching or beating human performance on standardized tests. In fact, a key problem for AI developers is that their models keep beating benchmark databases devised to test them, but still somehow fail real world tests.
Since further computing and algorithmic gains are expected in the next few years, this rapid progress is likely to continue. However, the next potential bottleneck to AI’s progress might not be AI itself, but a lack of data for models to train on.
-
Gen Z3 weeks agoCharted: Gen Z Job Attitudes Compared with Other Generations
-
Green1 week agoTracking Antarctica Sea Ice Loss in 2023
-
Maps3 weeks agoRanked: The World’s Largest Cities By Population
-
AI1 week agoAI vs. Humans: Which Performs Certain Skills Better?
-
Mining4 weeks agoCharted: America’s Import Reliance of Key Minerals
-
Markets3 weeks agoCharted: The Rise and Fall of WeWork
-
Demographics4 days agoVisualizing the World’s Growing Millionaire Population (2012-2022)
-
United States4 weeks agoInternet Adoption in America: Who Isn’t Online Yet?

