What The Data Says About Wealth Inequality
Wealth inequality has gone through peaks and troughs throughout history.
Most recently, in the decade between 2010 and 2020, the top 1% of U.S. households’ portion of wealth has gone from 28.6% to 31.2%.
However, when expressed in raw dollars, things begin to look different. Wealth during the same period for the 1% went from approximately $17.5 trillion to $35 trillion. Meanwhile, the total wealth pool rose from $60 trillion to $112 trillion.
In other words, all households by category have amassed wealth during the same period, albeit at different rates.
|Household Wealth Percentile||Annual Growth in Wealth (CAGR)|
Source: The Federal Reserve
Drivers Of Wealth Inequality
The longest bull market in history, which went from March 2009 to February 2020, has been a big driver for the recent divergence. The U.S. composition of wealth for the top 1% of households skews towards corporate equities and mutual funds, of which they collectively own $14 trillion. By contrast, the bottom 50% of households own $0.16 trillion.
It’s often said a stock market correction is long overdue. Since the top 1% of households clearly have the most skin in the game, if one were to transpire, wealth inequality would likely retract.
A Longer Term Look
Although the inequality of wealth is heavily discussed in today’s climate, the numbers have been higher before.
Wealth inequality, measured by the top 1% of U.S. households’ portion of wealth, was at its peak at the start of the 20th century. Back then, a harsh and more concrete class divide with lower rates of upward mobility were common themes.
At its peak in 1910, the top 1% of U.S. households owned well over 40% of all wealth. Major world wars and the Great Depression seemed to be catalysts against this, and the years after WWII brought about some of the lowest levels of inequality seen in the modern era.
Wealth inequality has ebbed and flowed throughout history, but it has steadily crept back up in the last few decades. Today, its adverse effects continue to garner the attention of more people—including policy makers who are facing immense pressure to find a solution.
Chart: 30 Years of Wildfires in America
Here’s a look at the number of wildfires in America that have occurred each year since 1990, and the acres of forest land scorched during that period.
30 Years of Wildfires in America
This summer, record-breaking droughts and relentless heat waves have fueled disastrous wildfires across the United States. It’s gotten so bad, the state of California has decided to shut down all national forests for two weeks to stop the spread.
But how disastrous has this year been compared to previous years? This graphic gives a historical look at the number of wildfires in America that have occurred each year since 1990, and the acres of forest land scorched during that period.
Total Wildland Fires and Acres from 1990 to 2020
In the U.S., an average of 70,000 wildfires burn through 5.8 million acres of land each year. But some years have been worse than others.
|Year||# of Fires||# of Acres Burned|
*note: 2021 figures as of September 3, 2021
One particularly bad year was 2006, which had over 96,000 fires and destroyed 9.9 million acres of land across the country. It was the year of the Esperanza Fire in California, which burned 40,000 acres and cost $9 million in damages.
2015 was also a devastating year, with over 10.1 million acres destroyed across the country–the worst year on record, in terms of acres burned.
Climate Change’s Role in Wildfires
Wildfires are only expected to worsen in the near future since warmer temperatures and drier climates allow the fires to grow quickly and intensely.
We’re already starting to see climate change impact the wildfire season. For instance, autumn is usually peak wildfire season for California, but this year, one of the largest fires on record started in mid-July, and is still burning as of the date of publication.
Editor’s note, September 20, 2021: In the post above, we said that California closed downed down all national parks for two weeks, starting August 31st. In fact, they closed down all national forests.
Visualizing the Typical Atlantic Hurricane Season
While the Atlantic hurricane season runs from June to late November, about 85% of activity happens between August, September, and October.
Explained: The Typical Atlantic Hurricane Season
On August 29, 2021, Hurricane Ida hurled into the state of Louisiana at rapid speed. With winds of 150 mph, preliminary reports believe it’s the fifth strongest hurricane to ever hit the U.S. mainland.
As research shows, Hurricane Ida’s impact hit right at the peak of the Atlantic hurricane season. Here’s a brief explainer on the basics of hurricanes, how storms are classified, and what a typical storm season looks like in the Atlantic Basin.
Let’s dive in.
Classifying a Storm
Hurricanes are intense tropical storms that are classified by their wind speed. What’s the difference between a hurricane, a typhoon, and a cyclone? They’re essentially the same thing, but are named differently based on their location:
- Hurricane is used for storms that formed in the North Atlantic, central North Pacific, and eastern North Pacific (impacting countries like the U.S.)
- Typhoon is used for storms in the Northwest Pacific (impacting countries like Japan)
- Tropical Cyclone is used for storms in the South Pacific and Indian Ocean (impacting countries like Fiji and India)
Since we’re focusing on the Atlantic, we’ll be using the term hurricane and/or storm throughout the rest of this article.
A storm needs to reach a certain wind speed before it gets classified as a hurricane. Storms with wind speeds of:
- <73 mph are considered Tropical Storms
- 74-110 mph winds are considered Hurricanes
- 111 mph+ winds are considered Major Hurricanes
Breaking Down the Atlantic Hurricane Season
Generally, Hurricanes form in the warm ocean waters in the central Atlantic and Gulf of Mexico, following westward trade winds and curving up towards the North American mainland. Hurricanes are formed when these specific elements come into play:
- A pre-existing weather disturbance such as a tropical wave
- Water at least 80ºF (27ºC) with a depth of at least 50 meters
- Thunderstorm activity
- Low wind shear (too much wind can remove the heat and moisture hurricanes use for fuel)
The Atlantic hurricane season technically lasts six months, beginning on June 1st and ending in late November. However, 85% of activity happens between August, September, and October.
Each subregion in the Atlantic has its own unique climatology, which means peak seasons can vary from place to place—for example, south Florida sees the most hurricanes in October, while the entire Atlantic Basin’s peak season is early-to-mid September.
Climate Change and Hurricanes
According to the Center of Climate Change and Energy Solutions, it’s unclear whether climate change will increase the number of hurricanes per year.
However, research indicates that warmer weather and high ocean temperatures will most likely lead to more intense storms, ultimately causing more damage and devastation.
» Want to learn more about climate change? Here’s an article on The Paris Agreement: Is The World’s Climate Action Plan on Track?
Green2 weeks ago
The World’s 25 Largest Lakes, Side by Side
Economy2 weeks ago
The 20 Fastest Growing Jobs in the Next Decade
Science4 weeks ago
Comparing the Size of The World’s Rockets, Past and Present
Misc4 weeks ago
Every Single Cognitive Bias in One Infographic
Misc2 weeks ago
All World Languages in One Visualization
Misc3 weeks ago
Razor Thin: A New Perspective on Earth’s Atmosphere
Misc4 weeks ago
Visualizing the Highest-Paid Athletes in 2021
Markets1 week ago
Mapping The Biggest Companies By Market Cap in 60 Countries