Technology
All of the World’s Spaceports on One Map
View the high-resolution map
Mapped: The World’s Rocket Launch Sites
From Sputnik 1 to today’s massive satellite constellations, every object in space was launched from just a handful of locations.
The map above, from BryceTech, is a comprehensive look at the world’s spaceports (both orbital and sub-orbital) as well as ballistic missile test sites.
The World’s Major Spaceports
Though the graphic above is a detailed list of many types of rocket launch sites, we’ll focus on major sites that are sending satellites and passengers into sub-orbit, orbit, and beyond.
Launch Facility | Location | Country |
---|---|---|
Cape Canaveral Space Force Station | Florida | 🇺🇸 U.S. |
Cape Canaveral Spaceport | Florida | 🇺🇸 U.S. |
Kennedy Space Center | Florida | 🇺🇸 U.S. |
Cecil Field Spaceport | Florida | 🇺🇸 U.S. |
Colorado Air & Space Port | Colorado | 🇺🇸 U.S. |
Vandenberg Air Force Base | California | 🇺🇸 U.S. |
Mojave Air and Space Port | California | 🇺🇸 U.S. |
Oklahoma Air & Space Port | Oklahoma | 🇺🇸 U.S. |
Poker Flat Research Range | Alaska | 🇺🇸 U.S. |
Pacific Spaceport Complex | Alaska | 🇺🇸 U.S. |
Spaceport America | New Mexico | 🇺🇸 U.S. |
Launch Site One (Corn Ranch) | Texas | 🇺🇸 U.S. |
Houston Spaceport | Texas | 🇺🇸 U.S. |
Midland Air & Space Port | Texas | 🇺🇸 U.S. |
SpaceX Development and Test Facility | Texas | 🇺🇸 U.S. |
SpaceX Starbase | Texas | 🇺🇸 U.S. |
Spaceport Camden | Georgia | 🇺🇸 U.S. |
Mid-Atlantic Regional Spaceport | Virginia | 🇺🇸 U.S. |
Wallops Flight Facility | Virginia | 🇺🇸 U.S. |
Reagan Test Site | Kwajalein Atoll | 🇲🇭 Marshall Islands |
Naro Space Center | Outer Naro Island | 🇰🇷 South Korea |
Sohae Satellite Launching Station | North Pyongan Province | 🇰🇵 North Korea |
Kapustin Yar | Astrakhan Oblast | 🇷🇺 Russia |
Plesetsk Cosmodrome | Arkhangelsk Oblast | 🇷🇺 Russia |
Vostochny Cosmodrome | Amur Oblast | 🇷🇺 Russia |
Yasny Launch Base | Orenburg Oblast | 🇷🇺 Russia |
Arnhem Space Centre | Northern Territory | 🇦🇺 Australia |
Whalers Way Orbital Launch Complex | South Australia | 🇦🇺 Australia |
Koonibba Test Range | South Australia | 🇦🇺 Australia |
Bowen Orbital Spaceport | Queensland | 🇦🇺 Australia |
Rocket Lab Launch Complex 1 | Wairoa District | 🇳🇿 New Zealand |
Baikonur Cosmodrome | Baikonur | 🇰🇿 Kazakhstan |
Space Port Oita | Ōita | 🇯🇵 Japan |
Tanegashima Space Center | Kagoshima | 🇯🇵 Japan |
Uchinoura Space Center | Kagoshima | 🇯🇵 Japan |
Taiki Aerospace Research Field | Hokkaido | 🇯🇵 Japan |
Hokkaido Spaceport | Hokkaido | 🇯🇵 Japan |
Ryori Launch Site | Iwate | 🇯🇵 Japan |
Sonmiani Satellite Launch Center | Balochistan | 🇵🇰 Pakistan |
Integrated Test Range | Odisha | 🇮🇳 India |
Thumba Equatorial Rocket Launching Station | Kerala | 🇮🇳 India |
Satish Dhawan Space Centre | Sriharikota | 🇮🇳 India |
Guiana Space Centre | Kourou | 🇬🇫 French Guiana |
Barreira do Inferno Launch Center | Rio Grande do Norte | 🇧🇷 Brazil |
Alcântara Space Center | Maranhão | 🇧🇷 Brazil |
Stasiun Peluncuran Roket | West Java | 🇮🇩 Indonesia |
Jiuquan Satellite Launch Center | Gansu Province | 🇨🇳 China |
Taiyuan Satellite Launch Center | Shanxi Province | 🇨🇳 China |
Wenchang Spacecraft Launch Site | Hainan Province | 🇨🇳 China |
Xichang Satellite Launch Center | Sichuan Province | 🇨🇳 China |
Palmachim Airbase | Central District | 🇮🇱 Israel |
Imam Khomeini Space Launch Terminal | Semnan | 🇮🇷 Iran |
Qom Lauch Facility | Qom | 🇮🇷 Iran |
El Arenosillo Test Centre | Huelva | 🇪🇸 Spain |
Spaceport Sweden | Lapland | 🇸🇪 Sweden |
Esrange Space Center | Lapland | 🇸🇪 Sweden |
Andøya Space | Nordland | 🇳🇴 Norway |
SaxaVord Spaceport | Shetland Islands | 🇬🇧 UK |
Sutherland Spaceport | Sutherland | 🇬🇧 UK |
Western Isles Spaceport | Outer Hebrides | 🇬🇧 UK |
Spaceport Machrihanish | Campbeltown | 🇬🇧 UK |
Prestwick Spaceport | Glasgow | 🇬🇧 UK |
Snowdonia Spaceport | North West Wales | 🇬🇧 UK |
Spaceport Cornwall | Cornwall | 🇬🇧 UK |
Orbex LP1 | Moray | 🇬🇧 UK |
Spaceport Nova Scotia | Nova Scotia | 🇨🇦 Canada |
Editor’s note: The above table includes all sites that are operational, as well as under construction, as of publishing date.
The list above covers fixed locations, and does not include SpaceX’s autonomous spaceport drone ships. There are currently three active drone ships—one based near Los Angeles, and the other two based at Port Canaveral, Florida.
Two of the most famous launch sites on the list are the Baikonur Cosmodrome (Kazakhstan) and Cape Canaveral (United States). The former was constructed as the base of operations for the Soviet space program and was the launch point for Earth’s first artificial satellite, Sputnik 1. The latter was NASA’s primary base of operations and the first lunar-landing flight was launched from there in 1969.
The global roster of spaceports has grown immensely since Baikonur and Cape Canaveral were the only game in town. Now numerous countries have the ability to launch satellites, and many more are getting in on the action.
Wenchang Space Launch Site, on the island of Hainan, is China’s newest launch location. The site recorded its first successful launch in 2016.
Location, Location
One interesting quirk of the map above is the lack of spaceports in Europe. Europe’s ambitions for space are actually launched from the Guiana Space Centre in South America. Europe’s Spaceport has been operating in French Guiana since 1968.
Low altitude launch locations near the equator are the most desirable, as far less energy is required to take a spacecraft from surface level to an equatorial, geostationary orbit.
Islands and coastal areas are also common locations for launch sites. Since the open waters aren’t inhabited, there is minimal risk of harm from debris in the event of a launch failure.
As demand for satellites and space exploration grows, the number of launch locations will continue to grow as well.
Technology
Charted: The Exponential Growth in AI Computation
In eight decades, artificial intelligence has moved from purview of science fiction to reality. Here’s a quick history of AI computation.

Charted: The Exponential Growth in AI Computation
Electronic computers had barely been around for a decade in the 1940s, before experiments with AI began. Now we have AI models that can write poetry and generate images from textual prompts. But what’s led to such exponential growth in such a short time?
This chart from Our World in Data tracks the history of AI through the amount of computation power used to train an AI model, using data from Epoch AI.
The Three Eras of AI Computation
In the 1950s, American mathematician Claude Shannon trained a robotic mouse called Theseus to navigate a maze and remember its course—the first apparent artificial learning of any kind.
Theseus was built on 40 floating point operations (FLOPs), a unit of measurement used to count the number of basic arithmetic operations (addition, subtraction, multiplication, or division) that a computer or processor can perform in one second.
Computation power, availability of training data, and algorithms are the three main ingredients to AI progress. And for the first few decades of AI advances, compute, which is the computational power needed to train an AI model, grew according to Moore’s Law.
Period | Era | Compute Doubling |
---|---|---|
1950–2010 | Pre-Deep Learning | 18–24 months |
2010–2016 | Deep Learning | 5–7 months |
2016–2022 | Large-scale models | 11 months |
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
However, at the start of the Deep Learning Era, heralded by AlexNet (an image recognition AI) in 2012, that doubling timeframe shortened considerably to six months, as researchers invested more in computation and processors.
With the emergence of AlphaGo in 2015—a computer program that beat a human professional Go player—researchers have identified a third era: that of the large-scale AI models whose computation needs dwarf all previous AI systems.
Predicting AI Computation Progress
Looking back at the only the last decade itself, compute has grown so tremendously it’s difficult to comprehend.
For example, the compute used to train Minerva, an AI which can solve complex math problems, is nearly 6 million times that which was used to train AlexNet 10 years ago.
Here’s a list of important AI models through history and the amount of compute used to train them.
AI | Year | FLOPs |
---|---|---|
Theseus | 1950 | 40 |
Perceptron Mark I | 1957–58 | 695,000 |
Neocognitron | 1980 | 228 million |
NetTalk | 1987 | 81 billion |
TD-Gammon | 1992 | 18 trillion |
NPLM | 2003 | 1.1 petaFLOPs |
AlexNet | 2012 | 470 petaFLOPs |
AlphaGo | 2016 | 1.9 million petaFLOPs |
GPT-3 | 2020 | 314 million petaFLOPs |
Minerva | 2022 | 2.7 billion petaFLOPs |
Note: One petaFLOP = one quadrillion FLOPs. Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
The result of this growth in computation, along with the availability of massive data sets and better algorithms, has yielded a lot of AI progress in seemingly very little time. Now AI doesn’t just match, but also beats human performance in many areas.
It’s difficult to say if the same pace of computation growth will be maintained. Large-scale models require increasingly more compute power to train, and if computation doesn’t continue to ramp up it could slow down progress. Exhausting all the data currently available for training AI models could also impede the development and implementation of new models.
However with all the funding poured into AI recently, perhaps more breakthroughs are around the corner—like matching the computation power of the human brain.
Where Does This Data Come From?
Source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.
Note: The time estimated to for computation to double can vary depending on different research attempts, including Amodei and Hernandez (2018) and Lyzhov (2021). This article is based on our source’s findings. Please see their full paper for further details. Furthermore, the authors are cognizant of the framing concerns with deeming an AI model “regular-sized” or “large-sized” and said further research is needed in the area.
Methodology: The authors of the paper used two methods to determine the amount of compute used to train AI Models: counting the number of operations and tracking GPU time. Both approaches have drawbacks, namely: a lack of transparency with training processes and severe complexity as ML models grow.
-
Business4 days ago
Ranked: The 20 Best Franchises to Open in the U.S.
-
AI4 weeks ago
Nvidia vs. AMD vs. Intel: Comparing AI Chip Sales
-
Money2 weeks ago
How Much Does it Take to Be Wealthy in America?
-
Markets3 days ago
Ranked: The Highest Paid CEOs in the S&P 500
-
Green4 weeks ago
The Anthropocene: A New Epoch in the Earth’s History
-
Business2 weeks ago
Visualizing the Number of Costco Stores, by Country
-
Markets2 days ago
Charted: Market Volatility at its Lowest Point Since 2020
-
Green4 weeks ago
Tracking Antarctica Sea Ice Loss in 2023