
MIT Technology Review Clarifies: Allow our authors to disentangle the intricate, chaotic domain of technology to assist you in comprehending what lies ahead.Discover more from the series here.
This January, Elon Musk’s SpaceX submitted a request to the US Federal Communications Commission to establish up to a million data centers in Earth’s orbit. The objective? To completely harness AI’s capabilities without inducing an environmental catastrophe on Earth. But is it feasible?
SpaceX joins a series of high-tech firms advocating for the promise of orbital computing frameworks. Last year, Amazon’s founder Jeff Bezos mentioned that the tech sector will transition towards extensive computing in space. Google plans to deploy data-processing satellites, targeting to launch a test constellation of 80 as early as next year. Additionally, last November Starcloud, a startup from Washington State, launched a satellite equipped with a high-powered Nvidia H100 GPU, marking the initial orbital trial of an advanced AI processor. The company envisions data centers in orbit matching the size of those on Earth by 2030.
Supporters argue that relocating data centers to space is logical. The present AI surge is overwhelming power grids and escalating the demand for water, essential for cooling the machinery. Local communities near extensive data centers are concerned about soaring costs for these resources due to rising demand, among other challenges.
In space, proponents claim, the issues relating to water and energy would be alleviated. Within permanently lit sun-synchronous orbits, space-based data centers would consistently access solar energy. Concurrently, the surplus heat generated would be efficiently released into the frigid vacuum of space. As launch costs decrease, and mega-rockets like SpaceX’s Starship promise to drive prices even lower, there could come a time when relocating the world’s data centers to space is a financially sound strategy. Critics, however, present a contrasting narrative and cite a range of technological obstacles, though some assert they might be overcome in the near future. Here are four crucial requirements necessary to actualize space-based data centers.
A method to dissipate heat
AI data centers generate significant heat. Space might appear to be an ideal location to dissipate this heat without expending large quantities of water. However, the reality is more complex. To maintain continuous operation, a space-oriented data center must reside in a consistently illuminated orbit, revolving around the planet from pole to pole, avoiding any periods in Earth’s shadow. Within that orbit, the equipment’s temperature would never drop below 80 °C, which is excessively hot for electronics to operate securely over long periods.
Removing heat from such a system is surprisingly difficult. “Thermal management and cooling in space is generally a significant issue,” states Lilly Eichinger, CEO of the Austrian space technology startup Satellives.
On Earth, heat primarily dissipates through the natural process of convection, dependent on the movement of gases and liquids like air and water. In the vacuum of space, heat must be eliminated through the considerably less efficient method of radiation. Safely expelling the heat generated by the computers, and that absorbed from the sun, requires large radiative surfaces. The larger the satellite, the more difficult it becomes to effectively radiate heat into space.
However, Yves Durand, former technology director at the European aerospace firm Thales Alenia Space, asserts that technology is already available to address this issue.
The company previously created a system for large telecommunications satellites capable of circulating refrigerant fluid through a network of tubing operated by a mechanical pump, ultimately transferring heat from inside a spacecraft to external radiators. Durand oversaw a 2024 feasibility study on space-based data centers, which concluded that despite existing challenges, Europe could position gigawatt-scale data centers (comparable to the largest terrestrial facilities) into orbit before 2050. These would be significantly larger than those projected by SpaceX, featuring solar arrays hundreds of meters wide—bigger than the International Space Station.
Computer chips that resist radiation exposure
The space surrounding Earth is perpetually subjected to cosmic particles and bombarded by solar radiation. At Earth’s surface, humans and their electronic devices are shielded from this hazardous soup of charged particles by the planet’s atmosphere and magnetosphere. However, as one moves further from Earth, this protection diminishes. Research indicates that flight crews encounter a increased risk of developing cancer due to regular exposure to high radiation at cruising altitudes where the atmosphere is thin and less protective.
Electronics in space face three forms of challenges stemming from elevated radiation levels, according to Ken Mai, a principal systems scientist in electrical and computer engineering at Carnegie Mellon University. Events termed single-event upsets can lead to bit flips and corrupt stored information when charged particles impact chips and memory devices. Over time, space electronics incur damage from ionizing radiation that diminishes their functionality. Additionally, a charged particle can strike a component in such a manner that it physically displaces atoms on the chip, inflicting lasting damage, Mai explains.
Traditionally, computers sent to space required extensive testing and were specifically crafted to endure the intense radiation found in Earth’s orbit. These space-hardened electronics, however, come at a much higher cost, and their performance lags significantly behind that of leading-edge devices used on Earth. Launching conventional chips is a risk. Yet, Durand notes that modern computer chips utilize technologies that are inherently more resistant to radiation than earlier systems. Furthermore, in mid-March, Nvidia promoted hardware, inclusive of a new GPU, claiming it is “bringing AI compute to orbital data centers.”
Nvidia’s head of edge AI marketing, Chen Su, informed MIT Technology Review that “Nvidia systems are fundamentally commercial off-the-shelf products, with radiation resilience achieved at the system level rather than solely through radiation-hardened silicon.” He added that satellite manufacturers enhance the chips’ durability through shielding, advanced software for error detection, and system architectures that blend consumer-grade devices with bespoke, hardened technologies.
Nonetheless, Mai points out that the data-processing chips are merely one concern. The data centers would also require memory and storage solutions, both of which are susceptible to damage from excessive radiation. Furthermore, operators would need the capacity to replace components or adapt when challenges arise. The practicality and cost-effectiveness of employing robots or astronaut missions for maintenance remains a significant uncertainty looming over the concept of large-scale orbiting data centers.
“You not only need to establish a data center in space that fulfills your immediate requirements; you need redundancy, extra components, and reconfigurability, so when something malfunctions, you can simply adjust your configuration and continue operating,” remarks Mai. “It’s a highly challenging situation because on one hand, you have abundant energy and power in space, but there are numerous drawbacks. It’s quite possible that these challenges may overshadow any benefits gained from placing a data center in space.”
Aside from the necessity for continuous upkeep, there exists the risk of catastrophic failure. During severe space weather events, satellites can experience an influx of radiation potent enough to disable all their electronics. The sun has just completed its most active phase of an 11-year cycle with relatively little impact on satellites. Nonetheless, experts caution that since the dawn of the space age, the Earth has yet to encounter the worst effects the sun has to offer. Many experts are skeptical that the cost-effective new space systems currently orbiting Earth are equipped to handle such scenarios.
A strategy to evade space debris
Both expansive orbiting data centers like those proposed by Thales Alenia Space and the mega-constellations of smaller satellites suggested by SpaceX present challenges for space sustainability experts. The region surrounding Earth is already quite cluttered with satellites. Starlink satellites alone execute hundreds of thousands of collision-avoidance maneuvers every year to evade debris and other spacecraft. An increasing number of objects in space raises the likelihood of a catastrophic collision that could scatter thousands of perilous fragments throughout orbit.
Large structures with extensive solar arrays would soon sustain damage from tiny pieces of space debris and meteoroids, which would gradually impair the performance of their solar panels and generate more debris in orbit. Operating one million satellites in low Earth orbit, the area of space at altitudes up to 2,000 kilometers, may be unfeasible without all satellites in that region being part of a unified network for effective communication in order to maneuver around each other, Greg Vialle, founder of the orbital recycling company Lunexus Space, stated to MIT Technology Review.
“Approximately four to five thousand satellites can fit in a single orbital shell,” Vialle explains. “When accounting for all the shells in low Earth orbit, you arrive at a maximum close to 240,000 satellites.”
Moreover, spacecraft must have the capability to pass each other at safe distances to prevent collisions, he emphasizes.
“You also need to ensure that items can be transported to higher orbits and brought back down to deorbit,” he continues. “Thus, you require distances of at least 10 kilometers between the satellites to do so safely. Mega-constellations like Starlink can occupy tighter spaces due to inter-satellite communication. However, housing one million satellites around Earth would likely lead to a monopoly.”
Additionally, Starlink would probably seek to routinely enhance its orbiting data centers with newer technologies. Replacing a million satellites approximately every five years could result in even higher orbital traffic—and this might amplify the rate of debris reentry into Earth’s atmosphere from around three or four pieces of junk daily to roughly one every three minutes, according to a group of astronomers who lodged objections against SpaceX’s FCC submission. Some scientists are worried that falling debris might harm the ozone layer and disrupt Earth’s thermal balance.
Cost-effective launch and assembly
The longer hardware endures in orbit, the greater the return on investment. However, for orbital data centers to be economically viable, companies must identify a relatively affordable method to place that hardware into orbit. SpaceX is banking on its forthcoming Starship mega-rocket, which will have the capacity to carry up to six times more payload than the current workhorse, Falcon 9. The Thales Alenia Space analysis concluded that if Europe were to construct its orbital data centers, it must develop a similarly powerful launcher.
Yet launch represents just one aspect of the equation. A large-scale orbital data center cannot fit within a rocket—even a mega-rocket. Its assembly in orbit will be necessary. This likely demands advanced robotic systems that are not yet available. Various companies have executed Earth-based experiments with prototypes of such systems, but they remain far from practical application.
Durand asserts that in the near term, smaller-scale data centers are likely to become an essential component of orbital infrastructure, by processing imagery from Earth-observing satellites directly in space without needing to transmit them to Earth. This would substantially benefit companies offering insights from space, as many of these datasets are exceptionally large, and the competition for opportunities to downlink them to Earth for processing through ground stations is intensifying.
“The advantage of orbital data centers is that you can initiate with small servers and progressively expand to develop larger data centers,” Durand states. “You can leverage modularity. You can acquire knowledge progressively and systematically build industrial capacity in space. We possess all the technology, and the demand for space-based data processing infrastructure is tremendous, so it’s reasonable to consider it.”
However, smaller facilities are unlikely to alleviate the strain that terrestrial data centers impose on the planet’s water and energy resources. Critics believe that this vision of the future may take decades to come to fruition, if it ever materializes at all.