Starcloud aims to establish a 4km x 4km data center satellite star cloud
Is the overwhelming need for massive data centers by AI manageable through extraterrestrial solutions? Tech firms are considering low-Earth orbit as a viable option, although experts warn that substantial engineering and unresolved challenges currently hinder progress.
The explosive demand and investment in generative AI platforms like ChatGPT have sparked an unparalleled need for computing resources, requiring vast land areas as well as electricity levels comparable to those consumed by millions of households. Consequently, many data centers are increasingly relying on unsustainable energy sources such as natural gas, with tech companies expressing concerns that renewable energy sources cannot meet their skyrocketing power needs or stability requirements for reliable operations.
In response, executives like Elon Musk and Jeff Bezos are advocating for the launch of data centers into orbit, where they could benefit from continuous sunlight, surpassing that of terrestrial solar panels. Bezos, founder of Amazon and owner of Blue Origin, stated earlier this year: It is anticipated that gigawatt-class data centers could be operational in space within 10 to 20 years.
Google is moving forward with its vision for a space data center through its pilot initiative, Project Suncatcher, which plans to launch two prototype satellites equipped with TPU AI chips by 2027 to experiment with their functionality in orbit. However, one of the most notable advancements in space data processing occurred this year with the launch of a solitary H100 graphics processing unit by StarCloud, an Nvidia-backed company. Nevertheless, this is significantly less computing power than what modern AI systems require; OpenAI is estimated to utilize around a million of such chips.
For data centers to function effectively in orbit, many unresolved issues must be tackled. “From an academic research standpoint, [space data centers] are still far from being production-ready,” remarks Benjamin Lee from the University of Pennsylvania, USA.
According to Lee, one of the major hurdles is the extensive scale required to meet AI’s computational needs. This involves not only the power demands from solar panels—requiring substantial surface area—but also the challenge of dissipating heat produced by the chips, the only feasible cooling method in a vacuum. “We can’t use cold air and evaporative cooling like we do on Earth,” Lee explained.
“Square kilometers will be occupied independently for energy generation and cooling,” he added. “These structures expand rapidly. When discussing capacity in the range of 1,000 megawatts, it essentially equates to a considerable area in orbit.” Indeed, StarCloud plans to construct a data center of 5,000 megawatts over 16 square kilometers, roughly 400 times the area of the solar panels on the International Space Station.
Lee believes that several promising technologies could help mitigate these requirements. Krishna Muralidharan from the University of Arizona is investigating thermoelectric devices that can convert heat into electricity, enhancing the efficiency of chips functioning in space. “It’s not a matter of feasibility; it’s a challenge,” Muralidharan stated. “For now, we can temporarily rely on large thermal panels, but ultimately we will require more sophisticated solutions.”
Additionally, space presents unique challenges unlike those found on Earth. For instance, there is a significant presence of high-energy radiation that can impact computer chips, leading to errors and disrupted calculations. “Everything will slow down,” Lee cautioned. “A chip positioned in space might perform worse compared to one on Earth due to the need for recalibration and error correction.”
To function at this scale, Muralidharan noted that thousands of satellites need to operate in tandem, necessitating highly precise laser systems for communication both between data centers and with Earth, where atmospheric interference can distort signals. Despite this, Muralidharan remains optimistic, believing these challenges are surmountable. “The real question is not if, but when,” he asserts.
Another point of uncertainty is whether AI will still necessitate such extensive computational resources by the time the data centers are in place. This is particularly relevant if anticipated advancements in AI do not align with the growing computing power we are beginning to observe. “It’s evident that training requirements may peak or stabilize, which would likely cause the demand for large-scale data centers to follow suit,” Lee explained.
Yet, even in such a scenario, Muralidharan suggests potential applications for space-based data centers, such as facilitating space exploration beyond Earth and monitoring terrestrial phenomena.
topic:
Source: www.newscientist.com
