Deep Tech

The Future of Cloud Computing Is in Space — PowerBank and Orbit AI Show How

A breakdown of the mission aiming to turn space into the next layer of digital infrastructure.

Updated

January 8, 2026 6:32 PM

The Hubble Space Telescope, one of the fist space infrastructures. PHOTO: UNSPLASH

PowerBank Corporation and Smartlink AI, the company behind Orbit AI, are preparing to send a very different kind of satellite into space. Their upcoming mission, scheduled for December 2025, aims to test what they call the world’s first “Orbital Cloud” — a system that moves parts of today’s digital infrastructure off the ground and into orbit. While satellites already handle GPS, TV signals and weather data, this project tries to do something bigger: turn space itself into a platform for computing, artificial intelligence (AI) and secure blockchain-based digital transactions. In essence, it marks the beginning of space-based cloud computing.

To understand why this matters, it is helpful to examine the limitations of our current systems. As AI tools grow more advanced, they require massive data centers that consume enormous amounts of electricity, especially for cooling. These facilities depend on national power grids, face regulatory constraints and are concentrated in just a few regions. Meanwhile, global connectivity still struggles with inequalities, censorship, congestion and geopolitical bottlenecks. The Orbital Cloud is meant to plug these gaps by building a computing and communication layer above Earth — a solar-powered, space-cooled network in Low Earth Orbit (LEO) that no single nation or company fully controls.

Orbit AI’s approach brings together two new systems. The first, called DeStarlink, is a decentralized satellite network designed for global internet-style connectivity and resilient communication. The second, DeStarAI, is a set of AI-focused in-orbit data centers placed directly on satellites, using space’s naturally cold environment instead of the energy-hungry cooling towers used on Earth. When these two ideas merge, the result is a floating digital layer where information can be transmitted, processed and verified without touching terrestrial infrastructure — a key shift in how AI workloads and cloud computing may be handled in the future.

PowerBank enters the picture by supplying the electricity and temperature-control technology needed to keep these satellites running. In space, sunlight is constant and uninterrupted — no clouds, no storms, no nighttime periods where panels lie idle. PowerBank plans to provide high-efficiency solar arrays and adaptive thermal systems that help the satellites manage heat in orbit. This collaboration marks a shift for PowerBank, which is expanding from traditional solar and battery projects into the realm of digital infrastructure, AI energy systems and next-generation satellite technology.

Describing the ambition behind this move, Dr. Richard Lu, CEO of PowerBank, said: “The next frontier of human innovation isn't just in space exploration, it's in building the infrastructure of tomorrow above the Earth”. He pointed to a future market that could surpass US$700 billion, driven by orbital satellites, AI computing in space, blockchain verification and solar-powered data systems. Integrating solar energy with orbital computing, he said, could help create “a globally sovereign, AI-enabled digital layer in space, which is a system that can help power finance, communications and critical infrastructure”.

Orbit AI’s Co-Founder and CEO, Gus Liu, describes their satellites as deliberately autonomous and intelligent. “Orbit AI is creating the first truly intelligent layer in orbit — satellites that compute, verify and optimize themselves autonomously”, he said, “The Orbital Cloud turns space into a platform for AI, blockchain and global connectivity. By leveraging solar-powered compute payloads and decentralized verification nodes, we are opening an entirely new, potentially US$700+ billion-dollar market opportunity — one that combines energy, data and sovereignty to reshape industries from finance to government and Web3. PowerBank's expertise in advanced solar energy systems will be significant in supporting this initiative."

This vision is not isolated. Earlier this year, Jeff Bezos echoed a similar idea at Italian Tech Week, saying: “We will be able to beat the cost of terrestrial data centres in space in the next couple of decades. These giant training clusters will be better built in space, because we have solar power there, 24/7 — no clouds, no rain, no weather.  The next step is going to be data centres and then other kinds of manufacturing.” His comments reflect a growing industry belief that space-based data centers will eventually outperform those on Earth.

The idea gains traction because the advantages are practical. Space offers free, constant solar power. It provides natural cooling, which is one of the costliest parts of running data centers on Earth. And above all, satellites in low-Earth orbit operate beyond national firewalls and political boundaries, making them more resilient to outages, censorship and conflict. For industries that rely heavily on secure connectivity and real-time data — finance, defense, AI, blockchain networks and global cloud providers — this could become an important alternative layer of infrastructure.

The upcoming Genesis-1 satellite is designed as a demonstration mission. It will test an Ethereum wallet, run a blockchain verification node and perform simple AI tasks in orbit. If the technology works as expected, Orbit AI plans to add several more satellites in 2026, expand into larger networks by 2027 and 2028 and begin full commercial operations by the decade’s end.

To build this system, Orbit AI plans to source technologies from some of the world’s most influential players: NVIDIA for AI processors, the Ethereum Foundation for blockchain tools, Galaxy Space and SparkX Satellite for satellite components, Galactic Energy for launch systems and AscendX Aerospace for advanced materials.

If successful, the Orbital Cloud could become the first step toward a world where part of humanity’s data, computing power and digital services run not in massive buildings on Earth, but in clusters of autonomous satellites illuminated by constant sunlight. For now, the journey begins with a single launch — a test satellite aiming to show that space can do far more than connect us. It may soon help power the systems that run our economies, technologies and global communication networks.

Keep Reading

Artificial Intelligence

The Real Cost of Scaling AI: How Supermicro and NVIDIA Are Rebuilding Data Center Infrastructure

The hidden cost of scaling AI: infrastructure, energy, and the push for liquid cooling.

Updated

January 8, 2026 6:31 PM

The inside of a data centre, with rows of server racks. PHOTO: FREEPIK

As artificial intelligence models grow larger and more demanding, the quiet pressure point isn’t the algorithms themselves—it’s the AI infrastructure that has to run them. Training and deploying modern AI models now requires enormous amounts of computing power, which creates a different kind of challenge: heat, energy use and space inside data centers. This is the context in which Supermicro and NVIDIA’s collaboration on AI infrastructure begins to matter.

Supermicro designs and builds large-scale computing systems for data centers. It has now expanded its support for NVIDIA’s Blackwell generation of AI chips with new liquid-cooled server platforms built around the NVIDIA HGX B300. The announcement isn’t just about faster hardware. It reflects a broader effort to rethink how AI data center infrastructure is built as facilities strain under rising power and cooling demands.

At a basic level, the systems are designed to pack more AI chips into less space while using less energy to keep them running. Instead of relying mainly on air cooling—fans, chillers and large amounts of electricity, these liquid-cooled AI servers circulate liquid directly across critical components. That approach removes heat more efficiently, allowing servers to run denser AI workloads without overheating or wasting energy.

Why does that matter outside a data center? Because AI doesn’t scale in isolation. As models become more complex, the cost of running them rises quickly, not just in hardware budgets, but in electricity use, water consumption and physical footprint. Traditional air-cooling methods are increasingly becoming a bottleneck, limiting how far AI systems can grow before energy and infrastructure costs spiral.

This is where the Supermicro–NVIDIA partnership fits in. NVIDIA supplies the computing engines—the Blackwell-based GPUs designed to handle massive AI workloads. Supermicro focuses on how those chips are deployed in the real world: how many GPUs can fit in a rack, how they are cooled, how quickly systems can be assembled and how reliably they can operate at scale in modern data centers. Together, the goal is to make high-density AI computing more practical, not just more powerful.

The new liquid-cooled designs are aimed at hyperscale data centers and so-called AI factories—facilities built specifically to train and run large AI models continuously. By increasing GPU density per rack and removing most of the heat through liquid cooling, these systems aim to ease a growing tension in the AI boom: the need for more computers without an equally dramatic rise in energy waste.

Just as important is speed. Large organizations don’t want to spend months stitching together custom AI infrastructure. Supermicro’s approach packages compute, networking and cooling into pre-validated data center building blocks that can be deployed faster. In a world where AI capabilities are advancing rapidly, time to deployment can matter as much as raw performance.

Stepping back, this development says less about one product launch and more about a shift in priorities across the AI industry. The next phase of AI growth isn’t only about smarter models—it’s about whether the physical infrastructure powering AI can scale responsibly. Efficiency, power use and sustainability are becoming as critical as speed.