AI in Space: Start at the Edge, Build for the Mission

Artificial intelligence is rapidly reshaping the future of space exploration and operations, bringing computation closer to where data is generated. Drawing on decades of experience in edge computing, Mark Papermaster outlines how mission-driven design, efficiency, and adaptability are enabling AI to transform satellites, spacecraft, and even the prospect of data centres beyond Earth

I started my career working on the space shuttle program at IBM and thought my life endeavours would centre on space. Instead, my interest turned to compute devices and the technology that can bring computation to the masses. Those interests are now aligning with the realities of AI in space, for both edge computation in satellites and spacecraft today, and the future plans for massive data centres in space.

For years, AMD has built for “edge reality” – where power is constrained, connectivity isn’t guaranteed, and success is measured in real-time decisions, not theoretical peak performance. We’ve helped bring AI into PCs, industrial systems and embedded deployments by combining heterogeneous compute (CPUs, GPUs and adaptive compute), along with a strong software foundation. This “edge playbook” centres on a relentless focus on performance-per-watt and mission-critical reliability, allowing our partners to right-size performance for their specific needs.

ads

We see space as the next and most demanding frontier for edge computing. The same fundamentals apply; they’re just amplified: strict power and thermal budgets, intermittent communications, expected long service lives, and a premium on reliability and autonomy. We are taking what we’ve learned enabling AI at the edge and extending it to space workloads with holistic co-design across hardware, software and systems so that on-board intelligence can be deployed efficiently, updated responsibly, and scaled across missions and form factors.

Orbiting data centres are emerging. As they do, AMD’s focus on adaptive, scalable platforms and an open ecosystem will help partners build robust, efficient end-to-end systems.

Space is the Ultimate Edge Environment

The immediate opportunity is on-board intelligence that senses, decides and acts as the mission happens. Space makes edge processing not just beneficial, but often necessary with local AI becoming the backbone of operations in which every downlink is constrained, every millisecond of latency matters and connectivity can’t be assumed.

Downlink is limited by bandwidth, power and communication windows, so sending everything to a terrestrial data centre is inefficient and slow. On-board AI can discard low-value data (like cloudy frames in Earth observation), can surface urgent events (like early wildfire signatures) and can enable resilient autonomy when connectivity is intermittent.

big bang

Edge processing helps spacecraft and satellites interpret data locally and act on it. Instead of treating the platform as a sensor that just collects raw data for Earth, AI in space turns it into a system that prioritises, compresses and decides at the point of capture with agentic AI workflows.

And this AI can be adjusted across use cases and workflows, whether for a planetary rover navigating hazards, or a spacecraft flagging telemetry anomalies before they cascade and create failure.

huges

The Intrigue of Data Centres in Space

Looking further out, success will be about making orbital compute a reality. With the challenge of insatiable demand for more AI computing in data centres, there are several efforts to deliver mass-scale computation in space to tap into solar power and leverage cooler temperatures.

Large-scale orbital compute will ultimately be limited by power, thermal dissipation, radiation resilience and communications. Many concepts assume sun-synchronous “dawn-dusk” orbits to maximise solar availability and reduce thermal cycling, with low Earth orbit helping limit latency and radiation exposure. One of the most difficult problems to solve is how to eliminate heat from large-scale compute deployments. Space is a vacuum, so excess heat must be conducted to radiators.

At meaningful scale, that reality drives architectural thinking toward modular, serviceable systems rather than the monolithic “data centre in a box.” It will be many elements operating together, each managing its own power generation and thermal dissipation while communicating through high-throughput links.

At large scale, that likely implies:

  • Modular deployments that can reach multimegawatt-class capabilities over time.
  • High-speed, low-latency interconnect between elements (including optical links at substantially higher data rates and lower energy consumption than what’s commonly deployed today).
  • Reliability and replacement models that assume modules may have limited lifetimes and can be de-orbited and replaced, more like fleet operations than traditional one-off spacecraft.

AMD Offers the Building Blocks for What’s Next

AMD’s adaptive computing has supported space exploration for decades, including image processing and navigation acceleration for NASA’s Mars rovers and the Artemis II mission.

AMD’s approach is to make space AI buildable – not as a one-off engineering project, but as a repeatable platform journey. That starts with adaptive, scalable compute building blocks that can be right sized to the mission: CPUs, GPUs, FPGAs and accelerator options where they make sense, paired with modular design philosophies.

This approach extends our established edge playbook to the stars. By providing the same platform consistency we’ve delivered for terrestrial deployments, we enable a repeatable journey where partners can evolve capabilities over time without re-architecting from scratch.

Just as important is openness. Space missions are assembled from many specialised suppliers, and no single vendor can (or should) dictate the full solution.

AMD is investing in open software and open standards so partners can integrate, tune and validate end-to-end systems with more choice and less friction. On the software side, AMD ROCm™ software is part of the open software stack for AI and HPC, designed to help developers move from kernels to applications on AMD accelerators. On the systems side, AMD is helping drive standards for open security, interconnect and infrastructure to ensure high-performance AI systems can scale without lock-in.

New Frontier: Scaling AI from Earth to Orbit

The most exciting part of this conversation is that AI is expanding where compute can create impact, including environments that are remote, constrained and mission critical. By putting intelligence closer to where data is generated, we reduce latency, save bandwidth, and improve mission outcomes. That’s true in factories, hospitals, and vehicles – and it’s true in space.

At AMD, we’ll keep doing what we do best: Engineer for reality, co-optimise the full system and build technologies that scale efficiently – from Earth to orbit and beyond.

—–

The writer is Chief Technology Officer and Executive Vice President at AMD. He is responsible for driving the company’s end-to-end technology vision, strategy and product roadmap.

The writer is Chief Technology Officer and Executive Vice President at AMD. He is responsible for driving the company’s end-to-end technology vision, strategy and product roadmap. The views expressed are personal and do not necessarily carry the views of Raksha Anirveda

More like this

The Case for IJSOC

Joint capabilities are not merely organisational constructs but also...

The Case for IJSOC

Victorious warriors win first and then go to War,...

First Do228 Successfully Transferred to Bangladesh Navy After Base Maintenance Event

OBERPFAFFENHOFEN / CHATTOGRAM – General Atomics AeroTec Systems GmbH (GA-ATS)...

India and Sri Lanka Deepen Maritime Cooperation Through IN–SLN DIVEX 2026

New Delhi: India and Sri Lanka reaffirmed their growing...

India’s Shadow War: Lone Wolves and the Fractures They Force Open

India faces a threat unlike any it has systematically...

Charting Growth: GRSE Records Highest-Ever Revenue and Profit in FY26

Kolkata: Garden Reach Shipbuilders & Engineers Limited (GRSE), a leading...
Indian Navy Special Edition 2025spot_img