Artificial intelligence is now operating beyond Earth’s atmosphere. In a development that sounds a bit like science fiction, a satellite recently captured an image of Earth and then analyzed that image using AI while still in orbit. No waiting for data to be sent back to the ground first. The analysis happened right there in space.
The test took place on March 25, roughly 500km above Alice Springs, Australia. An Earth-imaging satellite called Pelican-4 captured an image of an airport and then ran an AI model onboard to detect airplanes visible in the scene. Instead of behaving like a traditional satellite that simply collects photos and beams them down to Earth, the spacecraft performed its own analysis before the data ever left orbit.

Inside the satellite is an NVIDIA Jetson Orin computing module, a small but powerful piece of hardware designed for edge AI workloads. That module ran the object detection model immediately after the image was captured. The result was simple but important. The system successfully identified aircraft within moments of the photo being taken
“This success is a glimpse into the future of what we call Planetary Intelligence at scale,” said Kiruthika Devaraj, VP of Avionics & Spacecraft Technology. “By running AI at the edge on the NVIDIA Jetson platform, we can help reduce the time between ‘seeing’ a change on Earth and a customer ‘acting’ on it, while simultaneously minimizing downlink latency and cost. This shift toward integrated AI at the edge is a technological leap that can help differentiate solutions like Planet’s Global Monitoring Service (GMS), providing valuable insights for our customers and enabling rapid response times when it matters most.”
Normally, satellites send massive amounts of raw imagery down to ground stations where computers process it later. That workflow can introduce delays measured in hours. Running AI directly onboard the spacecraft changes the equation. Instead of transmitting everything, the satellite can process imagery in orbit and send back insights instead of raw data.

“This step with NVIDIA can help speed the pace of insight, reducing the time to potential answers from hours to minutes. This can be the critical difference-maker for our customers from disaster response to security and beyond,” said Will Marshall, CEO and co-founder of Planet. “Bigger picture: this is an exciting milestone towards delivering Planetary Intelligence. We’re moving AI from the internet into the physical realm, effectively connecting the ‘eyes’ of our satellites with an onboard ‘brain’ to create a nervous system for the planet.”
The company says the system runs inside isolated Docker containers directly on the satellite. That means the entire pipeline, from capturing the image to detecting objects and preparing geographic data, can happen in orbit. The spacecraft can even generate output formats like GeoTIFF and GeoJSON before sending results back to Earth.
The AI models involved are still early and will continue improving, but the idea behind the experiment is clear. Satellites may be evolving from passive cameras in space into intelligent observers capable of analyzing what they see in real time.
If that approach scales across future satellite constellations, it could shrink the gap between observation and action dramatically. A wildfire, military buildup, or infrastructure change could potentially be detected within minutes instead of hours. For Earth-monitoring systems, that kind of speed could make a real difference.
Source: NERDS.xyz