The sky’s the limit: AIML works with local company to clear earth's orbit

Space debris in earth’s orbit (image created using AI)
Space debris is a huge issue that presents a growing risk to the sustainability of earth’s orbital environment. Defunct satellites, spent rocket components, and other types of debris now number in the tens of thousands, and traveling at speeds exceeding 28,000 km/h, even millimetre-sized objects can cause significant damage to operational satellites or pose a threat to crewed missions.
Adelaide-based is working with AIML to address this issue by developing technology to safely remove space debris. The company has built an object characterisation system that combines sensor data with machine learning (ML) techniques to determine if an object is safe to intercept.
“The outcome will involve a specialised imaging tool that will be able to classify space debris for identification, calculate the size of the debris to ensure it will fit inside our payload, and determine whether it is safe for capture by analysing the approximate spin-rate of the object,” says Paladin Space CEO Harrison Box.
“[The AIML team] are clearly skilled in generating ML pipelines and training for AI. I was impressed by their work…in problem solving a solution.”
Training models to recognise categories of debris
AIML began collaborating with Paladin Space to develop the system in November 2024. Using event cameras and ML models, AIML team members developed a custom dataset and trained models to recognise four categories of debris: printed circuit boards (PCBs); solar panels; metal shards; and CubeSat, a class of small satellites often launched as secondary payloads alongside larger spacecraft.
Thomas Wolinski, Mechatronics Engineer with Paladin Space, supported the team with Jonathon Read, AIML’s Engineering Manager, serving as Project Manager. The proof of concept (POC) was built by AIML Machine Learning Engineers Aaron Poruthoor and Alec Arthur.
“In my work [on estimating the speed and size of space debris], we utilised a classical computer vision technique called ORB (oriented FAST and rotated BRIEF) for feature extraction,” said Arthur. “This is used to estimate the spin rate and reconstruct the object's shape by generating point clouds, which in turn allows us to estimate the object's volume.”
“The work is very novel and uses simple computer vision techniques.”

The object characterisation system accurately detects a CubeSat during a demonstration.
“My work involved developing the ML to detect and classify space junk using event cameras,” said Poruthoor. “This work is primarily aimed at advancing the use of event cameras in space applications and beyond.”
Instead of capturing a full image at a fixed rate, event cameras report brightness changes as they occur. Each ‘event’ encodes information about the time, location, and direction (increase or decrease) of the brightness change at a specific pixel.
“Event cameras are an emerging technology with immense potential across various industries due to their low latency, high dynamic range (HDR), excellent low-light performance, and energy efficiency,” said Poruthoor. “However, because the technology is still relatively new, there is limited research and few practical implementations available today.”
“Our goal is to help bridge that gap, both by demonstrating real-world use cases (like space debris characterisation) and by encouraging further research and development in this space,” Poruthoor continued. “We believe this work could pave the way for more widespread adoption of event cameras across industries such as aerospace, robotics, autonomous vehicles, and surveillance.”
Paladin Demo Day held 15 May
Paladin Space held a Demo Day on Thursday, 15 May, where they showcased this groundbreaking technology before a full house at the UniSA Enterprise Hub. Representatives from South Australia's space and research communities as well as government were on hand to watch a live demonstration of the tool in action.
AIML Professor Tat-jun Chin was at the Demo Day event to cheer on both the AIML and Paladin Space teams.
"I'm here because I was part of the team at AIML that helped develop the perception system. [I also] provided advice on algorithm development, evaluation, and training data sets," said Professor Chin. "This event is a great way to display the effort [AIML] has put into the project and we hope it will lead to even bigger opportunities for Paladin Space."

Members of the AIML/Paladin Space team (l-r) Thomas Wolinksi, Paladin Space Mechatronics Engineer; Harrison Box, Paladin Space CEO; and AIML members Aaron Poruthoor, Alec Author, Tat-jun Chin, and Jonathon Read
Other AIML members involved include Research Engineer Lachlan Mares, and PhD student Ethan Elms. Mares even used a home-based 3D printer to help construct key components of the technology.
The group’s combined efforts are incredibly innovative and precise, employing lightweight vision algorithms and smart engineering that achieve object spin rate accuracy down to the second and size estimates accurate to the centimetre, all while maintaining low computational overhead. And they give much of the credit for this extraordinary outcome on the event cameras used to help create the object characterisation system.
“If there's one key outcome we’d like to see, it’s the broader recognition and adoption of event cameras as a viable and valuable sensing tool, both in research and in industrial applications,” said Read. “We hope this project can inspire further innovation and contribute to unlocking the full potential of this underutilised technology."