800g Drone Achieves Stable Flight Without IMU Using Event Camera and AI Network

Image Credit: Jordan Cormack | Splash

Researchers have demonstrated a quadrotor drone capable of stable flight using solely visual data from a downward facing event camera, replacing the need for inertial measurement units in attitude control, a study showed.

The system, detailed in a paper published on arXiv on July 15, 2025, employs a compact convolutional recurrent neural network to process event streams and estimate the drones orientation and rotation rates in real time.

The Research Team and Publication

The work was led by Jesse J Hagenaars and Stein Stroobants, both from the Micro Air Vehicle Laboratory at Delft University of Technology in the Netherlands, with contributions from Sander M Bohté of the Machine Learning group at Centrum Wiskunde and Informatica in Amsterdam, and Guido C H E de Croon also from Delft. The preprint, titled "All Eyes, no IMU: Learning Flight Attitude from Vision Alone", builds on prior explorations in bio inspired robotics at the lab.

This marks a step in artificial intelligence driven drone navigation, where machine learning models interpret visual cues to handle tasks traditionally reliant on physical sensors.

Background on Drone Control Systems

Drones have long depended on inertial measurement units, combining accelerometers and gyroscopes, to maintain stable orientation relative to gravity during flight. These sensors provide direct readings of acceleration and angular velocity, essential for quick adjustments in attitude control loops.

However, such hardware adds weight and power demands, posing challenges for smaller, more efficient designs like those mimicking insect flight. Past efforts in vision based methods often focused on specific environments, such as detecting horizons in open skies or vanishing points in structured spaces, but typically still incorporated gyroscopes for precision.

Event cameras, which capture changes in brightness at high speeds with low latency, emerged in recent years as alternatives to standard cameras, drawing from neuromorphic principles to reduce data redundancy. Some earlier vision only demonstrations, such as a 2024 paper on fully neuromorphic drone control, were constrained to specific settings; others combined vision with gyros. The motivation here stems from observing flying insects, such as honeybees, which achieve agile manoeuvres using vision without dedicated gravity sensing organs.

How the System Works

The setup involves a custom 5 inch quadrotor weighing approximately 800 grams, equipped with a DVXplorer Micro event camera angled downward, featuring a 140 degree field of view lens. Events are accumulated into 5 millisecond frames at a resolution of 320 by 240 pixels, processed on an NVIDIA Jetson Orin NX embedded computer.

At the core is a convolutional recurrent neural network with about 425,000 parameters, including an encoder that downsamples input by eight times, a gated recurrent unit for memory, and a decoder outputting roll, pitch, and rotation rates. Trained via supervised learning on datasets from indoor arenas, the model uses mean squared error loss, with data augmentation like frame flipping to enhance robustness.

Real world tests involved flights where the network replaced the inertial unit in the control loop, achieving stable hovers and demonstrating low latency inference. Ablation studies revealed that including memory improved accuracy, while narrower fields of view aided generalization across varied settings.

Implications for Civilian Applications

This advance could reduce reliance on inertial measurement units in some scenarios, potentially simplifying drone designs for tasks like package delivery or aerial mapping. Lighter payloads might extend battery life and reduce costs, though performance still depends on visual texture and contrast.

In civilian sectors, it aligns with trends toward autonomous operations, potentially paving the way for swarms of small drones for surveying without bulky electronics. However, performance may vary in low contrast or complex visuals, highlighting needs for further validation in diverse scenarios.

Future Trends in AI Driven Drone Technology

Looking ahead, this research points to broader adoption of neural networks in robotics, evolving from hybrid sensor setups to pure learning based paradigms. Developments may integrate more advanced models, like spiking neural networks from neuromorphic hardware, to further cut power use.

As artificial intelligence refines visual processing, insect scale robots could become viable for applications in search and rescue or environmental monitoring, though scaling requires addressing computational demands on tiny platforms. Overall, the field appears set for innovations that prioritise efficiency through machine learning, informed by biological models.

3% Cover the Fee
TheDayAfterAI News

We are a leading AI-focused digital news platform, combining AI-generated reporting with human editorial oversight. By aggregating and synthesizing the latest developments in AI — spanning innovation, technology, ethics, policy and business — we deliver timely, accurate and thought-provoking content.

Previous
Previous

China Unveils AI Safety Governance Framework 2.0 Amid Global Concerns on Speech Control

Next
Next

Elderbrook to Perform DJ Set at Washington DC’s A.I. Warehouse on Dec 12