Event Cameras Help Autonomous Drones Duck and Dive

It is an established fact that dynamic vision sensors enable high-speed maneuvres with robots. Now drones equipped with event cameras are being developed – they respond to changes in a scene on a per-pixel basis in microseconds, to nimbly avoid obstacles.

Davide Scaramuzza’s Robotics and Perception Group at the University of Zurich pioneered the use of event cameras on drones. Event camera specific sensors might not be good at interpreting a scene visually like a regular camera, but they’re extremely sensitive to motion, responding to changes in a scene on a per-pixel basis in microseconds. For a fast-moving drone this feature could easily be the difference between crashing into something and avoiding it successfully.

The Automaton blog IEEE Spectrum’s award-winning robotics blog reports about a recently accepted paper to the IEEE Robotics and Automation Letters about exploring the use an event camera on drones moving at high speeds  led by Davide Falanga and Suseong Kim from Scaramuzza’s group. To validate their research, they hurl soccer balls at a drone as hard as they can, and see if it can dodge them.

The time it takes a robot (of any kind) to avoid an obstacle is constrained primarily by perception latency, which includes perceiving the environment, processing the data, and then generating control commands. Depending the sensor, algorithm and the computer being used typical perception latency is anywhere from tens of milliseconds to hundreds of milliseconds. The sensor itself is usually the biggest contributor to this latency, making the event cameras so appealing—they can spit out data with a theoretical latency measured in nanoseconds.

The question that the University of Zurich researchers want to answer is how much the perception latency actually affects the maximum speed at which a drone can move while still being able to successfully dodge obstacles.

Comparison of conventional vision sensors on a research-grade quadrotor (both mono and stereo cameras) with an event camera shows that the difference isn’t all that significant, as long as the quadrotor isn’t moving too quickly. As the speed of the quadrotor increases, though, event cameras can start to make a difference—a quadrotor with a thrust to weight ratio of 20, for example, could achieve maximum safe obstacle avoidance speeds that are about 12 percent higher than if it was using a traditional camera. Quadrotors this powerful don’t exist yet (maximum thrust to weight ratios are closer to 10).

Event cameras are pretty cool for other reasons as well: They don’t suffer from motion blur, and they’re much more resilient to lightning conditions, able to work just fine in the dark as well as in high dynamic range, like looking into the sun. As the speed and agility of drones increases, and especially if we want to start using them in unstructured environments for practical purposes, it sure seems like event cameras will be the way to go.

Share