Miscellaneous


Hybrid, frame and event based VIO for robust, autonomous navigation of quadrotors

Published on Sep 19, 2017

Event cameras are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. These cameras do not suffer from motion blur and have a very high dynamic range, which enables them to provide reliable visual information during high speed motions or in scenes characterized by high dynamic range. However, event cameras output only little information when the amount of motion is limited, such as in the case of almost still motion. Conversely, standard cameras provide instant and rich information about the environment most of the time (in low-speed and good lighting scenarios), but they fail severely in case of fast motions, or difficult lighting such as high dynamic range or low light scenes. In this paper, we present the first state estimation pipeline that leverages the complementary advantages of these two sensors by fusing in a tightly-coupled manner events, standard frames, and inertial measurements. We show on the publicly available Event Camera Dataset that our hybrid pipeline leads to an accuracy improvement of 130% over event-only pipelines, and 85% over standard-frames only visual-inertial systems, while still being computationally tractable. Furthermore, we use our pipeline to demonstrate—to the best of our knowledge—the first autonomous quadrotor flight using an event camera for state estimation, unlocking
flight scenarios that were not reachable with traditional visual inertial odometry, such as low-light environments and high dynamic range scenes.

Reference:
"Hybrid, Frame and Event based Visual Inertial Odometry for Robust, Autonomous Navigation of Quadrotors"

by Antoni Rosinol Vidal, Henri Rebecq, Timo Horstschaefer, Davide Scaramuzza
 

WeRobotics Webinar - Drone journalism in conflict zones

Published on Apr 3, 2018

Speaker: Gail Orenstein Gail has been a photographer for 23 years. Her drone work from Iraq, Bangladesh and Nepal has be syndicated around the world. In this webinar presentation, she shares her first-hand insights on balancing the powers and dangers of using small drones in high-risk areas are particularly instructive regardless of whether you work in disaster response, sustainable development, public health, environmental protection or climate change resilience.
 

Dynamic obstacle avoidance for quadrotors with event cameras (Science Robotics 2020)

Mar 18, 2020

Today's autonomous drones have reaction times of tens of milliseconds, which is not enough for navigating fast in complex dynamic environments. To safely avoid fast moving objects, drones need low-latency sensors and algorithms. We depart from state of the art approaches by using event cameras, which are novel bioinspired sensors with reaction times of microseconds. Our approach exploits the temporal information contained in the event stream to distinguish between static and dynamic objects and leverages a fast strategy to generate the motor commands necessary to avoid the approaching obstacles. Standard vision algorithms cannot be applied to event cameras because the output of these sensors is not images but a stream of asynchronous events that encode per-pixel intensity changes. Our resulting algorithm has an overall latency of only 3.5 milliseconds, which is sufficient for reliable detection and avoidance of fast moving obstacles. We demonstrate the effectiveness of our approach on an autonomous quadrotor using only onboard sensing and computation. Our drone was capable of avoiding multiple obstacles of different sizes and shapes at relative speeds up to 10 meters/second, both indoors and outdoors.

Reference:

"Dynamic Obstacle Avoidance for Quadrotors with Event Cameras"

by Davide Falanga, Kevin Kleber and Davide Scaramuzza
Science Robotics, March 18, 2020.
 
Back
Top