Continuous-Time Visual-Inertial Trajectory Estimation with Event Cameras

02/23/2017
by   Elias Mueggler, et al.
0

In contrast to traditional cameras, which output images at a fixed rate, event cameras have independent pixels that output asynchronous pixel-level brightness changes with microsecond resolution. In this paper, we leverage a continuous-time framework to perform trajectory estimation by fusing visual data from a moving event camera with inertial data from an IMU. This framework allows direct integration of the asynchronous events with micro-second accuracy and the inertial measurements at high frequency. The pose trajectory is approximated by a smooth curve in the space of rigid-body motions using cubic splines. This formulation significantly reduces the number of variables in trajectory estimation problems. We evaluate our method on real data from several scenes and compare the results against ground truth from a motion-capture system. We show superior performance of the proposed technique compared to non-batch event-based algorithms. We also show that both the map orientation and scale can be recovered accurately by fusing events and inertial data. To the best of our knowledge, this is the first work on visual-inertial fusion with event cameras using a continuous-time framework.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset