V2E: From video frames to realistic DVS event camera streams

06/13/2020
by   Tobi Delbruck, et al.
0

To help meet the increasing need for dynamic vision sensor (DVS) event camera data, we developed the v2e toolbox, which generates synthetic DVS event streams from intensity frame videos. Videos can be of any type, either real or synthetic. v2e optionally uses synthetic slow motion to upsample the video frame rate and then generates DVS events from these frames using a realistic pixel model that includes event threshold mismatch, finite illumination-dependent bandwidth, and several types of noise. v2e includes an algorithm that determines the DVS thresholds and bandwidth so that the synthetic event stream statistics match a given reference DVS recording. v2e is the first toolbox that can synthesize realistic low light DVS data. This paper also clarifies misleading claims about DVS characteristics in some of the computer vision literature. The v2e website is https://sites.google.com/view/video2events/home and code is hosted at https://github.com/SensorsINI/v2e.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset