Event-based Stereo Vision for Scene Flow Understanding in Autonomous Driving

Event-based cameras promise a shift towards perception robustness, with resilience to motion blur and high dynamic range. At the same time, autonomous driving is becoming a prominent application of cutting edge mobile robotics, with safety and robustness at its core metrics. Naturally, the interest in event cameras is growing among the autonomous driving community.

During a semester project at the Robotics Perception Group (UZH & ETH Zurich) I tackled the challenge of lacking availability of ground truth data in motion segmentation for this new type of vision sensors applied to urban driving scenarios. The motion segmentation problem was then approached with two new baselines, incorporating geometric knowledge. A preliminary evaluation on the new ground truth is performed, serving as successful proof of concepts.

Synthetic Autonomous Driving Dataset with lots of sensor modalities

The full report and the final presentation will be made available in the future.

thumbnail of FM-Event-based-Stereo-Vision-for-Scene-Flow-Understanding-in-Autonomous-Driving-Chapter-4
Chapter 4 from the semester thesis (full version will be published in the future)

Leave a Reply

Your email address will not be published. Required fields are marked *