Obstacle Detection for Self-Driving Cars Using Only Monocular Cameras And Wheel Odometry

October 26, 2015 in ETHZ-CVG, Publications, year 4 by Ulrich Schwesinger

Christian Haene, Torsten Sattler and Marc Pollefeys

IROS, 2015

Mapping the environment is crucial to enable path planning and obstacle avoidance for self-driving vehicles and other robots. In this paper, we concentrate on ground-based vehicles and present an approach which extracts static obstacles from depth maps computed out of multiple consecutive images. In contrast to existing approaches, our system does not require accurate visual inertial odometry estimation but solely relies on the readily available wheel odometry. To handle the resulting higher pose uncertainty, our system fuses obstacle detections over time and between cameras to estimate the free and occupied space around the vehicle. Using monocular fisheye cameras, we are able to cover a wider field of view and detect obstacles closer to the car, which are often not within the standard field of view of a classical binocular stereo camera setup. Our quantitative analysis shows that our system is accurate enough for navigation purposes of self-driving cars and runs in real-time.

@inproceedings{haene2015obstacle,
title={Obstacle Detection for Self-Driving Cars Using Only Monocular Cameras and Wheel Odometry},
author={H{\"a}ne, Christian and Sattler, Torsten and Pollefeys, Marc}
booktitle={IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
year=2015
}