Unified Temporal and Spatial Calibration for Multi-Sensor Systems

July 17, 2014 in ETHZ-ASL, Publications, year 4 by admin

Paul Furgale, Joern Rehder and Roland Siegwart

IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)

In order to increase accuracy and robustness in state estimation for robotics, a growing number of applications rely on data from multiple complementary sensors. For the best performance in sensor fusion, these different sensors must be spatially and temporally registered with respect to each other. To this end, a number of approaches have been developed to estimate these system parameters in a two stage process, first estimating the time offset and subsequently solving for the spatial transformation between sensors. In this work, we present on a novel framework for jointly estimating the temporal offset between measurements of different sensors and their spatial displacements with respect to each other. The approach is enabled by continuous-time batch estimation and extends previous work by seamlessly incorpo- rating time offsets within the rigorous theoretical framework of maximum likelihood estimation. Experimental results for a camera to inertial measurement unit (IMU) calibration prove the ability of this framework to accurately estimate time offsets up to a fraction of the smallest measurement period.


@inproceedings{furgale_iros13,
doi = { 10.1109/IROS.2013.6696514 },
year = { 2013 },
url = { bib/furgale_iros13.pdf },
title = { Unified Temporal and Spatial Calibration for Multi-Sensor Systems },
pages = { 1280--1286 },
month = { 3--7 November },
booktitle = { Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) },
author = { Paul Furgale and Joern Rehder and Roland Siegwart },
address = { Tokyo, Japan },
}