Friday, 21 August 2015

Relative continuous-time SLAM

Appearance-based techniques for simultaneous localization and mapping (SLAM) have been highly successful in assisting robot-motion estimation; however, these vision-based technologies have long assumed the use of imaging sensors with a global shutter, which are well suited to the traditional, discrete-time formulation of visual problems. In order to adapt these technologies to use scanning sensors, we propose novel methods for both outlier rejection and batch nonlinear estimation. Traditionally, the SLAM problem has been formulated in a single-privileged coordinate frame, which can become computationally expensive over long distances, particularly when a loop closure requires the adjustment of many pose variables. Recent discrete-time estimators have shown that a completely relative coordinate framework can be used to incrementally find a close approximation of the full maximum-likelihood solution in constant time. In order to use scanning sensors, we propose moving the relative coordinate formulation of SLAM into continuous time by estimating the velocity profile of the robot. We derive the relative formulation of the continuous-time robot trajectory and formulate an estimator using temporal basis functions. A motion-compensated outlier rejection scheme is proposed by using a constant-velocity model for the random sample consensus algorithm. Our experimental results use intensity imagery from a two-axis scanning lidar; due to the sensors’ scanning nature, it behaves similarly to a slow rolling-shutter camera. Both algorithms are validated using a sequence of 6880 lidar frames acquired over a 1.1 km traversal.

from robot theory


Post a Comment