Show simple item record

Extrinsic Calibration and Ego-Motion Estimation for Mobile Multi-Sensor Systems

dc.contributor.authorHuang, Kaihong
dc.date.accessioned2020-08-31T13:58:56Z
dc.date.available2020-08-31T13:58:56Z
dc.date.issued2020
dc.identifier.urihttps://hdl.handle.net/20.500.11811/8564
dc.description.abstractAutonomous robots and vehicles are often equipped with multiple sensors to perform vital tasks such as localization or mapping. The joint system of various sensors with different sensing modalities can often provide better localization or mapping results than individual sensor alone in terms of accuracy or completeness. However, to enable improved performance, two important challenges have to be addressed when dealing with multi-sensor systems. Firstly, how to accurately determine the spatial relationship between individual sensor on the robot? This is a vital task known as extrinsic calibration. Without this calibration information, measurements from different sensors cannot be fused. Secondly, how to combine data from multiple sensors to correct for the deficiencies of each sensor, and thus, provides better estimations? This is another important task known as data fusion.
The core of this thesis is to provide answers to these two questions. We cover, in the first part of the thesis, aspects related to improving the extrinsic calibration accuracy, and present, in the second part, novel data fusion algorithms designed to address the ego-motion estimation problem using data from a laser scanner and a monocular camera.
In the extrinsic calibration part, we contribute by revealing and quantifying the relative calibration accuracies of three common types of calibration methods, so as to offer an insight into choosing the best calibration method when multiple options are available. Following that, we propose an optimization approach for solving common motion-based calibration problems. By exploiting the Gauss-Helmert model, our approach is more accurate and robust than classical least squares model.
In the data fusion part, we focus on camera-laser data fusion and contribute with two new ego-motion estimation algorithms that combine complementary information from a laser scanner and a monocular camera. The first algorithm utilizes camera image information to guide the laser scan-matching. It can provide accurate motion estimates and yet can work in general conditions without requiring a field-of-view overlap between the camera and laser scanner, nor an initial guess of the motion parameters.
The second algorithm combines the camera and the laser scanner information in a direct way, assuming the field-of-view overlap between the sensors is substantial. By maximizing the information usage of both the sparse laser point cloud and the dense image, the second algorithm is able to achieve state-of-the-art estimation accuracy. Experimental results confirm that both algorithms offer excellent alternatives to state-of-the-art camera-laser ego-motion estimation algorithms.
en
dc.format.extent128
dc.language.isoeng
dc.relation.ispartofseriesSchriftenreihe / Institut für Geodäsie und Geoinformation ; 64
dc.rightsIn Copyright
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/
dc.subjectMobile Multi-Sensor Systems
dc.subjectCalibratiion
dc.subject.ddc526.982 Fotogrammetrie
dc.titleExtrinsic Calibration and Ego-Motion Estimation for Mobile Multi-Sensor Systems
dc.typeDissertation oder Habilitation
dc.publisher.nameRheinische Friedrich-Wilhelms-Universität Bonn, Landwirtschaftliche Fakultät, IGG - Institut für Geodäsie und Geoinformation
dc.publisher.locationBonn
dc.rights.accessRightsopenAccess
dc.relation.eissn2699-6685
dc.relation.urnhttps://nbn-resolving.org/nbn:de:hbz:5n-53094
ulbbn.pubtypeZweitveröffentlichung


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

The following license files are associated with this item:

InCopyright