Quenzel, Jan: Efficient Real-Time Calibration and Odometry for Dense Multi-Modal Mapping with UAVs. - Bonn, 2025. - Dissertation, Rheinische Friedrich-Wilhelms-Universität Bonn.
Online-Ausgabe in bonndoc: https://nbn-resolving.org/urn:nbn:de:hbz:5-86531
Online-Ausgabe in bonndoc: https://nbn-resolving.org/urn:nbn:de:hbz:5-86531
@phdthesis{handle:20.500.11811/13706,
urn: https://nbn-resolving.org/urn:nbn:de:hbz:5-86531,
author = {{Jan Quenzel}},
title = {Efficient Real-Time Calibration and Odometry for Dense Multi-Modal Mapping with UAVs},
school = {Rheinische Friedrich-Wilhelms-Universität Bonn},
year = 2025,
month = nov,
note = {Autonomous robotic systems heavily rely on knowledge about their environment to safely navigate, interact with, and perform search and rescue (SAR) and inspection tasks in real-time. To better understand the robot's surroundings, a flying robot requires fast and robust perception, enabled by complementary sensors.
However, improper sensor calibration degrades the localization accuracy and reconstruction quality, which may lead to failure of the overall system. The common photometric error assumes a constant brightness, which is regularly violated in the real world and impairs the system's robustness. To restore this photometric consistency, we extract small oriented patches at tracked ORB features and jointly estimate the photometric parameters on keyframes including the exposure change. Our approach densely models the radial intensity fall-off due to vignetting and the camera response function with thin plate splines (TPS) from sparse measurements. To further improve runtime, we establish correspondences via direct gradient-based metrics and propose a novel robust combination of gradient orientation and magnitude, applicable for Visual-SLAM, disparity- and depth estimation.
Independent of ambient illumination, LiDARs provide accurate distance measurements around the robot even in texture-less environments. Thus, our LiDAR-inertial odometry MARS jointly aligns multi-resolution surfel maps with a Gaussian Mixture Model (GMM) formulation using a continuous-time B-spline trajectory. We accelerate covariance and GMM computation with Kronecker sums and products. An unscented transform (UT) de-skews surfels at runtime, while a timewise splitting into intra-scan segments facilitates motion compensation during spline optimization. Complementary soft constraints on relative poses from robot odometry and preintegrated IMU pseudo-measurements further improve our system's robustness and accuracy.
For high-level planning in dynamic environments, a signum occupancy function improves the reactivity of our mapping by maintaining a short temporal occupancy window in real-time. In addition, we enrich our dense map with color, thermal signatures, and semantic information using the spline trajectory for accurate and motion-compensated projection. Our semantic fusion further adapts a Bayesian update in logarithmic form for greater numerical stability.
The methods presented throughout this thesis provide state-of-the-art results on various datasets. As such, our created maps facilitate inspection and SAR while improving decision-making for further downstream tasks. Moreover, our methods are applicable for general dense 3D mapping and localization with, e.g., car-, robot-mounted, or handheld sensor suites.},
url = {https://hdl.handle.net/20.500.11811/13706}
}
urn: https://nbn-resolving.org/urn:nbn:de:hbz:5-86531,
author = {{Jan Quenzel}},
title = {Efficient Real-Time Calibration and Odometry for Dense Multi-Modal Mapping with UAVs},
school = {Rheinische Friedrich-Wilhelms-Universität Bonn},
year = 2025,
month = nov,
note = {Autonomous robotic systems heavily rely on knowledge about their environment to safely navigate, interact with, and perform search and rescue (SAR) and inspection tasks in real-time. To better understand the robot's surroundings, a flying robot requires fast and robust perception, enabled by complementary sensors.
However, improper sensor calibration degrades the localization accuracy and reconstruction quality, which may lead to failure of the overall system. The common photometric error assumes a constant brightness, which is regularly violated in the real world and impairs the system's robustness. To restore this photometric consistency, we extract small oriented patches at tracked ORB features and jointly estimate the photometric parameters on keyframes including the exposure change. Our approach densely models the radial intensity fall-off due to vignetting and the camera response function with thin plate splines (TPS) from sparse measurements. To further improve runtime, we establish correspondences via direct gradient-based metrics and propose a novel robust combination of gradient orientation and magnitude, applicable for Visual-SLAM, disparity- and depth estimation.
Independent of ambient illumination, LiDARs provide accurate distance measurements around the robot even in texture-less environments. Thus, our LiDAR-inertial odometry MARS jointly aligns multi-resolution surfel maps with a Gaussian Mixture Model (GMM) formulation using a continuous-time B-spline trajectory. We accelerate covariance and GMM computation with Kronecker sums and products. An unscented transform (UT) de-skews surfels at runtime, while a timewise splitting into intra-scan segments facilitates motion compensation during spline optimization. Complementary soft constraints on relative poses from robot odometry and preintegrated IMU pseudo-measurements further improve our system's robustness and accuracy.
For high-level planning in dynamic environments, a signum occupancy function improves the reactivity of our mapping by maintaining a short temporal occupancy window in real-time. In addition, we enrich our dense map with color, thermal signatures, and semantic information using the spline trajectory for accurate and motion-compensated projection. Our semantic fusion further adapts a Bayesian update in logarithmic form for greater numerical stability.
The methods presented throughout this thesis provide state-of-the-art results on various datasets. As such, our created maps facilitate inspection and SAR while improving decision-making for further downstream tasks. Moreover, our methods are applicable for general dense 3D mapping and localization with, e.g., car-, robot-mounted, or handheld sensor suites.},
url = {https://hdl.handle.net/20.500.11811/13706}
}





