Robot Perception
Degradation Resilient LiDAR-Radar-Inertial Odometry
The proposed approach combines modalities in a factor graph-based windowed smoother with sensor information-specific factor formulations which enable, in the case of degeneracy, partial information to be conveyed to the graph along the non-degenerate axes.
|
An Online Self-calibrating Refractive Camera Model with Application to Underwater Odometry
This work presents a camera model for refractive media such as water and its application in underwater visual-inertial odometry. The model is self-calibrating in real-time and is free of known correspondences or calibration targets.
|
MIMOSA: A Multi-Modal SLAM Framework for Resilient Autonomy against Sensor Degradation
This work presents a framework for Multi-Modal SLAM (MIMOSA) that utilizes a nonlinear factor graph as the underlying representation to provide loosely-coupled fusion of any number of sensing modalities
|
Keyframe-based Thermal-Inertial OdometryThis work presents a keyframe-based thermal-inertial odometry estimation framework tailored to the exact data and concepts of operation of thermal cameras which provide a potential solution to penetrate and overcome conditions of darkness and strong presence of obscurants. The developed framework was verified with respect to its resilience, performance and ability to enable autonomous navigation in an extensive set of experimental studies including multiple field deployments in severely degraded, dark and obscurants-filled underground mines.
|
Robust Thermal-Inertial Localization for Aerial Robots: A case for Direct MethodsThis video presents a comparative study between state-of-the-art GPS-denied visual-inertial odometry frameworks. The comparison includes odometry frameworks applied on rescaled thermal camera data against full radiometric information-exploiting methods using a modified filter-based estimator and a recently proposed keyframe-based direct technique specifically designed for thermal-inertial localization.
|
Keyframe-based Direct Thermal-Inertial OdometryIn this work we present a method for fusion of direct radiometric data from a thermal camera with inertial measurements to enable pose estimation of aerial robots. LWIR thermal vision is not affected by the lack of illumination and the presence of obscurants such as fog and dust, rendering it suitable for GPS-denied, dark and obscurants-filled environments. In contrast to previous approaches, which use 8-bit re-scaled thermal imagery as a complementary sensing modality to visual image data, our approach makes use of the full 14 bit - radiometric data making it generalizable to a variety of environments. Furthermore, our approach implements a key-frame based joint optimization scheme, making odometry estimates robust against image data interruption, which is common during the operation of thermal cameras due to the application of flat field corrections.
|
Visual-Thermal Landmarks and Inertial Fusion for Navigation in Degraded Visual EnvironmentsIn this work we present a multi--sensor fusion algorithm for reliable odometry estimation in GPS--denied and degraded visual environments. The proposed method utilizes information from both the visible and thermal spectra for landmark selection and prioritizes feature extraction from informative image regions based on a metric over spatial entropy. Furthermore, inertial sensing cues are integrated to improve the robustness of the odometry estimation process.
|
Thermal-Inertial Localization for Autonomous Navigation of Aerial Robots through ObscurantsIn this video, the problem of GPS-denied aerial robot navigation through obscurants is considered. Through the fusion of thermal camera and IMU data, estimation of the robot trajectory in such degraded visual environments is achieved. The system is demonstrated in a heavily smoke-filled machine shop environment. Enabling the navigation of aerial robots through obscurants can prove critical in important applications such as search and rescue in underground mines, or in many real-life surveillance scenarios.
|
Vision-Depth Landmarks and Inertial Fusion for Navigation in Degraded Visual EnvironmentsIn this work, a method for tight fusion of visual, depth and inertial data for autonomous navigation in GPS–denied, poorly illuminated, and textureless environments is proposed. Visual and depth information are fused at the feature detection and descriptor extraction levels to augment one sensing modality with the other. These multimodal features are then further integrated with inertial sensor cues using an extended Kalman filter to estimate the robot pose, sensor bias terms, extrinsic calibration parameters, and landmark positions simultaneously as part of the filter state.
|
Visual-inertial odometry-enhanced geometrically stable ICPThis video presents our work on autonomous exploration and mapping of dark, visually-degraded environments. The system employs a NIR visual-inertial localization system augmented with a 3D time-of-flight depth sensor. Exploiting an uncertainty-aware receding horizon exploration and mapping planner, the robot operates autonomously in environment for which no prior knowledge is available.
|
|
LiDAR-Visual-Inertial Fused Mapping for Cars - Preliminary Result - UNR Garage This is a preliminary result with a system consisting of a Velodyne PuckLITE, a Stereo camera and an Inertial Measurement Unit. The data are processed through 2 pipelines, namely LiDAR odometry and visual-inertial odometry. The final result is fused through an EKF and direct IMU feeds. This map presents the depth data from the combination of the LiDAR and the stereo camera system. For the camera system a pruning distance of 2m is set.
|
|
Relevant publications:
- C. Papachristos, D. Tzoumanikas, K. Alexis, A. Tzes, "Autonomous Robotic Aerial Tracking, Avoidance, and Seeking of a Mobile Human Subject", International Symposium of Visual Computing (ISVC) 2015, 2015, Las Vegas, US
- C. Papachristos, S. Khattak, K. Alexis, "Autonomous Exploration of Visually-Degraded Environments using Aerial Robots", International Conference on Unmanned Aircraft Systems (ICUAS), 2017
- F. Mascarich, S. Khattak, C. Papachristos, K. Alexis, "A Multi-Modal Mapping Unit for Autonomous Exploration and Mapping of Underground Tunnels", IEEE Aerospace Conference (AeroConf) 2016, Yellowstone Conference, Big Sky, Montana, March 3-10, 2018
- C. Papachristos, K. Alexis, "Thermal-Inertial Localization for Autonomous Navigation of Aerial Robots through Obscurants", International Conference on Unmanned Aircraft Systems (ICUAS), Dallas, TX, USA, 2018
- S. Khattak, C. Papachristos, K. Alexis, "Marker based Thermal-Inertial Localization for Aerial Robots in Obscurant Filled Environments", International Symposium on Visual Computing, Las Vegas, November 19-21, 2018
- S. Khattak, C. Papachristos, K. Alexis, "Vision-Depth Landmarks and Inertial Fusion for Navigation in Degraded Visual Environments", International Symposium on Visual Computing, Las Vegas, November 19-21, 2018
- C. Papachristos, S. Khattak, F. Mascarich, K. Alexis, "Autonomous Navigation and Mapping in Underground Mines Using Aerial Robots", IEEE Aerospace Conference (AeroConf) 2019, Yellowstone Conference, Big Sky, Montana, Mar 2-9, 2019
- S. Khattak, C. Papachristos, K. Alexis, "Visual-Thermal Landmarks and Inertial Fusion for Navigation in Degraded Visual Environments", IEEE Aerospace Conference (AeroConf) 2019, Yellowstone Conference, Big Sky, Montana, Mar 2-9, 2019 (Best Paper Award in the "Air Vehicle Systems and Technologies" Track)
- S. Khattak, C. Papachristos, K. Alexis, "Keyframe-based Direct Thermal–Inertial Odometry", IEEE International Conference on Robotics and Automation (ICRA), 2019, May 20-24, 2019 Montreal, Canada
- S. Khattak, F. Mascarich, T. Dang, C. Papachristos, K. Alexis, "Robust Thermal-Inertial Localization for Aerial Robots: A Case for Direct Methods", International Conference on Unmanned Aircraft Systems (ICUAS), June 11-14, 2019, Atlanta, GA, USA