Autonomous Robots Lab
  • Home
  • News
  • Research
    • Autonomous Navigation and Exploration
    • Fixed-Wing UAVs
    • Agile and Physical Interaction Control
    • Localization and 3D Reconstruction
    • Subterranean Robotics
    • Collision-tolerant Aerial Robots
    • Marine Robotics
    • Intelligent Mobility >
      • Student Projects
      • Electric Bus Datasets
    • Robotics for Nuclear Sites
    • Degerminator
    • Autonomous Robots Arena
    • Code
    • Media
    • Research Presentations
    • Projects
  • Publications
  • Group
    • People
    • Research Collaborators
    • Positions
  • Education
    • Introduction to Aerial Robotics >
      • Online Textbook >
        • Modeling >
          • Frame Rotations and Representations
          • Multirotor Dynamics
        • State Estimation >
          • Inertial Sensors
          • Batch Discrete-Time Estimation
          • The Kalman Filter
        • Flight Control >
          • PID Control
          • LQR Control
          • Linear Model Predictive Control
        • Motion Planning >
          • Holonomic Vehicle BVS
          • Dubins Airplane
          • Collision-free Navigation
          • Structural Inspection Path Planning
        • Simulation Tools >
          • Simulations with SimPy
          • MATLAB & Simulink
          • RotorS Simulator >
            • RotorS Simulator Video Examples
      • Lecture Slides
      • Literature and Links
      • RotorS Simulator
      • Student Projects
      • Homework Assignments
      • Independent Study
      • Video Explanations
      • Syllabus
      • Grade Statistics
    • Autonomous Mobile Robot Design >
      • Lecture Slides
      • Semester Projects
      • Code Repository
      • Literature and Links
      • RotorS Simulator
      • Video Explanations
      • Resources for Semester Projects
      • Syllabus
    • Robotics for DDD Applications
    • CS302 - Data Structures
    • Student Projects >
      • Robot Competitions
      • Undergraduate Researchers Needed
      • ConstructionBots - Student Projects
    • EiT TTK4854 - Robotic Ocean Waste Removal
    • Aerial Robotic Autonomy >
      • Breadth Topics
      • Deep-dive Topics
      • Literature
    • Robotics Seminars
    • Robotics Days
    • Outreach >
      • Drones Demystified! >
        • Lecture Slides
        • Code Repository
        • Video Explanations
        • RotorS Simulator
        • Online Textbook
      • Autonomous Robots Camp >
        • RotorS Simulator
      • Outreach Student Projects
    • BadgerWorks >
      • General Study Links
      • Learn ROS
      • SubT-Edu
  • Resources
    • Autonomous Robots Arena
    • Robot Development Space
  • Contact

Localization and 3D Reconstruction

Keyframe-based Thermal-Inertial Odometry

This work presents a keyframe-based thermal-inertial odometry estimation framework tailored to the exact data and concepts of operation of thermal cameras which provide a potential solution to penetrate and overcome conditions of darkness and strong presence of obscurants. The developed framework was verified with respect to its resilience, performance and ability to enable autonomous navigation in an extensive set of experimental studies including multiple field deployments in severely degraded, dark and obscurants-filled underground mines.​

Robust Thermal-Inertial Localization for Aerial Robots: A case for Direct Methods

This video presents a comparative study between state-of-the-art GPS-denied visual-inertial odometry frameworks. The comparison includes odometry frameworks applied on rescaled thermal camera data against full radiometric information-exploiting methods using a modified filter-based estimator and a recently proposed keyframe-based direct technique specifically designed for thermal-inertial localization. ​
​

Keyframe-based Direct Thermal-Inertial Odometry

In this work we present a method for fusion of direct radiometric data from a thermal camera with inertial measurements to enable pose estimation of aerial robots. LWIR thermal vision is not affected by the lack of illumination and the presence of obscurants such as fog and dust, rendering it suitable for GPS-denied, dark and obscurants-filled environments. In contrast to previous approaches, which use 8-bit re-scaled thermal imagery as a complementary sensing modality to visual image data, our approach makes use of the full  14 bit - radiometric data making it generalizable to a variety of environments. Furthermore, our approach implements a key-frame based joint optimization scheme, making odometry estimates robust against image data interruption, which is common during the operation of thermal cameras due to the application of flat field corrections.​

Visual-Thermal Landmarks and Inertial Fusion for Navigation in Degraded Visual Environments

In this work we present a multi--sensor fusion algorithm for reliable odometry estimation in GPS--denied and degraded visual environments. The proposed method utilizes information from both the visible and thermal spectra for landmark selection and prioritizes feature extraction from informative image regions based on a metric over spatial entropy. Furthermore, inertial sensing cues are integrated to improve the robustness of the odometry estimation process.​

Thermal-Inertial Localization for Autonomous Navigation of Aerial Robots through Obscurants

In this video, the problem of GPS-denied aerial robot navigation through obscurants is considered. Through the fusion of thermal camera and IMU data, estimation of the robot trajectory in such degraded visual environments is achieved. The system is demonstrated in a heavily smoke-filled machine shop environment. Enabling the navigation of aerial robots through obscurants can prove critical in important applications such as search and rescue in underground mines, or in many real-life surveillance scenarios.

Vision-Depth Landmarks and Inertial Fusion for Navigation in Degraded Visual Environments

In this work, a method for tight fusion of visual, depth and inertial data for autonomous navigation in GPS–denied, poorly illuminated, and textureless environments is proposed. Visual and depth information are fused at the feature detection and descriptor extraction levels to augment one sensing modality with the other. These multimodal features are then further integrated with inertial sensor cues using an extended Kalman filter to estimate the robot pose, sensor bias terms, extrinsic calibration parameters, and landmark positions simultaneously as part of the filter state.

Visual-inertial odometry-enhanced geometrically stable ICP​

This video presents our work on autonomous exploration and mapping of dark, visually-degraded environments. The system employs a NIR visual-inertial localization system augmented with a 3D time-of-flight depth sensor. Exploiting an uncertainty-aware receding horizon exploration and mapping planner, the robot operates autonomously in environment for which no prior knowledge is available.
​

A Multi-Modal Mapping Unit for Autonomous Exploration and Mapping of Underground Tunnels 

A Multi-Modal Mapping Unit for Autonomous Robotic Exploration in Visually-degraded Environments

This video presents results on autonomous exploration and mapping of tunnel environments using the developed Multi-Modal Mapping Unit (M3U). The M3U tightly synchronized a stereo camera pair with an Inertial Measurement Unit and super-bright flashing LEDs, while it further fuses time-of-flight 3D depth sensors. It allows for GPS-denied localization and mapping visually-degraded environments.​
This work presents the results of the development of a multi-modal mapping unit combined with an uncertainty-aware planner for exploration and mapping in dark visually-degraded environments.

LiDAR-Visual-Inertial Fused Mapping for Cars - Preliminary Result - UNR Garage  ​​

This is a preliminary result with a system consisting of a Velodyne PuckLITE, a Stereo camera and an Inertial Measurement Unit. The data are processed through 2 pipelines, namely LiDAR odometry and visual-inertial odometry. The final result is fused through an EKF and direct IMU feeds. This map presents the depth data from the combination of the LiDAR and the stereo camera system. For the camera system a pruning distance of 2m is set. 

LiDAR-Visual-Inertial Localization & Mapping for Cars - Preliminary Result - Reno@Night ​

Preliminary work on mapping in Visually-degraded Environments

This is a preliminary result with a system consisting of a Velodyne PuckLITE, a Stereo camera and an Inertial Measurement Unit. The data are processed through 2 pipelines, namely LiDAR odometry and visual-inertial odometry. The final result is fused through an EKF and direct IMU feeds. This map presents only the LiDAR data.
These are preliminary results on exploration and mapping inside a tunnel with approximate dimensions of WIDTHxHEIGHTxLENGTH = 3x4x30m. The robot navigated along half of it (approximately) and our team evaluated the capability of localization and mapping using NIR camera/IMU, as well as Laser Time-of-Flight sensors.​​
Relevant publications:
  • C. Papachristos, D. Tzoumanikas, K. Alexis, A. Tzes, "Autonomous Robotic Aerial Tracking, Avoidance, and Seeking of a Mobile Human Subject", International Symposium of Visual Computing (ISVC) 2015, 2015, Las Vegas, US
  • ​C. Papachristos, S. Khattak, K. Alexis, "Autonomous Exploration of Visually-Degraded Environments using Aerial Robots", International Conference on Unmanned Aircraft Systems (ICUAS), 2017
  • F. Mascarich, S. Khattak, C. Papachristos, K. Alexis, "A Multi-Modal Mapping Unit for Autonomous Exploration and Mapping of Underground Tunnels", IEEE Aerospace Conference (AeroConf) 2016, Yellowstone Conference, Big Sky, Montana, March 3-10, 2018
  • C. Papachristos, K. Alexis, "Thermal-Inertial Localization for Autonomous Navigation of Aerial Robots through Obscurants", International Conference on Unmanned Aircraft Systems (ICUAS), Dallas, TX, USA, 2018
  • S. Khattak, C. Papachristos, K. Alexis, "Marker based Thermal-Inertial Localization for Aerial Robots in Obscurant Filled Environments", International Symposium on Visual Computing, Las Vegas, November 19-21, 2018
  • S. Khattak, C. Papachristos, K. Alexis, "Vision-Depth Landmarks and Inertial Fusion for Navigation in Degraded Visual Environments",  International Symposium on Visual Computing, Las Vegas, November 19-21, 2018
  • C. Papachristos, S. Khattak, F. Mascarich, K. Alexis, "Autonomous Navigation and Mapping in Underground Mines Using Aerial Robots", IEEE Aerospace Conference (AeroConf) 2019, Yellowstone Conference, Big Sky, Montana, Mar 2-9, 2019 ​
  • S. Khattak, C. Papachristos, K. Alexis, "Visual-Thermal Landmarks and Inertial Fusion for Navigation in Degraded Visual Environments", IEEE Aerospace Conference (AeroConf) 2019, Yellowstone Conference, Big Sky, Montana, Mar 2-9, 2019 (Best Paper Award in the "Air Vehicle Systems and Technologies" Track)​
  • S. Khattak, C. Papachristos, K. Alexis, "Keyframe-based Direct Thermal–Inertial Odometry", IEEE International Conference on Robotics and Automation (ICRA), 2019, May 20-24, 2019 Montreal, Canada
  • S. Khattak, F. Mascarich, T. Dang, C. Papachristos, K. Alexis, "Robust Thermal-Inertial Localization for Aerial Robots: A Case for Direct Methods", International Conference on Unmanned Aircraft Systems (ICUAS), June 11-14, 2019, Atlanta, GA, USA​
Proudly powered by Weebly