- C. Papachristos, K. Alexis, "Thermal-Inertial Localization for Autonomous Navigation of Aerial Robots through Obscurants", International Conference on Unmanned Aircraft Systems (ICUAS), Dallas, TX, USA, 2018
Our paper on thermal-inertial navigation:
We organize a workshop on Autonomous Navigation for Aerial Robots in Extreme Environments: From Subterranean Environments to the Arctic" at ICUAS 2018 (http://www.uasconferences.com/).
Tutorial Summary: Progress in autonomous aerial robots has enabled their wide utilization in a variety of important applications such as infrastructure monitoring or precision agriculture. However, at the same time their truly ubiquitous utilization or their integration in the most challenging environments and important use cases depends on their ability to navigate in extreme conditions. In this tutorial we consider two main examples, namely a) subterranean navigation for rotorcraft aerial robots (individually operating or as teams), and, b) longendurance
fixed-wing flight over the Arctic. In that context this tutorial overviews the required advances in robotic perception, state estimation, planning, control and vehicle design that enable autonomous systems to seamlessly navigate, explore and map within such challenging environments.
From a technological standpoint the focus is on three major challenges: a) degraded sensing, b) austere navigation, and c) long-term autonomy and endurance. In terms of degraded sensing emphasis is on environments that either exteroceptive or proprioceptive sensing (or both) may provide weak and illconditioned information. Characteristic examples include those of dark tunnels and caves or dust- and smoke- filled mines, as well as long-term autonomous flight above the Arctic subject to weak magnetometer readings and poor GNSS satellite geometry. By austere navigation, we particularly emphasize complex underground environments such as mines with narrow orepasses, tunnels and more. Finally, in terms of long term
autonomy we refer to systems that, on one hand possess extended endurance capabilities, and on the other hand have robust state estimation, control and planning that allows them to reliably operate for extended periods of time.
The tutorial begins its presentation and discussion from the research experiences of its organizers that includes: a) extensive autonomous exploration and mapping missions inside visually-degraded (darkness, haze conditions mines and tunnels using aerial robots, b) agile navigation using micro aerial robots within cluttered environments, and, c) multi-hour solar-powered UAV flight in the Arctic region for environmental research purposes. Additional experience in the following areas is also considered: a) multi-modal
Simultaneous Localization and Mapping through camera, LiDAR and IR vision fusion, b) belief- and saliency- aware autonomous exploration, c) agile flight control, d) multi-robot teaming with simultaneous ultra-wide band-based localization, e) specialized aerial robot design for aggressive flight, f) long-endurance solar-powered unmanned aircraft design, and, g) robust state estimation over the Arctic zone. Beyond the presentation of current and previous results, the tutorial contributes to organizing and defining the core research directions that can allow for flying robots to present advanced levels of robustness, resiliency and multi-agent reconfigurability when operating in the most extreme conditions and environments, for example those found in subterranean settings.
The work of our lab and Shehryar Khattak, PhD Candidate in multi-modal perception, are featured in the recent video for the Graduate Research at the Computer Science & Engineering Department of the University of Nevada, Reno.
In this work, a method for tight fusion of visual, depth and inertial data for autonomous navigation in GPS–denied, poorly illuminated, and textureless environments is proposed. Visual and depth information are fused at the feature detection and descriptor extraction levels to augment one sensing modality with the other. These multimodal features are then further integrated with inertial sensor cues using an extended Kalman filter to estimate the robot pose, sensor bias terms, extrinsic calibration parameters, and landmark positions simultaneously as part of the filter state. The proposed algorithm is to enable reliable navigation of a Micro Aerial Vehicle in challenging visually–degraded environments using RGB-D information from a Realsense D435 Depth Camera and an IMU.
New work of our lab approaches the problem of localization through obscurants via thermal-inertial fusion.
In this work we present a method for optical flow based background subtraction from a single moving camera with application to autonomous driving. Without the use of any other sensor or processing, the method detects the vast majority of the moving entities in the environment. The electric bus is a product of Proterra Inc (https://www.proterra.com/) and used by RTC (http://www.rtcwashoe.com/). The work is part of the Intelligent Mobility Project coordinated by the University of Nevada, Reno and the Nevada Center of Applied Research (NCAR).
The work of the Autonomous Robots Lab is featured at BBC Click. The video features:
Our paper on multi-modal 3D/radiation mapping:
Our IEEE ICRA 2018 papers: