Resilient Autonomous Navigation and Exploration
Autonomous navigation and exploration corresponds to a main research direction for our lab.
Reinforcement Learning for Collision-free Flight Exploiting Deep Collision Encoding
This work contributes a novel deep navigation policy that enables collision-free flight of aerial robots based on a modular approach exploiting deep collision encoding and reinforcement learning.
|
ORACLE Library of Deep Learning-based Safe Navigation Methods: Indicative Results
We open-source the ORACLE library of methods on deep learned collision-free navigation of aerial robots that assume a) no access to a map of the environment or an estimate of the robot’s position, and presents robust sim2real transfer.
|
Autonomous Under Canopy Navigation and Mapping in Dense ForestsIn this video we present results from the recent field-testing campaign of the DigiForest project at Evo, Finland.
|
Autonomous Exploration and Visual Inspection of Ballast Water Tank in an FPSO
This video presents results on autonomous exploration and visual inspection of a ballast tank inside an FPSO vessel.
|
Autonomous Exploration of Ballast Water Tank with Navigation through Manholes
This video presents results on autonomous exploration of multiple ballast water tank compartments inside an FPSO vessel.
|
Semantically-enhanced Deep Collision Prediction for Autonomous NavigationThis work contributes a novel and modularized learning-based method for aerial robots navigating cluttered environments.
|
Team CERBERUS DARPA Subterranean Challenge Winning Prize RunThis video presents the winning deployment of Team CERBERUS in the Final Event of the DARPA Subterranean Challenge. We explored a significant part of the environment and scored 23 points, getting the 1st place.
|
Autonomous Teamed Exploration of Subterranean Environments using Legged and Aerial RobotsThis paper presents a novel strategy for autonomous teamed exploration of subterranean environments using legged and aerial robots.
|
Motion Primitives-based Navigation Planning using Deep Collision PredictionThis work contributes a method to design a novel navigation planner exploiting a learning-based collision prediction network.
|
CERBERUS: Deployment at the DARPA Subterranean Challenge Urban CircuitThis paper reports the technological progress and performance of team “CERBERUS” after participating in the Tunnel and Urban Circuits of the DARPA Subterranean Challenge. The SubT Challenge is an international robotics competition organized by the Defense Advanced Research Projects Agency to inspire advances in resilient robotic autonomy in subterranean settings.
|
Hypergame-based Adaptive Behavior Path Planning for Combined Exploration and Visual Search
In this work we present an adaptive behavior path planning method for autonomous exploration and visual search of unknown environments. As volumetric exploration and visual coverage of unknown environments, with possibly different sensors, are non-identical objectives, a principled combination of the two is proposed. In particular, the method involves three distinct planning policies, namely exploration, and sparse or dense visual coverage.
|
Autonomous Aerial Robotic Subterranean Exploration inside the Moaning CavernsIn this video we present the autonomous exploration and mapping of a section of the Moaning Caverns subterranean environment. The Charlie aerial robotic scout, integrating and fusing LiDAR, visual, thermal and inertial data, localizes itself and maps the environment in a multi-modal fashion. Given the real-time built volumetric representation of the environment, a graph-based exploration path planning method facilitates autonomous path planning optimizing for exploration.
|
DARPA SubT Urban Circuit: Autonomous Exploration in the Satsop Abandoned Power PlantIn this video, a mission of the Alpha Aerial Scout of Team CERBERUS during the DARPA Subterranean Challenge Urban Circuit event is presented. The Alpha Robot operates inside the Satsop Abandoned Power Plant and performs autonomous exploration. The robot autonomy relies on a resilient multi-modal localization and mapping solution fusing LiDAR, vision and inertial sensing, combined with a graph-based exploration path planner.
|
DARPA SubT Urban Circuit: Collision-tolerant Exploration of Staircases using Aerial RobotsIn this video we present the autonomous exploration of a staircase with four sub-levels and the transition between two floors of the Satsop Nuclear Power Plant during the DARPA Subterranean Challenge Urban Circuit. The utilized system is a collision-tolerant flying robot capable of multi-modal Localization And Mapping fusing LiDAR, vision and inertial sensing. Autonomous exploration and navigation through the staircase is enabled through a Graph-based Exploration Planner.
|
Graph-based Exploration Path Planning - Aerial Robot inside an Underground MineIn this video we present results on autonomous subterranean exploration inside an abandoned underground mine using an aerial robot. The aerial robot is utilizing the proposed Graph-based Exploration Path Planner which ensures the efficient exploration of the complex underground environment, while simultaneously avoiding obstacles.
|
Graph-based Path Planner: ANYmal Quadruped Robot Exploring Gonzen MineIn this video we present results on autonomous subterranean exploration inside an abandoned underground mine using the ANYmal legged robot. ANYmal is utilizing the proposed Graph-based Exploration Path Planner which ensures the efficient exploration of the complex underground environment, while simultaneously avoiding obstacles and respecting traversability constraints.
|
Aerial Robotic Graph-based Exploration Path Planning during the DARPA SubT Challenge Tunnel CircuitIn this video we present an autonomous exploration deployment of our aerial robotic scouts inside an underground mine in Pittsburgh. The robot is utilizing the proposed Graph-based Exploration Planner to guide its motion so as to explore a first long segment inside this subterranean environment. The presented result took place in the framework of the Tunnel Circuit competition phase of the DARPA SubT Challenge.
|
Motion Primitives-based Path Planning for Fast and Agile Exploration using Aerial RobotsThis work presents a novel path planning strategy for fast and agile exploration using aerial robots. Tailored to the combined need for large-scale exploration of challenging and confined environments, despite the limited endurance of micro aerial vehicles, the proposed planner employs motion primitives to identify admissible paths that search the configuration space, while exploiting the dynamic flight properties of small aerial robots.
|
Learning-based Path Planning for Autonomous Exploration of Subterranean EnvironmentsIn this work we present a new methodology on learning-based path planning for autonomous exploration of subterranean environments using aerial robots. Utilizing a recently proposed graph-based path planner as a "training expert" and following an approach relying on the concepts of imitation learning, we derive a trained policy capable of guiding the robot to autonomously explore underground mine drifts and tunnels.
|
Keyframe-based Thermal-Inertial OdometryThis work presents a keyframe-based thermal-inertial odometry estimation framework tailored to the exact data and concepts of operation of thermal cameras which provide a potential solution to penetrate and overcome conditions of darkness and strong presence of obscurants. The developed framework was verified with respect to its resilience, performance and ability to enable autonomous navigation in an extensive set of experimental studies including multiple field deployments in severely degraded, dark and obscurants-filled underground mines.
|
Subterranean Exploration: When the going gets tough ...... the robots get going!
The presented results are from a series of deployments in underground environments conducted in the framework of the DARPA Subterranean Challenge - Team CERBERUS. |
Robust Thermal-Inertial Localization for Aerial Robots: A case for Direct MethodsThis video presents a comparative study between state-of-the-art GPS-denied visual-inertial odometry frameworks. The comparison includes odometry frameworks applied on rescaled thermal camera data against full radiometric information-exploiting methods using a modified filter-based estimator and a recently proposed keyframe-based direct technique specifically designed for thermal-inertial localization.
|
Graph-based Path Planning for Autonomous Robotic Exploration in Subterranean EnvironmentsThis work presents a new strategy for autonomous graph-based exploration path planning in subterranean environments. Tailored to the fact that subterranean settings such as underground mines are often large-scale networks of narrow tunnel-like and multi-branched topologies, the proposed planner is structured around a bifurcated local- and global-planner architecture.
|
CERBERUS Field Testing at Gonzen Mine - Aerial Scouts Exploration & HomingIn this video we present results on evaluating our subterranean aerial scouts on autonomous exploration and homing inside an underground mine. The test took place as part of the CERBERUS integration week at the Gonzen Mine in Switzerland
|
Graph-based Path Planning for Autonomous Subterranean ExplorationIn this work we present new results on autonomous exploration and mapping of underground mines using aerial robots. A flying robot capable of sensing-degraded localization and mapping utilizes a new graph search-based planning algorithm to ensure efficient and smooth exploration even in narrow subterranean settings.
|
Autonomous Exploration and Mapping in Underground Mines using Aerial RobotsIn this work we present a comprehensive solution towards autonomous exploration and mapping in underground mine environments. The work relies on a set of contributions in sensing-degraded localization and mapping, as well as exploration path planning
|
Contact-based Navigation Path Planning for Aerial RobotsIn this work, we present a path planning method for exploiting contact by aerial robots to enable the traversal of highly anomalous surfaces. Apart from sliding in contact, the proposed strategy introduces a new locomotion modality of azimuth rotations perpendicular to the surface, dubbed the flying cartwheel mode.
|
Keyframe-based Direct Thermal-Inertial OdometryIn this work we present a method for fusion of direct radiometric data from a thermal camera with inertial measurements to enable pose estimation of aerial robots. LWIR thermal vision is not affected by the lack of illumination and the presence of obscurants such as fog and dust, rendering it suitable for GPS-denied, dark and obscurants-filled environments. In contrast to previous approaches, which use 8-bit re-scaled thermal imagery as a complementary sensing modality to visual image data, our approach makes use of the full 14 bit - radiometric data making it generalizable to a variety of environments. Furthermore, our approach implements a key-frame based joint optimization scheme, making odometry estimates robust against image data interruption, which is common during the operation of thermal cameras due to the application of flat field corrections.
|
Visual-Thermal Landmarks and Inertial Fusion for Navigation in Degraded Visual EnvironmentsIn this work we present a multi--sensor fusion algorithm for reliable odometry estimation in GPS--denied and degraded visual environments. The proposed method utilizes information from both the visible and thermal spectra for landmark selection and prioritizes feature extraction from informative image regions based on a metric over spatial entropy. Furthermore, inertial sensing cues are integrated to improve the robustness of the odometry estimation process.
|
Field Deployment of Autonomous Aerial Robots in Underground MinesIn this video we present results from a field deployment of robotic systems inside an underground mine. The deployment involved the autonomous operation, exploration and mapping of underground mine drifts and headings using two aerial robots. The first robot based its operation on the fusion of visible-light and thermal camera data alongside IMU cues, while the second robot employed LiDAR as a prime sensing modality. Both robotic systems demonstrated the ability to operate in the challenging underground environment. In addition, we evaluated the ability of a comprehensive LiDAR, visual- and thermal-inertial sensor system to provide persistent localization when ferried onboard a truck navigating across the mine drifts.
|
Anomaly Detection and Cognizant Path Planning for Surveillance Operations using Aerial RobotsIn this video we present results for the task of unsupervised anomaly detection for aerial robotic surveillance. For environments for which anomaly data are sparse or absent altogether, this work proposes the merging of deep learned visual features and one-class support vector machines to efficiently detect anomaly on camera data and in real-time. Results are shown in relation to area surveillance using a camera-equipped aerial robot conducting a coverage path over an area in which few man-made structures are introduced and have the role of anomalies against their environment.
|
Vision-Depth Landmarks and Inertial Fusion for Navigation in Degraded Visual EnvironmentsIn this work, a method for tight fusion of visual, depth and inertial data for autonomous navigation in GPS–denied, poorly illuminated, and textureless environments is proposed. Visual and depth information are fused at the feature detection and descriptor extraction levels to augment one sensing modality with the other. These multimodal features are then further integrated with inertial sensor cues using an extended Kalman filter to estimate the robot pose, sensor bias terms, extrinsic calibration parameters, and landmark positions simultaneously as part of the filter state.
|
Thermal-Inertial Localization for Autonomous Navigation of Aerial Robots through Obscurants
In this video, the problem of GPS-denied aerial robot navigation through obscurants is considered. Through the fusion of thermal camera and IMU data, estimation of the robot trajectory in such degraded visual environments is achieved. The system is demonstrated in a heavily smoke-filled machine shop environment. Enabling the navigation of aerial robots through obscurants can prove critical in important applications such as search and rescue in underground mines, or in many real-life surveillance scenarios.
|
Visual Saliency-aware Receding Horizon Autonomous Exploration with Application to Aerial RoboticsIn this video we present autonomous visual saliency-aware receding horizon exploration using aerial robots. The presented results refer to the exploration of two environments with different salient objects, namely a room with paintings and a mannequin, as well as a machine shop that includes salient objects such as warning signs and a fire extinguisher.
|
Radiation Source Localization in GPS-denied Environments using Aerial RobotsIn this paper we present the system and methods to enable autonomous nuclear radiation source localization using aerial robots in GPS-denied environments. The presented results refer to the localization of a Cesium-137 source using a small aerial robot both with and without any prior knowledge of its environment
|
Autonomous 3D and Radiation Mapping in Tunnel Environments using Aerial RobotsThis video presents results on autonomous radiation mapping and source localization in tunnel environments using aerial robots.
|
Autonomous Exploration and Simultaneous Object Search using Aerial RobotsThis video presents results on autonomous exploration and simultaneous object search through semantically-enhanced path planning for aerial robots.
|
A Multi-Modal Mapping Unit for Autonomous Exploration and Mapping of Underground Tunnels
This video presents results on autonomous exploration and mapping of tunnel environments using the developed Multi-Modal Mapping Unit (M3U). The M3U tightly synchronized a stereo camera pair with an Inertial Measurement Unit and super-bright flashing LEDs, while it further fuses time-of-flight 3D depth sensors. It allows for GPS-denied localization and mapping visually-degraded environments.
|
Autonomous Aerial Robotic Exploration and Mapping of a Railroad Tunnel in Degraded Visual ConditionsThis video presents indicative results of a sequence of field experiments conducted to verify and evaluate new algorithms and systems for autonomous exploration and mapping of tunnel environments using aerial robots.
|
A Multi-Modal Mapping Unit for Autonomous Robotic Exploration in Visually-degraded Environments This work presents the results of the development of a multi-modal mapping unit combined with an uncertainty-aware planner for exploration and mapping in dark visually-degraded environments.
|
Visual-inertial odometry-enhanced geometrically stable ICPThis video presents results on the topic of visual-inertial odometry-enhanced geometrically stable ICP.
|
Autonomous Exploration in Darkness using NIR Visual-Inertial-Depth Localization and MappingThis video presents our work on autonomous exploration and mapping of dark, visually-degraded environments. The system employs a NIR visual-inertial localization system augmented with a 3D time-of-flight depth sensor. Exploiting an uncertainty-aware receding horizon exploration and mapping planner, the robot operates autonomously in environment for which no prior knowledge is available.
|
Research results of Dr. Kostas Alexis before 2016.
|
Augmented Reality-enhanced Structural InspectionIn this work we present a methodology for augmented reality-enhanced structural inspection using aerial robots. The robot is able of autonomously computing a full coverage inspection path for a prior model of the environment, while the augmented reality-enabled user seamlessly defines new viewpoints to improve the exploration.
|
|
3D Outdoors Autonomous CoverageThis video accompanies our recent paper at Autonomous Robots. The experiment took place at the ETH Zurich polyterasse and the aerial robot performed the mission autonomously both in terms of path-planning as well as vision and control.
|
Structural Inspection Path PlanningThis video accompanies the paper submission entitled: "An Incremental Sampling-based Approach to Inspection Planning: The Rapidly-exploring Random Tree of Trees" which was accepted at the Robotica journal.
|
|