Autonomous Robots Lab
  • Home
  • News
  • Research
    • Autonomous Navigation and Exploration
    • Fixed-Wing UAVs
    • Agile and Physical Interaction Control
    • Localization and 3D Reconstruction
    • Subterranean Robotics
    • Collision-tolerant Aerial Robots
    • Marine Robotics
    • Intelligent Mobility >
      • Student Projects
      • Electric Bus Datasets
    • Robotics for Nuclear Sites
    • Degerminator
    • Autonomous Robots Arena
    • Code
    • Media
    • Research Presentations
    • Projects
  • Publications
  • Group
    • People
    • Research Collaborators
    • Positions
  • Education
    • Introduction to Aerial Robotics >
      • Online Textbook >
        • Modeling >
          • Frame Rotations and Representations
          • Multirotor Dynamics
        • State Estimation >
          • Inertial Sensors
          • Batch Discrete-Time Estimation
          • The Kalman Filter
        • Flight Control >
          • PID Control
          • LQR Control
          • Linear Model Predictive Control
        • Motion Planning >
          • Holonomic Vehicle BVS
          • Dubins Airplane
          • Collision-free Navigation
          • Structural Inspection Path Planning
        • Simulation Tools >
          • Simulations with SimPy
          • MATLAB & Simulink
          • RotorS Simulator >
            • RotorS Simulator Video Examples
      • Lecture Slides
      • Literature and Links
      • RotorS Simulator
      • Student Projects
      • Homework Assignments
      • Independent Study
      • Video Explanations
      • Syllabus
      • Grade Statistics
    • Autonomous Mobile Robot Design >
      • Lecture Slides
      • Semester Projects
      • Code Repository
      • Literature and Links
      • RotorS Simulator
      • Video Explanations
      • Resources for Semester Projects
      • Syllabus
    • Robotics for DDD Applications
    • CS302 - Data Structures
    • Student Projects >
      • Robot Competitions
      • Undergraduate Researchers Needed
      • ConstructionBots - Student Projects
    • EiT TTK4854 - Robotic Ocean Waste Removal
    • Aerial Robotic Autonomy >
      • Breadth Topics
      • Deep-dive Topics
      • Literature
    • Robotics Seminars
    • Robotics Days
    • Outreach >
      • Drones Demystified! >
        • Lecture Slides
        • Code Repository
        • Video Explanations
        • RotorS Simulator
        • Online Textbook
      • Autonomous Robots Camp >
        • RotorS Simulator
      • Outreach Student Projects
    • BadgerWorks >
      • General Study Links
      • Learn ROS
      • SubT-Edu
  • Resources
    • Autonomous Robots Arena
    • Robot Development Space
  • Contact
To Master Students: For interest in the proposed projects, contact us at: konstantinos.alexis@ntnu.no 
Picture

Projects for Master Thesis - Open for academic period 2021-2022. 

Covariance Estimation for Multi-Modal SLAM
Supervisors: Kostas Alexis |Keywords: SLAM, GPS-denied, Visually-degraded
Given - Not Available. 
Multi-Modal SLAM consists of fusing information generated by different sensing modalities (Visual Cameras, LiDARs, IMU etc.) to provide an accurate estimate of the location of a robot as well as a map of its environment. This fusion has typically been modeled in the form of a non-linear least squares optimization with the residuals accounting for their respective uncertainties. For good results, and especially in degenerate scenarios, it is imperative that the uncertainty of the information being fused is well estimated. However, as this information is obtained from sensors measuring different physical properties/phenomena, a direct comparison of uncertainties is semantically not meaningful and past approaches have dealt with this by using fixed or heuristically scaled covariances. This thesis will explore the problem of covariance estimation specifically for such fusion of multi-modal information taking a data-driven approach.
  • Details available following this link.  
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
  • Additional contact: Nikhil Khedekar (nikhil.v.khedekar@ntnu.no) 
Picture
Relevant video: ​​https://youtu.be/Tv6FoQI8h_I
Navigation Policy for a Tiny Drone using Deep Reinforcement Learning
Supervisors: Kostas Alexis |Keywords: Reinforcement Learning, Autonomy
Given - Not Available. ​
Lightweight quadrotors are an ideal platform for fast exploration in confined environments thanks to their agility and small body’s sizes. However, due to the limited payload that such tiny drones can carry, appropriate sensors and navigation software accounting for severely constrained resources onboard the robot need to be developed. Deep reinforcement learning offers a promising approach to design an efficient navigation policy by directly infer robot’s action from noisy sensor observation. This project and thesis aim to address this challenge by equipping a tiny quadrotor with lightweight range sensing and using Deep Reinforcement Learning to find the near-optimal navigation policy for such robot in confined settings.​
  • Details available following this link.  
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
  • Additional contact: Huan Nguyen (dinh.h.nguyen@ntnu.no) 
Picture
Relevant video: ​​
​https://youtu.be/6oRlmdy7tw4
Design, Modeling and Control of a Collision-tolerant Bicopter
Supervisors: Kostas Alexis |Keywords: Aerial Robots, Control
Given - Not Available. ​
When Micro Aerial Vehicles (MAVs) are commanded to operate in environments with complex obstacles that are hard to detect and/or difficult to avoid, collisions with these obstacles can disrupt the mission and bring damage, sometimes fatally, to the systems. Traditional collision-tolerant design can be mainly distinguished in a) Addition of propeller guards 2) Protection of the system with an external-impact resilient shell, 3) Use of novel materials to decrease the stiffness of the vehicle during collisions. These designs are still bulky and heavy, due to the addition of hardware components and in most cases, they increase the robot size. Although such methods can work well for large to medium size environments, they often fail when it comes to extremely confined environments and narrow passages, such as manholes present in ship ballast tanks, collapsed buildings, caves, mines and more. Responding to this fact, in this project and thesis we seek to design and develop a miniaturized robotic embodiment which will contribute to a) enhance the collision-tolerance of the system via a compliant mechanism b) reduce the mass, dimensions and increase safety and efficiency of equally capable systems c) develop modelling and control strategies for active collision interaction.
  • Details available following this link.  
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
  • Additional contact: Paolo De Petris (paolo.de.petris@ntnu.no) 
Picture
Image from: 
​Qin, Y., Xu, W., Lee, A. and Zhang, F., 2020. Gemini: A compact yet efficient bi-copter uav for indoor applications. IEEE Robotics and Automation Letters, 5(2), pp.3213-3220. 
Modeling, Simulation, and Control of a Jumping and Walking Quadruped Robot for Martian Lava Tube Exploration
Supervisors: Kostas Alexis | Keywords: Space & Subterranean Robots, Control
Given - Not Available. ​
Over the last decades satellites, telescopes, landers, and wheeled rovers have been the main form of space exploration. As the field of legged robotics has developed and matured significantly in recent years, we now see the opportunity to explore more diverse and interesting terrain in space using specialized quadruped robots optimized for challenging off-world planetary environments, such as craters, caves and lava tubes. Legged robots, such as he Boston Dynamics Spot and the ANYbotics ANYmal, present a set of advantages in mobility and versatility in complex environments over traditional wheeled robots and rovers. Jumping legged robots may be able to traverse the geometrically complex subterranean voids of lava tubes in planets such as Mars. A jumping legged robot for Marian lava tube exploration will retain the key advantages of quadruped systems in overcoming rough terrain, while also being able to coordinate its actuators and exploit the low gravity environment of Mars and compliant leg designs to jump for significant height and thus overcome large obstacles. In this project your task is to contribute on the modeling, control and simulation of such as jumping legged robot that is currently being designed by our team at NTNU.
  • Details available following this link.  
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
  • Additional contact: Jørgen Anker Olsen (jorgen.a.olsen@ntnu.no) 
Picture
Image from the Robotic Systems Lab, ETH Zurich
Selective Inspection of Ship Ballast/Cargo Tanks by Identifying Inspection-important Geometries 
Supervisors: Kostas Alexis | Keywords: Semantic Mapping, Path Planning
Given - Not Available. ​
Ballast/cargo tanks in ships can experience damage due to prolonged exposure to stress and water. If not inspected and fixed timely these can lead to permanent damages causing huge losses to the shipping industry. Currently, these inspection tasks are performed by humans, often require scaffolding or roping, and can thus last for several days. Longer inspection times require the ship to remain in the dock longer, resulting in a huge loss to the owner of the ship. Micro Aerial Vehicles capable of autonomous inspection can speed up this process significantly. However, a uniform inspection of these large tanks is not practical as the robot must inspect the surfaces at a very close distance (0.6 - 1.0m). To overcome this challenge, an intelligent inspection strategy needs to be used. Human inspectors know the areas of high likelihood of damage from their experience and focus their attention more on these areas.  This thesis aims to develop a method that can identify such high importance areas in a 3D lidar depth image through supervised learning or by detecting the change in geometry, annotate them in a volumetric map (this representation is not final), and utilize them for intelligent inspection.
  • Details available following this link.  
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
  • Additional contact: Mihir Dharmadhikari (mihir.dharmadhikari@ntnu.no) 
Picture
Reinforcement Learning-based Trajectory Generation for Collision-free Fast Flight
Supervisors: Kostas Alexis | Keywords: Reinforcement Learning
Given - Not Available. ​
Fast flight of an aerial robot in confined environments depends on several modules that sense, plan, map and control the robot independently. While this approach allows parallel development, they add a certain processing delay and take longer to complete, hindering fast flight. For collision avoidance, classical methods build a volumetric map that is prone to inconsistencies as the estimate of the robot’s position drifts over time. A neural network can be used to directly take in a sequence of sensor measurements using a depth camera or a 3D lidar sensor, without using a map, to determine a set of parameters that are used to generate a smooth, collision-free trajectory for the robot. This process takes place iteratively, with the robot updating the trajectory at a given frequency, as it receives new sensor information. This thesis aims to design a neural network architecture and develop a method to train it to generate parameters for trajectories to reach a goal position in the environment. These trajectories must be smooth, collision-free and respect the dynamic constraints of the robot.
  • Details available following this link.  
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
  • Additional contact: Mihir Kulkarni (mihir.kulkarni@ntnu.no) 
Picture
Relevant video: ​https://youtu.be/LcsuX1bwI_8
Visual-Inertial Odometry Estimation in Degraded Underwater Environments
Supervisors: Kostas Alexis (NTNU), Eleni Kelasidi (SINTEF) | Keywords: Underwater robotics, Odometry, SLAM
Available
Underwater environments can often present severe conditions of visual degradation. This may take the form of low-light and low-texture scenes, presence of obscurants and more. Traditional utilization of robotic systems in the underwater domain rarely relates to autonomous missions close to structures (man-made or natural) and typically exploits sensing solutions tailored to open-water navigation (e.g., sonars, DVLs fused with high-quality Inertial Measurement Units). Although such methods are established by decades of research and are reliable, they do not apply in complex autonomous operations that relate to underwater navigation within confined structures such as natural caves, chambers within shipwrecks, fish farms and more. In such environments, relying on visual, and particularly visual-inertial, is the meaningful choice. However, such methods have to deal with low-light, low-texture and often obscurant-filled conditions. Responding to this fact, in this project and thesis we seek to develop resilient underwater visual-inertial odometry estimation through combined contributions in a) enhanced imaging starting from hardware features to image pre-processing, b) low-texture visual data association, and c) robust optimization.​
  • Details available following this link.  
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
  • Additional contact: Eleni Kelasidi (Eleni.Kelasidi@sintef.no) 
Picture
Smart Paws for Robot Dogs
Supervisors: Kostas Alexis | Keywords: Space & Subterranean Robots, Mechatronics
Given - Not Available. ​
In this project we aim to develop a new generation of “smart paws” that will enable legged robots, such as Boston Dynamics Spot and ANYmal C systems, to reliably a) sense the interaction forces with the terrain (direction and mangitude), and b) classify the type of terrain (e.g., asphalt, soil, rubble, snow, ice). To achieve this goal, we look at the intelligent combination of i) “coded vision” (exploiting a micro camera inside the paw and a specialized pattern from the inner side of the paw), ii) audio analysis (inside the paw), and c) inertial measurements (accelerometer, gyroscope) also from within the paw. The Figure presents an outline of the sensing modalities to be exploited by this robotic paw. To correlate the diverse sensor data (coded vision, audio, inertial cues) we will seek to exploit both classical techniques in computer vision and audio analysis, as well as modern deep learning. Supervisory signals for the force magnitude and direction will be provided using a testbed incorporating a force sensor. 
  • Details available following this link. 
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
Picture
Autonomous Underwater Cave Explorer
Supervisors: Kostas Alexis (NTNU), Eleni Kelasidi (SINTEF), Marios Xanthidis (SINTEF) | Keywords: Underwater Robots
Available
Underwater robotics have long been utilized for the remote exploration of complex natural environments. However, such systems are yet to achieve the level of autonomy and ability to undertake complex missions that is observed in their flying- or ground counterparts. This is despite the fact that complex underwater environments, such as networks of underwater caves, are particularly important both for scientific exploration and industry applications. Motivated by the above observation, in this project we seek to develop a mechatronically simple and highly autonomous underwater robot that presents agile control, visual-inertial odometry estimation capabilities, dense mapping, and autonomous exploration path planning skills. To that end we aim to exploit extensive prior experience in subterranean exploration gained through our winning participation in the DARPA Subterranean Challenge.​
  • Details available following this link. 
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
  • Additional contact: Eleni Kelasidi (Eleni.Kelasidi@sintef.no) 
Picture
Semantic Mapping in Ship Ballast and Cargo Tanks
Supervisors: Kostas Alexis (NTNU), Geir Hamre (DNV) | Keywords: Semantic SLAM
Given - Not Available. ​
Traditional methods for localization and mapping for robotic systems, rely on low-level geometric features such as corners, lines, or surface patches to reconstruct the metric 3D structure of a scene. In essence, semantic information for the scene is completely disregarded during this process. However, this contradicts the fact that industrial scenes often involve specific types of semantics that are known a-priori and should thus be exploited. In this project, we aim to exploit semantic information in order to both allow the better detection of such semantics in a new industrial environment, as well as to improve the robot’s localization and mapping capabilities as demonstrated by a small niche of recent contributions. We focus explicitly on ship ballast and cargo tanks and the specific semantics of interest relate to structural components that represent areas of the tanks where an inspection mission must focus. Such inspection missions are frequently conducted to assess the integrity and health of the ship.
  • Details ​available following this link. 
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
  • Additional contact: Mihir Dharmadhikari (mihir.dharmadhikari@ntnu.no) 
Picture
Design and Prototype an Unmanned Ground Effect Vehicle
Supervisors: Kostas Alexis | Keywords: VTOL/Fixed-wing UAV, Visual-inertial Odometry
Given - Not Available. ​
Ground-Effect Vehicles (GEVs), also called Wing-In-Ground effect (WIG) or – historically – Ekranoplans, are a class of systems that is able to move over the surface by gaining support from the relative reactions of the air against the surface of the terrain or water. Most commonly, such systems are designed to glide over the surface of the sea by exploiting the ground effect to improve their aerodynamic efficiency. Motivated by both old GEV designs and recent concepts towards GEV-based transportation, in this project you are tasked with a) designing a GEV and considering its scaling properties (from small unmanned- to a system that could be used for transportation purposes) and b) prototyping this design as a small Unmanned GEV (UGEV). Eventually, albeit evaluated on a smaller scale, the system should be able to assess if such designs represent a competitive, efficient, low-maintenance and “green” alternative for fast transportation for coastal or archipelagic areas.​
  • Details ​available following this link. 
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
Picture
Videos from existing, professional or hobby-level, GEVs: Airfish 8 | WIG crafts |
RC GEV model 
Tiny Visual-Inertial Odometry System
Supervisors: Kostas Alexis | Keywords: Visual-Inertial Odometry
Given - Not Available. ​
Visual-inertial odometry systems have long been researched with significant success. Exploiting relevant advances in vision systems and processors, as well as breakthroughs in the knowledge domains of state estimation and computer vision, the community has developed robust and high-performance visual-inertial estimation systems presenting drift that grows as slowly as 1% over the robot path. However, such systems are often computationally demanding and they rely on high-quality camera sensors. This prohibits the potential for their ubiquitous utilization in particularly miniaturized systems. In this project you are asked to develop a miniaturized visual-inertial system that exploits an integrated system that offers a monocular camera, an Inertial Measurement Unit, a 1D distance sensor (time-of-flight), alongside a processing solution integrating 1x  Arm® Cortex® M7 @ 480 MHz and 1x Arm® Cortex® M4 core @ 240 MHz. Your solution should fit well in the M7 core of this system and perform visual-inertial odometry at 20FPS.
  • Details ​available following this link. 
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
Picture
Path Planning for Fixed Wing Aerial Delivery 
Supervisors: Kostas Alexis | Keywords: VTOL/Fixed-wing UAV Control
Given - Not Available. ​
In this project you are asked to develop a path planning method that exploits prior knowledge of the terrain geometry, population density over each area in the map, models and predictions for the atmospheric conditions, as well as model of a fixed-wing airplane dynamics (and its constraints) such that safe and efficient paths are derived towards fixed-wing UAV-based aerial delivery. The project and master thesis shall be conducted in collaboration with Aviant [https://www.aviant.no/], a start-up company that aims to deliver a full-stack drone service for on-demand transport of cargo.  
  • Details​ available following this link. 
  • This thesis is in collaboration with Aviant, a company specializing in medical drone transportation
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
Picture
Investigating the effect of time synchronization in SLAM
Supervisors: Kostas Alexis | Keywords: SLAM
Given - Not Available. ​
Multi-sensor fusion algorithms, like visual-inertial SLAM, fuse measurements from different sensors with the underlying assumption that these measurements are aligned with respect to some global clock. This alignment can be estimated either offline -during calibration- or online when the system is being used. Further, the alignment may be fixed with an offset (through propogation delays) or variable (through additional processing delays depending on the CPU load) with respect to the global clock leading to the underperformance of the SLAM algorithm due to the violated assumption. In this project the student will derive a model and evaluate how this synchronization affects the overall accuracy of the algorithm.​
  • Details available following this link. 
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
  • Additional contact: Nikhil Khedekar (nikhil.v.khedekar@ntnu.no) 
Picture
Image of the VersaVIS system developed by ETH Zurich.
mmWave Radar-based Deep Collision Prediction
Supervisors: Kostas Alexis | Keywords: SLAM

Available. ​
In this project and thesis we seek to understand the complexities of mmWave radar imaging and develop a deep learning-based collision predictor that correlates a window of radar data, extracts appropriate features and can predict if a future vehicle trajectory is colliding with the environment without assuming access to a consistent 3D map or any other depth/visual cue. When successful, this technology can break new ground and allow autonomous flying or ground robots to reliable access some of the most extreme environments on Earth, and can enhance the safety of autonomous driving in foggy conditions.
  • Details available following this link. 
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
Picture
Understanding the Vision System of Underground Species
Supervisors: Kostas Alexis
Available - Not posted but available after discussion. E-mail us if desired. 
In this project we question what is special and specific to the vision system of specials (from animals to insects) living underground. Good examples include the wolf spider or beavers. The emphasis is on literature study on the specific domain and derivation of conclusions on how certain principles may apply to robotic vision systems both in the sense of hardware and algorithms. It corresponds to a project that will lay the ground for many subsequent investigations to follow. 
  • Main Contact: Kostas Alexis (konstantinos.alexis@ntnu.no)
Picture
Picture

Own ideas - high risk projects!

Do you have your own idea about a robotics project? Are you willing to discuss a high-risk project with the understanding that things might not always work? Contact me and schedule a meeting!
Proudly powered by Weebly