Autonomous Robots Lab
  • Home
  • News
  • Research
    • Autonomous Navigation and Exploration
    • Fixed-Wing UAVs
    • Agile and Physical Interaction Control
    • Localization and 3D Reconstruction
    • Subterranean Robotics
    • Collision-tolerant Aerial Robots
    • Marine Robotics
    • Intelligent Mobility >
      • Student Projects
      • Electric Bus Datasets
    • Robotics for Nuclear Sites
    • Degerminator
    • Autonomous Robots Arena
    • Code
    • Media
    • Research Presentations
    • Projects
  • Publications
  • Group
    • People
    • Research Collaborators
    • Positions
  • Education
    • Introduction to Aerial Robotics >
      • Online Textbook >
        • Modeling >
          • Frame Rotations and Representations
          • Multirotor Dynamics
        • State Estimation >
          • Inertial Sensors
          • Batch Discrete-Time Estimation
          • The Kalman Filter
        • Flight Control >
          • PID Control
          • LQR Control
          • Linear Model Predictive Control
        • Motion Planning >
          • Holonomic Vehicle BVS
          • Dubins Airplane
          • Collision-free Navigation
          • Structural Inspection Path Planning
        • Simulation Tools >
          • Simulations with SimPy
          • MATLAB & Simulink
          • RotorS Simulator >
            • RotorS Simulator Video Examples
      • Lecture Slides
      • Literature and Links
      • RotorS Simulator
      • Student Projects
      • Homework Assignments
      • Independent Study
      • Video Explanations
      • Syllabus
      • Grade Statistics
    • Autonomous Mobile Robot Design >
      • Lecture Slides
      • Semester Projects
      • Code Repository
      • Literature and Links
      • RotorS Simulator
      • Video Explanations
      • Resources for Semester Projects
      • Syllabus
    • Robotics for DDD Applications
    • CS302 - Data Structures
    • Student Projects >
      • Robot Competitions
      • Undergraduate Researchers Needed
      • ConstructionBots - Student Projects
    • EiT TTK4854 - Robotic Ocean Waste Removal
    • Aerial Robotic Autonomy >
      • Breadth Topics
      • Deep-dive Topics
      • Literature
    • Robotics Seminars
    • Robotics Days
    • Outreach >
      • Drones Demystified! >
        • Lecture Slides
        • Code Repository
        • Video Explanations
        • RotorS Simulator
        • Online Textbook
      • Autonomous Robots Camp >
        • RotorS Simulator
      • Outreach Student Projects
    • BadgerWorks >
      • General Study Links
      • Learn ROS
      • SubT-Edu
  • Resources
    • Autonomous Robots Arena
    • Robot Development Space
  • Contact
Degerminator

Overview

Degerminator is a robotic system quickly put together on the basis of a previous system of our lab with the purpose to examine the potential of area disinfection based on UV germicidal light onboard an autonomous system. In this page we will outline the basic design of the system with the goal to possibly help in their rapid usage worldwide. As our group does not consist of medical professionals we naturally let all decision-making for the possible use of such systems to the experts. 
Picture

UV Germicidal Lamp

At the moment the Degerminator robot integrates the a YIGSM 60W LED UVC Germicidal Lamp with a 7200lm rating. Wavelength is 254nm. We seek to receive advise for better UV lamp options. Please see below for contact info. 

Robotic Platform

The developed robotic system - dubbed Degerminator - is built around ​a SuperDroid IG52-DB4 4WD roving platform. The integrated low--level electronics drive the system in skid-steer mode, while a NUC5i7RYH provides high--level computation resources and is interfaced with all sensing systems and the low--level electronics responsible for the vehicle actuation. A simple differential--drive controller is implemented and drives the system in a manner such that the heading difference to the goal configuration is corrected with priority at 20Hz. 

Fun fact: The robot platform was previously used to map radiation (video) as well as in the role of communications extender during the Urban Circuit of the DARPA Subterranean Challenge. 

Localization &  Terrain Mapping 

To enable GPS--denied localization and mapping, Degerminator integrates a Velodyne Puck LITE 16-channel LiDAR with vertical and horizontal field of view equal to 30degrees and 360degrees respectively and a nominal range of 100m. Utilizing an implementation of the LiDAR Odometry And Mapping (LOAM) [1] method the robot full pose consisting of its position and its orientation, alongside a point cloud representation of the environment are estimated. Given the pose and the registered point cloud map at every iteration the robot further fuses the data from an oblique facing Realsense D435 RGBD sensor to derive a higher resolution representation of the terrain therefore enhancing its online built map. Subsequently, terrain traversability analysis between any two candidate robot configurations is enabled using the Elevation Mapping framework in [2]. These functionalities are essential for the autonomous navigation of the robot especially in the case of indoor, cluttered environments and rough terrain. 

Autonomous Navigation

Given the online and real-time reconstructed map of the environment, Degerminator uses a Differential Drive controller [3] and a collision-free motion planner based on random sampling [4] to guide its way towards desired goal destinations. An alternative functionality for autonomous area exploration and coverage based on the work in [5] is also possible. The latter is for cases where even a rough a map of the environment is not known a priori and no human supervision of the robot is considered to be possible. 

Interested? 

If you have interest, would like to know more or contribute feel free to contact us at:
  • Prof. Dr. Kostas Alexis (kalexis@unr.edu)
  • Frank Mascarich (fmascarich@nevada.unr.edu)
  • Lab general e-mail (autonomous.robots.lab@gmail.com) 

References

[1] ​Zhang, J. and Singh, S., 2014, July. LOAM: Lidar Odometry and Mapping in Real-time. In Robotics: Science and Systems (Vol. 2, No. 9). 
[2] ​Fankhauser, P., Bloesch, M. and Hutter, M., 2018. Probabilistic terrain mapping for mobile robots with uncertain localization. IEEE Robotics and Automation Letters, 3(4), pp.3019-3026.
[3] Siegwart, R., Nourbakhsh, I.R. and Scaramuzza, D., 2011. Introduction to autonomous mobile robots. MIT press.
[4] Karaman, S. and Frazzoli, E., 2011. Sampling-based algorithms for optimal motion planning. The international journal of robotics research, 30(7), pp.846-894.
[5] ​Dang, T., Mascarich, F., Khattak, S., Papachristos, C. and Alexis, K., 2019, November. Graph-based path planning for autonomous robotic exploration in subterranean environments. In 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 3105-3112). IEEE.

Proudly powered by Weebly