Degerminator
Overview
Degerminator is a robotic system quickly put together on the basis of a previous system of our lab with the purpose to examine the potential of area disinfection based on UV germicidal light onboard an autonomous system. In this page we will outline the basic design of the system with the goal to possibly help in their rapid usage worldwide. As our group does not consist of medical professionals we naturally let all decision-making for the possible use of such systems to the experts.
|
UV Germicidal Lamp
At the moment the Degerminator robot integrates the a YIGSM 60W LED UVC Germicidal Lamp with a 7200lm rating. Wavelength is 254nm. We seek to receive advise for better UV lamp options. Please see below for contact info.
Robotic Platform
The developed robotic system - dubbed Degerminator - is built around a SuperDroid IG52-DB4 4WD roving platform. The integrated low--level electronics drive the system in skid-steer mode, while a NUC5i7RYH provides high--level computation resources and is interfaced with all sensing systems and the low--level electronics responsible for the vehicle actuation. A simple differential--drive controller is implemented and drives the system in a manner such that the heading difference to the goal configuration is corrected with priority at 20Hz.
Fun fact: The robot platform was previously used to map radiation (video) as well as in the role of communications extender during the Urban Circuit of the DARPA Subterranean Challenge.
Fun fact: The robot platform was previously used to map radiation (video) as well as in the role of communications extender during the Urban Circuit of the DARPA Subterranean Challenge.
Localization & Terrain Mapping
To enable GPS--denied localization and mapping, Degerminator integrates a Velodyne Puck LITE 16-channel LiDAR with vertical and horizontal field of view equal to 30degrees and 360degrees respectively and a nominal range of 100m. Utilizing an implementation of the LiDAR Odometry And Mapping (LOAM) [1] method the robot full pose consisting of its position and its orientation, alongside a point cloud representation of the environment are estimated. Given the pose and the registered point cloud map at every iteration the robot further fuses the data from an oblique facing Realsense D435 RGBD sensor to derive a higher resolution representation of the terrain therefore enhancing its online built map. Subsequently, terrain traversability analysis between any two candidate robot configurations is enabled using the Elevation Mapping framework in [2]. These functionalities are essential for the autonomous navigation of the robot especially in the case of indoor, cluttered environments and rough terrain.
Autonomous Navigation
Given the online and real-time reconstructed map of the environment, Degerminator uses a Differential Drive controller [3] and a collision-free motion planner based on random sampling [4] to guide its way towards desired goal destinations. An alternative functionality for autonomous area exploration and coverage based on the work in [5] is also possible. The latter is for cases where even a rough a map of the environment is not known a priori and no human supervision of the robot is considered to be possible.
Interested?
If you have interest, would like to know more or contribute feel free to contact us at:
- Prof. Dr. Kostas Alexis ([email protected])
- Frank Mascarich ([email protected])
- Lab general e-mail ([email protected])
References
[1] Zhang, J. and Singh, S., 2014, July. LOAM: Lidar Odometry and Mapping in Real-time. In Robotics: Science and Systems (Vol. 2, No. 9).
[2] Fankhauser, P., Bloesch, M. and Hutter, M., 2018. Probabilistic terrain mapping for mobile robots with uncertain localization. IEEE Robotics and Automation Letters, 3(4), pp.3019-3026.
[3] Siegwart, R., Nourbakhsh, I.R. and Scaramuzza, D., 2011. Introduction to autonomous mobile robots. MIT press.
[4] Karaman, S. and Frazzoli, E., 2011. Sampling-based algorithms for optimal motion planning. The international journal of robotics research, 30(7), pp.846-894.
[5] Dang, T., Mascarich, F., Khattak, S., Papachristos, C. and Alexis, K., 2019, November. Graph-based path planning for autonomous robotic exploration in subterranean environments. In 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 3105-3112). IEEE.
[2] Fankhauser, P., Bloesch, M. and Hutter, M., 2018. Probabilistic terrain mapping for mobile robots with uncertain localization. IEEE Robotics and Automation Letters, 3(4), pp.3019-3026.
[3] Siegwart, R., Nourbakhsh, I.R. and Scaramuzza, D., 2011. Introduction to autonomous mobile robots. MIT press.
[4] Karaman, S. and Frazzoli, E., 2011. Sampling-based algorithms for optimal motion planning. The international journal of robotics research, 30(7), pp.846-894.
[5] Dang, T., Mascarich, F., Khattak, S., Papachristos, C. and Alexis, K., 2019, November. Graph-based path planning for autonomous robotic exploration in subterranean environments. In 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 3105-3112). IEEE.