Our NSF REU Site student Ryan Fite from Colorado School of Mines who worked at our lab during summer 2018 gets selected by NSF for 2018 Council on Undergraduate Research’s REU Symposium, in Alexandria, VA. This is a nationally-competitive conference of REU participants, so congrats to Ryan. Also congrats to his mentor, Shehryar Khattak - PhD Candidate at our lab.
Ryan's work is featured in our github repositories and specifically at: https://github.com/unr-arl/hfsd
NSF REU Site Student working at ARL selected by NSF for 2018 Council on Undergraduate Research’s REU Symposium
Inside Unmanned Systems honors us with a detailed article on our work on Nuclearized Flying Robots: http://insideunmannedsystems.com/charting-cleanup-detecting-nuclear-waste/
Furthermore, our through-smoke-flying robot is featured at the cover of the journal. Thank you!
Our ISVC 2018 papers:
In this video we present results from a field deployment of robotic systems inside an underground mine.
The deployment involved the autonomous operation, exploration and mapping of underground mine drifts and headings using two aerial robots. The first robot based its operation on the fusion of visible-light and thermal camera data alongside IMU cues, while the second robot employed LiDAR as a prime sensing modality.
Both robotic systems demonstrated the ability to operate in the challenging underground environment. In addition, we evaluated the ability of a comprehensive LiDAR, visual- and thermal-inertial sensor system to provide persistent localization when ferried onboard a truck navigating across the mine drifts.
Such field deployments inside underground mines allow the specific analysis regarding the required improvements and technological breakthroughts towards fully autonomous and long-term subterranean robots. It is expected that, among others, these robots will be able to greatly support the needs and goals of the mining industry.
We just released a dataset with labeled data for vehicle classification during nighttime. You can access the dataset following this link: https://github.com/unr-arl/vehicles-nighttime
In this video we present results for the task of unsupervised anomaly detection for aerial robotic surveillance. For environments for which anomaly data are sparse or absent altogether, this work proposes the merging of deep learned visual features and one-class support vector machines to efficiently detect anomaly on camera data and in real-time. Results are shown in relation to area surveillance using a camera-equipped aerial robot conducting a coverage path over an area in which few man-made structures are introduced and have the role of anomalies against their environment.
Training Data: Camera frames from similar environments but lacking any man-made structure or humans.
Test Data: Camera frames collected by the aerial robot over an area similar to that of the training data but in which also few man-made structures and humans have been introduced. Those man-made structures and humans should be detected as cases of anomaly in the data.
This preliminary work is presented as Late Breaking Result at IEEE ICRA 2018
Our paper on thermal-inertial navigation:
Write something about yourself. No need to be fancy, just an overview.