Autonomous Robots Lab
  • Home
  • News
  • Research
    • Autonomous Navigation and Exploration
    • Robot Perception
    • Robot Learning
    • Subterranean Robotics
    • Collision-tolerant Aerial Robots
    • Fixed-Wing UAVs
    • Agile and Physical Interaction Control
    • Underwater Autonomy
    • Intelligent Mobility
    • Robotics for Nuclear Sites
    • Autonomous Robots Arena
    • Code
    • Media
    • Research Presentations
    • Projects
  • Publications
  • Group
    • People
    • Research Collaborators
  • Education
    • Introduction to Aerial Robotics >
      • Online Textbook >
        • Modeling >
          • Frame Rotations and Representations
          • Multirotor Dynamics
        • State Estimation >
          • Inertial Sensors
          • Batch Discrete-Time Estimation
          • The Kalman Filter
        • Flight Control >
          • PID Control
          • LQR Control
          • Linear Model Predictive Control
        • Motion Planning >
          • Holonomic Vehicle BVS
          • Dubins Airplane
          • Collision-free Navigation
          • Structural Inspection Path Planning
        • Simulation Tools >
          • Simulations with SimPy
          • MATLAB & Simulink
          • RotorS Simulator >
            • RotorS Simulator Video Examples
      • Lecture Slides
      • Literature and Links
      • RotorS Simulator
      • Student Projects
      • Homework Assignments
      • Independent Study
      • Video Explanations
      • Syllabus
      • Grade Statistics
    • Autonomous Mobile Robot Design >
      • Lecture Slides
      • Semester Projects
      • Code Repository
      • Literature and Links
      • RotorS Simulator
      • Video Explanations
      • Resources for Semester Projects
      • Syllabus
    • Robotics for DDD Applications
    • CS302 - Data Structures
    • Student Projects >
      • Robot Competitions
      • Undergraduate Researchers Needed
      • ConstructionBots - Student Projects
    • EiT TTK4854 - Robotic Ocean Waste Removal
    • Aerial Robotic Autonomy >
      • Breadth Topics
      • Deep-dive Topics
      • Project & Assignments
      • Literature
    • Robotics Seminars
    • Robotics Days
    • Outreach >
      • Drones Demystified! >
        • Lecture Slides
        • Code Repository
        • Video Explanations
        • RotorS Simulator
        • Online Textbook
      • Autonomous Robots Camp >
        • RotorS Simulator
      • Outreach Student Projects
    • BadgerWorks >
      • General Study Links
      • Learn ROS
      • SubT-Edu
  • Resources
    • Autonomous Robots Arena
    • Robot Development Space
  • Contact

Semester Projects

The course is developed around challenging research-oriented semester projects. Those available for Fall 2017 are listed below:
Download updated instructions for semester projects. 
Project #1: GPS-denied Autonomous Car Localization in Visually-degraded Conditions
Description: GPS-denied localization employing vision and LiDAR sensors is an extensively studied field currently working reliably in simple, featurefull or geometrically structured environments. However, when in visually-degraded conditions and ill-conditioned geometry the problems remains particularly challenging. The goal of this project is to enable GPS-denied Simultaneous Localization And Mapping (SLAM) for autonomous cars navigating subject to rain, snow or ice. 

Research Tasks:
  • Task 1: Vision, NIR, LiDAR System Integration
  • Task 2: Vision, NIR, LiDAR Sensor Fusion
  • Task 3: Dataset collection and groundtruth stamping using GPS
  • Task 4: Field experiments and evaluation​​ ​
Project #2: Change Detection for Autonomous Driving
Description: When a vehicle navigates continuously within a certain area, exploiting its previous map to localize robustly within it and pre-plan its actions leads to optimized performance. For this process to be reliable, spatio-temporal change detection has to take place. However, change detection is challenging both in terms of correlating the input data and maps, as well as in terms of map scalability.

Research Tasks:
  • Task 1: Change detection in images
  • Task 2: Volumetric mapping
  • Task 3: Change detection in volumetric maps
  • Task 4: Semantic change classification using convolutional neural nets
  • Task 4: Dataset collection and groundtruthing
  • Task 5: Field experiments and evaluation​ ​
Project #3: Robotic Inspection of Mines
Description: Mine inspection corresponds to a major challenge due to the difficult environments and the often visually-degraded conditions. This project refers to the development of an aerial and ground robotic system that aims to enable systematic 3D mapping and semantic classification.

Research Tasks:
  • Task 1: Platform development (ideally based on existing robots at the lab)
  • Task 2: Volumetric and surface mapping
  • Task 3: Aerial - to -ground robot collaboration
  • Task 4: Dataset collection and groundtruthing
  • Task 5: Field experiments and evaluation​ ​
2016 Semester Projects
Project #5: Aerial Robotics for Nuclear Site Characterization
Description: A century of nuclear research, war and accidents created a worldwide legacy of contaminated sites. Massive cleanup of that nuclear complex is underway. Our broad research goal is to addresses means to explore and rad-map nuclear sites by deploying unprecedented, tightly integrated sensing, modeling and planning on small flying robots. Within this project in particular, the goal is to develop multi-modal sensing and mapping capabilities by fusing visual cues with thermal and radiation camera data alongside with inertial sensor readings. Ultimately, the aerial robot should be able to derive 3D maps of its environment that are further annotated with the spatial thermal and radiation distribution. Technically, this will be achieved via the development of a multi-modal localization and mapping pipeline that exploits the different sensing modalities (inertial, visible-light, thermal and radiation camera) in a synchronized and complimentary fashion. Finally, within the project you are expected to demonstrate the autonomous multi-modal mapping capabilities via relevant experiments using a multirotor aerial robot. 

Research Tasks:
  • Task 1: Thermal, LiDAR, Radiation Sensing modules integration
  • Task 2: Thermal camera-SLAM
  • Task 3: Multi-modal 3D maps
  • Task 4: Estimation of spatial distribution of heat and radiation
  • Task 5: Heat/Radiation source seek planning
  • Task 6: Robot Evaluation and Demonstration
​
Team:
  • Shehryar Khattak
  • Tung Dang
  • Tuan Le
  • Nhan Pham
  • Tim Kwist
  • Daniel Mendez

​Collaborators: Nevada Advanced Autonomous Systems Innovation Center - ​https://www.unr.edu/naasic
Budget: $2,000
Picture
Project #4: Aerial Robotics for Climate Monitoring and Control
Description: Within that project you are requested to develop an aerial robot capable of environmental monitoring. In particular, an “environmental sensing pod” that integrates visible light and multispectral cameras, GPS receiver, and inertial, atmospheric quality, as well as temperature sensors. Through appropriate sensor fusion, the aerial robot should be able to estimate a consistent 3D terrain/atmospheric map of its environment according to which every spatial point is annotated with atmospheric measurements and the altitude that those took place (or ideally their spatial distribution). To enable advanced operational capacity, a fixed-wing aerial robot should be employed and GPS-based navigation should be automated. Ideally, the aerial robot should be able to also autonomously derive paths that ensure sufficient coverage of environmental sensing data.

Research Tasks:
  • Task 1: Autopilot integration and verification
  • Task 2: Sensing modules and Processing unit integration
  • Task 3: Integration of Visual-Inertial SLAM solution
  • Task 4: Environmental-data trajectory annotation and estimation of spatial distributions
  • Task 5: Real-time plane extraction for landing
  • Task 6: Robot Evaluation and Demonstration

Team:
  • Jason Rush
  • Mat Boggs
  • Devaul Tyler Timothy
  • Frank Mascarich


​
​Collaborators: Desert Research Institute - ​https://www.dri.edu/ 
Budget: $2,000
Picture
Project #3: Robots to Study Lake Tahoe!
Description: Water is a nexus of global struggle, and increasing pressure on water resources is driven by large-scale perturbations such as climate change, invasive species, dam development and diversions, pathogen occurrence, nutrient deposition, pollution, toxic chemicals, and increasing and competing human demands. These problems are multidimensional and require integrative, data-driven solutions enabled by environmental data collection at various scales in space and time. Currently, most ecological research that quantifies impacts from perturbations in aquatic ecosystems is based on (i) the collection of single snapshot data in space, or (ii) multiple collections from a single part of an ecosystem over time. Ecosystems are inherently complex; therefore, having access to these relatively coarse and incomplete collections in space and time could result in less than optimal data based solutions. The goal of this project is to design and develop a platform that can be used on the surface of a lake to quantify the water quality changes in the nearshore environment (1-10 m deep). The platform would be autonomous, used to monitor the environment for water quality (temperature, turbidity, oxygen, chl a) at a given depth.

Research Tasks:
  • Task 1: Autopilot integration and verification
  • Task 2: Sensing modules and Processing unit integration
  • Task 3: Robot Localization and Mapping using Visual-Inertial solution
  • Task 4: Fused visible light/thermal fusion for unified 3D reconstruction
  • Task 5: Robot boat autonomous navigation for shoreline tracking
  • Task 6: Robot Evaluation and Demonstration
​
Team:
  • Camille Bourquin
  •  Stephen Williams
  • Tyler Schmidt
  • Steven King
  • Ke Xu

​Collaborators: Aquatic Ecosystems Analysis Lab: - ​http://aquaticecosystemslab.org/ , NAASIC
Budget: $2,000
Picture
Picture
Project #2: Autonomous Cars Navigation
Description: Autonomous transportation systems not only is an ongoing research trend but also a key factor for the progress of our societies, safety of transportation, more green technologies, growth and better quality of life. The goal of this project will be to develop a miniaturized autonomous car able to navigate while mapping its environment, detecting objects in it (other cars) and performing collision-avoidance maneuvers. To achieve this goal, the robot will integrated controlled steering and a perception system that fuses data from cameras, an inertial measurement unit and depth sensors therefore being able to robustly performing the simultaneous localization and mapping task. Finally, a local path path planner will guide the steering control towards collision-free paths. 

Research Tasks:
  • ​Task 1: Sensing modules and Processing Unit Integration
  • Task 2: Autopilot integration and verification
  • Task 3: Robot Localization using LiDAR/RGBD/Visual-SLAM
  • Task 4: Static/Dynamic Obstacle Detection
  • Task 5: Robot Motion Collision-free Planning
  • Task 6: Robot Evaluation and Demonstration

Team:
  • Niki Silveria
  • Monique Dingle
  • Phoebe Argon
  • Jason Worsnop Cody 
  • Brett Knadle
  • Phillip Vong
​
Collaborators: Nevada Center for Applied Research, NAASIC
Budget: $2,000
Picture
Project #1: Smartphone-assisted Delivery Drone Landing
Description: This project will run in collaboration with Flirtey - the first parcel delivery company conducting real-life operations in the US. The goal is to develop a system that exploits direct/indirect communication between a smartphone and the aerial robot such that delivery landing "on top" of the smartphone becomes possible. Such an approach will enable commercial parcel delivery within challenging and cluttered urban environments. Within the framework of the project, we seek for the most reliable, novel but also technologically feasible solution for the problem at hand. The aerial robot will be able of visual processing and may implement different communication protocols, while the smartphone should be considered "as available" on the market. 

Research Tasks:
  • ​Task 1: Autopilot integration
  • Task 2: Camera systems integration
  • Task 3: Robot-to-Phone and Phone-to-Robot cooperative localization
  • Task 4: Visual-servoying phone tracking
  • Task 5: Autonomous Landing on phone
  • Task 6: Robot Evaluation and Demonstration
​
Team: 
  • Sajid Zeeshan
  • Lopez Austin
  • Golden Erik
  • Kevin Green

​Collaborators: Flirtey - ​http://flirtey.com/
Budget: $2,000
Picture
Picture
Picture
Picture

​​

Proudly powered by Weebly