Resources for the Semester Projects
All the course projects are supported with hardware systems that can enable the autonomous navigation of a robotic system and possibly with further sensing capabilities for a particular application. Below you will find general resources for some of the most fundamental tasks within your work.
Development Resources
1. The Robot Operating System
2. Autopilot system
3. ROS - to - PIXHAWK communication via MAVLink
4. Camera Calibration
5. Robot Localization & Image Processing:
6. Robot Path Planning
7. Robot Simulation:
8. Resources from the Code Repository
The basic process that should be followed to successfully accomplish the goals of all projects is the following:
Sensor Datasets
Study Resources
Development Resources
1. The Robot Operating System
- Website: http://wiki.ros.org/
- Tutorials: http://wiki.ros.org/ROS/Tutorials
2. Autopilot system
- The Pixhawk autopilot: https://pixhawk.org/
- The PX4 Flight Stack for Pixhawk: http://px4.io/
- PX4 stack user-guide: http://px4.io/user-guide/
- PX4 stack dev-guide: http://dev.px4.io/
3. ROS - to - PIXHAWK communication via MAVLink
- MAVROS: http://wiki.ros.org/mavros
4. Camera Calibration
- Monocular-MATLAB: https://www.mathworks.com/help/vision/ug/single-camera-calibrator-app.html
- Monocular-ROS: http://wiki.ros.org/camera_calibration
- Stereo-MATLAB: http://www.mathworks.com/help/vision/ug/stereo-camera-calibrator-app.html
- Stereo-ROS: http://wiki.ros.org/camera_calibration/Tutorials/StereoCalibration
5. Robot Localization & Image Processing:
- USB Camera in ROS: http://wiki.ros.org/usb_cam
- View image frames at ROS: http://wiki.ros.org/image_view
- Intel Realsense with ROS: http://wiki.ros.org/RealSense
- Visual odometry in ROS: http://wiki.ros.org/fovis_ros
- Stereo visual odometry in ROS: http://wiki.ros.org/viso2_ros
- RGBD SLAM (e.g. with sensors like kinect, realsense etc): http://wiki.ros.org/rgbdslam
- OpenCV Apps (e.g. edge detection, optical flow etc): http://wiki.ros.org/opencv_apps
- Visual Servoing in ROS: http://wiki.ros.org/visp
- Create an octomap to support robot navigation: http://wiki.ros.org/octomap
6. Robot Path Planning
- The Open Motion Planning Library: http://ompl.kavrakilab.org/
- MoveIt (using ompl in ros directly): http://wiki.ros.org/moveit
- Exploration of unknown environments: https://github.com/ethz-asl/nbvplanner
- Structural Inspection Planner: https://github.com/ethz-asl/StructuralInspectionPlanner
7. Robot Simulation:
- RotorS Simulator: https://github.com/ethz-asl/rotors_simulator
- PX4/Gazebo Simulation (relevant with the above): http://dev.px4.io/simulation-gazebo.html
- MATLAB Quadcopter Simulation: http://www.mathworks.com/help/aeroblks/examples/quadcopter-project.html
- Robot Differential Drive Control with MATLAB: https://www.mathworks.com/help/robotics/examples/path-following-for-a-differential-drive-robot.html
8. Resources from the Code Repository
The basic process that should be followed to successfully accomplish the goals of all projects is the following:
- Get the autopilot to work for your robot.
- Will you operate based on GPS guidance only? If yes then rely on the position control at the level of the autopilot. If not then the autopilot will have to be interfaced with a high level processing system and this should guide the low-level robot control on how to perform appropriate trajectories.
- Get your high-level processor to communicate with the autopilot. Since we consider Pixhawk-based systems then use MAVROS that relies on the MAVLink protocol.
- Get your sensors interfaced using ROS.
- For your cameras perform camera calibration.
- If you are using many sensors together, consider performing extrinsics calibration
- If you are to rely on on-board sensors localization, consider a relevant framework such as fovis and viso (these are standard good choices for visual odometry, other approaches exist). For more specific options, contact at [email protected]
- What path planning are you doing? For collision-free waypoint naigation use OMPL/MoveIt. For exploration of unknown environments use the NBVPLANNER. For inspection of facilities for which you have a prior mesh model, consider using the Structural Inspection Planner
- Get the topics of the path planner and of the localization system to become the reference and the current state for your high-level controller (if you are not operating only based on GPS navigation)
Sensor Datasets
- Intel Realsense - Download ROSBAG
- VI-Sensor Stereo Visual-Inertial system - Download ROSBAG
- Monocular camera observing a smartphone screen display a QR-code - Download ROSBAG
- Monocular camera observing a smartphone flashing its light - Download ROSBAG
Study Resources
- A set of papers on EKF, SLAM, Path Planning relevant with the project topics. Download