Navigation

Our rovers navigation relies on an RGB-D Graph based approach for simultaneous localization and mapping (rtabmap). For this we use image and point cloud data from a commercial ASUS Xtion sensor. The algorithm employs feature detection to generate optical odometry for our rovers and an appearance-based loop closure detector to refine the generated cloud map.

Localization and map generated by with rtabmap in our simulation

Localization and map generated by with rtabmap in our simulation

We integrate the data of different sensors to generate an obstacle map. Then we use an online optimal trajectory planner utilizing the Elastic-Band method to navigate this map (teb_local_planner).

Our path planning in the simulation

Our path planning in the simulation

For Autonomous Exploration we adopt a frontier-based approach, where our rovers navigate to the boundary between explored, free space and unexplored space (frontiers) until the entire environment has been explored.