Graph-SLAM

Our motivation for this project was to develop a robot that can perform SLAM with decent accuracy while keeping the development costs as low as possible. This project was in collaboration with IEEE NITK and was presented in the NITK Project Expo 2018.

My Contributions Highlights:

The first stage of the project was map generation using Gmapping on the Gazebo platform with a model of Kobuki TurtleBot. We had created a simple maze like arena and the robot was a part of it. The map generated through SLAM was decently similar to the actual arena.
In the second stage of the project, the team built a physical robot with an ultrasonic sensor and a camera. Each landmark had a unique colour. The ultrasonic sensor measure the distance between the robot and a landmark and the camera identified that landmark. The distance measured from the ultrasonic sensors and the angle from the servo motor were used as inputs to the GraphSLAM function.