Our motivation for this project was to develop a robot that can perform SLAM with decent accuracy while keeping the development costs as low as possible. This project was in collaboration with IEEE NITK and was presented in the NITK Project Expo 2018.
My Contributions Highlights:
Wrote program for GraphSLAM as a function which takes in sensor data and outputs the cordinates of the map
Developed and integrated the sensors to mimic the function of a depth camera like the Kinect and wrote program for sensor fusion which to give clean data
Led the team of 5 members to implement and test different sub-sections of overall system which includes the computer program, electrical circuitry and mechanical assembly
The first stage of the project was map generation using Gmapping on the Gazebo platform with a model of Kobuki
TurtleBot. We had created a simple maze like arena and the robot was a part of it. The map generated through SLAM was decently similar to the
actual arena.
In the second stage of the project, the team built a
physical robot with an ultrasonic sensor and a
camera. Each landmark had a unique colour.
The ultrasonic sensor measure the distance
between the robot and a landmark and the
camera identified that landmark. The distance measured from the ultrasonic sensors and the angle from the servo motor were used as inputs to the GraphSLAM function.