Self-driving Robot

Building a Computer Vision and navigation system for a robot to pick up some boxes.

Integrated Design Project

The largest piece of coursework in the 2nd year of engineering at CUED is the Integrated Design Project, an intensely competitive full-time four-week project integrating robotics,electronics and software, in teams of six. I was both team leader and the sole software team member, developing from scratch the computer vision and navigation system for our robot. The only inputs for navigation are the pixels from the camera, and the control is achieved by altering power and direction to each of the two wheels, and also via the servo motors controlling the scoop and front scanner. The magnetic sensor forscanning blocks is a separate subsystem built by the electronics team.

The navigation system used image processing and computer vision to extract the location of the targets and to detect the position of the robot, at 30 fps in real time. This was done using pure numpy with no other image-processing or recognition libraries, as I found that this was the fastest and easiest way to get the level of control I wanted. Our fully-automatic robot championed the final competition scoring 94% of the available marks, compared to the mean score of less than 50%. You can view a video of the final run, including the overhead perspective from the image recognition navigation system, below.