A simple example of self driving using HSV filtering with , findContours from OpenCV and grab_contours from imutils. The code for this example can be found here: https://github.com/garethnisbet/T-BOTS/blob/master/Python/Development/T-Bot_Tracking/DrawAndTrack.py
Image A was an early attempt. Image B shows the results of improved PID tuning. Maybe C will be more like an elephant than a woolly mammoth!
The first half of the video shows the T-Bot being controlled using angular and positional PID loops. The second half uses the same controller but, the coloured disks on top of the T-Bot are rotated by about 40 degrees. Now the T-Bot cannot travel in the direction perpendicular to the line defined by the two discs. While the PID controller is not the most elegant or efficient way of doing this, it certainly proves to be remarkably robust.
The challenge is to set the best lap time. You will need a Raspberry Pi or PC with Bluetooth capability, a tape measure, a webcam, and a T-Bot. Adjust your camera so the ends of the superimposed sine function are 1 m apart. Now, develop the best controls strategy to give the best time. The code used for the video can be found here.
A simple keyboard controller for the T-Bot.
I have used deque to create a rolling plot. This is very helpful for tuning your T-Bot.
You can use CombinationFilter.py to experiment with the two filters.
The Joystick Bridge has been updated. More instructions have been added to the instructions page for PS3 and PS4 controllers.
Here is a prototype of a magnetic FPV camera hat for the T-Bot. The T-Bot is being controlled using the Python Joystick bridge. OpenCV is being used to pull in the video stream and transform the frames to a birds eye view. All of this is being done on the Raspberry Pi 4 at 30 FPS.