This Python code allows you to use your generic joystick or PS3/PS4 controller to control your T-Bot while streaming video in real time from the T-Bot’s helmet camera. This has been bench marked on the Raspberry Pi 4 at 30 fps. You can find it here:
Create your own themes for your T-Bot controller. The SVG files are available for you to play with.
The controller uses PyGame, PyBluez or Socket, OpenCV and NumPy. You can find the code here.
A simple example of self driving using HSV filtering with , findContours from OpenCV and grab_contours from imutils. The code for this example can be found here: https://github.com/garethnisbet/T-BOTS/blob/master/Python/Development/T-Bot_Tracking/DrawAndTrack.py
Image A was an early attempt. Image B shows the results of improved PID tuning. Maybe C will be more like an elephant than a woolly mammoth!
The first half of the video shows the T-Bot being controlled using angular and positional PID loops. The second half uses the same controller but, the coloured disks on top of the T-Bot are rotated by about 40 degrees. Now the T-Bot cannot travel in the direction perpendicular to the line defined by the two discs. While the PID controller is not the most elegant or efficient way of doing this, it certainly proves to be remarkably robust.
The challenge is to set the best lap time. You will need a Raspberry Pi or PC with Bluetooth capability, a tape measure, a webcam, and a T-Bot. Adjust your camera so the ends of the superimposed sine function are 1 m apart. Now, develop the best controls strategy to give the best time. The code used for the video can be found here.
A simple keyboard controller for the T-Bot.
I have used deque to create a rolling plot. This is very helpful for tuning your T-Bot.