Macro recording has now been added.

Macro recording has now been added.
Here is a keyboard and mouse controller for those of you who don’t have a physical joystick. This is simpler than the one found here and is easier to customize. You can find it in our GitHub Repository. Note this version does not have macro recording.
A convolution function is used to identify each of the tile types. A difference function is used to determine the grid shape. The zebra tile is taken as the starting position then some logic is used to step through neighbouring tiles. The code can be found here.
This Python code allows you to use your generic joystick or PS3/PS4 controller to control your T-Bot while streaming video in real time from the T-Bot’s helmet camera. This has been bench marked on the Raspberry Pi 4 at 30 fps. You can find it here:
Create your own themes for your T-Bot controller. The SVG files are available for you to play with.
The controller uses PyGame, PyBluez or Socket, OpenCV and NumPy. You can find the code here.
A simple example of self driving using HSV filtering with , findContours from OpenCV and grab_contours from imutils. The code for this example can be found here: https://github.com/garethnisbet/T-BOTS/blob/master/Python/Development/T-Bot_Tracking/DrawAndTrack.py
Image A was an early attempt. Image B shows the results of improved PID tuning. Maybe C will be more like an elephant than a woolly mammoth!
The first half of the video shows the T-Bot being controlled using angular and positional PID loops. The second half uses the same controller but, the coloured disks on top of the T-Bot are rotated by about 40 degrees. Now the T-Bot cannot travel in the direction perpendicular to the line defined by the two discs. While the PID controller is not the most elegant or efficient way of doing this, it certainly proves to be remarkably robust.