[PIWARS] Overview

Well, here we are, very close to the competition itself! What I want to know is where did all those months go?? It seems like only yesterday that I got that email saying I had been selected to compete! But on the other hand after all this hard work over such a long period Lobsang is getting quite sophisticated, though I don't know if it is enough to win any prizes at Pi Wars.

I have a communication system set up between the Pi and the Duino. They talk over serial, but the commands need to be parsed. I include error handling and can read numerous commands, eg. "LMI14" means set the left motor instantly to speed (14 - 16) = -2 (negative so backwards), whereas "RMR32" means set the right motor to (32 - 16) = +16 (max speed, forwards) but ramp the speed from the current speed. Ramping is the automatic setting as it puts less strain on the motors and reduces the voltage drop when the motors start. The Pi can also read responses from the Duino, so I get the Duino to check the ultrasonic sensor and calculate the distance (the theory being that it is more accurate) then it sends the distance to the Pi like this "US24" means the distance sensed is 24 cm away.

Lobsang can now:

  • Follow a line of almost any shape, so long as two parts of the line do not come close together- then it may jump from one track to the next. I use a logical way of looking at the line. There are three sensors for the line, which gives 6 detectable places for the line to be- centred on the middle sensor, between the middle sensor and the left or right sensor (but not under any sensor), under the left or right sensor, and to the left or the right of the whole sensor head (under no sensor). I use this information to alter the way the robot follows the line. If the line is centred then the robot drives forward. If the line under the left or right sensor then the robot turns the correct way, but if the line was under the left or right sensor but now has moved out from under the sensor head (the robot is understeering around a corner) then Lobsang turns tighter. So it has two speeds for turning corners. After a lot of tweaking, the loop to check for line position changes now runs at around 50Hz (50 cycles a second) which is fast enough to react quickly and register fast-changing line positions. The previous code was a lot slower and would sometimes miss the line or react too slowly.
  • Be controlled by a keyboard like a radio controlled toy. W, A, S, D keys control motor speed and direction, and in some instances other keys control Pi Wars specific appendages (Skittles challenge). I am using the PyGame library to get key presses from the keyboard. Multiple keys can be pressed eg. W and A makes the robot turn left and go forward. This single concept is altered for the straight line speed test slightly so left and right turning is greatly reduced so I can fine tune the robot's direction without slewing around lots and losing time. The video proof is here on YouTube if anyone doubts what I say!
  • Drive up to a wall or other pale vertical obstacle and stop very close to it but without actually touching. I am still working on the code for this. It is more a case of perfecting rather than creating, as I already have some completely usable code but there are ways I can make it even better. I currently use a somewhat inaccurate ultrasonic sensor for the main approach and then a Ryanteck line following sensor with three individual sensors on one head, which for this challenge is re-mounted so the sensors point vertically and work as obstacle sensors. Because they are designed for detecting a line at very close proximity, they only detect the wall when they get very close to it. This is perfect for getting the robot as close as possible- I simply drive forward slowly until any one of the sensors detects the wall. They give a digital readout so handling their output is very simple. Improvements I want to make are to centre the robot after it gets to the wall. It may drift slightly off course and get to the wall at an angle, which means that one sensor will detect the wall before the other two. Using this I can work out which way the robot has drifted and then correct this. During this process I should also be able to get a bit closer to the wall.


Capabilities still in progress include:

  • Guiding and launching a ball to knock over the skittles. I am almost ready with this challenge but still not quite there. I am making a Meccano appendage that has a ball guide (vaguely a square shape that keeps the ball in the right place while it is moved) with an arm at the front that can open or close the guide, and an elastic band powered flipper. It is a little hard to explain it all accurately (images are coming soon, when I complete it!). The flipper is released by a servo but the energy that propels the ball comes from the elastic band. The robot will be controlled manually using the process described above with additional controls for the appendage. The robot will drive to and collect the ball, closing the guide. When the launcher is positioned correctly and the robot is in the right place, pressing a single button will open the guide and almost simultaneously release the flipper that propels the ball, though there will be a slight delay to let the guide open fully before the ball shoots through the gap.
  • Driving in a three point turn. The robot does describe a rough T-shape but I still need to calibrate it a lot. I have a theory that I could use the still-to-be-fitted laser module to point to the place where the robot will end up after driving part of the T-shape (to the tip of an upside down L) and use this point to position the robot accurately in the challenge starting box, though I don't know if I will have time.

So still quite a bit to do! But I think I'm making good progress. I was set back a bit by being away over this week without access to my tools, but now I'm back I'll be hurrying along again. As always my code is in a repository on GitHub.