[PIWARS] GitHub repository set up, OLED scares, new mounts, line follower update

I've finally created a GitHub repository for my code that runs on Lobsang (it was pretty hard to get the local repository program working, but I managed in the end!). It is somewhat patchily commented, so I can't guarantee that it's all completely understandable, but it's there if anyone wants to take a look. The file which may be of interest for people to use themselves is Oled.py- it's customized to my robot, but can still be used without the hardware I have. It's for Pi to SH1106 OLED communication. It is similar to Adafruit's SSD1306 oled library, but that doesn't work with the SH chip which is why I had to create my own library . In Oled.py there are no draw lines or shapes functions, but it can display images, and text in the Minecraftia font. It's still pretty basic, but the underlying I2C commands are there and can be built on if you want to use the library.

I've created a line following sensor arm, which looks better (or worse? I'm not entirely sure...) than the previous makeshift design, and is sturdier and simpler. And for the Proximity Alert challenge, I will flip the sensor so it points forward to use it like a obstacle detector, but for very close range. That way I can get really close to the wall, more accurately than with just the ultrasonic sensor that I am using for the main approach to the wall.

IMG_0001_1__0.jpg

IMG_0002_1__2.jpg

IMG_0003_1__2.jpg

 

The ultrasonic sensor is working again. I snapped my two wires that came out of the voltage divider I made (US runs at 5V, RasPiO Duino at 3.3V so output logic needs to be stepped down) a while back but have re-soldered them more securely and I have not had trouble yet. This has enabled me to make a Proximity Challenge program, with pretty successful results. I am using the line following sensor as a very close range obstacle sensor so the ultrasonic is used for the main approach then when the robot get close to the wall the line sensor, flipped so it points outwards, is used to detect the wall at which point the robot stops.

IMG_0001_1__1.jpg

IMG_0002_1__3.jpg

IMG_0003_1__3.jpg

 

I have created a master control program that automatically runs when the Pi boots. It displays a scrollable menu on the oled on the back of Lobsang, which then runs each seperate challenge script and when the challenge script exits the menu continues running. All user keyboard controls are gathered using pygame for every script. For instance although some programs are autonamous, they wait for a SPACE press before the robot moves, and you can stop the robot at any time by pressing the SPACE key again, to avoid possible accidents on the day.

I had a scare the other day when I was replacing the very temporary OLED mount (blue tack!) with a 'proper' mount. While attaching the OLED I snapped a cornerof the glass! It is very fragile. There was a very tense moment while Lobsang booted, then the OLED quite happily displayed 'Lobsang booted successfully' and it still works fine now, fortunately. I don't know what I would have done if the OLED broke- I don't have a spare and my code relies on it quite heavily!

IMG_0001_1_.jpg

IMG_0002_1__1.jpg

IMG_0003_1__1.jpg

 

I have been tweaking the code for line following for quite a while now. My first attempts followed a line erratically at ~19 seconds to go around my custom track. Now, after numerous revisions, and having started from scratch a few times too, I have settled on some simple, efficient code (not PID. Just logical) that can complete a circuit of my track in just over 12 seconds! That's approximately a 40% increase in speed.

The latest line following code running on Lobsang is here on YouTube.