First independent walk after brain surgery (from arduino to PiZero) and porting to python. A bit wobbly and set to cautious mode so pretty slow, but shows promise. And cute :) Doesn't use any pre-set gait, calculates weight shifting and leg movement completely runtime and dynamically so can walk in any direction: forward-backwards, strafe and turn, and any combination of those, and blend or change any time. The legs are a bit too short to be practical, but wanted to make him more puppy-like than the typical "spider" legged quadrupeds. More optimisation of the gait algorithm to come to make him more efficient and smoother, plus plan to add some sensors and a webcam to work as a "telepresence puppy" :) (see a few previous versions on my youtube channel)
Hi, Denes. We interacted briefly on the ShoutBox. Please do add more detail, and link to your YouTube videos.
Do you have any issues with calculating the leg movement real-time using the PiZero? I’ve read some Raspberry Pi makers use an Arduino in conjuction with a Pi since Linux is not a real time operating system.
BTW, you can actually add your YouTube videos right to the Robot page. When you are editing, just scroll down to the Video section and put your video URL in the form.
Looking forward to seeing your telepresence puppy!
Interesting, i did post both a picture and a youtube link…
Hi guys,
Thanks, although that’s a bit odd. The form even refused to post it at first because it needed a primary image, so i added one. Let’s see if it works this time
In other news, i managed to spend some time on the software and completely re-engineered how to calculate the gait. Much smoother and more efficient, look at the videos (Oh, and it seems i can finally add youtube links!) Sorry for the low res, just captured it from my screen.
As you probably noticed, it’s running in a simulation (Unity3D) for now not on the actual body, because i can develop much faster this way. I think the progress so far speaks for itself
If everything goes well i will port it to the Pi; seems like i found a way to do it in C# so i can save a huge amount of time not having to port the whole thing to python!
Curious to know how you set the gait in place - was there sample mammal-based walking code available in Unity? The video “Virtual StormPi” seems incredibly fluid, though it has degrees of freedom which the robot would not have .
My little robot KITtyBot has a similar general layout on the limbs but I use calculated gaits and leg movements. I done some more work on it, and a slightly smaller IR controlled version using only Pro Mini and the servo.h library, but am a little behind with publishing to LMR. In the meantime I collect your robot and follow your progress.