StomPi, my to-be telepresence puppy ^_^

First independent walk after brain surgery (from arduino to PiZero) and porting to python. A bit wobbly and set to cautious mode so pretty slow, but shows promise. And cute :) Doesn't use any pre-set gait, calculates weight shifting and leg movement completely runtime and dynamically so can walk in any direction: forward-backwards, strafe and turn, and any combination of those, and blend or change any time. The legs are a bit too short to be practical, but wanted to make him more puppy-like than the typical "spider" legged quadrupeds. More optimisation of the gait algorithm to come to make him more efficient and smoother, plus plan to add some sensors and a webcam to work as a "telepresence puppy" :) (see a few previous versions on my youtube channel)

(youtube link: https://www.youtube.com/watch?v=3e4ZW9gtczE)


This is a companion discussion topic for the original entry at https://community.robotshop.com/robots/show/stompi-my-to-be-telepresence-puppy

Hi, Denes. We interacted

Hi, Denes. We interacted briefly on the ShoutBox. Please do add more detail, and link to your YouTube videos. 

Do you have any issues with calculating the leg movement real-time using the PiZero? I’ve read some Raspberry Pi makers use an Arduino in conjuction with a Pi since Linux is not a real time operating system.

BTW, you can actually add your YouTube videos right to the Robot page. When you are editing, just scroll down to the Video section and put your video URL in the form.

Looking forward to seeing your telepresence puppy!

Good to see!

That sounds very interesting Denes!

I look forward to see some videos and pictures of your creature :wink: Thanks and keep going!

Interesting, i did post both a picture and a youtube link…

Hi guys,

Thanks, although that’s a bit odd. The form even refused to post it at first because it needed a primary image, so i added one. Let’s see if it works this time

So cute!Yeah, the form is a

So cute!

Yeah, the form is a bit weird. I think after the first time you save it and go back in, you get more options (like to post video).

I like how you’ve wired three servos together to make each leg. Simple, but elegant.

 

Nice Project!

Given the degrees of freedom, I’m really looking forward to seeing it walk more smoothly and seeing you add a cover.

Wow!!!

It is amazing, very nice and very complete !!!

update, smooth gait (work in progress)

Hi guys,

Thanks for the kind words, glad you like it! :slight_smile:

In other news, i managed to spend some time on the software and completely re-engineered how to calculate the gait. Much smoother and more efficient, look at the videos (Oh, and it seems i can finally add youtube links!) Sorry for the low res, just captured it from my screen.

As you probably noticed, it’s running in a simulation (Unity3D) for now not on the actual body, because i can develop much faster this way. I think the progress so far speaks for itself :slight_smile:

If everything goes well i will port it to the Pi; seems like i found a way to do it in C# so i can save a huge amount of time not having to port the whole thing to python!

 

Gait generation

Curious to know how you set the gait in place - was there sample mammal-based walking code available in Unity? The video “Virtual StormPi” seems incredibly fluid, though it has degrees of freedom which the robot would not have .

Very interesting!My little

Very interesting!

My little robot KITtyBot has a similar general layout on the limbs but I use calculated gaits and leg movements. I done some more work on it, and a slightly smaller IR controlled version using only Pro Mini and the servo.h library, but am a little behind with publishing to LMR. In the meantime I collect your robot and follow your progress. :slight_smile: