Quadruped slow walking gait video

A customer sent me this video of his SES based quad. It’s a slow static gait. Thanks Shawn! Enjoy…

youtube.com/watch?v=fEN4VPILnjI

Very nice gait. After all, whats the rush in life for a quad?

-robodude666

Pretty cool, esp. since it looks like he articulated a head on it too. :slight_smile:
I can’t help but wonder if having an ankle axis, even if only in one dof, would make a static parallel gait for a quad much more achievable. It seems like most folks use the 3 dof legs from a hex but the point of ground contact is totally uncontrolled. I am not going to tear apart my scout to mess with this but if someone were already working on a quad or had a bunch of parts laying about I would think this would be a very interesting test to run. :bulb:

Can’t one just “feather” the center pair of legs on a hexapod, and then basically have a quad?

The hexapod sequences (tripod and ripple gaits) I’ve seen look very realistic, and can be “traced” to biological models (i.e., insects). I watch the bipeds and quads walk, and they’re nowhere near similar to the real thing. No doubt it is because the hexapod (and octapod!) gaits are basically stable, while the others are not.

Has anyone tried to make a biped or quadraped “push off” a little when it takes a step. Maybe try trotting first? The “BIG DOG” seems to be able to pull a reasonable looking gait; although the “reversed knees” throws me a little.

Alan KM6VV

It kinda reminds me of a turtle, which is cool. I like turtles.

That turtle looking thing is mine. I don’t know much but I’ll share anything I have figured out so far. Thanks for asking about it Jim, and posting the video.

The trick to the crawl is to swing the center of mass over the triangle formed by the three “on-ground” legs while moving the other leg.

The rate is variable, but if I speed it up too much it gets sloppy - I think the way I send commands over rs-232 to the servo controller doesn’t provide tight enough control at high speeds (I just send destination positions for every servo every tick) - or maybe it just needs more juice to the servos. Not sure what would tighten it up at faster crawls.

Also, I had worked out how to spin it at any forward speed, but I lost that version of the software due to a lightning strike. I thought it was backed up but it wasn’t.

The head is a CMU cam on a tilt/pad system.

The shiny thing glommed on top is a voice synth so it can tell me things (ready, walking, etc.) to help debug it, etc.

I haven’t worked on it at all since Dec 2005, though. Busy with another project.

The CMU cam does all processing locally? Also, what’s the new project you have up your sleeve?

Yes, the cmu cam grabs a frame of video, and then runs analysis on it on a microcontroller, and you can then get “stats” about the image basically over a low bandwidth rs-232 or TTL connection. Its main thing is that it can track a blob of color I guess.

That didn’t please me, as I wanted to work on vision myself, so I was learning how to program the microcontroller they used, and also the camera chip they used – I successfully was able to control registers in the camera chip, but hadn’t yet got everything working to stream the pixels out into my own memory for processing – although I did learn a lot about it (FIFO memory, etc), perfect for grabbing/reading a frame of video. Basically I was reverse engineering the CMUcam (or CMUCam2 or whatever it is).

The other project is just work - a motorsports simulation.

Very impressive! If I had the smarts to design my own onboard video processor, I would love to do it. This is something that many people would love to have. A solution that is small and able to give their bot vision control with out the use of a laptop computer.

It almost does has a laptop on it… Not quite though - just an i386ex 25MHz.

However, my friend learned how to write software and run it on his Sony PSP (you have to have an old firmware version of the PSP to do so, AFAIK). We ported the software to it, and ran the bot using the PSP! – now that has enough power to run vision algorithms – but unfortunately, I think we only had an RS-232 (out the proprietary PSP head-phone jack) for I/O – so no good way to connect a camera (or does the PSP have a camera?).

Someday I’ll have enough money (or the cost will be low enough) it will be easy to put vision algorithms onto it.

You have my attention on this topic.

I was thinking of a design that is the size of the SSC-32 or the Mini-ABB or smaller. It would be great to have an embedded design that has near the power of a Laptop PC.

The reason I have such a great interest in this is that I would love to add vision to my bot head project, but I would want it to actually provide useful navigation, object avoidance and low level object recognition. I know this if far fetched but it’s my vision I have.

For now, RoboRealm is the best solution out there but a wirless transiever needs to be used.

Amazing! Hacking a PSP to do vision! I think a smaller design could be made if all it has to do is process video, but I guess the power comes from having any design function like a computer rather than a single function design.

I bet my small Pocket PC phone can do some video processing and it has a built in camera.