My Hexapod Simulation

Hello everyone

A few weeks ago I ordered the CH3-R hexapod. My goal (right now) is to control the hexapod from a PC using Java and to add basic capabilities over time (walking, basic object detection, sensors on the feet etc.).
At the moment the software can only make my bot walk using different gait types and it can make the body of the hexapod move around.

I also added a little simulation panel to the software. This lets me debug the gait, and I can also see when a servo is moved to a position that’s mechanically not possible on the real bot.
The simulation is developed in Java using Java3D (OpenGL).

I can now visualize the hexapod and control the real hexapod at the same time. The servo positions are sent to the bot by serial cable.

http://img486.imageshack.us/img486/7338/ch3rgg5.png

The white panels with the red dot in the middle are used to control the most important functions:
(1, Leftmost): centerpoint of the body relative to the legs
(2): controls pitch and yaw of the body
(3, 4): controls the rotation point to control the direction of the
hexapod (middle->turns on position, rightmost->walk straight ahead,
up->walk sidewards straight ahead etc.)

The gait control panel (top right) controls the type of the gait.
The gait algorithm uses an input value between 0.0 and 1.0 to control the phase of the step.
Leg Ground Time: controls the amount of time the foot stays on the floor.
1, 2, 3, 4, 5, 6: controls the offset of the phase of each leg

This is quite a general way to specify gaits. I can change between tripod, ripple and wave gait by changing these values. (see video 02:12-02:35)

Here is the video of the simulation software in action:
youtube.com/watch?v=QkrTDEijRIM

I was just able to record the 3D part of the UI since the recording software directly captures the rendered frames from the OpenGL pipeline.

Martin

Hi Martin,

Terrific!

Are you saying you can develop scripts for gaits, and interactively view them with your program? Visual Java?

Is the “movie” then generated afterwards?

Perhaps you could tell us what software products are needed to do all of this? I know I don’t have ANY of it!

I’ve mentioned on other threads that I want to be able to watch the leg movements resulting from the IK calculations. Sounds like you’ve gone WAY past that?

Please give us more information about what you’re doing! I’m sure others (myself included) would LOVE to hear more!

Alan KM6VV
Do I sound EXCITED?

The gait algorithm/user interface/visualization is written in plain Java. Using a Java IDE like eclipse or IDEA you can run the program in debug mode, change the gait algorithm and when saving directly see how the modified gait algorithm works.
The visualization is realtime, like a 3D “game”. All parameters can be changed in the user interface and are directly visible in the 3D hexapod.

The video is a screen recording. That’s also the reason why the viewport is zooming in and out somewhat strange by me dragging around the mouse (I always pick the wrong zooming direction first) :slight_smile:
The biggest problem to create the movie was to find a tool that is able to record the window contents at 20 frames per second.
Let me know if someone knows a tool that can record an arbitrary area of the screen at 20fps. Perhaps I could create a movie by recording the screen with a regular video camera, then I can show you how I change the parameters of the hexapod.

]Java (GUI, controls, gait) java.sun.com//:m]
]Java3D (Visualization) java.sun.com/products/java-media/3D//:m]
]Taksi (Screen recording) taksi.sourceforge.net//:m]

All listed tools are free.
You will also need an IDE like eclipse.org/

Martin

Thanks Martin,

Again, that’s quite a project! Thanks for the URLs and the run-down. Do you have a website?

Alan KM6VV

No, I don’t have a website. Perhaps I will create one at a later stage of my project.

Martin

Hi!

Great job!
This was really stunning. Very impressive! 8)
I’m fiddling with a excel program… but the program of your rules.

I take my hat of.

How are you interfacing with the real hexapod?
Can’t wait to se more of this project!

I’m sending the commands to the bot by serial cable. I will probably change to a wireless connection or mount some kind of industrial PC or something else with enough CPU power directly to the bot.

Thanks
Martin

Ok, so you are using a servocontroller on the bot? SSC32?
Any live video’s of the hexapod on youtube?

Thats fantastic! Ive never seen something soo cool 8)

Exactly, only the SSC32 is on the bot.

Not yet, but I will try to create one this weekend.

Nice work. I personally don’t like java. Think its bleh. I prefer C++ to java. You can use DirectX and OpenGL :slight_smile:

I like the simulation though, looks very realistic (motion wise).

Do you first create a sequence with the program, and then send the commands to the hex? Or do you control the hex on the fly?

-robodude666

I’m using Java as the programming language of choice because I know the language best and the software should run on windows and linux without porting. Java3D is an abstracted library and can run on top of OpenGL or Direct3D, it’s just a matter of setting a flag to switch between OGL/D3D.

The commands are generated on the fly. There’s no need to create a sequence first. In the end the bot should act autonomous based on object detection/recognition. That’s the reason I want to create the locomotion first and then start to add behaviour.

Martin

Hello everyone,

I added two new videos.
The first video shows the highlighting of servos in positions mechanically not possible on the real bot.
In this video I also use the new capability to control the angle between the legs.
youtube.com/watch?v=lP64iNDrKZQ

The second video shows the hexapod in the waypoint mode.
The locomotion algorithm controls the steering of the bot to reach the red waypoint. The waypoint can be changed using the mouse.
The algorithm is very basic and only calculates the rotation point of the bot depending on the angle between bot/waypoint and depending on the distance of the waypoint.
I can still manually change between gait types and other parameters like body height, pitch, yaw and angle between legs.
youtube.com/watch?v=p9FrDbRz56Y

Martin

HEHE! :laughing:

Awesome! The best simulation I have ever seen.
Is it possible to walk around the waypoint? Meaning that the waypoint is the centerpoint of rotation.

Your program are just excellent!

It is already possible to walk around a point while looking at this point when controlling the direction/rotation of the bot manually :smiley:
youtube.com/watch?v=mYj_9QjOvOQ

I would have to slightly adjust the waypoint mode when the bot should do this automatically.

Thanks
Martin

Impressive! 8)

I have managed to do the same with my excel program. But your solution are so extremely elegant.

Keep up the good work!

Looking forward to see your live walking hexapod!

I uploaded a new video showing the simulation with the real hexapod side by side.
youtube.com/watch?v=zIJSidNh0yo

Martin

OK! I don’t have any words left…

VERY IMPRESSIVE!!!

So cool! Had to admit, I’m a bit envious…
:astonished:

Very nice work :slight_smile: How big is the whole program? All of the files together.

I’m envious of both of you =P

Depends what files we count. The runnable compiled program with all data files (3D objects, textures etc.) without required libraries is 1.6MB.
Classes only are 300kb but this already contains video processing, steering behaviour and environment classes not used in the demos.
Source code required to run the demos is currently around 150kb.

Thanks,
Martin