A few weeks ago I ordered the CH3-R hexapod. My goal (right now) is to control the hexapod from a PC using Java and to add basic capabilities over time (walking, basic object detection, sensors on the feet etc.).
At the moment the software can only make my bot walk using different gait types and it can make the body of the hexapod move around.
I also added a little simulation panel to the software. This lets me debug the gait, and I can also see when a servo is moved to a position that’s mechanically not possible on the real bot.
The simulation is developed in Java using Java3D (OpenGL).
I can now visualize the hexapod and control the real hexapod at the same time. The servo positions are sent to the bot by serial cable.
The white panels with the red dot in the middle are used to control the most important functions:
(1, Leftmost): centerpoint of the body relative to the legs
(2): controls pitch and yaw of the body
(3, 4): controls the rotation point to control the direction of the
hexapod (middle->turns on position, rightmost->walk straight ahead,
up->walk sidewards straight ahead etc.)
The gait control panel (top right) controls the type of the gait.
The gait algorithm uses an input value between 0.0 and 1.0 to control the phase of the step.
Leg Ground Time: controls the amount of time the foot stays on the floor.
1, 2, 3, 4, 5, 6: controls the offset of the phase of each leg
This is quite a general way to specify gaits. I can change between tripod, ripple and wave gait by changing these values. (see video 02:12-02:35)
Are you saying you can develop scripts for gaits, and interactively view them with your program? Visual Java?
Is the “movie” then generated afterwards?
Perhaps you could tell us what software products are needed to do all of this? I know I don’t have ANY of it!
I’ve mentioned on other threads that I want to be able to watch the leg movements resulting from the IK calculations. Sounds like you’ve gone WAY past that?
Please give us more information about what you’re doing! I’m sure others (myself included) would LOVE to hear more!
The gait algorithm/user interface/visualization is written in plain Java. Using a Java IDE like eclipse or IDEA you can run the program in debug mode, change the gait algorithm and when saving directly see how the modified gait algorithm works.
The visualization is realtime, like a 3D “game”. All parameters can be changed in the user interface and are directly visible in the 3D hexapod.
The video is a screen recording. That’s also the reason why the viewport is zooming in and out somewhat strange by me dragging around the mouse (I always pick the wrong zooming direction first)
The biggest problem to create the movie was to find a tool that is able to record the window contents at 20 frames per second.
Let me know if someone knows a tool that can record an arbitrary area of the screen at 20fps. Perhaps I could create a movie by recording the screen with a regular video camera, then I can show you how I change the parameters of the hexapod.
I’m sending the commands to the bot by serial cable. I will probably change to a wireless connection or mount some kind of industrial PC or something else with enough CPU power directly to the bot.
I’m using Java as the programming language of choice because I know the language best and the software should run on windows and linux without porting. Java3D is an abstracted library and can run on top of OpenGL or Direct3D, it’s just a matter of setting a flag to switch between OGL/D3D.
The commands are generated on the fly. There’s no need to create a sequence first. In the end the bot should act autonomous based on object detection/recognition. That’s the reason I want to create the locomotion first and then start to add behaviour.
I added two new videos.
The first video shows the highlighting of servos in positions mechanically not possible on the real bot.
In this video I also use the new capability to control the angle between the legs. youtube.com/watch?v=lP64iNDrKZQ
The second video shows the hexapod in the waypoint mode.
The locomotion algorithm controls the steering of the bot to reach the red waypoint. The waypoint can be changed using the mouse.
The algorithm is very basic and only calculates the rotation point of the bot depending on the angle between bot/waypoint and depending on the distance of the waypoint.
I can still manually change between gait types and other parameters like body height, pitch, yaw and angle between legs. youtube.com/watch?v=p9FrDbRz56Y
It is already possible to walk around a point while looking at this point when controlling the direction/rotation of the bot manually youtube.com/watch?v=mYj_9QjOvOQ
I would have to slightly adjust the waypoint mode when the bot should do this automatically.
Depends what files we count. The runnable compiled program with all data files (3D objects, textures etc.) without required libraries is 1.6MB.
Classes only are 300kb but this already contains video processing, steering behaviour and environment classes not used in the demos.
Source code required to run the demos is currently around 150kb.