IK GUI for the SEQ

Hello,

This may be a rediculous question as I am software/code impaired but…

Is it realistic to think that someday we might have a GUI that will allow us to move limbs/assemblies using IK instead of having to move each indivdual servo? For example: You would move a set of triangles that represent the tips of the limbs on your Hexapod and the program would create the code for the controller from what the incorperated IK engine came up with. Or it could be a simple line drawing of the limb(s)with a grip point at the end(s) that can be moved in 3D space virtually to effect movments in the 'bot.

Whuddya think?

:slight_smile:

What???

Hmm…
Is it realistic?
Yes and no.

Creating 3d environments is tricky (and expensive) enough on it’s own, even with stickfigures.
Doing other things in the background (sending and possibly recieving large amounts of serial servo data) at the same time is even more so.
And, then add onto that all of the necessary IK calculations and you’ve got a very compute-intensive system going on.

Can it be done?
Sure.
But definitely not in an afternoon!

A while ago, Mike started a project that would more-or-less do exactly this:
www.mike-winters.com/
He took a break from it, though, during his summer semester of college.
Now that the summer is over, maybe he’ll again turn up…
:slight_smile:

Nick,

You seem to have grasped what I was imagining. What if instead of actually moving (Sending/Receiveing/Puppeting) servos the interface worked like a simulator. You would draw your stick figure bot and move the limbs about in the sequances you desire and the program would record said sequences and translate them into code. You would only need to move the end points on the limbs because of the IK engine. You would have to set up the constraints on the Joints (Parent, Child, Angular and Spatial) like the IK bone in a 3D animation program. Once a sequence was recorded the program could then translate the movements into graphed charts that you could tweak in order to smooth them out.

I can picture it. I’m sure there are some here that know people that could make such a program. I wonder if one exists allready. I’ve found some high end ones that are for commercial and military apps but since no one is taking my ideas of bionic man-machine interfaces seriously enough to give me the funding needed I just have to wait for the technology to surface in the private sector…

:slight_smile:

The cool thing about the SSC-32 is you can send all of the servos a new destination every so often (30-50mS) and do the IK calculations in this dead space. So every loop it gets updated positions from the IK and instructs them to go to the new updated position. The moves are coordinated (group moves) and they are updated as they move, before they reach their last commanded destination. This method works incredibly well. The math involved is not a problem for a powerful PC. This is how our arms are controlled with RIOS, and how the H3-R robots legs are controlled as well.

The Sequencer program would require a way to exchange data with another program (DDE) to allow a 3D engine to provide positional input the the program. I think it’s on the wish list, but I’m not sure when or if it will be incorporated.

Jim,

I’m really excited about using the sequencer you sent me. I’ve had a few ideas about how to program my leg assemblies. It sounds like I can directly move a servo powered joint to a desired position and get the program to report to me where that is. Since my 'Bot has such sensitive limbs, and since the servos power is transferred indirectly I can’t move the actual assembly manually. I thought I would build a dummy leg with the same angles and lengths with actual servos at the joints and use that to find the positions. Then I could plug in my ‘hero’ leg and run the sequence through it to see how it goes…

The 3D model SEQ would take the place of the need for a ‘waldo’. In my mind the interface looks a lot like the one in Martin and Hash’s ‘Animation Master’. I could watch the machines movement graphically before having to implement them physically. It would be really awesome if the code crunchers that did the SEQ and the ones that did ‘Animation Master’ could colaborate on something.

:slight_smile: