POWERPOD ATOM PROGRAM - CH3R HEXAPOD -- Details?

Yes, what I meant to say is there is a +/- 30 degrees offset from leg to leg and this extra 30 degrees is perhaps the offset from the “center.” By “center” I mean the nose of the H3-R, the mid point between the two “front” legs if the three holes for the switches are considered to be the “rear.”

I’m just theorizing this portion. The IK calculation still has not fully permeated my understanding of what is going on from a logic flow stand point… :stuck_out_tongue:

For the moment, I am using my own code, which is just a gaiting code to make the hexapod walk. This is a sequence-based code so it’s inferior to the one that Laurent wrote, but it is used as a morale booster (to watch my hexapod do something other then sit on the bucket/stand with a DB9 umbilical all the time) when I get frustrated trying to understand his awesome code :wink:

Let’s keep this thread going so we can all have a better understanding of the IK calculations that is going on. I am still very confused at the “movement” point of view of things. There seems to be two SETS of reference system. One set is each individual reference system that fixes its origin at the Hip Horizontal servo. The other set of reference system seems to be the body of the H3-R? Am I viewing this correctly?

In order to determine where to move the hexapod, one has to assume that you want to ultimately move the body/chassis to coordinate (X,Z) and use the IK calculation to move the legs into certain positions to accomplish this?

I’m really hoping that Laurent can shed some light if he has any free time to explain his code… Meanwhile, I am searching for books and literature that explains IK calculations more on the general level (theory) to gain a better understanding of what perspective/angle I should be viewing from to even start throwing trigonometry at it… I believe that ultimately, this general sense of IK can be and should be applied to robots of any kind (except for wheeled/tracked I suppose)…

:smiley:

Hi Tom,

Also recall that “0 degrees” on the 'bot is aft (to the rear).

Yeah, it’s great to see some moves for the first time! I ran for quite a while “up on blocks” with a friend’s DIY 'bot, trying to get code to run for it.

True, but you don’t have to give much thought to the body of the 'bot. Just revolve the input (commanded vector) direction around to the orientation of each leg, and then work in the leg “frame” of reference.

You could say that. I just get into the “leg frame”, and the IK moves the foot of the leg in response to the input vector.

Do the research! There are plenty of books on robots, and papers on the internet. Just a little trig will get you the IK.

Alan KM6VV

Ok, so I went through several articles explaining IK, and it looks basic enough, just need to use the knowns and solve for the unknowns, which in this case, is the angles that the joints needs to be so that we can send the appropriate corresponding angles to the servo after converting it to a pulse width.

What I don’t get is, even if the tips are commanded to move to a location after the IK calculation are done, doesn’t the legs still have to be commanded to move again in order for the chassis to move?

For instance, wouldn’t you need to do two IK calculations in order for the body to move?

  1. Do the IK calculations and command three legs in a tripod to move to Location X,Z

  2. Do another set of IK calculations to this same planted tripod in order to push/pull from the location set in 1) in order to move the body?

I don’t get this higher-level flow of the diagram. I know that the array Xpos() and Ypos() and Zpos() is current position of the hexapod, while Xpos2() and Ypos2() and Zpos2() is the position to move to.

Does that mean that after the leg tips of the planted tripod is commanded to move to Xpos2() and Zpos2(), of course with the tips above the ground while transitioning to this position, that the tips of the legs have to be moved back to Xpos() and Zpos() (original position), but this time, move back to this location with the tips touching/contacting the ground?

I’m no programming guru, but I get the flow of it. The IK for the destination positions are calculated on the fly in real time. (Is that redundant?) So the code Laurent wrote sends these destination positions to the SSC-32 using the group move function. As the legs are moving the IK is calculated again and the new updated destination positions are sent before the legs reach the last commanded positions. This ensures a continuous movement without stuttering, or going off track. This is also how straight line interpolation is acomplished with the end effector of an arm. Hope this helps.

Ahhh, I see what you mean, but ultimately, the servos will have to swing back to some position in order for it move again?

Does anyone have an edited code for the Atom Pro to use in “serial” mode? I know that the code is originally intended for the BS2, but if someone can post or send me the code generated by Powerpod for the Atom Pro, I would appreciate it.

I’ve begun editing it, but it seems there are many syntaxual (is that even a word?) differences between the two…

:open_mouth: gulp… :open_mouth: Why would they need to do that?

BS2? PowerPod generates Basic Atom code, not BS2 code. The BS2 can not… um, do the math. :wink:

Kurt has the best Atom Pro hack of the Atom 28 code created by PowerPod, but it’s not for serial. We are working on it…

I can easily send/post my version, which had defines to work for both round and inline.

My guess is that most of the code is the same for the serial and the PS2 code, just some sections different for input. So it would not be hard for someone to have Powerpod generate both versions (serial vs PS2) and see where the differences are and then take the version I modified and cut out the PS2 functions and insert the serial functions. You would then have to do some conversions on those sections (serin baud constants or the like, maybe a few nap instructions, but it should not be overly difficult)

There are some threads in either this forum on what most of the code differences were.

That’s about it. Just a couple of “cosine law” calcs, and a right triangle or two to solve. Once you have the servo angles, they are scaled to the working range of the servos.

There are two tripods, and they work in alternating fashion. One tripod “flies”, the other moves the body over the ground.

You do an IK calc for each leg in each tripod, the difference is basically if you “lift” the leg, which requires different angles, of course. You’ve got it.

That’s it. And the “pos2” are added to the “pos” set (well, most of 'em). You’ll also see that the pos2 set is added “incrementally” to pos set, resulting in a smooth move. The number of “steps” in this incremental move determines the speed of the gait; as the servo speeds and the loop delays are left constant (due to very coarse delay in BASIC).

The “flying” tripod moves the feet forward, The “ground” tripod moves the feet aft at the same time thus they make a loop in the vertical plane, first going forward (fly) and then then backward (ground). I guess you could say the ground triangle flips around.

Alan KM6VV

Yes, that is exactly the flow I was thinking of. After I study Kurte’s code (since it has the right syntax for an Atom Pro), I’m going to try to incrementally build up my own IK code from scratch, so that I can fully understand what needs to happen in order to move my bot…

Kurte,

can you please send me the code either through PM or on the next reply to this thread? I think posting it in a new thread might land you a sticky post since it’ll be quite helpful for all h3-r owners with Atom Pro.

Perhaps in the Atom Pro section?

I have been following this thread with great interest, because i wanted to understand the code as well. And i must say that it was really helpfull up to a point.

My main reason for wanting to understand the code is the fact that i have build a somewhat modified version of the BH3 (normally inline). I have used the BH3 body and the C-legs. I also added a via epia pico itx (main reason for choosing the BH3 instead of BH3R) and of course some extra battery.

I changed the position of the front and rear legs because it makes the robot i whole lot more stable. The frontlegs are 30 degrees off to the front and the rear legs 30 degrees off to the rear.

This results in a body that needs the code for the round body, but with some modifications. And to be honest i cannot get al the details in the code right… Corrected walking is not the problem, but the turning gives me a lot of problems.

Could somebody help me on this?

Yes, I hope we can post more insights into IK on this thread!

Are you saying that the pico fits on the BH3 bot? That could be interesting!

The round body code runs a loop to calculate the values of the 6 (or 8!) legs. The in-line body does a simpler calc, 3 legs left, 3 legs right.

You might consider using a table of the angles of your legs. Mine are in FP, but integers could be used. A further refinement (I’m still developing and want to see angles) would be to put the SIN/COS values into the table. Speeds up the calcs.

[code]/* C code example */

double LegAngles] = /* THESE ARE MY NEW ANGLES Shelob */
{ 36.87, 90.0, 143.13, 216.87, 270.0, 323.13};

for(Index = 0; Index < MAX_LEGS; Index++)
{
loc = 0;
LegPos = LegAngles[Index];

XPos2Tmp = -Range * cos(DegToRad(Heading + LegPos));
ZPos2Tmp = -Range * sin(DegToRad(Heading + LegPos));

/* old 127 “BASIC” scaling + 300 in mm */

XPos2[Index] = (signed int)(XPos2Tmp * 127.0 / 300.0);
ZPos2[Index] = (signed int)(ZPos2Tmp * 127.0 / 300.0);
}
[/code]

I’ve still got the “BASIC” scaling here, that can go away as well.

Remember that “0” is aft on the 'bot.

But WHAT IS this 300 parm?

Alan KM6VV

Yes, what i did is i build the via epia pico itx board inside of the BH3 body. Its only a little longer and wider than the ssc- or botboard (10 cm x 7 cm x 3 cm). Actually everything fits like a glove inside the body:

botboard
ssc32 board
via epia pico itx
1 usb to serial converter (both the botboard and the ssc32 board are connected to the epia, com1 and com2)
(o yeah and two batteries, 1 for the servoboard and 1 for the pc)

The epia has 4 usb (2 webcams, wifi, usb to serial), 1 Ghz processor, 1 Gb memory and a 4 GB IDE flashdisk and i am running windows XP and all the software from that (powerpod, programming the atom board, sequencer, roborealm (twice)).

By the way, the via epia board with processor was under 200 dollar…you then only need memory and harddisk.

That’s why it perhaps is a little heavy and i noticed that inline walking is not the best way to go…

In the mean while i was experimenting with the software for the round body and this seems to work really well when i put the legs in the same configuration as the round body (60-60-60-60-60-60). So now the front and rear legs are actually 45 degrees of instead of 30 degrees. And this works really well for the walking part.

But still the turning is the problem, the two middle legs are of course not as far from the center as the front and rear legs, so they are off now.

I don’t know if this is the right way to go, but i was thinking of using the code for the round body and the 45 degrees of setting and making some correction to the code for the 2 middle legs. More specifically, a way to correct for the incorrect body size for the middle legs.

I don’t understand it yet though…

That’s especially one of the parameters i am wondering about as well…

That should be corrected by the rotation angles table.

The legs being further away from the center of the 'bot (as I think you’re indicating) shouldn’t matter, only the input vector that’s rotated around to be applied to the individual legs.

Like I’ve been saying; that’s exactly what you’ll be doing using the vector rotation instead of just two pairs of mirror vectors (for an in-line 'bot).

Alan KM6VV

Eureka, thank you…i get the picture…will try…

Ok, I got the modified code from Kurte, and I’ve truncated most of the PS2 input section but have left the rest intact. I’m trying to “inject” a direction for the Hexapod to walk in, so I’m at this portion of the code:

DCoord = 20      

for Index = 0 to 5
	XPos2(Index) = -(DCoord * COS(DAngle + (Index * 43 + 21)) / 300) ; 43 => 60 degrees
	ZPos2(Index) = -(DCoord * SIN(DAngle + (Index * 43 + 21)) / 300) ; 43 => 60 degrees
next
.
.
.
.

As you can see, I’m trying to inject a “20” into the DCoord, but nothing seems to happen. Am I injecting into the wrong variable to get the hexpod walking?

It has been a long time since I tried to understand this code. When I made it work on the pro, mostly I ran a version on a generic Atom and in some case had it dump out calculations, which I then checked against the calculations that I was getting from the Pro. Obviously Laurent would be the best one to answer this.

However a quick look through the code, I don’t think starting off by changing DCoord will change the direction. It appears to be more of a modifier of the calculated angles. DAnagle may be more of an angle.

I am also not sure what you mean by injecting an angle. My guess is that if you are wanting the Hex to simply move in a direction (Not rotate), I would try experimenting with the two variables XSpeed and YSpeed. To go forward at full speed you might try (XSpeed=127, YSpeed=0. To go back, change XSpeed = -127. To go 90 degrees try XSpeed=0, YSpeed=127… To go 45 degrees try XSpeed=127, YSpeed=127. Obviously you may want to start out not going at the maximum speeds…

To have the Hex Rotate try changing the variable Steering. 0 will do normal walk -127 will rotate one direction at max speed, +127 will rotate the opposit direction.

Good Luck

Thanks Kurte, I’ll try that when I get home tonight. What I meant by “injecting” a direction/angle (or more position relative to the body/feet), was that since the code accepts commands from the PS2 controller and this is how the Xspeed and Yspeed is determined, I wanted to “inject” a software control (autonomous) of the movement of the hexapod. I still have a ping))) and gpird sensor I have to tag on, so I needed a way to control the movement via software and not by an external peripheral such as the PS2 wireless (although receiving and interpreting the command of the PS2 portion of the code is quite elegant :wink: )

I have a walking gait that I developed myself, but it is a gait nevertheless. It’s more of a test program to test the function of the servos. But I would really love to experiment with Laurent’s/Powerpod’s variant of the code which uses the IK calculation for a fluid motion.

One more question, is the resolution of this fluid motion controlled by the variable “Nbstep?”

I believe the fluid motion may be controlled by the variable GaitSpeed.
The heigher the number the slower the motions happen and probably the more fluid they are.

Good synopses Kurte!

Some comments; DCoord is the magnitude, DAngle is the angle of the “command vector”. This vector gives a “translate” motion, and “Steering” will be seen to rotate the 'bot.

It can be seen that DCoord and DAngle come from XSpeed and YSPeed:

[code] DCoord = SQR(XSpeed * XSpeed + YSpeed * YSpeed)

TmpCos = XSpeed * 127 / DCoord
	
gosub ACos
DAngle = TmpAngle

[/code]

Alan KM6VV

Ahh, so Xspeed/Yspeed can be seen as a cartesian way to move things and the coordinates in cartesian space is then converted to DCoord/Dangle which is a polar representation of the movement…

Perhaps “injecting” Dcoord/Dangle might suit my needs better, but I’ll experiment with using both coordinate system to see what the advantages/disadvantages of either or when these are interfacing with higher level functions such as decision making after getting some telemetry input from sensors…

thanks guys, I really appreciate the help!

:smiley: :smiley: :smiley: