Hi @kurte,
Servo Control
Also will interesting to see how overlapped your Serial RX and TX be? And how much the data you will typically transfer.
I’ve had some success here. My LSS library will send multiple commands together. This get’s rid of some latency and also fills any FIFOs on the wire as well. If you are sending action commands you can just send all no matter the servo ID, but when querying you can only send multiple packets to the same servo. The same servo will respond sequentially, but if sent to multiple servos simultaneously the replies could conflict. The servo will respond deterministically within 800 micro seconds so my library will assume the response is waiting in a FIFO after 1200 microseconds and send the next request anyway. The LSS lib compiles on Linux or Arduino devices.
One (of many) thing I am unsure of is how much of the smarts of the motion will typically be done in the main processor RPI4, or how much will be in the Servo controller (Atmega 328) and how much if it will be handled directly by the servos?
I went with a layered approach. I wanted Compliant Joints and I needed a lot of wiggle room to play with the Compliant algorithm controller-side so the only feature requests I made were simpler building blocks. @scharette was very helpful in this regard, both talking it out together and adding a few things. He added some additional current control features for me. This was on top of what he was already working on in regards to current control, smooth motion, response dampening, etc that I took advantage of. I then wired these up in a state machine algo in my LSS library and the mirror video above shows that result. I then wrapped that lib in a ROS2 node called the “joint controller”. You can then send either Cartesian coordinates or joint coordinates to a ROS2 topic and move the joints and also subscribe to joint angles or Cartesian joint locations. I continue to build on this…
Inverse Kinematics/Dynamics
[snip] it had a fixed sinusoidal walking gate, that it broke down into N parts. For each part of this, it then used the Kinematics to convert this into servo positions. The Driver code then converted the standard ROS coordinate units into Servo units
So ROS2 already uses Orocos/KDL kinematics and dynamics library to translate from joint angle space to Cartesian coordinate space in a built-in node called “Robot State Controller” node. My compliant joints work well for human-robot interaction when gravity is not pushing on the joint, but to really got human-like compliance I would have to add Gravity Compensation. This is where I add a higher layer of control to the stack. Since Orocos is already linked into ROS, I just created another node called Humanoid Dynamics Model and include the orocos header, load the robot description via the robot_description URDF topic and orocos does most of the setup work. I pass in the force of gravity, any other force wrenches, and some orientation vectors from the ROS TF topic (robot pose,. IMU, etc) and Orocos will compute torque forces on the joints due to gravity. I can then push this back to the joint/compliance controller to configure the joints and Bobs your uncle…the joint is able to detect human interaction even with it’s leg held straight out already under heavy tension due to gravity. Before gravity compensation the leg would fall thinking a human was pushing it down.
This Gravity Comp node is not specific to humanoids, it reads the robot configuration from the URDF, so you could add on to this node with more Inverse Kinematics or Inverse Dynamics models. Bonus, there is very little math to do! No Jacobians at least. haha.
The Driver code then converted the standard ROS coordinate units into Servo units. Then the whole timing of this was controlled by the max servo delta from the last position. That is it iterated moving each servo + or - one servo unit doing a complete servo group move output, until the last servo was it the new position… Which worked fine for the owner, but I always wanted to change it
Yeah, this seems really convoluted. I think we are on the same page here. I remember the Phoenix IK and it was able to translate and rotate the center robot body and the legs adapted as they should without slippage. I can’t imagine this not being possible with Orocos. I have to admit I am working on this problem at the moment though. The problem comes down to affixing the foot joints that are in contact with the ground. Those ground contacts must be the fixed points and thus the root of the IK node. Orocos has ChainIKSolver, ChainIDSolver and TreeSolver…I think the TreeSolver does this sort of multi-contact solving. At the moment, I am looking into some other more advanced humanoid optimized controllers from Croccodyl/Pinnochio and Robot Control Toolbox. Both of these libs solve for multi-contact rigid body solving using multi-shooting algorithms and Model Predictive Control (MPC). Are you familiar with these techniques? I could use some tips if you do. I am up to my neck in white papers and my math is rusty too. lol. Both libraries have some interesting videos of humanoids, quadrupeds and even quadcopters doing some crazy tricks. With MPC and Stack-Of-Tasks (SoT, OpenSOT) we can issue rough walking patterns and the MPC and SoT will execute them but maintain balance or other constraints.
Happy Valentine’s Day…MoveIt!2 Tomorrow!!
There is also MoveIt!2 being released tomorrow for ROS2! I’ve been looking forward to checking this out! It was highly regarded in ROS1 for motion planning. I dont think it is as advanced as MPC techniques I mentioned above(it uses OMPL solver internally) but it is also extensible so probably could use MPC and bonus it has existing (and nice) RVIz interfaces for joint manipulation. It is also intended for walking bots not just rovers.
It will be very interesting to see what special features of a Hexapod integrate well with ROS2? At least my impression of ROS was it was very much geared toward something like a ROVER.
Yeah, you might need to throw out what you know of ROS for ROS2. I wasnt impressed with ROS1 and it did seem geared heavily towards fixed bases. It’s not really ROS specifically that did this but all the ecosystem development that went with ROS was geared towards this. They are more agnostic now and there is a greater number pushing forward in legged ROS robots.
Things like if you come to a door and your default configuration is such that you don’t fit, can you change the angle of the legs such that you are narrower and make it through… Or if there is a stick in your path, can you change your leg height to step over it. [snip] Hopefully ROS2 is setup to allow lots of newer things to happen.
Yes if you build/configure your robot this way. It’s not a ROS2 restriction. It’s a function of the IK/ID solver and motion planning which you can write yourself or use something existing, such as MoveIt!2. Not a lot of choice here yet for humanoids in ROS2, more choice in hexapods I see, but this is where I thought if you could port your existing Phoenix motion planning code to ROS2 you’d have your advanced motion planning but also access to much more like 3D point cloud sensors, Octomap mapping and avoidance, etc. Definitely, ROS2 is much more open to legged robots!
REP Documents
FYI There are REP-xxx documents, much like Internet RFCs, that describe joints, naming conventions, message policies, etc for robots. There are REPs for wheeled, humanoids, world-robot reference frames, odemetry and much more. This is how the ROS ecosystem inter-operates across vendors by defining standardized message passing interfaces. For example, I am all over REP-120, 103 and 105 for my humanoid.
Finally…
I would really like us to work together. I am still heavy into the humanoid for now but a lot of what I have done is easily usable in other legged robots. The LSS library for the servo control loop, Compliant Joint Controller and Gravity Compensation is agnostic to humanoid or multi-legged robots so these are available to you with just a change to some ROS yaml files and a different URDF file. I would really welcome any improvements as well as there is lots to do yet. If you want to give ROS2 a try I’d be happy to webex or hangout chat and get you and @zenta up to speed. I won’t be offended if you decide to go your own way either but don’t be afraid to use me as a resource. I look forward to working with the two of you!
Colin