Lynxmotion SES V2 Hexapod Robot

If they do stall they’ll most likely turn off within a second. If the stall is very minor then maybe a few seconds at the most.
At that point the LSS enters a safe mode and will stop responding to all motions commands (but can still be queried and reset).

Short answer: full duty for up to 1.6 s.

Longer answer: Without any limitations to maximum duty cycle (see MMD) or a CL/CH limit (based on average current with a “D” command), the servos will try and go to up-to maximum duty cycle. As the current ramps up without motion, the LSS detects this internally and eventually enters safe mode.

The safe mode can be entered from three parameters: temperature, voltage or current. That last one is what is most important for stalls. The safety triggers faster the higher the current use during a stall. For the ST1 & HS1, anything above 1250 mA sustained for more than 1.6 s would cut off. For the HT1, that upper limit is 2800 mA. For lower currents (but still above normal use), the duration required for cut off / safety is longer, but still only a handful of seconds.

To prevent false positives (and false negatives!), the limit is not triggered on one measurement but “accumulated over time” (and also reduced over time, when in normal operation). Therefore, if you stall temporarily, getting close to the limit but not reaching it (not entering safe mode), move away and then stall temporarily again, you may (and probably will) still trigger the safe mode anyway as the internal accumulation lowers over time, not instantly.

I hope this helps understand the safe mode a bit better (as it related to current / motion).

1 Like

@scharette and … - Thanks for the update,

I forgot about the CL and CH modifiers.

Sounds like you added some good safeguards built into the servos to keep them from from hopefully frying themselves.

Some of the others I know did, for a couple of different reasons. For example I know a few people built Hexapods using some pretty powerful Dynamixels like the MX-64 or MX-106 and I can imagine some of them may have gone up to the newer equivalent X Series Servos. where MX 64 puts out over 60Kg/CM and The 104s maybe 100… Which if your code is not working right might break something like fingers or the frame…

Also I think @zenta experimented with something like this with at least some of his earlier Terrain Adaptation, where I think when the leg is going down he put it at a very low punch and then detected when it stopped moving and then I believe told the servo to logically go to where it already was and to turn up the punch… Are you still doing something like that?

Again sorry I keep getting distracted, trying to debug why people are having issues with nRF radios talking between AVR based boards and A Teensy 4.x - I always meant to play with some of these to update the DIY remote controls to remove the XBee as sometimes the XBee deciding to randomly not send out packets in a consistent timing was irritating.

Plus people asking how to do half duplex and… Too easy to get distracted :wink:

Hope everyone is doing well! I am still staying in my Cave

1 Like

Hi Kurt,
For Terrain Adaptation using the AX-18 they moved at full torque all the time. I just read the load from the femur servo to determine if it had ground contact or not. If the readings where to high it simply moved the leg a little upwards again. At power-up the servos are set to low torque preventing them to cause damage when trying to “snap” into position. It then just waits until all servos has reached their goal (init positions) and then turning on full torque. Today I’m using mechanical switches on each foot.

1 Like

That is exactly what the CH modifier is for! :slight_smile:

Great idea!

Also not a bad idea. I figure I’d prefer a capacitive load sensing tab in that case, since you get not just floor detection but also know how the load is spread on each leg…
Something like these ones.

1 Like

Oh also, on that point:

You can do the same with the LSS using:

MMD allows both symmetric torque limits (both directions) or asymmetric ones (use a comma to separate both values). QMMD will let you know what the current limits are. You can read more here.

1 Like

@zenta - Glad to hear from you. Have you had a chance to experiment adapting your code to these servos?

@scharette - Sounds good.

For startup, on some other hexapods, I did something like, move all of the servos to their startup positions, and take a second or so to get there… On some Arms, I ran into issues where they could get into problems depending what the positions of the servos were at power off. Especially true with RC servos with no feedback as you you not query where they were… So in those case I had it first move to some neutral safe position and then to the start position…

It will be interesting to try out some of the different ways to detect the ground and and any other obstructions.

It is good that the servos have the ability to set these limits, such that the main processor does not have to necessarily query lots and lots of times to detect.

Back with the HROS1, I know a few of us experimented with using FSR sensors to detect when the two feet hit the ground. If I remember correctly we had 4 sensors per foot (each corner).

Renee and I played around with a simple board (either Teensy 3.x or Adafruit Trinket Pro, where we setup the board to connect into the DXL servo chain as each one having their own IDs, and we set up logical registers to read the sensors… (They also had a Neopixel we could set the colors…)… Obviously the Dynamixel is a little different than LSS servos, but could probably do a similar thing.

I was also thinking about experimenting with some Time Of Flight (TOF) sensors like the VL53 or VL62… where maybe you could detect how far the foot was up before you started to lower it. Not sure how well they would work, but could be a fun experiment.

Now back to playing around.

1 Like

@kurte @xan @zenta @cmackenzie In case you were not aware, all three servos will be moving to a three part aluminum case. The first production we have received is the ST1, then HT1, then ST1: https://www.robotshop.com/en/lynxmotion-smart-servo-lss---standard-st1.html

Back to the topic at hand, hoping the CL / CH might be sufficient for contact detection, but the LSS-2IO would allow additional sensors to be connected per foot. Just brainstorming, but wonder if adding sensors to the knee (facing the foot to get an idea of distance) would help keep the sensors safe and be easier to mount.

As you said (Kurte), first step is simply getting it to walk.

3 Likes

Current detection certainly works for contact detection but there is a lot of math involved to get it right since you have to determine if current is due to normal servo command or obstacle. If the robot isnt moving the math would be much easier but that is probably not the case. I use a predictive model of robot and joint state, then compute predicted torques, then the torques directly relate to what current I should be reading from the servos. I’m getting really good agreement between my my model and measurements except in some cases where my model diverges due to IMU issues (still resolving that).

If you imagine walking in the dark it’s basically the same…and we want the robot to walk at a nice pace? lol. Even humans bumble around with this limited level of sensor input. So if you want to increase trajectory speed you better integrate trajectory planning with some kind of confidence value and probably branch trajectory planning.

Sooo…tl;dr if you can add more force/touch sensors, like capacities ones…I highly recommend it. haha :slight_smile:

1 Like

:heart: :heart: :heart:

Sounds good!

As I was saying, math is hard :lol: These days I would probably do it by trial and error. That is start walking, and see what the ranges I see for normal things, and then might experiment with different things, like if it hits an obstacle moving forward (or whichever direction) and foot off ground , maybe will detect this in femur joint? if Leg coming down, detect in the tibia joint …

Good question. For something like a TOF, it would be interesting to see where a good position might be, and likewise, maybe where the supporting electronics would go. could be something like the LSS210. Could be something smaller.

Something like, the neopixel one some of us played with for the HROS1

Although in this case you may not need neopixels as you have color LEDS in each servo… Although these were sort of neat inside the 3d printed hands which then glowed the different colors…

Yep I should probably get back to this… I have part of it in place, using my older fixed point match version, would be better to go with a floating point version and the like. Only problem is my math skills these days are really really rusty… Had a math minor on my BS in CS, but that was several decades ago and have not used much of it since… Sort of hoping that someone who remembers what an angle is :wink: might get some of this stuff working :smiley:

But now back to playing. Hopefully soon!

1 Like

… can relate.

So can I! I have struggled with the math in the robot dynamics too. I am a gluten for punishment though and I just pushed through it. Just kept shining a light in the darkness until it mentally clicked. The whitepapers are loaded with symbols that everyone else seemed to know since birth (no legend)…so that makes it magnitudes harder. Eventually you learn what the hieroglyphs are and it’s no big deal, but geez you could have just said that lol.

I feel like I have all the data in good order, and I’ve been using a lot of trial and error to optimize output based on the data but it’s hard to converge on a good solution with so many process variables to play with…so many levers and buttons to push so to speak lol. So I’m looking into using a fancy new AI lambda server I have available to me and throw some machine learning at it. I am setting up for Gazebo sim, this requires urdf and srdf config to be in proper order. I am also seeing if OpenAI gym might help. I played with a humanoid walking example in OpenAI a few years ago and it worked well. This required urdf/srdf too so going to see where this road takes me. I think it will take some time for initial setup but once done it will be like rocket fuel to adding robot behaviors.

3 Likes

Quite a bit of a delay, but RobotShop should soon have stock of the Teensy 4.1
@xan @zenta Any update? Hope all is going well despite the pandemic.
@kurte breakout board’s ordered?

1 Like

Hi Coleman and others,

Although it looks like the Netherlands is moving towards a second lockdown, all is well here. I’m working mostly from my home. I hope everybody is well and safe!

Unfortunately, I did not have the amount of ‘robotics-time’ I wish I had. I’ve finished calibrating the servos. I discarded the limits for now. @cbenson, I will reply on your question later (I did not forget those). I have the electronics set up to communicate with the Teensy 4.0. I’m using some SparkFun levelers to convert between 5V and 3.3V. Next step is to get the two board actually talking.

I hope to have an update soon!

Xan

2 Likes

Hi. Sorry for my absence. Several reasons. I’ve been away on a long business trip. Got sick for a week (not C-19). My workshop PC which I replaced 3 months ago suddenly went into blue screens and I’ve to send it in for repair (hardware failure). Also a bit distracted by other hobbies with my youngest son…

Hello All!

I am also doing OK. We are still behaving like everyone has C-19 and avoiding all contact… Hopefully one of these days we will have it under control and can loosen up some.

As I mentioned in the previous post, I have been sort of lazy on getting things done, as my math skills are not as good as it use to be. After all isn’t 1+1=3? :wink:

Yes - I have my own T4.1 carrier board. I need to get back to playing with it to remember which things I wanted to fix on it.

I have yet to fully populate, example I have not hooked up speaker. Did work with talking to servos, USB host, and hook up one of the Adafruit ST77xx displays… But unfortunately I keep getting distracted.

Often to other projects with the Teensy… Or maybe ROS2 stuff (Turtlebot 3) or …

Hopefully soon we can get some baseline set of code working, that I can then have fun tweaking.

1 Like

I have been busy with ROS2 and Gazebo simulation. I have the robot loaded in Gazebo. I am working on getting my joint controllers wired into the Gazebo simulation. Once this is done I can move onto optimization of the walking/motion parameters using a Q-learning algorithm.

7 Likes

@cmackenzie
Impressive stuff you got going there! :smiley:

@cmackenzie A complete presentation as an update - impressive. I had to look up a Lambda server’s specs. Only around $ 1/4 million.

1 Like