Lynxmotion SES V2 Hexapod Robot

Welcome one and all! With the launch of the Lynxmotion Smart Servo (LSS) motors, it was only a matter of time before legged robots would be developed around them. This thread is intended to allow all those interested in the project to follow the design progress, offer insights and help in the development of these robots. The main objective is develop and ultimately release both an 18 degree of freedom (DoF) “insect” style hexapod and a 12 DoF “insect” style quadruped using the LSS actuators and much of the SES V2 bracket and electronics system. Using a Raspberry Pi 4 as the “brain”, running ROS 2.0, we hope participants here will help create the most advanced, adaptive and responsive inverse kinematics system for this type of robot.


  1. GENERAL - Why only develop one of each robot?

It is difficult to support many different robots in terms of documentation, hardware and software (configurations) for what amounts to only minor differences in the design. Other types of robots will be developed as well, but the focus of this thread are legged robots with three degree of freedom “insect style” legs. Should the code allow for an octopod (eight legs) design and allow for configurable parameters (body dimensions, leg dimensions etc), that would be great.

  1. GENERAL - License?

Yes, the license under which the software is being developed is GPL 3.0

  1. MECHANICAL - Why not create open hardware?

The official Lynxmotion designs and kits will make use of the Lynxmotion Servo Erector Set (SES) system to help ensure the robots can be assembled uniformly, and work as expected. Customers are free to create their own hardware, but custom systems are always more difficult to support. CAD files for more and more Lynxmotion parts, including the SES V2 brackets are available on the Lynxmotion Wiki.

  1. MECHANICAL - Can I use other RC or smart servos?

Using the Lynxmotion Smart Servos for the designs ensures everyone is “on the same page”, and development is not split between many different systems or communication protocols / methods. Unlike RC servos, the smart servos offer various different output / feedback, most notably position.

  1. FIRMWARE - Why isn’t the firmware used in the servos open source?

Considerable development and resources went into the creation of the Lynxmotion Smart Servos. If there are suggestions or questions regarding the inner workings of the servos and/or the firmware, we will do our best to answer them.

  1. SOFTWARE - Am I restricted to using ROS 2.0?

We anticipate that the most complex calculations for advanced motion and behavior will need to run on a Raspberry Pi which is running ROS 2.0, which is the focus here. However, there may be interest in creating a smart phone app, a wireless controller or other useful devices which might need to be programmed in other languages.

  1. GENERAL - I don’t have the hardware, but I would like to participate, what can I do?

There are many ways you can participate and contribute without having the physical hardware.
All of the software being used is free to download, and there is even a 3D simulation of the robot. Feel free to post and indicate what you would be interested and passionate to work on and we’ll see if it can be included. Ultimately, contributions which are well documented can be easily included, while only describing what you accomplished (but not how) does not really help anyone.

  1. GENERAL - Why not allow / use / implement ____ instead / alongside?

As with all projects, there are advantages and disadvantages in all choices, and while certain decisions may seem to give only advantages, might take longer to implement. We do not want this project to be closed to ideas, but without some focus, it is often harder to reach results. We’ll try to include any good suggestions for future consideration.

  1. SOFTWARE - How can I install and use the latest code?

The thread below provides the latest information on how to get up and running.



Reserved post for useful information . getting up to speed.


Custom Teensy 4 breakout board (designed by kurte). This board is not commercially available at this time, but if there are those who are interested in pursuing that project, please send a private message.

Reserved post for notifications / successes / suggestions / news etc.

Looks like this will be a lot of fun… Just bookmarked…


Hi All,

Here is the initial design I came up with but nothing is fixed and feedback is greatly appreciated…!! (I would add, required). This was cut with cheap / glossy 1.5mm G10 (fiberglass) for fitment testing.

I went for a simple leg design (but many options there) using the standard Lynxmotion parts.

I’ve also put another version on the Lynxmotion Wiki - Sample Leg A.

While i was assembling:

Once assembled:

Let’s start the fun :slight_smile:


Hey Guys!

The hex is looking Good! Do we already have a name in mind? :wink:

Clear description of the scope Coleman. I think the new servos with feedback, together with the powerful Pi, will open more possibilities and will push the capabilities further then we could reach before. Let’s see what this puppy can do!

Hey @kurte, long time no see! I’ll send you a pm to keep this treat on topic.



No names in mind at all yet, same as the physical designs. Figured we’d compile a list of all naming suggestions and vote at the end?

Yes this should be a lot of fun. Will be interesting to see where all we take it.

Again sorry if some of this is maybe a bit off topic, but may might help to look at starting points.

Over the last many years most of the Robotics stuff I have done was with using smart servos by Robotis, mostly with Trossen Robotic Hexapods. (Actually the last couple of years spending most of my play time working with Teensy Micro-controllers by PJRC).

Earlier I played around and ported the Phoenix code to run on Linux controllers such as RPI, Odroid, UP, BeagleBone black… My github project was/is always in a state of disarray but is up at:

Also during that time frame, another Trossen Robotics member, created a monster hexapod and a ROS stack for it. His project for the ROS stuff is up at:

More on this hexapod at:
Note: a few of us worked with KevinOchs at made the code work on the PhantomX (Trossen standard hexapod) with either RPI or Odroid. There is another thread up on their forum if anyone is interested.

Now back on topic. This looks like it will be a lot of fun. I do need to more familiar with these servos! Things like, What type of feedback can I get from them? I assume things like current position, maybe voltage, maybe temp? What about how much force or torque they are under…

Sounds like old times




  • Position in puse or tenths of degrees
  • Voltage in millivots
  • Temperature in tenths of degrees Celsius
  • Current in milliamps
    … and a lot of other queries (speed, origin offset, angular range, etc. etc.)

We’re working on calculated torque since there’s no “torque sensor” onboard. We’ve had success implementing CH and CL command modifiers which cause the servo to halt or go limp if the current ever goes beyond the value (possible easy solution for terrain adaptation).

We’re waiting for Matt Denton to come back from vacation and participate too (he was part of the LSS BETA)

Hi everyone!

LSS Servos
I have been working on the Lynxmotion Humanoid robot for about 2 months now and I can say they have come a long way since the initial beta. They are performing great in my humanoid robot so I don’t think you will have any issues getting good performance for the hexapod. They’ve put a lot of work into new commands for smooth motion and torque control. I especially like turning off the typical RC-like servo motion control and enabling the IIR motion filter. I can blast position updates and get better motion control for humanoid dynamic control. The motion is so much smoother! To illustrate, I coded one arm to follow the other (in limp mode) and there was no latency, it even mirrored the acceleration of gravity smoothly when I drop the arm at the end.

Joint Compliance
I believe there is no better user interface for a robot than direct human-robot interaction with compliance. Thanks to Sebastien for implementing some feature requests for torque control I was able to create a Compliant Joint controller even my 6 year-old naturally understood. :slight_smile: This also worked great for auto-detecting obstacles or itself! So even before any differential equations or Jacobians this robot is safe to operate without hurting anyone or itself. These features greatly improved compliance response (LSS commands in bold):

  • Disabled RC-like motion control (EM0), enabled IIR filter.
  • Using IIR filter pole count (FPC) to control smoothing of position updates. Makes a big difference! I also change this in realtime to adapt to conditions.
  • Controlling response stiffness (AS,AH) in real-time (ex. less jerk transitioning in/out of compliance modes)
  • Controlling max torque using MMD commands allows active compliance where the joint resists or supports itself while still being manipulated by human.
  • Reading back position and current. Current seems accurate and directly proportional to joint torque.
    Servo LED Legend: blue=holding, red=limp (passive compliance), pink=active compliance (powered but still compliant).

Humanoid Dynamic Control
I then added Gravity Compensation with an IMU, Kalman filter and Inverse Dynamics to better determine external forces for compliance by subtracting gravity forces. Not perfect yet but it is able to hold its leg straight out which shows the inverse dynamics is increasing torque to those joints and still able to detect the force of my hand. It’s not detecting ground contact forces yet so it wont stand on its own. I am now in the process of implementing the balance and walking humanoid dynamics.


@kurte @xan @zenta @cmackenzie Regarding the hardware, is the design above good to go to get started? We want to ensure we’re all working on the same hardware. We’ll pack them up and ship them out when we have everyone’s “go”.

Good Morning,

My guess is the underlying hardware is probably good. Obviously others here who have experience with these servos can probably give a much more definitive, that looks like it will work…

I guess one of the more interesting question will be on what direction we take with this hardware, in the software and what electronics and sensors and the like are you expecting to use and if this part will be standardized.

For example, I know you mentioned ROS2, which I played with a little… But as usual with ROS forgot everything…

Awhile ago I was trying to get myself back into ROS, and at the time felt like trying to understand ROS through KevinOs ROS stack was not probably the best way to understand ROS as it felt like he was trying to fit a square peg in a round hole… So I purchased a Turtlebot3 Waffle and later cobbled together most of a burger to try things out… Unfortunately I got distracted back to trying to get the underlying electronics to work better and play with the underlying controller software and did not spend enough time playing with ROS.

During that time I was working back and forth with some of the ROS people at Robotis, who worked on their Turblebots plus wrote the ROS book you can download from:

SO back to this subject: Questions include are you using some other processor along with the RPI4 to control the servos and the like. Will this be a ROS node…

What IMU?

Is there any 3d mapping hardware like a LIDAR and/or 3d Camera, to use for mapping and the like?

And then hopefully someone there who understands enough about some of these things to hopefully help guide some of the ROS configuration files…

Also wondering which version of ROS2? What Host are you using? Windows? Linux?

Sorry, I know too many questions!


Questions are good!

Are you using some other processor along with the RPI4 to control the servos and the like. Will this be a ROS node…


What IMU?

Nothing in particular yet. T.B.D.

Is there any 3d mapping hardware like a LIDAR and/or 3d Camera, to use for mapping and the like?

Nothing planned yet, but open to ideas. We certainly have some preferred partners. It will depend on who is ready to put in the effort with this.

And then hopefully someone there who understands enough about some of these things to hopefully help guide some of the ROS configuration files…

Colin has learned a lot about ROS 2 and will hopefully be able to get everyone up to speed. @cmackenzie

Also wondering which version of ROS2? What Host are you using? Windows? Linux?

@cmackenzie - You’re not the “go to” for ROS 2 :slight_smile:

1 Like

Hi @kurte! At the moment I am using the RPi4 but I also use my desktop Linux Ubuntu station to do a lot of ROS node development as well. This works great. I run the hardware related nodes like the joint controller and IMU on the RPi, then develop the humanoid dynamics nodes on my desktop. There are some advancements in ROS2 that make this smoother than ROS1. I can debug-cycle without touching the setup on the RPi. When desired I can push the node to run on the RPi and move on to the next node. I use CLion as my IDE. If I need to debug hardware related nodes then it’s easy to setup CLion to remote dev over ssh…this is not cross-compiling or remote gdb, just CLion running gcc/gdb/cmake/etc over ssh instead of local terminal. I get full debugging capability and the node is actually running on the RPi then.

IMU is Bosch BNO-055

I wasnt a fan of ROS1. IMO it was too bloated for compact robotics. ROS2 on the other hand I’ve fallen in love with. haha. I am using the latest ROS2 eloquent. I have docker images, but managed to install on Ubuntu19 with some work so not using docker anymore.

  • Simpler API. Pretty easy to code a new node.
  • With a little extra work you can create LifecycleNodes which give better control over initialization, configure, activate/deactivate, teardown…my joint controller has this so I can bring it up or down or reconfigure without restarting the ROS nodes on RPi (as mentioned above).
  • efficient IPC and network communication. Doesnt use XML HTTP like ROS1 did.
  • Supports real-time
  • Has microROS for uCs like arduino or ESP

There are many existing ROS nodes for 3D cameras and mapping (octomap). We need to setup 3 things in ROS: joint controller, a URDF file of the hexapod, and a motion control algorithm. I have the joint controller and IMU node from the humanoid already. URDF generation takes some Solidworks and some manual work. I am working on Motion/Balance Control in the humanoid now and it can also be used for the hex. There are also many other existing motion controllers for hex, as well your Phoenix code may work here too - just wrap it in a ROS2 node. Once this is done the rest should be off-the-shelf ROS stuff like octomap, vision sensors, IR, whatever.

Still there is a big learning curve to get used to ROS2. I took the Udemy ROS1 and 2 courses which helped, then a lot of youtube. For sure I am open to helping out getting up to speed.


Hi @kurte, I am thinking some more. If you have existing hexapod code, specifically motion control, that I can wrap in a ROS node I would like to try. Your code can output Cartesian coordinates or joint angles. My existing joint controller from the humanoid can interact with the servos given your controllers output.


@xan @kurte happy to see you both here!
First, I feel very sorry for not being able to participate much during the Beta session. I’m really impressed by @cmackenzie work, love the smooth motions! How fast are you able to update the servo postions? Going the ROS2 + RPi4 route is probably the best way. Personally I have none to very little experience with ROS. So it will be a very hard learning curve. Especially when free time is not my friend at the moment.

The hardware looks ok. Personally I prefer slightly more curved parts… My Beta hex with some 3D printed femur sections are still standing idle doing nothing. Last time I did anything I updated the firmware on the servos. If I recall correctly the pins have changed on the final version too?
2019-07-31 17.18.21

Anyway, it would be fun trying to learn new stuff.


@zenta Happy to see you here too! Since the servos and electronics from that BETA test don’t play well with the production batch (everything changed enough so they’re not even worth using), perhaps you can leave that as a cool looking robot on a shelf? Your leg design reduces the number of brackets per leg from five to three, which would lower the MSRP given the cost of aluminum brackets. Very open to the design since the 3D printed part could be made easily out of G10 (potentially the same material as the body). Your base looks almost identical to the design at top. What we could do is proceed with the suggestion to start (again, not in stone at all, just so everyone can get stared), and we can play with the design and refine it, then send out and changes in hardware? This gives us time for you to design and for us to create the parts in G10. The servos make up the lion’s share of the price.

Think you’ll have time to participate? We won’t have a fixed deadline for this one, but we’ll eventually need to release a few robots :wink:

1 Like

Hi @zenta. Thanks for the kudos! I’ve followed your designs for years…I love the hex wrapped up in the globe! Awesome!

Mirror video was running about 30ms loop. Right now my control loop is 15ms which was my target so stopped there. This is with the bus split into 3…each leg on a bus, then upper body. This is at 250k baud and I am pretty much saturating the bus with all the packets I am sending and receiving. LSS servos are not a cause of latency but Linux and hardware FIFO certainly can be; most especially usb-serial devices so stay away from these. The RPi4’s hardware serials are low-latency though and work great! Using Arduino or ESP devices are also good since they dont have bulk FIFOs but give the same results as the RPi. Seb says I could run at 500 or higher baud and I will give this a try eventually. I did a bunch of tests and I have more write-up on this if interested but this is the TLDR version.

Building ROS2 from source has some issues in the docs…so before attempting hit me up and I will send you my build notes/errata. Though, if you have Ubuntu18 you can simply use the repo to install…but not available on Ubuntu19.


Hi @zenta,

Good to hear from you again! I like what you have done with the 3d printed parts. This makes the femur look more solid and the cables can be rooted more clean since they are on the body side of the joint. Additionally, I’m not familiar with the new SES brackets, do you think it might be possible to bring the coxa and femur joints closer together without running into the body or servo’s themself? This to best mimic a ball joint. (Just an idea)

@cmackenzie Impressive result to get the mirroring with such a small lag. I have no experience with ROS. I hope I can shoot you some questions when I’m stuck :slight_smile:


One of the problems I foresee is the lack of space for the Raspberry Pi & LSS-Adapter.
We made the mounting holes align with the Pi so both can be stacked but it makes for a really high “tower” of electronics.

Ideally I would like to see the electronics hidden / clean as well.
Maybe the only way is to make it bigger but let me know what you think.

On a side note, some of the features @cmackenzie is talking about are not yet in the official firmware release.