Lynxmotion SES V2 Hexapod Robot

And possibly less weight…? :slight_smile:

We ran many tests on a single, saturated bus with many servos @ all the baud rates. For the most part, it all went well. Where we’ve hit issues is with cable lengths, servos numbers and higher baud rates all contributing to a worse overall signal and therefore packet loss.

We’ve certified the LSS to be able to do 36 servos on a single bus @ 500 kbps using one LSS Adapter Board and up-to 2 cables (with LSS Power Hub in between) chained on each port to connect all the servos. Similar setup as below but with more LSS!
image

You can most likely have more servos than this or use a faster baud rate but probably not both. Past 500 kbps we started noticing packet loss with large amounts of servos. As a side note, we chose 36 as a target since it should be enough for a full humanoid with lots of DoF per limb. That being said, as @cmackenzie mentioned above, tight control loops require using multiple buses anyways for adaptive control, so 36 LSS on one bus probably will not happen in most use cases.

1 Like

Indeed, the new release will come soon. That being said, two extra points to take note of:

  1. You can access the newest dev firmware versions in the “experiemental” (checkbox Exp. in LSS Config) section.
  2. As @cbenson said, the previous electronics/LSS (i.e.: beta versions) will be abandoned for now and therefore the new firmware are not available for it. If I remember correctly version 367 is available for “type 1” (the beta hardware), but that’s prob. the end for those servos.
1 Like

Good morning all,

@zenta - Great to hear from you. As always I like your designs and the curved setup has a lot of appeal! Hope you will have some time to participate in this.

@cmackenzie and others - I have zero experience with these servos so there will for sure be a learning curve. Most everything I have done with servos for the last many years is using Dynamixel servos. Which run and on a half duplex serial, and typically I run at 1Mhz baud rate, but have experimented with 2mhz… Some of the controllers/servos go higher than this. So it will be interesting. Also will interesting to see how overlapped your Serial RX and TX be? And how much the data you will typically transfer.

Sorry if some of these ramblings don’t make sense. One (of many) thing I am unsure of is how much of the smarts of the motion will typically be done in the main processor RPI4, or how much will be in the Servo controller (Atmega 328) and how much if it will be handled directly by the servos? As an example the ROS Hexapod Stack I mentioned. If my memory is correct, it had a fixed sinusoidal walking gate, that it broke down into N parts. For each part of this, it then used the Kinematics to convert this into servo positions. The Driver code then converted the standard ROS coordinate units into Servo units. Then the whole timing of this was controlled by the max servo delta from the last position. That is it iterated moving each servo + or - one servo unit doing a complete servo group move output, until the last servo was it the new position… Which worked fine for the owner, but I always wanted to change it, as you could not change anything of the walking gate (like step height, length), without throwing their timings out the window… All of the odometry stuff for ROS was done by actually timing how long it took the hexapod to walk some fixed distance… Note this did not take advantage of any of the smarts that newer servos now have built into them.

At different times I was in the process of trying to change the Servo driver to be more deterministic. And work more like SSC-32 in that you pass in the new positions and how long you want this move to take and the servo driver code did the proper interpolation for all of the servos and reduced the issues involved of changing the walking gait. But I was pretty sure that these changes would not make it back into the official sources, so I sort of punted. Some of this is still up at: https://github.com/KurtE/hexapod_ros/tree/Servo_Driver_experiments

Again other code. As I mentioned I do/did have working Phoenix code base running on RPI, actually a few different versions. However I was always hoping that @zenta would have some more time and we could come up with a better version. My Floating point version still had issues, as my math skills are beyond rusty. I had a math minor when I received my BS in Computer Science, but now days I don’t remember anything, had to google just to do some simple things like how to rotate a clock hand around circle (Translate points by an angle)…

So again a question of how much of the simple interpolation stuff will be done on RPI or the Servos or the Atmega?

Note: The Linux (RPI) code base I have for Phoenix is mainly geared around the PhantomX using the Dynamixel Servos. As it looks like I have not touched most of this in at least 4 yours, I am pretty sure I have not updated this code yet to work with the Protocol 2 servos.

That is/was on my list of things to try for a newer PhantomX… But I dropped the ball on this after I had some minor health issues last year (Hernia surgery)…

Probably one of the first things I would like to do when I get a set of your servos is to do a quick and dirty version of the Phoenix code that works with your servos. Probably first would be to run some Arduino board (like a nice and shiny Teensy 4.1) to see how everything works.
Then try to migrate it toward RPI… Then ROS2… It will be very interesting to see what special features of a Hexapod integrate well with ROS2?
At least my impression of ROS was it was very much geared toward something like a ROVER, where you drove straight then turned and had no idea of walking sideways, or changing your configuration? Things like if you come to a door and your default configuration is such that you don’t fit, can you change the angle of the legs such that you are narrower and make it through… Or if there is a stick in your path, can you change your leg height to step over it.

Again maybe lots of stuff to experiment with. Hopefully ROS2 is setup to allow lots of newer things to happen.

Sorry again about this rambling

2 Likes

{…} I have zero experience with these servos so there will for sure be a learning curve

We’ll get the setup out to you ASAP!

I’ll just drop this here… :crazy_face:






1 Like

Hi @kurte,

Servo Control

Also will interesting to see how overlapped your Serial RX and TX be? And how much the data you will typically transfer.

I’ve had some success here. My LSS library will send multiple commands together. This get’s rid of some latency and also fills any FIFOs on the wire as well. If you are sending action commands you can just send all no matter the servo ID, but when querying you can only send multiple packets to the same servo. The same servo will respond sequentially, but if sent to multiple servos simultaneously the replies could conflict. The servo will respond deterministically within 800 micro seconds so my library will assume the response is waiting in a FIFO after 1200 microseconds and send the next request anyway. The LSS lib compiles on Linux or Arduino devices.

One (of many) thing I am unsure of is how much of the smarts of the motion will typically be done in the main processor RPI4, or how much will be in the Servo controller (Atmega 328) and how much if it will be handled directly by the servos?

I went with a layered approach. I wanted Compliant Joints and I needed a lot of wiggle room to play with the Compliant algorithm controller-side so the only feature requests I made were simpler building blocks. @scharette was very helpful in this regard, both talking it out together and adding a few things. He added some additional current control features for me. This was on top of what he was already working on in regards to current control, smooth motion, response dampening, etc that I took advantage of. I then wired these up in a state machine algo in my LSS library and the mirror video above shows that result. I then wrapped that lib in a ROS2 node called the “joint controller”. You can then send either Cartesian coordinates or joint coordinates to a ROS2 topic and move the joints and also subscribe to joint angles or Cartesian joint locations. I continue to build on this…

Inverse Kinematics/Dynamics

[snip] it had a fixed sinusoidal walking gate, that it broke down into N parts. For each part of this, it then used the Kinematics to convert this into servo positions. The Driver code then converted the standard ROS coordinate units into Servo units

So ROS2 already uses Orocos/KDL kinematics and dynamics library to translate from joint angle space to Cartesian coordinate space in a built-in node called “Robot State Controller” node. My compliant joints work well for human-robot interaction when gravity is not pushing on the joint, but to really got human-like compliance I would have to add Gravity Compensation. This is where I add a higher layer of control to the stack. Since Orocos is already linked into ROS, I just created another node called Humanoid Dynamics Model and include the orocos header, load the robot description via the robot_description URDF topic and orocos does most of the setup work. I pass in the force of gravity, any other force wrenches, and some orientation vectors from the ROS TF topic (robot pose,. IMU, etc) and Orocos will compute torque forces on the joints due to gravity. I can then push this back to the joint/compliance controller to configure the joints and Bobs your uncle…the joint is able to detect human interaction even with it’s leg held straight out already under heavy tension due to gravity. Before gravity compensation the leg would fall thinking a human was pushing it down.

This Gravity Comp node is not specific to humanoids, it reads the robot configuration from the URDF, so you could add on to this node with more Inverse Kinematics or Inverse Dynamics models. Bonus, there is very little math to do! :slight_smile: No Jacobians at least. haha.

The Driver code then converted the standard ROS coordinate units into Servo units. Then the whole timing of this was controlled by the max servo delta from the last position. That is it iterated moving each servo + or - one servo unit doing a complete servo group move output, until the last servo was it the new position… Which worked fine for the owner, but I always wanted to change it

Yeah, this seems really convoluted. I think we are on the same page here. I remember the Phoenix IK and it was able to translate and rotate the center robot body and the legs adapted as they should without slippage. I can’t imagine this not being possible with Orocos. I have to admit I am working on this problem at the moment though. The problem comes down to affixing the foot joints that are in contact with the ground. Those ground contacts must be the fixed points and thus the root of the IK node. Orocos has ChainIKSolver, ChainIDSolver and TreeSolver…I think the TreeSolver does this sort of multi-contact solving. At the moment, I am looking into some other more advanced humanoid optimized controllers from Croccodyl/Pinnochio and Robot Control Toolbox. Both of these libs solve for multi-contact rigid body solving using multi-shooting algorithms and Model Predictive Control (MPC). Are you familiar with these techniques? I could use some tips if you do. I am up to my neck in white papers and my math is rusty too. lol. Both libraries have some interesting videos of humanoids, quadrupeds and even quadcopters doing some crazy tricks. With MPC and Stack-Of-Tasks (SoT, OpenSOT) we can issue rough walking patterns and the MPC and SoT will execute them but maintain balance or other constraints.

Happy Valentine’s Day…MoveIt!2 Tomorrow!!
There is also MoveIt!2 being released tomorrow for ROS2! I’ve been looking forward to checking this out! It was highly regarded in ROS1 for motion planning. I dont think it is as advanced as MPC techniques I mentioned above(it uses OMPL solver internally) but it is also extensible so probably could use MPC and bonus it has existing (and nice) RVIz interfaces for joint manipulation. It is also intended for walking bots not just rovers.

It will be very interesting to see what special features of a Hexapod integrate well with ROS2? At least my impression of ROS was it was very much geared toward something like a ROVER.

Yeah, you might need to throw out what you know of ROS for ROS2. :slight_smile: I wasnt impressed with ROS1 and it did seem geared heavily towards fixed bases. It’s not really ROS specifically that did this but all the ecosystem development that went with ROS was geared towards this. They are more agnostic now and there is a greater number pushing forward in legged ROS robots.

Things like if you come to a door and your default configuration is such that you don’t fit, can you change the angle of the legs such that you are narrower and make it through… Or if there is a stick in your path, can you change your leg height to step over it. [snip] Hopefully ROS2 is setup to allow lots of newer things to happen.

Yes if you build/configure your robot this way. It’s not a ROS2 restriction. It’s a function of the IK/ID solver and motion planning which you can write yourself or use something existing, such as MoveIt!2. Not a lot of choice here yet for humanoids in ROS2, more choice in hexapods I see, but this is where I thought if you could port your existing Phoenix motion planning code to ROS2 you’d have your advanced motion planning but also access to much more like 3D point cloud sensors, Octomap mapping and avoidance, etc. Definitely, ROS2 is much more open to legged robots!

REP Documents
FYI There are REP-xxx documents, much like Internet RFCs, that describe joints, naming conventions, message policies, etc for robots. There are REPs for wheeled, humanoids, world-robot reference frames, odemetry and much more. This is how the ROS ecosystem inter-operates across vendors by defining standardized message passing interfaces. For example, I am all over REP-120, 103 and 105 for my humanoid.

Finally… :slight_smile:
I would really like us to work together. I am still heavy into the humanoid for now but a lot of what I have done is easily usable in other legged robots. The LSS library for the servo control loop, Compliant Joint Controller and Gravity Compensation is agnostic to humanoid or multi-legged robots so these are available to you with just a change to some ROS yaml files and a different URDF file. I would really welcome any improvements as well as there is lots to do yet. If you want to give ROS2 a try I’d be happy to webex or hangout chat and get you and @zenta up to speed. I won’t be offended if you decide to go your own way either but don’t be afraid to use me as a resource. I look forward to working with the two of you! :smile:

Colin

3 Likes

Building ROS on Ubuntu 19.10 (includes RPi4)

Using these instructions as a base:
ROS2 Eloquent Installation Guide

Preamble
I just noticed the installation guide for ROS2 Eloquent release is up. I had to wrangle with an old ROS2 install doc before and there were issues. In both install scripts they first install some packages from apt tool, then follow up with some python packages via pip install. The two main issues I had were with “lark-parser” and “colcon-common-extensions” not being found in apt repo, then later rosdep tool not finding them. These packages are also in python-pip though. You can first follow the new install manual, but if you are getting missing packages errors or what not…you can get these packages via pip instead, and tell rosdep to ignore references to these as dependencies (you’ll install via pip so it’s safe to ignore them). These package issues are likely to still occur on Ubuntu 19 but not 18.

My Install on Unbuntu 19
Ubuntu 19 doesnt have python3-lark-parser, python3-ifcfg or python3-colcon-extensions packages, so we install via pip instead. At the “Install Development Tools and ROS tools” section, remove from apt install command the python3 packages “colcon-common-extensions, lark-parser” and instead install using pip as they do with many other packages immediately following the apt install.

Install via PIP
$ sudo python3 -m pip install lark-parser ifcfg
$ sudo python3 -m pip install -U colcon-common-extensions

RosDep errors
Rosdep expects the lark-parser to be installed via apt but we installed via pip, but we can tell rosdep to skip it as a dep:

Modified rosdep command
$ rosdep install --from-paths src --ignore-src --rosdistro eloquent -y --skip-keys “console_bridge fastcdr fastrtps libopensplice67 libopensplice69 rti-connext-dds-5.3.1 urdfdom_headers python3-ifcfg python3-lark-parser”

After Installation

After you get ROS2 installed (probably in /opt/ros/eloquent) you need to setup the default ROS workspace:

$ source /opt/ros2/eloquent/install/setup.bash

To run your first test, there are a bunch of examples in /opt/ros2/eloquent/src/ros2/examples. You can run an example using the ros2 command.

$ ros2 run <package-name> <binary-program-name>

I get the from the examples package.xml file, but I get the from the CMakeLists.txt file in the add_executable() macro. So for example, in terminal 1 run:

$ ros2 run examples_rclcpp_minimal_subscriber subscriber_member_function

In second terminal, run:

$ ros2 run examples_rclcpp_minimal_publisher publisher_member_function

In a third terminal, query the ROS2 system for some info:

$ ros2 topic list
$ ros2 node list
$ ros2 param dump --print /minimal_subscriber

Now go run some more examples, maybe run the TurleSim, or create your own node using ros2 pkg create command.

2 Likes

Thanks @cmackenzie - There is a lot of good stuff here, which I (and suspect most of the others as well) need to look over and start to get setup.

Also may need/want to come up with some standard setup to recommend for doing the ROS2 stuff.

For example my main machine I use is a Windows 10 machine. It is my understanding that ROS2 (unlike ROS classic) works with WIndows, but probably there are still lots of things that still works better with Linux. So my old Dev machine, is setup with Ubuntu 18.04 (Current LTS setup). I know the next LTS will coming out in a couple of months (20.04), but not sure how long it would be before ROS2 was supported on it. So again it will be interesting.

But I personally wonder if this should be the first step with the Hexapod? That is personally I would rather start off with some Arduino setup, like we did for the Phoenix or T-Hex and have an update Arduino code base that works with this.

That is I wonder if there will be a reasonable percentage of people who will resist the ROS2 learning curve and simply want to do their own thing… Yes I do think there will be many who may want to move up to something like ROS2, but for me, I would like to see different stepping stones to get there.

For example I think it would be great if we could start off with some form of the Phoenix code base (like hopefully @zenta version that uses floating point and new walking stuff…) Then transition to hosting a lot of this code on RPI4 and then move up to ROS2. But I don’t know if others feel the same way.

But assuming yes - What hardware setup makes sense for Arduino setup? I personally believe the processor needs to be more powerful than the AVR based Arduinos, preferably ones who support floating point math… I personally like working with PJRC boards, like the T4, but I know there are others.

Thoughts? Am I barking up the wrong tree?

Quick question are the LSS servos compatible with 3.3v TTL?

Two part question: Will the servos run OK if the TX pin outputs 3.3v or does it need to be converted up to +5v?

Second part: do the RX pins of the servos drive the RX pins to +5v or is it Open Drain (or some other similar name), where it is assumed that host has Pulled up the signal to high and the servos only pull it down for zeros… Or do I need to make sure that the RX pin on host handles +5v?
Again if I try with T3.2/5 these are 5v tolerant, but T4 or T3.6 are not… So would maybe need level shifter…

  1. The LSS are intended to work with 5V voltage levels and might not work as it should with 3.3V. We actually tested them with 3.3V and we experienced some issues.
  2. The Rx pin of the servo doesn’t drive the signal. The Rx pin of the servo’s MCU is driven through a tristate single bus buffer that uses CMOS technology powered at 5V. It is rated for a minimum high-level input voltage of 0.7 x Vcc(5V) = 3.5V. Therefore, using a 3.3V/5V level shifter is needed.

Also may need/want to come up with some standard setup to recommend for doing the ROS2 stuff. For example my main machine I use is a Windows 10 machine. It is my understanding that ROS2 (unlike ROS classic) works with WIndows,

There seems to be many people running on Windows on Youtube and it is supposed to work but never tried. Other possibilities if you have difficulty is using Docker or Windows Subsystem For Linux (WSL). Other members of my team (not related to robotics) use WSL and it works really well. WSL is an Ubuntu distribution so it should be simple setup for ROS2.

For docker, I have some prepared images that has ROS2 core, RVis and Gazebo and includes NVidia OpenGL/Cuda docker extensions. Only issue there (small) is you have to properly setup bridge networking if you want ROS2 from your docker to see ROS2 on the RPi. Also, I was using remote XWindows to open x apps from the docker…you’d need to install XWindows tools on Windows to do this.

But I personally wonder if this should be the first step with the Hexapod?

Your call. :slight_smile: If you can keep Arduino-specific code isolated out of your core then perhaps we can later wrap your code stuff in a ROS node with minimal work. I use lambda’s, classes, templates, plain C, etc in my Arduino and Linux compatible LSS lib so no restrictions on language aspects just stuff like Arduino Serial and print stuff, etc need to be isolated in platform-specific source files.

What hardware setup makes sense for Arduino setup? I personally believe the processor needs to be more powerful than the AVR based Arduinos, preferably ones who support floating point math… I personally like working with PJRC boards, like the T4, but I know there are others.

I like the PJRC boards, I have a few Teensy boards here as well. I use a lot of ESP8266 based boards (NodeMCU firmware based) and I love the ease of use and on-board Wifi but not as much GPIO as the Teensy. I also have some of the new Sipeed MAiX boards with are RISC-V and have on-board neural net fabric that is TensorFlow compatible. I think this would be great for trained walking algorithms. Setup for MAiX is not as easy as Arduino compatible PJRC, ESP and such boards though so I shelved my plans on MAiX for down-the-road. MAiX does have an Arduino based firmware…firmware and docs were sketchy last I checked but maybe they’ve worked out the kinks by now.

1 Like

Further to Brahim’s comments, you should use the discrete transistor level shifters. I had the same issue and was going to use some TI TXB0108 type shifters until Brahim burst my bubble that those TI buffers have too much capacitance. :frowning: Booo TI. His tests on the level shifters show they are much better.

2 Likes

Thanks @cmackenzie and @bdaouas - I had a feeling that the 5v was maybe required for the Servo TTL levels. I thought I would ask, as again with Dynamixel Servos they also spec out 5v but work reasonably well at 3.3v…

@cbenson - As I mentioned earlier, sometimes I get stuck in the weeds :wink:

Again sorry in advance, if I am asking questions that have been asked before or suggestions on products differences that were discussed and dismissed for technical or business reasons

I may be wrong but I don’t see any products yet, that allow quick and easy connecting these servos to a 3.3v processor such as a Teensy or RPI?
Obviously I can easily use the Sparkfun level shifter that was linked in or the more expensive Adafruit version:

But I wonder if you have considered making something like your power hub which has the level converter built in? Not sure if you would have the host provide the +5v (high level VIN) or have small VR built onto the hub that converted the servo voltage to +5v?
(I don’t think the LSS - Adapter board does this? )

Controlling servos from Host such as RPI? I believe you are using the Arduino board?

If I am understanding this product correctly, you either use it as an Atmega328 or you use it as a USB to serial adapter but not both? That is for example you can not, have the RPI do the main stuff and send packets through this adapter to the servos and have the processor on the board do some of the smarts?

Again sorry to keep bringing up Dynamixels, but they are what I am most familiar with. Wish at a minimum you had some controllers that you could do some similar things such as like the USB2Dynamixel or simpler 3rd party USB2AX which I think is now retired.

The USB2AX is/was an Atmegea32u2 which has a hardware UART AND USB built in… So it allowed the controller to do additional things with the packets passed to it… Example suppose you wish to get the current position of the 18 servos of your hexapod. How do you do it. Currently I believe you have to send out individual packets to each servo over USB, which when the USB to serial adapter receives it, it needs to then output the data over the TTL serial port, to the servo and wait for servo to send back the data, and then the adapter then sends the data back to the RPI again with whatever USB delays added in. The USB2AX had the ability to say give me the position (set of registers) for the following Servos, where it would do the communications to each of the servos, and then package up the response to send back to the host. Thus reducing the number of USB packets being sent back and forth. Also the advantage of having the USB built into the processor instead of an external USB to serial adapter chip, is you don’t have the delay associated with the adapter having to send the data to and from the processor or a serial port.

Another feature that I had at one point I had a firmware version for the USB2AX, that I wanted was the ability to setup to tell the servos their next position, without the timing of the command being influenced by the USB communications. That is that is you are thinking about an SSC32 group move, the ability to send down the next group move that the SSC-32 will start on, at the specified time or when the current one has completed… Why? With my earlier experiments of running some hexapods using different Linux boards (RPI, BBB, Odroid, UP…) I never felt like the motions of the legs were ever as smooth as when the timings were controlled by a micro-controller. The USB timing delays were not consistent, especially if other things were happening on the different USB devices or the like. I did have some better success when I used the hardware TTL serial of the RPI…

It would be easy to build a quick and dirty adapter to do similar and be fully Arduino compatible with something like Arduino Pro Micro, like:

But personally I would prefer a simple ARM based setup such as a Teensy or some equivalent to the Robotis OpenCM9.04

Again sorry I know I am probably just rambling here (sort of off the deep end), especially since I have not tried any of this out yet, and hopefully you have already have better solutions.

Kurt

Hi @Kurte, On the humanoid I struggled getting the 18 servos to update fast enough on Linux and USB. Again, Not an LSS servo issue just the nature of USB and Isochronous transfer protocol is a real killer for low-latency comm! FT did a really good doc on this read here. If you want low-latency control I would shy away from any USB in the mix period. :slight_smile:

I saw no difference between TTL serial on RPi or Arduino (I used a Teensy for testing actually), both had best-case latency. A possibly downside on Arduino is if the transmission is blocking IO, it could leave little time left for algorithms. I didnt pursue further though.

The LSS adapter does have 1 TTL shifter on it. It is on the bluetooth socket. So you directly wire a Teensy into there and set the LSS adapter switch to Bluetooth. I use this, and two other shifters like the Sparkfun ones for the extra LSS buses. I desoldered the LSS pcb headers on the old beta LSS adapter (hehe) and soldered those onto the TTL shifters. With a bit of pin tinkering I was able to directly solder them. Then added some 0.1" header pigtails to the other side. It would be nice to have this as a LSS add-on, like the 2IO board it could screw onto the servo bracket mounting points.

I don’t control my servo loop update timing. I generally have about 4ms jitter, which would be high under normal circumstance. Visually, I dont notice any jitter on the servos. This is 100% because of Seb’s recent work on the new motion mode and IIR filter (EM0 and FPC cmd). Before that there was considerable jitter visible. Take a look at the difference: (no other changes other than EM and FPC mode)

Default mode - EM1 and no IIR

Here it’s smooth as butter but by their very nature IIR filters adds some command latency, I turn the IIR count up high (12) to test.

The porridge is just right, EM0 + FPC3…Smooth but no discernible latency

1 Like

Actually, the LSS Adapter Board does have 3.3 V DC on-board regulator and level shifters on it. They are used for the Bee socket. So, I guess you can use those as an “off-label” use? :stuck_out_tongue: Just don’t forget your common ground! :wink:

Edit: it seems I was beaten to it :stuck_out_tongue:

@xan @zenta Do you have a preference for how to get started? The team is discussing internally and there seem to be two main options:

  1. Teensy 4.0 & Arduino
  • Advantages: The 4.0 is Arduino compatible and pretty powerful, so others might have an easier time getting started. Might be the fastest way to get the robot walking. If KurtE is up to modifying his original Teensy breakout for the 4.0 and include logic level shifters, we are in a position to produce a few in-house, on a single board.
  • Disadvantages: Current Phoenix code doesn’t use equations (coordinate based), so creating adaptive walking and more advanced functionality might be harder. The Teensy 4.0 is only available from one manufacturer and has a pretty specific form factor.
  1. ROS 2.0 & Raspberry Pi.
  • Advantages: With ROS 2.0 it’s about as good as it gets for advanced functionality.
  • Disadvantages: About as complex as it gets in terms of setup and understanding.

Neither solution has a clean electronics setup, and both require 3.3V. The Teensy only needs one logic level converter, whereas Colin has found three are ideal for the Raspberry Pi 4. We’re brainstorming a few potential prototype boards, like KurtE’s Teensy breakout (but include level shifting), or a small LSS compatible module which includes one or two level shifters and would be more general purpose. We feel as though an LSS compatible, custom “HAT” will need to be in the works either way which handles logic level conversion and power and replaces the LSS adapter for more complex robots. Either way, we would like to get you the hardware sooner rather than later.

2 Likes

@cbenson - I personally think in the end you are going to want both!

That is, for simple stuff and for someone getting started in Robotics, I personally believe you are going to want a reasonably easy setup that they can understand. Where as advanced user are going to maybe want ROS…

Also it may be my take on ROS was that it was all sort of all geared toward being a rover, so again for many things if a rover could not do it, you did not do it… Now maybe it was only the developer of the previous Hex ROS stack believed that :wink: And hopefully that has or is changing.

Again for my own playing, I will probably adapt a Teensy 4 to try out some of the stuff. But one of the nice things of Arduino setup, is in theory one could go with Teensy? Or one could try something like the Arduino Portena when it comes out:
https://store.arduino.cc/usa/portenta-h7

Again it will be interesting to see how well the RPI4s work. As I mentioned in previous post, I ran into issues of the robots not moving as smooth with RPI as it was with Microcontrollers… @cmackenzie work on smoothing out the Humanoid looks promising.

Again in some previous Linux based robots, I(We) ran into these problems with the USB latency. This was especially true when you then added on other USB (or camera) activity, as I believe at least on the RPI2 all USB (and camera) stuff funneled into one place (channel?), So you were pretty limited on bandwidth… Maybe the new RPI4 has improved this? During that time a few of us moved over to using ODroid (XU3 or XU4) as they had better USB handling. I also later went to using UP boards as they are X86 and again better USB and you could run normal 64bit X86 like Linux installs… This was also partially true as had an external IMU that connected using USB, and at the time I believe it’s node was running Python (using some real simple python serial setup, which at least at the time ate most of one of the CPU cores…

When I was using RPI2 and some with RPI3, I did have a slight better consistent timings when I went through using the Hardware Serial port. However I did notice that RPI3 the hardware Serial port by default was not the one on the expansion connector but instead used by BT. But you could reconfigure it such that the expansion connector was the hardware Serial port… I have not played enough yet with RPI4 to know if they are like the RPI3 in this regard or not.

Again should be fun.

@cbenson and all - Sorry I don’t mean to hog the conversation, especially since there are many of you have done great stuff and are more up to date on your setups and the industry… So again I hope you don’t mind if I throw out a few more.

There is a third option - which is a combination of both.

That is if you look at the Turtlebot3 by Robotis, they Originally started off with Intel Joule processor connected up to their OpenCR board. When Intel dropped the the Joule they went with the RPI3… Likewise Trossen Robotics was working on a version of Turtlebot2 again with Joule and their Arbotix Pro, when Joule was dropped they went to Intel NUC and not sure what they are using now for their secondary…

Now back to TB3 - As I mentioned I purchased A RPI turtle and built up most of a Burger. The Burger was mainly setup to play along with a few of the ROS2 developer at Robotis.

They have their OpenCR board (32-bit ARM Cortex®-M7 with FPU 216 MHz, 462 DMIPS).
With this setup, they have the OpenCR board setup as a ROS node, I believe it still does all of it’s communications through the RPI. This node both subscribes to several ROS topics, it also publishes several as well, including things like IMU (has one on board), plus ODOMETRY, Joint State, …

I know that they were(are) working on another setup, I believe they were calling it XEL and they were setting up for the nodes to be more independent, where each of the major pieces had a LAN type connector which could work as normal nodes, and they also had a few other nodes which worked over their SERVO like connections. Again I do not know the current state of these or if it makes sense to create a secondary board, example Teensy 4.1 which will release probably later this spring/summer, (Same processor asT4), lots more Flash memory, setup where you can have secondary memory, Same form factor as T3.6… Maybe add an IMU, maybe setup to have some standard controller, which maybe like TB3 you can drive it around… But again it is setup for the RPI to send it ROS2 instructions on like where to go…

Again you could choose many different options for what processor to use… But it would be nice if it had things like floating point…

@cbenson, I have a background in Industrial Automation. Here, systems use dedicated hardware depending on the type of response times needed. The last robot I was working on (years ago already :roll_eyes:) I planned the same setup. Having the SSC for driving the servos in fast controlled loops. Having an Arduino to do the IK, as quickly as possible and update the SSC with new positions. And additionally, I would use a Raspberry Pi (2?) to do more powerful, but less time-critical, stuff like processing camera data and AI. The Arduino could be fed by a controller or the Pi giving the same controls like walking directions.

Price-wise, this is of course not the way for this hex. And, with the new servos, this might be a good idea to have an more advanced controller to take care of the “walking engine”. Since we get feedback we can do more complex things which help with terrain adaption and maybe we can take dynamics into consideration as well. It all depends on how advanced we want to make the walking platform.

Back to topic:
Just do IK to let the hex walk with a controller, the Arduino could do the job. And quickest up and running. Doing more advanced things can work as well but might run into limits.

To have a more future proof platform I would go with the Pi. Having the Pi in place will open more possibilities. But also complexity.

I would personally go with the Pi to see if we can push the limits and bring the platform to a higher level. With some good tutorials, this could be made into “medium/high” entry-level as well.

I agree with KurtE and think that both solutions might serve their unique purpose for customers.

Ok, I hope that made sense. :sweat_smile:

1 Like