For startup, on some other hexapods, I did something like, move all of the servos to their startup positions, and take a second or so to get there… On some Arms, I ran into issues where they could get into problems depending what the positions of the servos were at power off. Especially true with RC servos with no feedback as you you not query where they were… So in those case I had it first move to some neutral safe position and then to the start position…
It will be interesting to try out some of the different ways to detect the ground and and any other obstructions.
It is good that the servos have the ability to set these limits, such that the main processor does not have to necessarily query lots and lots of times to detect.
Back with the HROS1, I know a few of us experimented with using FSR sensors to detect when the two feet hit the ground. If I remember correctly we had 4 sensors per foot (each corner).
Renee and I played around with a simple board (either Teensy 3.x or Adafruit Trinket Pro, where we setup the board to connect into the DXL servo chain as each one having their own IDs, and we set up logical registers to read the sensors… (They also had a Neopixel we could set the colors…)… Obviously the Dynamixel is a little different than LSS servos, but could probably do a similar thing.
I was also thinking about experimenting with some Time Of Flight (TOF) sensors like the VL53 or VL62… where maybe you could detect how far the foot was up before you started to lower it. Not sure how well they would work, but could be a fun experiment.
Back to the topic at hand, hoping the CL / CH might be sufficient for contact detection, but the LSS-2IO would allow additional sensors to be connected per foot. Just brainstorming, but wonder if adding sensors to the knee (facing the foot to get an idea of distance) would help keep the sensors safe and be easier to mount.
As you said (Kurte), first step is simply getting it to walk.
Current detection certainly works for contact detection but there is a lot of math involved to get it right since you have to determine if current is due to normal servo command or obstacle. If the robot isnt moving the math would be much easier but that is probably not the case. I use a predictive model of robot and joint state, then compute predicted torques, then the torques directly relate to what current I should be reading from the servos. I’m getting really good agreement between my my model and measurements except in some cases where my model diverges due to IMU issues (still resolving that).
If you imagine walking in the dark it’s basically the same…and we want the robot to walk at a nice pace? lol. Even humans bumble around with this limited level of sensor input. So if you want to increase trajectory speed you better integrate trajectory planning with some kind of confidence value and probably branch trajectory planning.
Sooo…tl;dr if you can add more force/touch sensors, like capacities ones…I highly recommend it. haha
As I was saying, math is hard :lol: These days I would probably do it by trial and error. That is start walking, and see what the ranges I see for normal things, and then might experiment with different things, like if it hits an obstacle moving forward (or whichever direction) and foot off ground , maybe will detect this in femur joint? if Leg coming down, detect in the tibia joint …
Good question. For something like a TOF, it would be interesting to see where a good position might be, and likewise, maybe where the supporting electronics would go. could be something like the LSS210. Could be something smaller.
Something like, the neopixel one some of us played with for the HROS1
Although in this case you may not need neopixels as you have color LEDS in each servo… Although these were sort of neat inside the 3d printed hands which then glowed the different colors…
Yep I should probably get back to this… I have part of it in place, using my older fixed point match version, would be better to go with a floating point version and the like. Only problem is my math skills these days are really really rusty… Had a math minor on my BS in CS, but that was several decades ago and have not used much of it since… Sort of hoping that someone who remembers what an angle is might get some of this stuff working
So can I! I have struggled with the math in the robot dynamics too. I am a gluten for punishment though and I just pushed through it. Just kept shining a light in the darkness until it mentally clicked. The whitepapers are loaded with symbols that everyone else seemed to know since birth (no legend)…so that makes it magnitudes harder. Eventually you learn what the hieroglyphs are and it’s no big deal, but geez you could have just said that lol.
I feel like I have all the data in good order, and I’ve been using a lot of trial and error to optimize output based on the data but it’s hard to converge on a good solution with so many process variables to play with…so many levers and buttons to push so to speak lol. So I’m looking into using a fancy new AI lambda server I have available to me and throw some machine learning at it. I am setting up for Gazebo sim, this requires urdf and srdf config to be in proper order. I am also seeing if OpenAI gym might help. I played with a humanoid walking example in OpenAI a few years ago and it worked well. This required urdf/srdf too so going to see where this road takes me. I think it will take some time for initial setup but once done it will be like rocket fuel to adding robot behaviors.
Quite a bit of a delay, but RobotShop should soon have stock of the Teensy 4.1 @xan@zenta Any update? Hope all is going well despite the pandemic. @kurte breakout board’s ordered?
Although it looks like the Netherlands is moving towards a second lockdown, all is well here. I’m working mostly from my home. I hope everybody is well and safe!
Unfortunately, I did not have the amount of ‘robotics-time’ I wish I had. I’ve finished calibrating the servos. I discarded the limits for now. @cbenson, I will reply on your question later (I did not forget those). I have the electronics set up to communicate with the Teensy 4.0. I’m using some SparkFun levelers to convert between 5V and 3.3V. Next step is to get the two board actually talking.
Hi. Sorry for my absence. Several reasons. I’ve been away on a long business trip. Got sick for a week (not C-19). My workshop PC which I replaced 3 months ago suddenly went into blue screens and I’ve to send it in for repair (hardware failure). Also a bit distracted by other hobbies with my youngest son…
I am also doing OK. We are still behaving like everyone has C-19 and avoiding all contact… Hopefully one of these days we will have it under control and can loosen up some.
As I mentioned in the previous post, I have been sort of lazy on getting things done, as my math skills are not as good as it use to be. After all isn’t 1+1=3?
Yes - I have my own T4.1 carrier board. I need to get back to playing with it to remember which things I wanted to fix on it.
I have yet to fully populate, example I have not hooked up speaker. Did work with talking to servos, USB host, and hook up one of the Adafruit ST77xx displays… But unfortunately I keep getting distracted.
Often to other projects with the Teensy… Or maybe ROS2 stuff (Turtlebot 3) or …
Hopefully soon we can get some baseline set of code working, that I can then have fun tweaking.
I have been busy with ROS2 and Gazebo simulation. I have the robot loaded in Gazebo. I am working on getting my joint controllers wired into the Gazebo simulation. Once this is done I can move onto optimization of the walking/motion parameters using a Q-learning algorithm.
@cmackenzie Any news to report from testing the humanoid on the server?
@xan Hope the Netherlands (and the rest of the planet) takes this seriously and doesn’t put all hope in a vaccine. Last time you thought you might have an update soon - any progress? Keep at it!
@zenta Hopefully all better now? Did you take your son trick or treating? Looking forward to an update!
The first big milestone seems like it would be successfully porting the Phoenix code to make this new platform walk (no sensor feedback, terrain adaptation etc.). Sounds like a plan?
Hi @cbenson. Yes, I’m editing up a new video as we speak. Was hoping to post today. I helped out the ROS2 Controls community with two PRs which got approved this week and a third I assisted on in the Gazebo-ros community is pending but required the first two to complete first. In my own branches though I have a working ROS2+ros2_controls+gazebo+lss_humanoid setup.
I feel like Sylvester the cat though. It’s been take a step, rake to the face, take another step, hit a pole, get new teeth, hit a wall kind of thing. I have a comical video to show where I am now. Nevertheless, it’s progress towards the goal.
Lots of progress here. ROS2 Controls+Gazebo stuff working. Identified an issue in my collision models that needs fixing. At the end I review the overall architecture. Most of this map is already done. There is configuration and wiring this stuff together properly that remains and integration of an ML reinforced training algorithm.
Good to know you suspect where the issue is coming from. Servos take up physical space, so having three degrees of rotation at a point results in… breakdancing!
Thanks. Finally got my PC back. Currently working at home since my youngest son are in quarantine due to a local C-19 case in their class. I think we all are getting tired of this…
Need to dig into everything again to make the hex walk. Can’t have a dust-collector sitting on my desk. Might start looking into what @kurte have on his github again.
@zenta - Hope your son and all of you as well are doing well and did not catch it. Looks like the Fall surge is starting around all of our areas. While for a long time it looked like most other countries had a better handle on it. It looks like we are starting a major surge. Unfortunately in the US we never got our levels down to a reasonable level, so I suspect it is really going to spike!
Personally we are still hunkered down here. As I mentioned earlier to you, we have not really been anywhere since March. i.e. we have not been in any stores, restaurants, etc since then. Hopefully at some point we will get our act together and get this under control.
@cbenson - I have had different boards that are working, from a simple one that just sets up an Arduino Uno Headers, such that you can plug in your LSS adapter board, and I have TTL level converters for both the servos and the XBee. Been awhile since I played with it.
I also have another board that has servo connections. Of course everyone can do a real simple quick and dirty setup with the T4 or 4.1 and something like Sparkfun or Adafruit TTL level converters, with wires to hook up to your servos. Can then have a couple of your servo hubs to then wire up to all of the servos.
Sorry I have not gotten much done lately as it is more fun when others are playing along… And Math is Hard
So been having more fun doing things like: working with T4.x and TOF (Time of Flight) sensors. And playing with simple cheap camera and see how to use the Teensy 4.1 capabilities to read in images, using the CSI sub-system which does most all of the work in background using DMA and then needing to hook up extra RAM to T4.1 as to have the camera be able to buffler 2 640x480 16 bit images (Wont fit in normal memory).
Again not very Hexapod specific, but having fun.
Also again I personally am totally open to other hardware setups like ones to run ROS (probably ROS2). I have a RPI4 8mb with SSD sitting here and would be a nice platform to setup an interesting Hexapod.
I am ready to eager to get started again, especially once there are others to bounce ideas and code off each other.