Autonomous Rover

This is a robot I'm working on. Has a Zed depth camera for SLAM Also has Lidar Lite. RoboClaw motor controller and 2 nvidia Jetson boards. Also GPS and an IMU unit. All the tubing is carbon fiber the rest is aluminum parts from Actobotics. Jetsons are running Ubuntu and ROS. Just fired up the motors tonight for the first time. Had one wired backwards of course :) But outside of that all is well. Just learning ROS so its slow going right now. Having to hack a driver I found for the roboclaw to work with my set-up.

 

Griz


This is a companion discussion topic for the original entry at https://community.robotshop.com/robots/show/autonomous-rover

Nice design. The carbon

Nice design.  The carbon fiber with the aluminum actobotics parts looks really cool.

I hadn’t heard of “Zed depth camera” so I looked it up and came up with https://www.stereolabs.com/.  Is that what you are using?  I am curious as to how that plays out with what you are doing.  

I am also playing with ROS on Linux right now.  I am using a Kinect to do the point cloud and trying to get the navigation and mapping modules to work with it.  Nice thing about a Kinect is a refurbished Kinect is $25 from gamestop.com although requires a 12V power supply.  There is a learning curve with ROS, no doubt.  I had Linux running in a VMWare Player (virtualized on my Windows box) and it took several weeks of scratching my head every night after work to finally find someone who said that VMWare and Kinect don’t mix.  I just found an old Windows laptop and last night I wiped it and installed Ubuntu, so starting over.  It won’t take long to get everything installed and working again.  A little frustrating.

That said, I am surprised more hobbyists aren’t biting the bullet and going to ROS.  To go beyond the simple robot that can wander around your living room bumping into things, it seems such a logical step.  Please let us know how your project goes.  If you have any questions on ROS, I might be able to answer although certainly no expert.  

Regards,


Bill

 


Impressive Sensor and Processor Setup

This is a VERY impressive sensor and processing setup, but you know that.  I only write this to say “Awesome Job” and so others might stop and pay more attention.  This is probably the most powerful setup I have seen on a hobby robot.  You had me at Jetson…without the 3D camera and the Lidar.

I would love to hear more about how you are using and processing the video streams and your experiences with the Jetsons, Lidar, and ROS.  I hope you choose to post more and often.  This is a fascinating project.  I will be following.

I’m hoping to stick a toe in the water of 3D by using 2 Pixy cameras on movable ears that pre-process the video and output blob locations at 50FPS.  I realize that is laughable compared to this, but it fits in a small bot and lets me walk before I run.

Keep up the inspiring work.  I’ll be following this project.

Regards,

Martin

FLIR Camera

Just a thought, but putting a FLIR camera on this bot would seem like a nice addition to distinguish people/animals from the rest of the 3D image.

https://www.sparkfun.com/products/13233

 

FLIR pixel count

I suppose it is the germanium that makes IR cameras so expensive, hence why even a 60x80 sensor is $260.

But, you can do a lot with 4800 pixels, a house fly has around 4000 and absolutely no learning capacity. Yet it is able to dodge most anything and has a basic set of escape skills based on it’s vision.

A good idea, I think.

There are some things you can do with a standard PIR and some (actually a lot) of HDPE, the FLIR is a better idea.

Since, I am prone to throw out ideas, whether they are wanted or not, might I suggest that the UV end of the spectrum is just as interesting. Many things that normal vision miss are hihlighted in UV, which is why so many animals have vision that extends there. The goal is largely to pick things out of the clutter. Also note that many sensors will see somewhat into IR, not far enough to pick out low temps but enough to really highlight skys, trees and water. A red filter and removing, if there is one, an IR filter from the sensor will get you there. IR pics:

https://www.google.com/search?site=&tbm=isch&source=hp&biw=1280&bih=667&q=infrared+pictures&oq=infrared

re: Pixel Count

I use one now that is only 16x4 = 64 Pixels.  Its not enough to make photos, but its enough to find targets of interest and aim at them or track their movement.

I’ve never looked into UV (other than looking at the sun).   Any good sensors for UV you recommend?

Jetson

  I had no idea that the Jetsons were built on the Tegra K1, an impressive and interesting bit of hardware. You seem to have the hardware the rest of us dream about and are putting it to good use.

  Hadn’t heard of the Zed either, but it is a sensor whose time has come. I’m just toying around the edges with a Kinect…

Yep pretty cool chip

The K1’s GPU makes this all possible. Can’t run the Zed software without it. There is a hitch that Nvidia is working on right now. Some problem with the USB port so it won’t work in the 2K or 1080 modes. Don’t use either one for SLAM so not worried about it and I figure that since this is the only chipset that will work with the Zed they are going to be pretty interested in solving it. They are sweet little boards. A little pricy but you get good bang for the buck. I’m into photography too so eventually the top of the rover will have a standard set of camera rods on it to mount DSLR’s and such. I have the parts to do it from a shoulder rig I put together a few years back. I switched over to stills a few years back so its just gathering dust.

Finished hooking up everything and hiding all the wires today. Its quite crowded now good thing the final base will be 6 inches longer than the prototype. The base needs to be much thicker. I just picked up whatever plastic I could find at Home Depot to do the prototype. Plan on doing the next one with a 3-d modeling program and having it laser cut. I am not a very good fabricator. Nothing is square if I build it. Lots of “extra” holes etc. Integrating parts or repurposing things I do well at but not sawing and drilling. So in order for it to look decent it needs to be done with a laser cutter. On the final version the base will be black and the top a slight tint. Haven’t decided if I’m going to put sides on the computer area yet or not.

Figured out how to do away with the suspension as well. If I fix one side of the outrigger to the main shaft and keep the bearings in the other side I can put a gear on the centershaft and use a motor and the imu to keep it level. So look like I’ll end up with 2 motor controllers one for the wheels and one for the lidar turntable and the leveling device.

All the wiring except the two wires going to the on-off switch are in split tubing. The wires going to the two motors on the opposite side from the motor controller snake down through the base support tubes across the underside of the base and back into the outrigger on the other side. So no bare wires anywhere on the bot now.

I have one of the Raspi cameras that does IR. I have an old Canon that has the filters removed I play with from time to time. But since this bot has the Zed and plenty of compute power I’m going to use CUDNN library to implement a deep neural network to recognize faces and objects with the feed from the Zed. Planning on putting some small led headlights on the front outrigger elbows. Nice flat face and pre-drilled holes that will work great I’m thinking. I have a light sensor onboard its to switch over from the Zed to the Lidar exclusivly when it gets too dark for the Zed. So it would be no problem to turn on an Ir cam at that point in time as well. I have the two webcams I started with so maybe I can remove the filter from one of those and use the pair as regular and Ir sensors. Also have a CMU-Pixie color following camera on a 2-axis setup that will probably end up on the bot as well.

Finished compiling the RoboClaw driver I found today as well. So I have all the software compiled for ROS for the bot now. Probably missed something but rosdep says everything is a-ok. We’ll see I’m not expecting too much intially. ROS is complicated steep learning curve in the beginning and so many packages. It takes a long time just to become familiar with them and pick the stuff you need. But I love the way it works and all the tools it provides. Robots can be an absolute bear to debug without proper tools.

This is the first scratch built robot for me I built a HeathKit Hero 2000 back in the 80’s. I was at my brother’s house in Houston while he was here on leave and picked up an arduino just to play with. Less than a year later I’m up to my eyeballs in this embedded stuff. I used to play with it years ago but it was so difficult having to wire wrap everything. And once the speed of the processors came up the noise made it practically impossible unless you made boards. Never had much luck etching and drilling boards so I quit tinkering.

Its been a ton of fun building it. I’m sure it wil be an even bigger kick when its wandering on its own. Can’t wait.

 

Griz

NVidia and Bus

I’ve been a fan of NVidia since the Tegra 4. I think the competition with Qualcomm, and when Google dropped them, really caused them to kick mobile development up a notch. I had signed on as developer for the K1 but never got one, more power to you. 

The serial bus is never as easy it would seem. The Raspberry Pi 1 never got their serial working right. The Pi2 is much better but can’t handle both streams off a Kinect, and the DUE clone I have here needed help with pullups.

The key to everything is the depth map, which you well recognize, and I think we all appreciate your efforts to move the art along.

Back in the day I used to develop circuit boards on the kitchen floor. I remember a friend kicking over the tray of Xylene and everything, floor, shoes and all started melting big time. (I note that NASA used to use Xylene to temporarily waterproof the shuttle, until they figured out that was why the tiles were falling off!) Just as well circuit boards aren’t made that way anymore… 

All in all with the technolgy available and much of it cheap, it is the right time for robotics. Go run with it.

64 px

What sensor is that (16 x 4)? Might be in my rather low price range…

Sorry that I don’t have a source for UV sensors, although as I think now that the low end (IR) is more usefull for much of what we are trying to do.

Wikipedia:

https://en.wikipedia.org/wiki/Ultraviolet_photography

There are probably a lot of D40s and D70s at Ebay, but those are full size (the body, not the sensor) 35 mm type cameras, a load to carry on a small robot.

Working on a display for the Pi2:

http://www.ebay.com/itm/181719162102

My intention there is to serve media up on the robot (which the Pi 2 is good at), so  that it could show pics and videos of it’s friends. I remain a huge fan of Anna and AVA.

No problem getting them now

Don’t have to sign up as a developer to get one now. I bought them at Amazon. About 200 bucks. I’d like to get ahold of the one they sell for automobiles that you need a 10k unit order to get. 2 TK1’s twice the memory can bus and all that.

What I like about them besides the excellent hardware is the software support they offer for the board. Customized version of Ubuntu 14 cuda libs cudnn and OpenCV with the proprietary stuff included actually there are two versions one with the prop stuff one without. Yea the Pi and Jetson basically use the onboard USB port for everything. One of the guys from Italy that also has the Zed told me Nvidia rep told him they were close to solving the problem. The Zed hammers one board over 70% utilization. Glad I decided to go with a pair and I have enough power available to add a third if necessary. They make an awesome cluster being able to share cuda tasks across the cluster is pretty awesome. They are a supercomputer that fits in your hand.

This bot will carry a DLSR easy. The motors are sized for a 25lb payload to move it at 20mph. And I doubled the torque from the required amount. 400oz/in per wheel. I wanted it to be fast so it will do well in the Autonomous Vehicle Contest next summer. The reasoning behind the adjustable ride height is for that contest as well. For the contest runs I’ll flatten the beast out lower the payload base to its lowest level so the CG is low and it will turn quickly without torque rolling and crashing out. I’ve been watching the videos from the past few years to see what I need to do to take the prize. The limiting factor is how quick the Zed can do its thing. So far I’m seeing about 13 frames a second at 720p when its doing the full depth map thing. A little low should be 15 for real time operations but close enough I think. The SDK and drivers aren’t even at v1.0 yet so I suspect some more performance is on hand as soon as they optimize everything. Another thing I want to try is to split the processing that happens on the cuda cores over both Jetsons. Cuda allows that and has made it easier in the last couple of versions of the cuda libs. With a little custom code I can take advantage of the Zed’s range. Long straights and when the bot determines its on one can kick up the motors to full and really take off. I’ve wanted computer vision since I bought my first computer back in the late 70’s. There is another stereo camera on Kickstarter. Doesn’t have the cuda support or software like the Zed but if you wanted to put that stuff together it would be a good way to go. The price is under 200 bucks. The problem with trying to do it with a pair of cameras is getting the calibration right. The Zed is calibrated at the factory and downloads its config the first time you plug it in. After that it can maintain that calibration. Hours of error prone work you don’t have to fool with. Its a lot of fun to play with. And hopefully by posting it up in as many places as I can find I can generate some of the old “if that old geezer can do it so can I” feeling amongst those that think it might be over their heads. Just wish my health was better so I could do a few 2 day hacks and get it done. Then again my health is probably the way it is because of those 2 day hacks :slight_smile: Should be getting ready to go to the F1 race in Austin about now but the weather here is horrible today. Moonsoon rain. But the cold front came through early so it should be nice and clear tomorrow for the race.

Griz

Speed

20 mph is awesome fast. 13 frames at that 30fps gives you a couple of feet or so of reaction time. I’m thinking the limit will be in the inertia more than in the electronics.  What does the course look like? I see it as an interesting software problem.

I’m working on a cat like quadruped so I am in the sub 1 mph range! I found out though, quite by accident, that it will jump a fair height.

My thinking is that double the torque is exactly what you should shoot for, you want to stay far away from stall torque if you want speed. For PM motors, half torque is where the maximum power is, I imagine you are using some of the brusless ones in which case the calcs are a bit different. The stuff they are driving helos with is incredible, you need a mass market item to bring along the technology for those of us more on the edge.

I saw the Pikes Peak electric (stock) car races on cable a while back, impressed me!

Back in the 70’s I wanted to do visual things, not necessarily computer vision. I thought I had a hot setup when I had 16MB of RAM and a 486DX. I used to pull stills from the video camera as there really was no such thing as a digital camera. It’s all so easy now…

Have fun at the races. We are pulling for you when your time comes.

Oops

My lack of math ability shines through again. The speed is a shade over 7 mph not 20. I’d have to go to 1000rpm motors to hit 20. Oh well just goes to show even the math challenged can build robots these days :slight_smile:

I use a similar display on a PI to provide the video for my 10 channel analyzer. Need to modify the program though. Too much extra stuff you don’t need on a small display. Take out all the menus etc and put them all on a right-click meny so the whole screen can be used for just the traces if you want. Its a one of these days project though :slight_smile: I’d have to learn Qt to do it and I’m just not that interested. Since it works over a network with the Eltima usb over wifi driver I can use the tablet or main computer to see the data. My boards came in yesterday picked up a new tip for the iron but couldn’t get the old one out so I’m just sitting here looking at my new boards instead of installing them. This hasn’t been a stellar weekend for sure.

 

Griz

Old irons and math

Maybe, just maybe a few drops of PB Blaster or equivalent will help, then tolerate the burning oil if it twists apart. I had to buy a new iron as my collection of them had gone to pot.

Technology has hit the soldering station (just looking, don’t have):

http://www.banggood.com/TS100-Digital-OLED-Programable-Interface-DC-5525-Soldering-Iron-Station-Built-in-STM32-Chip-p-984214.html

I’ve ordered some things from them, some times they come the same month…

I’m stuck with nothing on the display, SPI will not tell you what is on the bus, and I haven’t found exactly what I have, so it lights up and nothing else.

The problem with robotics (particularly for a noob like myself) as I see it is that you don’t know what you need so you research it and order it. Then the part comes in and you find that you need something else also, which you need to order… I’ve got a pile of bags and boxes from all parts of the globe and also the local hardware store. I see why complete kits are so popular, but it is just not my style. So you spend time waiting… and hacking.

7 mph is still fast for autonomous.

Same here

I started out with one arduino and a raspi. Now I have a box of them and a cluster and a couple of more Jetsons. The arduino like boards are so cheap its hard to resist. Wish you were close by I’d bring the analyser its perfect for debugging SPI. Has a template for it so all the messages show up in english. That alone saves a boat load of time. You can only do so much putting print statemensts in your code etc even debuggers have their limitations because they don’t look on the hardware side. I looked at a lot of analyzers and for the money the Embedded Artists Labtool came out on top. Has pre made signals on a header to learn how to use it with. The cpu that powers it is one of NXP’s most powerful and you can pop that off and use it as a Jtag probe. Has the software and all that. Then you have some debugging power with the ability to set a breakpoint anywhere in the memory space. And as many as you want not just the two the chip provides. Heck of a deal for 100 bucks. Then I found the Eltima driver that does usb over wifi so I can put that analyzer on the droid and monitor it in real time. Have it save the data to a bag as well.

One of these days I’m going to learn 3-d cad and start using that to verify my ideas. Sure would be a lot cheaper although on this build I’m not too far in the hole on extra parts. Only a couple of 30 dollar motors I haven’t already found a use for. I miss all the surplus places that used to be around. Every Sat we used to head up to Xerox surplus in Dallas and pick over the goodies. Then head to the sidewalk sale. I had a lot more stuff back then. Trying to keep it to a minimum these days but not doing a very good job at it.

Yea 7mph is plenty fast. My scooter will do 6mph and this is the test bed for making that a self driver so to have them matched is a good thing. Although I can already tell I’ll be keeping this bot intact if I do manage to get the scooter running. I get very attached to stuff I build.

I tried everything I have some killer oil that usually busts everything loose. Tried heating it then ended up trying an ez out and messed it all up. Its a cheap weller so no big deal I ordered one last night takes 2 days from Mouser. I’ll put some anti-sieze on the new one. Started to buy a better iron that would work with the station yesterday but didn’t. If I lived in town I’d just go get one but its 70 miles to Austin from here. The RS closed here so there is nothing close by. No biggie I have lots of software to work on and lots of stuff to read. I really like that new Oreilly book. And the pdf A Gentle Intro to ROS by Jason O’Kane.

Griz

Misc

Thanks for the heads up on the ROS pdf. Looked at the Labtool and it looks the real deal.

I was thinking of slumming it and getting this:

http://www.banggood.com/LHT00SU1-Virtual-Oscilloscope-Logic-Analyzer-I2C-SPI-CAN-Uart-p-988565.html

I had written a much longer post, and then it vanished. Oh well… time to make dinner…

Back in the day when surplus stores ruled, the surplus was real made in America, overbuilt durable stuff that originally cost a small fortune to manufacture. Now surplus is mostly cheap Chinese stuff that you can just about buy new from China for the same price.

 

More stuff on the bot

Well I had no luck at all with the roboclaw board and ROS. They changed the firmware over to crc16’s from checksums so none of the drivers I found will work. So I’m using it for the turntable for the lidar and eventually the leveling motor and I picked up a pair of OR controllers from Italy. Found out my idea of running two motors with encoders and y cording the other two isn’t all that good of an idea as they will run at different rpm unless controlled with an encoder. So more money out for another pair of motors with encoders. However the OR stuff comes with a complete ROS set-up I’ll just have to change the URDF robot definition file to match the dimensions of my bot and it should be ready to test. Worth the price of the boards to get the complete software package. So the bot has 360deg scanning Lidar using the Lidar Lite V2 now and also voice recognition and a speaker so it can talk. The voice stuff works over a bluetooth headset using pocketSphinx voice processing software.

Working my way through the Orielly ROS book. Finally understanding how to use it and how it works. Some excellent tools to test and diagnose problems. Still at the infant stage but at least its not a total unknown now. I’m sure by the time this bot is operational I’ll be very familiar with it.

Thinking about writing a quick and dirty program on an adruino to control the motors via the voice stuff to just get it on the ground and get it rolling. Going to be a couple more weeks before the other controllers get here. I finally took the time to align everything and get all the angles on the outriggers right. Picked up a little angle tool at Lowes and a square and used those to get everything square and aligned.

Working on the motor based leveling as well. finally figured out a combination of parts that will allow me to fix one side of the outrigger to the main shaft. That way the motor has something to work against to level the base platform. The part is actually a pillow block for a 1/4 bearing but it also has the hub pattern drilled into it. So using a clamp collar with the same hub pattern on the shaft and bolting that up to the bearing carrier sans bearing it will be fixed.

The turntable for the lidar is nylon chain driven. The slipring large end is 22mm so I used a clamp instead of the flange to mount it. 8mm clamping hub on the other end that holds the sprocket and rt angle channel bracket to hold the lidar. Since the slip ring only has rudimentary bushings I used the chain so I could run it very loose and still have accuracy. Very little load its not moving much weight so no need to have things tight. The chain is plastic/nylon comes in links easy to assemble and lubes itself. I’m using a 22mm motor with encoder to drive it. Right now its clamped onto the mast but soon I’m going to drill out one of the 1/2 in holes in the plate its all mounted too so the motor can go under the plate much closer to the slip ring.

So things are moving along slowly but with steady progress.

I looked at some of those inexpensive analysers but they all had some flaw. I sent back two different ones before I found the one I have now. The Embedded Artists has a jtag probe and software as well. Unlimited hardware breakpoints in any part of the memory space. Looks like I’m going to be able to use that part to program the uNav when firmware changes as well. And you get all the source code so you can do whatever you want with it.

Jetsons are 100 bucks on sale right now. https://developer.nvidia.com/embedded/makejtk1

 

Power subsystem

Can you discuss in detail the power subsystem you have in place? I understand you are using a hi amp dc-dc converter/boost circuit to feed the twin Jetsons.  What about the other components: motors, ZED camera, other sensors.  Do these components draw from the same source?  Was there design consideration for recharging the battery (assuming lipo)? And how do you have it all wired up.   Diagrams or pictures helpful.  Thanks…

Electrical

Just installed a Nvidia Tx1 last night so the power system might change a lot. The Tx1 doesn’t require 12v it can take anywhere from  9 to 19 v so I can just hook it straight up to the 4S lipo battery that powers the bot. That battery powers everything. The leads from that go to a strip where I fork them out to the motor controller and the DC-DC converter. There is an additional Drok converter that is adjustable onboard for arduino’s etc. The DC-DC convertor is a unit from micro-box. Its made for putting computers into autos. Has 4 12v pairs on the output side so two go to barrel plugs for the TK1’s and one pair goes to the Drok. One is not connected. I’ll probably leave the dcdc in there for the fitering and battery management functions. You can program it to signal and then delay before shutting down when the battery gets low. Giving the computers enough time to shut down gracefully. It will have 3 motor controllers eventually. Going to a pair of OR uNav uBridge controllers for the wheels and the roboclaw will do the lidar turntable and eventually the leveling. One other thing if you are using a motor controller that allows regen on braking be sure and provide a pathway via a diode across the power switch incase it rolls while turned off. Not sure if I’m going to have to get any more elaborate than that. It seems to function just fine on the bench anyways. The only other thing I want to do with the power system is rig up a way to read the voltages from the balance jack and provide a “Dead Mans Switch” type cut off in case either the dcdc convertor or the motor controllers fail to see the voltage is low. Don’t want a lipo catching fire while attached to my bot. To me lipo’s are a pain but no way to get that kind of energy density. And with the amount of electronics on this bot I need it if I want any kind of runtime.

So far the Tx1 has been a dream machine. It arrived yesterday. Flashing it is much easier than with the old jetpack and the TK1. Although I think this new jetpack also flashes the TK1. It does multiple downloads now instead of one at a time for the required pacakages which speeds things up a lot. No need to know the IP of the flashed Tx1 or Tk1 now for the final load. Has VisionWorks and Cudnn in the load now as well. I built ROS on it last night. No errors and it seems to function just fine. Twice the ram which is nice and its nice and quick. No need for the dual boards now. This one will handle everything nicely.

 

 

A few new pics

 

 

GRZ_1954.jpg

 

Haven't been able to work on the bot for the last month. Been moving to San Antonio. All finished up now so I can get back to it. Loving San Antonio so far. 10 years out in the boonies it was starting to feel like jail.

 

Griz