Our crab arm rover

Here are some pics of our “crab arm” / “goose neck” rover, called Regis. It uses a 4WD3 base extended a bit with some lexan plates, and two SES arms: one with a webcam, one with a gripper. Electronics includes an SSC-32 to run the 10 servos, a Sabretooth 2x10 R/C to run the motors, a Logitech webcam, and a Gumstix verdex board (600 MHz, 128 MB) as the brains. The rover is designed to run autonomously using our Tekkotsu “cognitive robotics” software framework on the verdex, although at the moment it’s being remotely controlled by Tekkotsu running on a Pentium.

http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/shoulder.jpg

Two interesting points about the design. People who design robots almost always put the camera in the wrong place. It’s either too low to the ground, which limits the field of view, or positioned in such a way that the robot can see landmarks but can’t see its own body. This makes it very hard for the robot to manipulate objects. We solved this problem by putting the webcam on the end of an arm – what we call a “goose neck webcam”, This robot can easily watch its gripper from overhead, or look around the room, or look at its own wheels.

http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/look-high.jpg

http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/grip-side.jpg

The second point is the “crab arm”: an arm that lies in the plane of the workspace instead of perpendicular to the workspace. This has some drawbacks, but it offers the advantage of making it easier to do visual servoing, especially when combined with the goose neck webcam. The shoulder of this arm has a big ServoCity/RobotZone gearbox (thanks to robodude666 for suggesting this solution), so it can rotate into vertical mode if necessary. In vertical mode it can’t rotate in the groundplane, so we’d have to use the wheels for that degree of freedom.

http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/grip-vert.jpg

We still need to get some kinks out of the design. We originally used a 6V battery to power the 10 servos and a 7.2V battery to power the motors, Gumstix, SSC-32, and webcam (via a voltage converter). Unfortunately, the batteries can’t provide enough current to drive everything at once, so for now we’re using power supplies and running the robot in tethered mode. It was already tethered by an Ethernet cable; we’ll be replacing that with an 802.11 board as soon as Gumstix begins shipping them. Also, the current wheels are too spongy for the weight of this robot; we’d like to find something a bit stiffer. (Suggestions appreciated.)

http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/side-view.jpg

We demonstrated the robot at the Association for the Advancement of Artificial Intelligence meeting in Vancouver this week, where it received a Technical Innovation Award for hardware/software integration. Once we’ve refined the design a bit, and adapted it to the new Lynx Motion 4WD base that Jim is about to release, we plan to publish construction plans so other folks can build these.

Here’s a picture of the robot looking at Glenn Nickens, the student who built it and programmed the inverse kinematics solution:

http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/look-glenn.jpg

My other student, Ethan Tira-Thompson, is the principal architect of the Tekkotsu framework and did most of the software support for the rover. For more information about Tekkotsu, visit Tekkotsu.org.

Very nice. Thanks for sharing.

This is a very cool, and I think exciting, design! I really like what you have done with the webcam on the “goose neck” arm. I have wanted to do something similar, but my idea is to put the gripper at the end of the arm the camera is on and provide the camera with full pan/tilt mounted just behind where the gripper is. That might be too much weight at the end of an arm though, so it may not work, possibly depending on how big the camera is.

That never did make sense to me either, which is why I have been thinking about alternatives too. Using two arms may be the most sensible way to do this, even though it likely adds much more weight than using a single arm. Have the camera on an arm also makes more sense for a spy bot type of robot. My idea is to add a thermal sensor along with the camera to allow detection of warm bodies such as my favorite fuzzball.

I can see where this would be much easier. You only have to track X/Y coordinates rather than X/Y/Z. That durn Z axis can really throw some kinks into the works. :slight_smile:

Do you have encoders on one of the left and right wheels? This might help with positioning when having to use the platform as a DOF for the gripper arm.

The only idea I can add that might make things better is to split the servo power between two batteries, using a separate VS1 and VS2 supplies using a 6V, 1600mAH pack for each side. Then use the 7.2V pack (1600mAH or 2800mAH) for the motors and electronics (which may not be a good combination). I hesitate to suggest adding yet a fourth battery just for the electronics, but it may be an alternative.

I think you are doing some very cool and progressive stuff here! Is the software framework available for experimentation or will it be made available in some form? I am all in favor of Linux, but am shying away from the gumstix. If the software could be built for a different platform, I would be interested, and it would definitely point me at getting an NGW100 (uses the Atmel AVR7000 series chips) network board ($69.00, runs Linux).

You’ve also definitely finished convincing me that I need to move to a four wheeled platform. :smiley: I was already pretty convinced, but this settles things for me. :smiley:

8-Dale

A good example of a different type of design thinking.

When you power-down do you have a pre-programmed routine or does the operator safely position the camera neck when it is extended or do you let it fall over as is ?

Any camera picture steadiness servo shake (not worded well) as a result of the cam being at the far end of 2 shafts and servos ?

Good job !

Very good design. For power I would use the minimum number of batteries. Try one 5000mAh li-poly and use regulators to power everything else…

[size=200]POST 500!!![/size]

:smiley: :smiley: :open_mouth: :smiley: :smiley: :stuck_out_tongue: 8) :laughing: :smiling_imp: :laughing: 8) :stuck_out_tongue: :smiley: :smiley: :open_mouth: :smiley: :smiley:

The software will run on any Linux box, or Mac OS, and is available for download at Tekkotsu.org. It’s open source and LGPLed. It was originally written for the AIBO, but Ethan has been working on a new hardware abstraction layer which allows us to support other platforms, including the Qwerkbot, the Lynx arm, and our new rover, Regis. This feature will be included in Tekkotsu 4.0 which will be officially released later this summer, but if you want muck around with it now, the “bleeding edge” version is available at tekkotsu.no-ip.org.

After Regis, our next project will be Kathie Lee: a hexapod.

Since we finished the robot just hours before hopping on the plane to Vancouver, we don’t have a safe power-down sequence yet. The operator just grabs hold of the goose neck and hits the power switch. Sometimes the computers crash due to brown out and everything just goes limp, but we haven’t suffered any damage yet. A bigger problem is on power-up, where the servos want to move very quickly to their starting position. We’ve learned to pre-position the crab arm and goose neck before turning the power back on.

Oh yeah. If we move the goose neck to the straight-up position, or move it very rapidly to any position, there is quite a bit of sway, but it damps down within a few seconds. So one of the things we’ll need to do as we fine tune the vision primitives is put in enough delay so that the camera stabilizes before we try to do any serious image processing. We had to do this for the AIBO too, but on the AIBO the stabilization time was well under a second; with Regis it will be longer.

Have you tried using the T or S parameter commands in your SSC-32 servo moves? Lengthening the time a move takes might create less sway.

8-Dale

Good suggestion linuxguy, slowing the servo trip should help.

I am curious about your… vision primitives

Is this identifing colors of objects in your path or possibly the shape such as a square or triangle.

Actually, Tekkotsu does its own speed control, so we don’t currently use the T or S commands; we just send servo commands at some high frame rate to move the servo step-by-step along its trajectory. Certainly we should be moving the joints at slower speeds to reduce sway. We’ll get around to that very shortly.

It’s all that and more. We do color segmentation using the CMVision package, plus simple shape recognition (lines, ellipses, blobs, spheres, some other things on a more experimental basis), object tracking, and automated map construction. A good introduction is our paper in Robotics and Autonomous Systems, which you can find on the Tekkotsu.org web site. Here’s a direct link: cs.cmu.edu/~dst/pubs/touretzky-autrob-2007.pdf

For a “big picture” description of our work, see these slides: tekkotsu.org/media/sigcse-2007-workshop.pdf

Note: all this stuff was developed for the AIBO. It will take some time before we have re-tuned everything to work well on Regis and other Lynx Motion based platforms. And we’ve just started thinking about manipulation primitives to make use of the arm and gripper. Lots of work to be done.

I have started a Tekkotsu thread in the Linux section for all those who may be interested in building the software.

8-Dale

Here are some views of the underside of Regis. The most prominent board is the Gumstix stack (verdex, console-vx, and netCF-vx). In the background you can see the Sabertooth motor controller, and on the top edge, the voltage converter. The SSC-32 is behind the Gumstix, so mostly out of sight. The USB gender changer is for plugging the webcam into the Gumstix USB port.

http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/underside1.jpg

Here’s a closer shot that also gives a nice view of the powered gearbox that serves as the shoulder for the crab arm:

http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/underside2.jpg

We found that when people looked at the robot they did not recognize the goose neck webcam as the robot’s “eye”, so we added a parrot head to make the robot’s appearance more intuitive:
http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/parrot-head.jpg

http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/parrot-closeup.jpg

http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/parrot-glenn.jpg

Finally, a team photo of the robot’s creators: Ethan Tira-Thompson (Tekkotsu software architecture), Glenn Nickens (robot assembly and kinematics), and Dave Touretzky (pays the bills).

http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/ethan-glenn-dave.jpg

What manufacturer and model of camera are you using?

8-Dale

It’s a Logitech Quickcam Communicate STX, with the plastic shell removed. The naked circuit board/lens assembly is attached to an SES C-bracket using two green plastic bottlecaps with slots cut in them.

We’re not wedded to this particular camera. We chose it because that’s what the Qwerkbot uses, and we already had a working Qwerkbot in our lab, so it was familiar. We’re using the gspca webcam driver for Linux, which supports over 200 camera models.

The replacement “stiffer” wheels are on order. I should have them in a week or so. They look the same as the ones you are using, but are so stiff they don’t even use foam inserts. Help is on the way.

PS nice robot. :wink:

Hi Dave,
Nice 'bot.
I’d been looking at gumstix as well and I’m interested to know what interface you are using to the wheel drivers.

Thanks,
Lewis.

Standard Lynx Motion solution: the wheels are driven by a Sabretooth 2x10 R/C motor controller, which gets commands from an SSC-32 (it looks like a pair of servos to the SSC-32). The SSC-32 gets commands from the gumstix over a serial line.

A short video demo of Regis, showing simultaneous operation of the crab arm and goose neck webcam, is now available on YouTube

Very nicely done! Impressive! Thanks for sending the link.