Here are some pics of our “crab arm” / “goose neck” rover, called Regis. It uses a 4WD3 base extended a bit with some lexan plates, and two SES arms: one with a webcam, one with a gripper. Electronics includes an SSC-32 to run the 10 servos, a Sabretooth 2x10 R/C to run the motors, a Logitech webcam, and a Gumstix verdex board (600 MHz, 128 MB) as the brains. The rover is designed to run autonomously using our Tekkotsu “cognitive robotics” software framework on the verdex, although at the moment it’s being remotely controlled by Tekkotsu running on a Pentium.
http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/shoulder.jpg
Two interesting points about the design. People who design robots almost always put the camera in the wrong place. It’s either too low to the ground, which limits the field of view, or positioned in such a way that the robot can see landmarks but can’t see its own body. This makes it very hard for the robot to manipulate objects. We solved this problem by putting the webcam on the end of an arm – what we call a “goose neck webcam”, This robot can easily watch its gripper from overhead, or look around the room, or look at its own wheels.
http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/look-high.jpg
http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/grip-side.jpg
The second point is the “crab arm”: an arm that lies in the plane of the workspace instead of perpendicular to the workspace. This has some drawbacks, but it offers the advantage of making it easier to do visual servoing, especially when combined with the goose neck webcam. The shoulder of this arm has a big ServoCity/RobotZone gearbox (thanks to robodude666 for suggesting this solution), so it can rotate into vertical mode if necessary. In vertical mode it can’t rotate in the groundplane, so we’d have to use the wheels for that degree of freedom.
http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/grip-vert.jpg
We still need to get some kinks out of the design. We originally used a 6V battery to power the 10 servos and a 7.2V battery to power the motors, Gumstix, SSC-32, and webcam (via a voltage converter). Unfortunately, the batteries can’t provide enough current to drive everything at once, so for now we’re using power supplies and running the robot in tethered mode. It was already tethered by an Ethernet cable; we’ll be replacing that with an 802.11 board as soon as Gumstix begins shipping them. Also, the current wheels are too spongy for the weight of this robot; we’d like to find something a bit stiffer. (Suggestions appreciated.)
http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/side-view.jpg
We demonstrated the robot at the Association for the Advancement of Artificial Intelligence meeting in Vancouver this week, where it received a Technical Innovation Award for hardware/software integration. Once we’ve refined the design a bit, and adapted it to the new Lynx Motion 4WD base that Jim is about to release, we plan to publish construction plans so other folks can build these.
Here’s a picture of the robot looking at Glenn Nickens, the student who built it and programmed the inverse kinematics solution:
http://www.cs.cmu.edu/~dst/Tekkotsu/Gallery/AAAI-07/look-glenn.jpg
My other student, Ethan Tira-Thompson, is the principal architect of the Tekkotsu framework and did most of the software support for the rover. For more information about Tekkotsu, visit Tekkotsu.org.