MiniEric robot

This is my latest robot, still work in progress. It will take a while until it gets finished, as I want to make it more and more complex. I intend to add every functionality I can to this robot. But here is the description:

 MiniEric was born because I needed a multipurpose test platform to develop code for my big butler robot (Eric). I wanted a small replica of the big robot, incorporating almost all features, like: object retrieval, interaction with humans, mapping, object recognition, text to speech, self charging, eventually voice recognition (simple commands), ability to compete in several types of robotic competitions (line, maze, fire...). Some of the features are implemented, some are on the way. I am a weak coder so I am slowly testing out bits of code, getting ideas from people on the net (I already got some from you guys, thanks a lot!). I am using the Arduino platform, the brain being a Roboduino board. I am using all pins on it so I'll have to hook up a Duemilanove over the I2C and downgrade the Roboduino as a servo controller. For the moment I am using a self made dual DC motor controller over the UART (I didn't get the I2C slave working yet) using a Tiny2313 and a SN754410 to drive a couple of Faulhaber motors with built in encoders. The robot has 8 servos: one for waist, 2 for shoulders, 2 for arms, 2 for pan/tilt head and one for a scanning sensor. On the head it has a Ping)) sensor, a thermopile array and will have an AVRcam. The sensor mounted on the scanner is a GP2D120 and is used for wall following or object retrieval. The arms can move independent (to point or wave) or together as a claw (to pick up objects). On the tip of the arms it has some suction cups that I want to attach to FSR to sense when the object is grabbed (but I could steal the ASF idea...). The robot has a 2x16 serial LCD (custom made) that I should upgrade with a graphic LCD for mapping purposes. The robot has some programmed moves (stored in the EEPROM) and is able to play a small tune or beeps. Did I mention that I hooked up an IR sensor and I can teach the robot new moves with the TV remote? The process is not so easy, but it beats the PC control method.

I have decided to add a few pictures to highlight the building process of this robot. I started CAD-ing it with Google Sketchup, then cut the parts from a poplar board I bought from Home Depot (it took me one afternoon to make and mount most of the parts) and used a piece of automotive (big) hose clamp to fabricate the servo brackets. I have used small wood screws (you can find them only at the hardware store in packs with tiny hinges for small jewelry boxes) but I had to drill small holes in advance so the wood would not crack. At first, I have used servos for driving, but they are too noisy for my ears, I hated it when it was running all over avoiding objects. So instead of installing a quadrature encoder, a small H-bridge and a ATtiny microcontroller inside the servo's box, I decided to get geared motors with built in encoders -> the Faulhaber motors from Electronic Goldmine. Over time, the robot has suffered many small mods and I guess it will happen again with the arms, as I am not happy with the current design. I need to re-shape them, perhaps ad one more micro servo per arm for an elbow bend or for a hand... Here are some early pictures:

 

 

 

     

I Hope you'll like it! 

UPDATE: (Nov. 14th)

 I have redesigned the robot's head and added 2 long range Sharp IR sensors mounted at 90deg from each other and 45deg from head axis. I also added the AVRcam and a LED bar to act as a mouth when the robot will speak. I will use a tiny to drive the LEDs using a AtoD pin to determine the voltage on the speaker, I've seen it done somewhere some time ago. I have added a color Nokia LCD to my motor controller board. I wanted to make the robot scan using the head pan servo and send through I2C the LCD commands, but it didn't work. So, I had to move over to the motor controller board the servo and the Ping sensor for testing purposes and I finally had proper results. The color on the LCD are still crap (for some reason this LCD is hard to set up properly) but I can display the distance and draw the pixels on the screen. Another weird thing, it seems that the Ping sensor's max distance is 102-103 cm, but I didn't had time to see why. After I got the scan properly displayed I have eliminated all the delays in my code and to my surprise, it scans madly fast! Then I made it scan left to right and right to left, with a second delay between directions, to be able to see the map on the screen. You can see the result in the video. I also attached the code and the NokiaLCD.lib in the zip file (change the extension from txt to zip). I had to use the SoftwareServo lib because the original Servo lib causes problems with the display. Enjoy!

Nov. 17th. Another update:

I have finally received the new R-Dev-Ino boards I've designed for the robot that will split all the functions over 4 or 5 modules. I'm using I2C for communications and I have to say I'm pleased how well that works. At the end of this week the robot will be ready for FireFighting and all that will remain will be to complete the mapping code and the vision code. Hmm, actually there are more things to do afterwards...

Nov. 27th.

After the Fire Fighting competition my next challenge is Mapping. But until then, I want to make Speech work. So far it's not intelligible, but I think I can tweak it a bit. It doesn't have to be perfect, but at least a bit better. Then I'm making the LED mouth work for a much realistic feel. When the new Nokia color LCD shield is here I'm going to continue my mapping efforts.

So here's a new video demonstrating the Voice and Speech. 

Update: Dec. 31st 2009.

I have attached the code for the Speech controller that uses a Mega328 with Arduino bootloader installed, on one of my R-Dev-Ino modules. You need to download the SpeechControllerzip.txt file and TTSzip.txt library, rename the files to replace the extension from .txt to .zip, unzip and copy the library in the proper place and the SpeechController code where ever you keep Arduino sketches.

Update: Jan. 30th 2010.

I've decided to change the two HXT900 servos from my robot's neck and I got some Turnigy TGY-S3101S mini servos, a bit bigger and a bit stronger. After taking the robot apart half way to remove a body part that I needed to cut to fit the new servo, bend in weird ways the servo bracket to fit the new lenght I managed to get it all back together and it was ready for the test. I loaded the ScanMap code on the micro and my jaw dropped in awe!!! The head moved perfectly, jitter free, 180 perfect degrees and SILENT like it was some sort of a stealth robot... I poked the head to tilt it, came back smoothly, with no extra oscillations... OMG!

I also took sensor measurements every 5cm from 20cm to 250cm and had Excel come out with a new equation, so my Sharp sensors measurements went right over the Ping sensor measurements and perfectly one over the other (I have 2 Sharp sensors at 90 degrees to each other, so the measurements overlap for the middle portion). Great Gods of Robots! I am now ready to start the mapping stuff!

Play Fetch, Firefighting, navigate by map using different sensors for localisation.

  • Actuators / output devices: 8 servos, 2 DC geared motors
  • Control method: IR remote and autonomous, Voice Command
  • CPU: 1 Tiny861 on a perf board, 1 Mega328 and 3 Mega168 each on one R-Dev-Ino board
  • Operating system: Arduino
  • Power source: 9.6v nicd 1000mA
  • Programming language: Arduino
  • Sensors / input devices: motor encoders, compass, TPA81 Devantech 8 Pixel Thermal Array Sensor, Ping))), Sharp IR sensors
  • Target environment: indoors

This is a companion discussion topic for the original entry at https://community.robotshop.com/robots/show/minieric-robot

Weak coder…???

Are you crazy!!! This is amazing!

|x

So many questions...where do I start?

That is really well made and
That is really well made and well thought out! COol programming, and with a teach mode too. Awesome!

Finally a Robot that acts like one
Very well done. I can picture the evolution. The assisted remote along with programming makes it more then the typical of what you normally see here (run and avoid). I like the arms that actually can do something besides look pretty. lol You give me ideas.

Looking good! Anybody else

Looking good! Anybody else see a miniature Johnny 5? :smiley: Just need some tracks, and some uber lasers, and you’ll be on your way to controlling the world!

I’m anxious to see some of your other designs. :slight_smile:

Cool
That thing is very cool! I have always wanted to try making a robot with arms that can pick things up.

This Bot is really nice! I
This Bot is really nice! I like the way it capture objects with its hands! Waiting for more of it! :smiley:

Wow! I’m glad you like it!

Wow! I’m glad you like it! :smiley:

I have joined LMR long time ago, but because of some glitch in the matrix I never got the confirmation email and I could not log in. I got pissed and forgot about it, but yesterday I saw a post on Arduino Forum about LoMoR and this time when I clicked on the Forgot password button I finally got the link and re-set my password. Good to be here at last…

So, I have written about this robot and others (mostly Lego robots) on my blog here: http://seriousrobotics.wordpress.com There is the remote control version of the code in Arduino there, all in one sketch. Since then I have the code split in libraries and tabs. I will attach the current version in a zip file, together with the necessary libraries.

Right now I am working on the motor controller, based on tiny2313. I wanted to be able to program it in Arduino, but the flash is too small for the code to fit, so I have redesigned it with tiny861. This brings another incompatibility with Arduino problem, it has different timer settings and I couldn’t figure out how to make the proper adjustments yet. I hope I won’t have to make another board with a mega8… (that was so simple from the start… but no, I’m too stubborn to give up…) So, I have the motor controller programmed in BASCOM-AVR, but working on the UART instead of the desired I2C (to make an I2C slave device you have to buy the commercial library for bascom, dohh). I should get over my stubborness and program it in AVR Studio like everyone does and set the damn USI to work, throw a PID function in there with variable target, setpoint, error and k settings to be able to use it for regulating speed, direction, distance by sending the proper I2C command. Oh boy, lots of work and I am still not comfortable writing it down. Sigh…

On the “brain” I need to focus on the FireFighting code so I can participate at CNRG in November. That’s why I got the thermopile array and it works nicely to detect the candle light and people (btw, I want to program the robot to play fetch with my 3yrs daughter after I get this to work). So now I have to find a way to navigate the model house by following a node map and the walls. And a way to put out the candle - oh, no blowing fan, has to use a fire estinguisher of some sorts since it has hands, right? I’ll either use a mist spray or a small bicycle CO2 tire inflator, not decided yet which one will be cooler… and easier to st up.

A problem that I have with this robot is the jerkiness of the servos. They use the SoftwareServo lib and I have to make sure I issue the refresh command as often as possible which is a pain. I have tried to use the new Servo lib based on Timer1 interrupts, but it is very sensitive to the other commands that use interrupts or they are time consuming, like serial, I2C, delay, pulsein, analog, so I had to get back to the software version. I liked that the servos were rock solid, so I guess I can use it to make a nice servo controller.

Oh well, I should get back to work, it’s a nice Saturday morning and I hope I’ll get a chance to finish something this weekend!

Cheers!

Great work! I agree with

Great work! I agree with e-square12… so many questions!

Here’s a couple to start.

  1. In the first video, how does the robot know when to drop the ball? What sensor triggers that response?
  2. Can you tell us a bit about the suction cup hands? I see wires so I imagine you have some sort of pressure sensors there?

Really tremendous work. Keep it up!

OK, I have used the Ping))

OK, I have used the Ping)) sensor to trigger the object release and the Sharp sensor to trigger the object pick up. Also the Ping)) sensor is used to trigger the wave and melody. I want to make it play a song and make dance moves, but I need to separate the tasks on 2 microcontrollers.

The suction cups have a hole in the center and a tiny push button inside to detect the object, but they are too stiff and the pressure is not strong enough to trigger them. I chose the suction cups so I can pick up the ball easier. I want to try a force sensitive sensor or a piece of anti static foam I saw on this site somewhere. But right now that is on the side line.

Anything else?

Nice! I like your idea
Nice! I like your idea about teaching the robot using your remote control. I have considered updating the software, or at the very least, certain variables in my robots using infrared. Thanks for posting the code. I’m haven’t looked at it yet, but I’m sure it will give me some good ideas.

Peizoelectric Force Gauge
Google it, there are little flexible peizo pads that have lowering resistance as force is applied. perfect for object feedback…

waist servos?

those second arm servos, the ones near the base, I assume they are strong enough to lift the arms? Are they Hi-torque?

The waist servo is a
The waist servo is a standard size GWS S03T 2BB (8 kg/cm), the shoulder servos are mini size Vigor VS-3 (2.5 kg/cm), and all the other servos are micro size HXT 900 (1.6 kg/cm). I haven’t tested the max weight the arms can lift, but they don’t have much strenght to grip an object (arms are a bit too long).

Very nive robot you have

Very nive robot you have here!! :slight_smile: Keep it going :wink:
What do you mean when you say you are a weak programmer?? :stuck_out_tongue:

Thanks for

Thanks for encouragement!

I’ve always built nice and mechanically functional robots, but I never was happy with the programs I managed to write. I’ve seen robots built by people that didn’t look great or didn’t had too much functionality but still they were doing so well in their actions, meaning good programming skills were used for their programs. I am trying to get there, but for me it’s a long process, as I don’t have the time I would like to learn good programming skills the right way, I’m more likely improvising, building upon other people’s bits of code I put together and try to adapt it to my needs. I guess I’ll get there sometime, this being one of the main purposes of this robot, to learn the programming part. That’s why I want to incorporate in this robot every function possible, pushing my limits further and further…

to be a weak programmer is
to be a weak programmer is one of my flaws too, as an artist/designer, program complex behaviors is sometimes frustrating, but I never give up :wink: and it is always fun to see what we get, and felling that everytime we get into a higher level :slight_smile: (sorry for the crappy english)

The robot can now scan for
The robot can now scan for objects and display the map as all of you saw in the LoMoR project. And it’s fast! Can’t wait to see how the Sharp sensors will work… Watch the new video! Enjoy!

This little guy is so well
This little guy is so well designed.I love the solution you came up with for the manipulator.I’m very impressed and inspired!

how much did thiscost to fix
how much did thiscost to fix it???