Low dropout voltage regulator LM2940 for microcontroller, EMIC 2 text-to-speech module, color sensor, LED eyes/mouth, Sharp IR distance sensor, LCD and PS/2 keyboard
6V voltage regulation for servos by voltage drop diode array
Pin header connectors for 4 servos
Pin header connector for Sharp IR distance sensor
EMIC 2 text-to-speech module
Reset push button
Head:
2 DOF (pan/tilt)
RGB LED eyes
LED mouth
Small custom made LED driver PCB
Sensors/Interfaces:
Color sensor based on TCS 3200 with collimating lens
Sharp IR distance sensor
Connector for PS/2 keyboard
Output devices:
Loudspeaker 8Ω, 0.5W/EMIC 2 text-to-speech module
16x2 serial LCD blue
Propulsion:
2 modified HITEC HS645MG servos
2 Lynxmotion servo wheels - 2.63" x 0.35"
2 Caster wheels
Battery:
7.2V/3200 mAh/NiMH
M.I.A. is completely open source. I will publish the Gerber files for manufacturing the shield and the LED driver PCB later, the sketchup 3-D drawing is already attached. Currently I am building a prototype of M.I.A. and developing example codes.
After development is finished I might try to sell M.I.A. as a kit and support a public welfare foundation with the proceeds (for instance cancer research).
The assembly instructions are listed on the end of this post
Hi MarkusB. I like your style. This design is very nice and this project looks promising. I think “social robot”, as you described it, is the way to evolve our robots. I’ll keep an eye on it.
I love the idea of more open source robotic models. The only one I think I’ve seen here so far was K12’s Bob model. Forgive me please if I’ve forgotten anyone. If I could suggest though, a head would be a good addition for a social robot. Just sticking the sharp on the top leaves it looking like a box of electronics. A social robot needs to create a feeling of attachment to whoever it interacts with. A head could take any form, it doesn’t even have to have a neck or swivel. but the main ingredients are eyes and a mouth. I’m looking forward to seeing how this one develops and any copies, improved versions inspired by it.
the dev kit that adafruit has http://www.adafruit.com/products/1316 is a rather large board, the kit is $10 more expensive, and, probably saves very little power.
Getting humanoid feeling although on the base of basic two wheel drive. What do you think is main purpose of having color sensor on the chest? I couldn’t figure out very much use for it because objects must goto sensor before color can be readed?
I think it would be nice to have easy mounting option for some common obstacle avoidance sensors around the body. Obstacle avoidance seems to be something almost everybody wants to try sooner or later
I have no experience of omniwheels and i don’t know how well they perform on different surfaces. I think lot of robots weight rests on this one wheel, could that be problem? Just something came to my mind.
Thanks for your input. I have some ideas regarding the color sensor. One idea is to use a perceptron alogorithm to teach the robot color recognition, another idea is for the swarm bot use: The robots holding colored plates in their ‘hands’. If one robot encounters another robot, they moving the color plates up to the color sensor position. According to the colors of each other, they starting different behaviors.
The Omni-wheel can withand a payload of 5kg or more - no problem. It also performs well on different smooth sufaces.
Mounting option for different sensors is a good idea. I have thought about it to add a mounting plate on the front for IR, ultrasonic rangers, bumper sensors, line following sensors etc.
Besides my comment below I will give some thought to that project.
I like the idea of Open Source. Do you consider to have a set of nice pictures? If so then please send your robot to Shanghai then Ok, now to your robot. I am missing a cam. For a social robot you could have a cam with a face recognition…something like chickenparmi did. Besides that I am again impressed about your ideas. But what’s with your other projects? Pending? Canceled?
Actually I just realised I didn’t ask what M.I.A. stands for? Shield looks good. Hows the emic 2 in output quality Are there any examples of it’s sound anywhere to listen to?
I hope you did not disassemble one of your daughters toys to get those arms.
The robot looks just awesome. Most of the times the initial sketch looks different than the final robot. However, it’s looking great. That arms give him a nice and friendly appearance. When you will show him in action?
regarding the lcd, is it necessary to type data in a specific format? Or the brain of the robot actually understands English? The bot looks awesome. I was thinking of making an insect bot with lcd. Your one, which is of course well ahead of my idea, reminded me that. Looking forward to the finished stuff.
I bought the toy at Carrefour for a small money. My daughter never saw the toy assambled
I think I need a few weeks more. Currently I am working on two new ideas:
- Face recognition by color sensor (skin color, hair color, eye color)
- Remote by PS2 controller with memory function, so you can train the robot some movements.
Later, after development is finished I might try to sell the robot as a kit. I don’t want to earn any money. The little margin I might make, I want donate to cancer research for some private reasons.
The data format the LCD is displaying is a char. The text processing is done via Strings. The robot does not understand English. It just search for keywords in the text you type and prepare a random answer of a given set of responses. Currently every keyword has 3 possible responses. If there is no keyword found, the algorithm ouputs a kind of response the user most likey will answer something where a keyword is found again. The robot sort of try to manipulate you. If you repeat yourself, the robot recognize this as well and output an answer like “Please don’t repeat yourself”.
The math solving algorithm I have explained already in my post. I have attached a draft code of the chat section of the robot if you want to take a look.