AVA

She keeps looking better and

She keeps looking better and better! 

Do you have something writen for the arms? Or you should start from scratch with that part of the code?

   I like the ears they are a lot more functional than what I’m planning in Andar.

  I wanted to ask you what motors did you used to improve the default ones in your linxmotion chassis.  I think I’ll need to do the same  so I get motors with encoders, and with more power since Andar can’t go over a simple flip flop…

  I started also working on my sonar ring! I hope you like it (comments and critics are welcome) https://m.facebook.com/nahueltaibo/albums/10153692633464931/

re: looking better

Thank you!  I like my robots to look good!  When she awakens later this year, I’ll tell her you said so.

I don’t have any code yet for the arms.  I’m using a lynxmotion servo controller board that I will send strings to through a serial connection.  A single string can contain commands to move multiple servos and time them to complete at the same time.

On the server side, I plan to create new gesture memories that store a series of associated serial commands for a given robot, as the server is steup to support multiple robots.  I also plan to associate existing memories with the new gesture memories, and send these gestures down to the robot when applicable.  The android will then route these gestures to the servo controller (through an arduino which will simply hand it off).   This approach means that I will be able to create and modify new gestures and associate them to other behavior without touching any code, simply by going to the Lucy website and filling in some forms.  I want to have a lot of gestures (arms, ears, head, and face together) coordinated with speech, as one of the things this robot will do is standup comedy…hopefully at SXSW in March.

Gestures are really just a start, in the long run, I want to be able to use the arms to grab things.  This will require the ear cameras to give a location in 3D of an object to grab.  I hope to accomplish that in 2016 but that will be huge challenge.  I have a friend that works on learning routines that I think could be suitable for dealing with learning to use the arms to grab things.

For the motors, I’m using the Lynmotion GHM-13 motors.  They have the same 50:1 reduction, but have much higher stall torque and somewhat higher rpm.  As a result, they are heavier than the stock motors.  I wanted the torque.

Your sonar ring is looking really good.  You got your ring in tighter by putting it in the neck area.  I can’t do that because of the articulated neck.  I look forward to seeing it in action.

Nice ears you have there,

Nice ears you have there, Martin, I’ve a question. what actually the ear’s functionality ?

----------------------------------------------------------------------------

Project Lucy is Cloud ? Can I use that as an API ?

---------------------------------------------------------------------------

Quote: "I plan to utilize her for entertainment and to voice control my TV and household lights"

generally smart home can be controlled using frequency ranges around 315 mhz until 434 mhz, are you planning to use freq transmitter or any others ?


---------------------------------------------------------------------------

Talking about frequency, I’ve an idea. what about your robot have capabilities to log your car’s remote control frequency (usually around 433 mhz) then decode it, So, the next time, you can instruct Ava to lock / unlock / turn on or turn off your car (using 433 mhz frequency transmitter) ?

Btw, I guess you have heard a device called “RTL SDR”, this device may be used by a robot to capture frequencies.

 

re: sw0rdm4n

Thanks.  My plan for the ears is to put two sensors in each ear…

Video Camera:   https://www.robotshop.com/en/pixy-cmucam5-image-sensor.html

Sound Sensor:  https://www.robotshop.com/en/sound-sensor.html

I will have to scale the ears up a bit and reprint as these sensor won’t fit now.  The camera sensor will pre-process video at 50 frames per second and pass on the width, height, x, and y coordinate of color blobs it detects.  It can scan for seven different colors at the same time and track many objects at the same time.

Each ear will be mounted to a servo to let it move from directly forward to directly aft.  For now I’m cancelling my plans to put an elevation servo in for each ear, however the entire head can elevate.   The two ears will cover 360 degrees.  I have a lot of plans for how to use this information to improve situational awareness, localization, and obstacle avoidance just to name a few.  Ultimately, I want to use the two cameras to give the robot depth perception when looking forward at specific colored objects.  I hope to eventually use this information to allow the arms to be able to grab things.

I plan to allow the ears to scan around while the robot’s face is focused on the person it is talking to.  If the sonar detects something closer in a particular vector, the ear can rotate to that vector to get a more detailed picture.

While all this is going on, I believe I can use a memory of where objects were seen in the past to localize within a few inches in a room, probably on the Android.  I did this with Anna and OCR, so it seems more feasible with the Pixy camera.

Due to the shape of the pixy sensor being flat on the bottom half, I believe I may be able to cram the sound detector sensor into each ear as well, directly in front of the Pixy but not blocking the Pixy camera.  I have never worked with sound sensors, so this area is totally new for me.

I think these ears will turn out being the most useful sensors on the bot.  I’ll still have the Android camera and thermal cameras facing forward with OpenCV (processing at a slower frame rate).  I’ll have a lot more flexibility with it to program whatever I want.

I should have my Pixy sensors by next week.  I hope to have the ears redesigned around them in the next few weeks.

Lucy runs on my home server as an API and supporting web site.  You can access them both and use them from your robot as well.  Send me an email to [email protected] and I’ll send you instructions.

For TV control, I was going to start with IR.  I have done a lot of work with IR before so it is nothing particularly new  I don’t really have my smart home yet, just some Philips Hue lighting and Wink stuff which I have managed to control through an API.  I don’t know anything about using the frequencies or devices you mentioned yet, but would love to learn if you have time to share anything.  I have some XBee Pro stuff lying around which I might cram in to the rear of the body or head but have never used it.

awesome idea

At first I think the ear is just for decoration. Now I got it, your idea is brilliant, I can’t imagine you will use that ear for something like that.

-----------------

Thank you very much, I will send the email soon, I willing to try the API that you provided for Julia (one of my robot).

-----------------

oh you will use it for tv control and philips lamp only. I think you will really use it for a complete smart home system. I don’t have good experience in frequency yet, currently I still on learning phase, but when I’ve spare time I will post about them at LMR. (actually I think, I’ve never give any useful contributions for LMR and it’s regrettable, I think I will give some contributions later)

 

Best regards

Antonius

Hi Nahuel, I have sent

Hi Nahuel, I have sent friend request to your facebook account

Accepted! Hey it would

Accepted!

   Hey it would be great if you post your robots here in LMR, so we know what are you up to. And also it really motivating when people gives you feedback on them, at least it was for me :slight_smile:

Cheers!

Nahuel

thank you. off course I will

thank you. off course I will post a robot when I think the time is right

quote: "One of the

quote: "One of the arms…might have to remove one joint if too heavy."

no don’t remove it. have it 4 DOF at least, if too heavy you can add pulley (powered by dc motor) or change the servo using higher power / torquee servo

 

reasons:

- human arm has 7 dof

- humanoid robots with arm usually has at least 5 DOF

- a robot to type a keyboard can be 3 DOF but to works smoother, it’s better 4 DOF

- lack of DOF = stiff movement of arms

re: DOF in Arms

I agree with you.  In the pictures, the arms have 4 DOF each.  I definitely won’t reduce that.  I’ll probably add an additional joint to each to take it to 5 DOF, as long as the servos are strong enough to support full extension of the arms and the overall bot doesn’t have a weight issue.  It’s easy to add joints as I’m using the EZ-Robot servos and brackets…I can just slide them on.  Coordinating all those servos will be the difficult part for me.

Amazing as always!How can

Amazing as always!

How can you go so fast! I envy you! (In a healthy way :slight_smile: )

What did you used to connect arduino and android? Usb host or usb accessory or any other option?

Re: Amazing as always…

Thanks Nahueltaibo!

The software upgrades took me a couple days straight, as I had most of the software wirtten in Eclipse on prior versions of Android and OpenCV.  OpenCV changed quite a bit, so that was a speedbump, and learning Android Studio.

The 3D printing side is taking forever though, I spend hours or sometimes days getting one part right.  The head is 10 parts so far, without the ears.  I’m going under the strategy that if I design the parts to the overall dimensions I want and make it modular, I can redesign, add detail, make parts curved, later and reprint particular parts.  We’ll see.

I hope I answer your question right…the Arduino acts as host, and the Android acts as an accessory.  In the Arduino code, I use Usb.h and AndroidAccessory.h    I pretty much haven’t touched my code in 3 years.  I send 3 different types of messages around over USB that pass different size data packets.

Thanks again,

Martin

I’m glad to see AVA growing

I’m glad to see AVA growing so fast, you are pretty efficient ! I spend a lot of time on my project, but it doesn’t go as fast as expected :wink:

I hope you will share videos soon !

Hello, Martin!I’m having

Hello, Martin!

I’m having difficulties to find good DC motors for tank treads for a project I’m planning. Actually good motors are easy to find… but with good speed and torque and that cost cheap, not so much.

So I would like to know what the stall torque and RPM of your DC motors (or if you have a link of them…). Sorry if you have mentioned it already.

Thank you!

re: Dickel

Hi Dickel,

Let me preface this with the fact that I am at best a novice with motors, and I am worse at finding anything cheap.  I can tell you the specs on what I have been using.

All my published bots use 12V motors.  Anna used the default motors that come with the Lynxmotion Tri–Track, model GHM-02.  They have 8.8 kg/cm stall torque.  You can buy them separately for $22 a piece.  At times she had some problems doing good skid-turn on carpet.  I thought it was the motors and my power supply system.  After digging through some code today, I think I could have had a software problem.  There were so many situations where more delicate throttle control is needed.

For Ava, one of my bigger concerns was additional weight from all the added servos and 3D printed body parts.  I wanted more power…don’t we always want more? …so I decided to get the Lynxmotion GHM-13’s instead…with 16.7 kg/cm stall torque.  They are $30 each.  I hope the additional torque is not going to break the sprockets should they jam.  Both of the motors are 50:1, but Ava’s also have higher RPM (152 vs. 120), at a cost of higher weight.  

With Ava’s additional torque, speed, and ground clearance, I think driving outdoors in grass a few inches high will now be a lot easier (Anna could get slowed down from the grass dragging on her low undercarriage)

I just swapped out the motor controller with a new one tonight, took her off her stand, and got her moving freely around the room for the first time ever just a few minutes ago.  I should learn a lot more this weekend as I finally get to test her out driving around with something close to her full expected body weight…which I still need to measure.  My first impressions are that the new motors are going to have more than enough power and that I’m going to need to re-write my throttle code so it accelerates and decelerates gradually instead of stopping or going full throttle all at once…to reduce stress on the neck!  

If I learn anything interesting, I’ll let you know.

re: LordGG

Thanks.  It felt so slow in the beginning, all those hours designing and printing parts.  I’m spending several hours every day lately, sometimes 10-12.  The reason is, I have a deadline, Ava is going participate in a panel discussion with humans on robot comedy at SXSW in March, so I have no time to spare.

I hope to do some videos soon.

Hello MartinYou are the bot

Hello Martin

You are the bot master!!. How you can Disign, Print, and Program so fast is beyond me. I have picked up several kids power wheel vehicles and my 30+ lbs grandson rides the large one around for hours at a time and it goes for weeks before needing a recharge. So I am thinking of designing a bot using some of those motors and gearboxes. 2 12v battries out of discarded UPS (that I have LOTS of) should last a good while. And torque should not be a problem. I am not happy with my current printed track base so back to the drawing board.

cheers.

re: rz90208

Thanks!  There has been at least one member here that has built a bot using Powerwheels, probably several.  Its a nice way to go I think if you are building a big bot and you still want it to not hurt somebody if it runs over their foot or bumps them.  I thought about building one too but thus far have stayed in the 10 lb bot size.

I’d love to see your printed track base ideas.

 

storage of data collected,

first id like to say what you ve done with ANNA and AVA is out of this world amazing,  i was wondering how much storage capacity do you have for these robots, and how much total storage, cloud and hard drive space from your server are they using to be this smart, and how much on a full day of learning, basically in a full day how much data does it store to remember what it learns.  and how much storage is built in the robots?  and again i commend you on your work.

re: storage

Thanks for the kind words.  Building robots and brains for them is a labor of love.

Currently, the DB is 132 Megabytes.  The database for OpenNLP is separate…I don’t have a size on that but could look it up.

It’s hard to quantify a full day of learning, she could easily add many megabytes in a single day if I asked her to learn things from the web that she didn’t already know.  I don’t do that very much because she already knows too much for her own good…think of a 5 year old trying to learn about something suitable for medical doctors and talk about it…not really good or fun.  I would think a day of casual interaction or verbal learning would only add of a small fraction of a MB…we’re talking double digit KB maybe, if that.

She does remember each verbal input/output exchange as a unique memory.  Each memory has up to 18 integers and 4 strings which are unused most of the time.

I don’t do any permanent storage of all the incoming sensor data and how it changes over time, as the data storage for this would be rapid and huge.  Right now the latest version is cached in memory for currently open “sessions”.  I may have to change this in the future to explore apply some machine learning techniques to sensor data.

There is some file storage for pictures that are taken upon request or when the lasers are fired, the latest video images are only stored in memory.

The robots have the droid phones, which have SQLite installed.  I did a lot of exploratory work with this in the beginning but am not using this much right now.  My storage and memory footprint on the phone is very small (other than what OpenCV uses).  Ava has a much better phone installed, so I am reevaluating the question “Where to best run code” right now.  Once I get all her new sensor capabilities running, if there is CPU power left over (and I think there will be a lot), I have been toying with the idea of trying to pull down and run the key portions of the server brain on the phone instead.  The problem is…another huge code re-write from C# and SQLServer to Java and SQLLite, and a lot of data to push down.  We’ll see.