AVA

re: droidbuilder

Thanks DB,

The Panda works pretty well when windows isn’tdoing something else in the background.  I have that problem with Android and windows, I wish both would provide a way to control and limit all the sideshows.

Debugging is where I currently still have challenges with the Panda, but in fairness, debugging pushes my regular PCs close to the edge, and over, at times.

 

re: Frannie

Thanks Wayne,

The ears seem to be a hit at events.  People love their pets.  The ears seem to give people a lot of the same initial emotional responses as seeing a pet would.  I still need to hook up the sensors and make them functional though, after all, the reason I put them on was to give Ava some more sensors on swivels to increase situational awareness to either side.  I originally wanted to put a pixycam in each ear but that didn’t work out due to the size.

Cheers,

Martin

Latepanda looks good, isn’t

Latepanda looks good, isn’t it designed to run windows ?  can it boot linux ?

re: Latte Panda running windows

Yes, it is designed to run windows.  I don’t know if it can run anything else.  I was using it to run the software I had written on a windows server (.NET, SQL Server).   It worked, but I still prefer to work using a server until I can refine the software enough to incorporate the new sensors for Ava and get the Panda to run onboard.  I may end up ditching the Panda as I need a way to remote control everything and a touchscreen laptop (running the server software) works really well for that.

I finally got all the new boards and sensors communicating, mostly through I2C.  I hope to close up the head soon and for a very long time.

You have been a busy guy!

You have been a busy guy!  She has really turned into something very special and unique.

Tell me more about Ava the web and database developer.

This has been done many times before, but it never really seems to take off.  I am curious as to how you are doing it and what makes this better.  Always looking for a better mousetrap.

Regards,

Bill

re: nhbill

She has a system database that has among other things, a few dozen tables where she remembers data about data structures and user interfaces.  When she syncs, she reads the system tables, primary keys, foreign keys, etc and “remembers” them in her system database.  From this, she is able to create web applications on the fly that can have search, forms, child forms, etc.  An administrator can then use the app and modify any of her metadata…like changing labels, widget characteristics, security, etc.  Within an hour or two, one can change all the settings to put the polish on all the “virtual” pages in the “virtual” app on top of a typical DB of around 10-20 tables.  One can apply icons, create relations in the data where they do not exist explicitly in the db, etc.  There are about 50 meta properties on each entity and around 65 for each field.  You can do search, add, edit, view, copy, delete, and reorder operations on anything with zero coding…and my favorite part, deal with lots of one-to-one and many-to-many relationships, in collapsible/expandable sections with a grid for each relation.

I needed all this because I have so many memory types to maintain.  Now that I have it, it makes for a really easy way to build web business apps on top of dbs and apis…so I’m getting a bit distracted from robotics with that.

There is also a memory-driven rules engine so one can create complex rules in the U.I., like enabled a field based on other fields, and complex database db update rules…once again, without coding.  Fun stuff.  I spent my career building metadata driven frameworks for fortune 500…this is the latest in a long line of ancestors, I am getting a lot of interest in licensing/consulting opps with it.  

Everything is modifiable in the app, while using the app at runtime…no compile step…just a constantly living, usable app that gets better with human input.  When an underlying table changes, or new tables are added, Ava adapts to the new data, adding it into whatever pages are relevant.

Love it!
Amazing project, thanks for being so thorough! Beautiful work! Thank you!

re: GrimSkippy

Thanks!  …I have lost a lot of sleep trying to design something functional and beautiful to the extent that is possible…a lot of competing design constraints in a small space.

Stl Files?

Hello mtriplett,

I recently got a 3d printer and was wondering about ava’s STL files? I would really like to use a sonar ring like hers. Did you ever get them uploaded?

Thanks

Guardian

I like this project

Wow. I appreciate it - good job, what you have done.
I’ve making functionally very similar robot, so I understand how many time you’ve spent with it.
Difference is I focus more on autonomnous navigation in my flat and recognition of position,
than on IA speaking. In detail I have continuous speech recognition, continuous rotating laser meter (like Lidar),
compensation vibration compass by gyro (works fine-my invention),
recognition of things (by web API), face recognition, etc.

Formely, I had too about 8 ultrasonic, but I have problem with reflection of signal in a flat.
Because of speed of my robot (max about 5Mph), I coudn’t used serial ultrasonic measuring (too slow,
I need about 10 loops in 1 second). And parallel measuring was not accurate ('cos of reflection).

Whole project is in progress (about 2 years, not much time).
If you want to change some know-how, please feel free to contact me at [email protected]
I’am curios how you manage reading so much ultrasonic together.

Pavel

re: stl files.

I’ll see what I can dig up.

re: multi

On managing a sonar array,

My bot moves slower than yours (1 mph ?), so I have more time.  Even so, when the bot is at full speed, perhaps only about 3 readings per sonar (if driving directly towards a wall) have time to execute before she could hit.

1.  She drives proportionally slower as the risk level increases (she detects objects that are closer to her desired path).  This buys time to get more readings.

2.  When at rest or rotating in place, she cycles through all the sonars equally.

3.  When Moving forward (either straight or in curves), she triggers forward sonars more often, followed by side, and lastly the rear ones.  This means the most forward ones get kicked off 3 times more often than others.  This is because the sonars in the very front have the least amount of time to react.  

Thank You

Thank you sir, I would appreciate it.

This is pretty amazing!

Looking forward to seeing the latest modifications/improvements.

robot face on phone

hello, first let me say i think what you’ve done is awesome!   Can you tell me how you build the robot face on the phone or direct me to a good tutorial on how to do it?   I’m a newbee and any help or direction would be greatly appreciated.

robot face on phone

how did you create the face for the robot on the phone?   

re: twright

I used Android Studio, the code is written in Java on a PC and tested/debugged on the Phone while connected with a USB cable.

Android has graphics capabilities to do everything you need…draw lines, circles, rectangles, etc.  The eyelids are black rectangles with rounded corners and are drawn over the eye spheres to give a blinking effect.  The irises depend on the light level.  Its all actually pretty simple.

I overlaid a mirror image of the video feed from OpenCV (which I draw partially transparent) so I could see what the robot sees.  I built overlays for the sonar as well.

The whole face took the better part of a weekend to develop, most of it was spent getting the curves for the moving lips.  I was not new to Android, but was new to graphics programming.

I have figured out how to do just about anything I have wanted on Android…except building a web server.  I used to do well getting Android to talk to Arduinos through USB, but have not been able to get that to work reliably after some library update either on Android or Arduino.  If you figure out how to do either of those two things, let me know.  

Cheers,

Martin

Ava is amazing!

I am glad to see ava being as awesome as ever. Keep up the good work and good things will come your way. I wish I could build a robot like ava but we’ll see about that lol.

Thanks

Thanks.  As far as building one yourself, I am sure you could.  A lot of it is just commiting to a project long term…which I haven’t been good at lately.  On the software side, you have to get some kind of platform that is powerful enough (like a pc) on or off the bot that gives you enough room for growth.  Arduinos are fun but so very constrained.  Try skimming design patterns by GoF or Society of Mind for some programming inspiration.

Good luck on your projects.  I always enjoy seeing what you are up to.

Martin

You too!

Thanks! You too!