Demo of a Conversational Robot that Learns by Listening

I have been working on this for the past few months, basically, its a conversational learning AI.  I've tried to figure out how to explain it...best to watch the video.  The logic based stuff is a few minutes into the video.

At the core of it, the bot learns concepts by listening to people and remembering what they say...

The robot learns from humans saying things like "People are mammals...mammals are animals...mammals have two eyes...fish can swim...London is in England...France is next to Germany...Rocks are heavier than feathers...Steel is stronger than Iron.  Cheetahs are faster than humans.  Superman is faster than a bullet...Penguins can't fly, Beijing is the capital of China...The Battle of Midway was in 1942...on and on.  From this it can deduce and answer a lot of logic based questions by traversing relationships it has learned.  It understands concepts like "is a", "has a", "location", "faster", "heavier", "smarter", "more famous", "near", "expensive" and much more.  It can answer who, what, where, when, why, and how many type questions.

The robot has hundreds of questions organized into topics that it can talk about.  It evaluates each question beforehand as to whether it is appropriate for whom it is talking to.  It remembers everything and revisits different questions on different timeframes like..."How is your evening going?" might come up often, while "Do you have children?" might only happen once every 5-10 years.  Some are time based...like "Who is playing on Monday Night Football tonight?" only happens if you like football, it is fall, and it is monday.  "Are you retired?" would only come up it you are older.

Thus far, the bot has learned 200,000 words, 4000 pieces of learned knowledge, and about 1200 commands and questions.

The robot has its own opinions on some things (like football) and has emotional reactions based on similarities/differences between its own opinions and people it is talking to.   It has some ability to empathize now by recognizing bad events that are happening to people that are closely related to the person talking to the robot.

The robot keeps separate records about each person it talks to.  It can answer questions about itself (1st person), you (2nd Person), or other people (3rd Person) known to you and the bot when referred to by name.

The bot has thermal vision which it uses to keep its head tracked on people it is talking to.   Its off in this video...hope to do a demo on that soon.

This video scratches the surface of what it can do...hope some of you folks like it!  A few glitches in the vid...since fixed!

https://www.youtube.com/watch?v=NUCqE8f9uNk

Conversational robot

This is good. I mean really good.

Are you using Googles speech recognition engine with an AliceBot layered on top, or ???

I’ll admit, I’m stumped on the architecture you would use to pull off this masterpiece.

Way better than any of the Turing test videos I have seen. If I had to pin point the difference that pushes this learning AI implementation over the top, it would be the “just right” inclusion of empathetic responses and the blazing fast response times.