Integration with Lucy: a cloud-based AI service

brain.jpg

No matter how many hours we spend hacking code in our basements, our robots remain, shall we say, not the brightest lights in the harbor. Time to offload some of that heavy lifting into the cloud? ‘Project Lucy’ is a codename for a cloud based Artificial Intelligence service by LMR's own Martin Triplett’s that should be released in April. Martin graciously let me be a guinea pig on the beta-version, and in this post, I’d like to share my experiences of integrating Lucy into CCSR, my own Linux-based robot platform. In this example, CCSR will run only lower-level functions such as sensory input, computer vision and actuation, but will rely on Lucy as a ‘remote Brain’ for all higher level functions such as Natural Language Processing, learning and remote control through a web-interface. I will share a simple Python module that provides integration on any wifi-enabled robot platform capable of running Python, such as Intel Edison, Minnowboard, Raspberry Pi and many others. Integrating with Lucy allows your robot to be controlled by a cloud-based Artificial Intelligence, much like ‘Anna’, as you have seen on LMR. 

The figure below shows the basic concept. You can clone the robotics_web Python module from github, and instantiate the roboticsWebClass in your robot application. The module relies on the Python packages requests, csv and xmlTree. The module will communicate with the AI web service through HTML, and it will parse and the XML-based responses coming back. 

lucydiag.png

In order to integrate with Lucy, you need to sign up with Martin Triplett for a ‘robot key’, so contact him directly. This gives you a unique instance of the AI dedicated to your robot, which will learn and develop based on your robot’s experiences.

The robotics_web Python module can be cloned from hereThe following Python code can get you started with Lucy:

 1
 2
 3
 4
 5
 6
 7
 8
 9
from robotics_web import roboticsWebClass
 
robotKey    = 'xxxxx'                               # Your robot Key at droids.homeip.net/RoboticsWeb
csvFile     = 'roboticsWebCmdMap.csv'               # Your robot platform specific command map
roboticsWeb = roboticsWebClass(robotKey, cvsFile)
 
while(1):
   for localRobotCommand in roboticsWeb.brainAPI():
      execute(localRobotCommand)

 

In this example, it is assumed your robot platform provides a ‘command interface’, i.e. a way to control the robot through user input such as a keypad, PS3 controller or over a serial interface. My example assumes the execute() function interprets a string as a command and calls the appropriate functions to actuate servo’s, motors, etc. You can replace execute() with however your platform parses user commands.

The robotics_web module uses a simple comma-separated file 'roboticsWebCmdMap.csv' to map Lucy’s command/service codes onto commands that can be executed on your local robot platform. The file provided through github does a limited mapping onto CCSR commands, so you’ll have to edit this to match your local robot platform. The command map CSV file has the following format:

 1
ID, AtomName, serviceID, commandID, <localCommand $data1 $data2>

 

You will only have to change or fill in the <localCommand> value for features that your robot platform supports. <localCommand> is a string, and the optional keywords $data1 and $data2 can be used to substitute any data values passed by Lucy. For example, if your command to vocalize a string is 'say', to move your robot forward is ‘move_fwd’, and the command to pan the head 90 degrees is 'pan 90' you’d map this on Lucy’s SpeechAgentDrive.Forward and Servo.PanRelative commands/service IDs respectively, and edit the CSV file as follows

1 
2
3
22267, SpeechAgent, 0, 0, say
16897, Drive.Forward, 2, 2, move_fwd
16907, Servo.PanRelative, 5, 1, pan $data1

 

In the example above, the roboticsWebClass is used in a polling mode, where your robot loops a call to a brainAPI method until Lucy initiates a command that you can execute, such as saying a sentence or actuating a servo. This allows full autonomous remote control by Lucy’s AI, or by a user through Lucy’s web-interface. Below is a different example where it is the robot that initiates activity. A speech-to-text operation produces text after capturing audio, and robotics_web pushes this text to the remote brain. Here, speech2text() is assumed to be your function producing a string from captured audio; in CCSR, this is done using Google Speech2Text web service.  


 1
 2
 3
 4
 5
 6
 7
 8
 9
 10
from robotics_web import roboticsWebClass
 
robotKey    = 'xxxxx'                               # Your robot Key at droids.homeip.net/RoboticsWeb
csvFile     = 'roboticsWebCmdMap.csv'               # Your robot platform specific command map
roboticsWeb = roboticsWebClass(robotKey, cvsFile)
 
while(1):
   text=speech2Text()
   for localRobotCommand in roboticsWeb.brainAPI(text):
      execute(localRobotCommand)

 

 

 

https://www.youtube.com/watch?v=0aLkqcL1GoU

Made my day…

Thanks for posting!  I’d love to see more video when you integrate more commands…especially movement.

When I build my Pi-based bot, I’ll be relying on a lot of the work you have done with these modules.  Thanks for sharing!

Martin

HTML Link error

The link in “The robotics_web Python module can be cloned from here.” is broken when you include the period and will take you to a 404 page. I’m looking forward to eventually being able to try all this out. Waiting on backordered parts.

Ah! Thanks for catching

Ah! Thanks for catching this, got the period out of the link now, so should be fixed. I’d love to hear how it works out for you, I left many cool features out in my experiments, specifically movement.

looks like droids.homeip.net

looks like droids.homeip.net is down

re: droids

Sadly, we get power outages here from time to time, and my DNS service goes down as well.  Sometimes, rarely, I shut it down on weekends, but generally it is up.

On the bright side, it is free to use!

Droids.homeip.net

Any idea when Lucy will be back online?

Has been down over a week now.