Emotion Engine for Phoenix

Recently I have been working on an approach to add basic emotion to my Phoenix robot.

I started off putting together an emotion engine; afterward I put together a secondary mcu to poll & scale the sensor data, the two mcu’s communicate over I2C. I choose to split the tasks over multiple mcu’s because it will scale better; adding more sensors later on will not be a problem. The emotion engine will grab data from the poll/scale mcu when it needs it for computation.

I have two sensors that feed into the emotional engine as of now, the first is a 3-axis accelerometer and the second is a ping sensor. As you get closer to the ping sensor the fear metric increases; if you poke or shake the accelerometer the anger metric increases.

These metrics are then be fed into the emotion engine and an overall mood will be generated. There are 7 mood types, (love, joy, surprise, anger, fear, envy, sad). There is also three degrees of emotion, (mild, considerable, and extreme). The next step is to then have the emotional state trigger pre-defined GP sequences. For instance when the emotional state is “extreme anger” I would like it to rear back and kick its front legs like Zenta’s phoenix does. For the “fear” response I was thinking maybe body shaking; any thoughts on this are welcome?

I have not implemented any gp sequences yet; I plan on looking into this next. Currently the emotional state of the robot is displayed on the Serial port.

The emotional states are smoothed over time by using a “fast exponential moving average”. Each mood ratio is then compared to a base line, a “slow exponential moving average”, that I call the “robot temperament”. The mood that has deviated furthest from its baseline temperament value is considered to be the current robot mood.

In the phoenix code I replaced the functionality of the triangle button on the ps2 remote; it now turns emotion on/off. The emotional state is read over Serial3 which I labeled EmotionEngine and communicates at 9600 baud.

I attached photos of my test rig; which consists of an Arduino MEGA 2560, SSC-32, PS2 Reciever. It is running Kurt’s ported phoenix code. I have not yet attached the “emotion hardware” to my phoenix just yet; I wanted to get all my ducks in a row first.

The “emotion hardware” consists of two arduino minis, ADXL335 accelerometer & a parallax ping sensor.

The code is still a little rough and might need some polishing; but is available here: github.com/davidhend/Phoenix
Everything was compiled using version 21 of the arudino IDE.

I need to give some credit to RandomMatrix; I re-purposed/changed his world mood library from his twitter world mood project found here:

instructables.com/id/Twitter … -in-a-Box/



Really want to see a video.
Encourage you to incorporate this sensor (on a pan/tilt) to make it even more realistic and also track motion:
robotshop.com/productinfo.as … lang=en-US
robotshop.com/dagu-mini-pan- … t-kit.html

Wow, nice work implementing this concept. I like doing cartoonish animatronics and have left “hooks” in my code for external inputs. An emotional input would be so cool.

I have upgraded the firmware on the SSC-32 to version 2.03GP. I have also placed an order the SEQ-01 software for the sequences; it should arrive sometime this week. I had to power the extra electronics from the servo voltage lines on the SSC-32; the 5V li-ion power source powering the other electronics didn’t have enough current to spare.

CBenson; I like the IR sensor & pan/tilt solution you posted and will definitely consider incorporating something like that. My thought would be to combine a ping sensor with that IR board, that way you can do basic tracking and get the distance from the object also. These changes would defiantly make it more interactive and make it seem more “alive”. After I get the response sequences worked out I will begin to entertain these ideas.

Hopefully my next post will include a video…