It gives you a temperature reading for 64 zones (16 x 4). Each zone represents a square of a few degrees from the robots point of view. I orient mine horizontally so I get 16 across and 4 in the up/down direction. Alternatively, you could mount this on a pan/tilt and move it around to build a more detailed thermal image. I have seen another sensor out there now that give a lot more resolution (320x240 or something) but cost 3 times more at least.
I connect it to an Aduino Uno R3. Once I assemble the data as ints, I output the data to the Mega via serial as a comma delimited series of numbers. This data gets sent to the Android and sent out to the Lucy service so I can build a bitmap of the data to see on the website.
There is supposed to be an initial calibration step to account for differences in each unit and pixel. I skipped that. For my purposes, I didn’t need more precise temperatures.
For now, I only use it to track faces, cats, hot objects like coffee, lights. If I remember correctly, I calculate the hottest column and tell the neck servo to move half the distance to that column. The vertical is similar. This prevents a bunch of overshooting and oscillation. I have successfully used it to play lasertag with fast moving cats. The robot can generally put the lasers on a thermal target that is in the same room if it is at least a few inches wide.
First I will implement the autonomous moving and mapping. After that I’ll buy a Thermal sensor.
I don’t remember if Anna, maps its environment (I’m still going through all her capabilities while on the subway to work). If it does, what did you used for that? I’m reading some info found in this post, if you know it, what do you think about it?
She needs help charging. I mount the batteries underneath and outside the enclosures to keep the center of gravity low and to make swapping out batteries easy. I used to run her for hours a day, so being able to swap batteries was important instead of waiting for her to recharge. Besides, I doubt I have the skills to figure out how to do autonomous base station charging. That would be something.
I’m going to be switching out her battery system with a single 14.4V LiPo soon along with some UBECs for some of the other voltages I need. She might lose a couple pounds in the process which I need as I have some arms I plan to mount.
I guess for a robot, she does have an awesome butt… I’ll tell her you said so.
butt seriously…good point. I’ll add in your suggestion. I remember reading something on 10 ways to make a robot seem more human and that was one of them. Most of the others I had already done. Love to hear any and all suggestions, ideas, criticisms, whatever, by email or otherwise. I’m in design phase thinking for the next major iteration, so like Johnny 5, I love input.
Jibo looks cool, looking forward to getting one at some point.
You said “we”. Are you with a robotics company of some kind? Always glad to meet new folks in the field.
Sometimes Robots Evolve Based on How We Perceive Them
I suppose with light sensor and thermal sensor, finding sunny spot should not be difficult if design could carry enough panels to be useful. A lot of my early goals with her had to do with surveillance…navigation…driving to to a series of waypoints, and taking pictures and sending video back. When I added a face and talking, she took on a completely different focus.
…thus my title…Sometimes Robots Evolve Based on How We Perceive Them