ASAR object tracking bot

Overview:

This is ASAR a robot built with an intention of using a raspberry pi as its brain.  Unfortunately the raspberry pi is not available until the end of May 2012, so I am having to tether it to my laptop whilst I develop the software for its behaviour.

Physical components:

Physically the robot consists of a webcam mounted on a pan and tilt bracket with two servos and it is using a H Bridge IC to drive the motors in a hacked Big Trak Jr chasis (cheapest way to get two DC motors in a chasis ready to go!).  Im not so hot on the electronics, but I managed to cobble this together with some help from an expert friend!

Platform:

First I used OpenCV with haar face detection (see the video). Currently I am working on opencv HSV thresholding to make the bot to follow a tennis green ball and nudge the ball once it is close enough.  This is working in its most basic form, the next steps is to develop a more sophisticated algorithm for the robot to hunt for the ball based upon the balls last known exit point from its field of vision. This however is proving more difficult than anticipated.

Future work:

Traditional object detection

Update:

Have got a Raspberry Pi on backorder, downside not going to arrive till end of April May!

Have updated the code and got basic body ajustment based upon servo pan position.  Its adjustment accuracy depends on how close the ball is, but its a start.

Also started coding a search function, for when the ball is not found when the ball is not detected, but hit problems of the camera freezing whilst running the pan servo through 20-170 degrees.

Have been investigating reading values from the Aurdino, found  Python firmata, https://github.com/lupeke/python-firmata/tree/98260401b92ce82399f8e795c87cb70cfd71d171, I hope to use this to combine another source of object detection, yet to be decided on.  This will ensure the bot doesnt crash into anything whilsts its searching for the ball.

Update 25/3/2012

The Aurdino delivery has now been put back until the end of May due to the much publicised non magnetic jack issue. I have done no further development on the robot code base, as I am working 7000 miles away from home. I thought about taking the robot with me, but I was worried about customs confiscating it.

Video archive

Backwards forwards http://www.youtube.com/watch?v=g0jcZnh-fMw

Haar face detection http://www.youtube.com/watch?v=oSC3BwQiu1w

OpenCV, HSV thresholding for object detection, bot moves towards target, adjusts position when it gets close enough

  • Actuators / output devices: DC Motors, Servos, piezo speaker
  • Control method: semi-controlled/auto
  • CPU: Aurdino
  • Operating system: Linux on the PC
  • Power source: 5V USB, 3 x AA rechargable batteries
  • Programming language: Python w OpenCV, Aurdino sketchbook
  • Sensors / input devices: Webcam
  • Target environment: indoor on smooth surfaces

This is a companion discussion topic for the original entry at https://community.robotshop.com/robots/show/asar-object-tracking-bot

Looks like this robot has

Looks like this robot has the potential to do have some interesting behaviour.

Should be a lot more versatile once fitted with the Raspberry Pi instead of the tether =)

Yes, I was hoping for one of

Yes, I was hoping for one of the first Raspberry Pi’s, however I am happy I got in the second 10,000 batch which wont arrive until the end of April.

Hopefully by then I may have refined my code and perhaps added an additional source of environment detection, I am not sure what to start with a bumper or IR sensor, do you have any suggestions?

I would need to read the input from any sensor with Python from the Aurdino, I believe I can do this, and have pulled in a serial stream from the Aurdino, but it seems a bit flaky, I am hoping firmata will provide me with more reliable readings directly from Aurdino Pins, this way I can interleave the readings with the opencv object detection.

So much to do and so little time.

Adding a bumper sensor is

Adding a bumper sensor is probably the best place to start, since you can debug the mechanical and electrical aspects very easily. Once you’ve got the bumper readings coming back to your laptop/Raspberry Pi reliably, you can add IR/ultrasonic/etc sensors and only have to think about configuring things that are specific to those sensors.

Ill add a left right bumper

Ill add a left right bumper and see if I can read the state of the microswitch from the arudino in python, will be an interesting and farily cheap / quick experiment to try for the first additional environmental sensor.  Thanks for pointer.