2 Servos, 2 Sensors and a Mac!

I wanted a little project which would test a Mac application I’ve developed. The “robot” is just a Pan/Tilt and arm Base kit combined. I made a few extra brackets so I could mount the IRPD above the Sharp GP2D12 sensor.

I’m using a SSC-32 board to control the servos and read the sensors. I’m using Bluetooth to communicate with an Apple 12" Powerbook nearby.

I made some cut outs from poster board to try 2 different methods for detecting objects:

In the first video I’m using a function called readIRPD(). It returns “None”, “Left”, “Right”, or “Center.” If the response is “None” or “Right” then I move a bit more to the right. If it reads “Center” then I stop and use the GP2D12 to read the distance. I also calculate the angle based on the servo position.

In the video you can also see an overshoot and correction move. Watch the LEDs flash when locating the second target. The Mac’s built in speech narrates the action.

http://homepage.mac.com/darrenlott/.Movies/MacSSC_IRPD.png
IRPD Demo

For the second test I added two more targets, wanting to see how quickly I could acquire their positions. This uses the “Speed” move from the SSC-32 and loops through about 50 reads of the GP2D12 distance sensor.

If an object is closer than 12" it goes into a list. At the end of the sweep I break the list into clusters (based on servo position) and pick the middle read from each cluster. To see how it did, we shoot a laser at each of the identified “Targets.”

Coming up with a way to integrate the laser was fun. I even considered one of the 100mW burning lasers, but since it was Christmas all I could think of was “You’ll shoot your eye out!”

http://homepage.mac.com/darrenlott/.Movies/MacSSC_GP2D12.png
GP2D12 Demo w/laser

wow, thats impressive

nah, we are all professionals here. just make sure you catch your first attempt on video. and why not bump it to 300mW, i want to see those cardboard guys cut in half.

but seriously thats a great little project you put together. You know with the height of all the men being the same you could calculate the angle off the distance so as to point the beam at their heads.

nick_A

That’s a rather nice way to target a large mass!
I’ve never been real interested with rangefinders, as they tend to be quite nonlinear when the ranger isn’t pointed perfectly at the object being viewed.
Scanning for an average cluster sounds like a novel way to get rid of messy noise AND get a much better idea of how things are laid out!

Oooh, neat idea, Nick_A.

A = tan(H/D)

A is the servo angle necessary to hit the head.
H is the distance between where the dot would normally be pointing (on the chest) and where you’d like the dot to point (on the head).
D is the distance to the figure.

You could use the same method to point to any place on the figure.
Let’s say, the foot:

A1 = tan(H/D)
A2 = tan(W/D)

A1 is the tilt servo angle.
A2 is the pan servo angle.
H is the height distance between chest and foot.
W is the width distance between chest and foot.

And, since you’ve got a PC as a processor…
You could do real complicated things, like having the robot trace the outline of the figure!

Great project!
Good luck.

Thanks Nick(s)!

Compensating for target distance is an excellent idea. I’ll be sure to try it out soon.

Currently I’m looking for a way to do very rudimentary vison. Maybe detect a colored shape or something, to distinguish between “Good Guys” and “Bad Guys” (e.g. an orange star would designate a Good Guy). So far I can control the iSight camera and grab an image for processing. I’ve read a bit about “ImageJ” for the Mac so I’m hoping I can build a module from there.

Here is an update to my machine vision pursuit:

There are a open source java classes to do various grunt work, but I didn’t find anything to fit directly into my application. I didn’t even find compelling demos.

One night I discovered an Apple app called “Quartz Composer.” It’s a gem hidden in the 10.4 Developer tools. If you have a Mac you definitely want to look into this.

Quartz Composer (QC) is a graphic based programming environment, so you drag an image source and a filter and connect them graphically, and then drag that output to somthing else. No coding. Ridiculously simple.

I was quickly able to take the input from my iSight camera and run it through a decolorizing filter, and then add color back to the object I wanted to track (a blue ball). Changing the parameters of some filters is pretty straight forward.

What you save is a .qtz file.

If qtz files ran only under “Quartz Composer” they would be a novelty at best. However, these qtz files are used for Quicktime movies, Screen Savers, iMovie filters, web page motion graphics, etc… The Mac OS is using them with the interchangability you’d expect from a jpeg.

I was also able to use the qtz file as a resource in my own custom Application (like you would any picture resource) and also direct numbers from the qtz processing into fields. Those numbers could then be used to position servos and constitute the basis for Robotic Vision Tracking.

Below is a simulation demonstrating where I want to go with this.

http://www.gravitypublishing.com/movies/Blob_Trac_Sim.png
Blob Tracking Simulation

This simulation actually uses my iSight and a Quartz Composer filter to change everything to B&W except the blue ball. I then overlayed the text, tracking brackets and numerics in Motion.

I turns out that the QC “Filters” themselves can’t return numerics since they render on the GPU. QC “Patches” are the components that return numbers. The trick is to make a custom patch for Quartz Composer that will return an X,Y value, based on center of the colored blob.

I have found a tutorial on how to create a custom QC patches in Apple’s X Code, which can use modules programmed in Java. So now
I’m boning up on Java, so I can try to utilize some of the open source code and hopefully get my Robot to track a visual object.

:open_mouth: :astonished: :laughing: Good job!!! it<s realy nice what you can do with only 2 servos and a range detector! very nice

Now there is a leap forward. I cant wait to see the end result.

nick_a

Good job. I’ve always been interested in computer vision, tracking, ect. but have never found the time. And I’m only 13.