Mike's Biped Torso Kit/w pics

One more photo:

This shows the “neck” to show how it looks from a head on view.

http://img77.imageshack.us/img77/1982/torsofront2cn0.jpg

THAT IS ABSOLUTELY INSANE!! IT EVEN TALKS!!! So sik MIKE!!! :open_mouth:

(ya, its one of those screaming posts I so often give). :smiley:

Pretty sick ■■■■ man. I think your bot is better than ASIMO :stuck_out_tongue: Like the future camera addition a lot. But, isnt that footage from the CMUcam2? Which you don’t have on the bot… which is footage I seen somewhere else :stuck_out_tongue:

Im sure he just incorporated those graphics into the video to give us an uderstanding of what the future vision of the bot will be like…dont take evereything you see as what will really happen. Im sure its just a future idea… :wink:

erm, the cam is able to track objects, colors, people and hand movements…
Intel even released an OpenCV which is a library that can be used to turn your webcam into a camera which can track stuff .etc

i c

No, the video you see is the software that will be used with the cmos camera, which will fit the new head design.

This is actual footage of the software in action. I do not have the cmos camera yet, but instead, I used my web cam as the video input source. In the video, I showed briefly my hand, and my daughter waving her arms to demonstrate the tracking.

I was experimenting with the software to prepare for the cmos camera. Pretty neat huh?

Here is a rendering of the new design:

http://img50.imageshack.us/img50/5969/ccdheadso3.jpg

http://img124.imageshack.us/img124/2897/ccdhead2nn9.jpg

how will you judge distances for the new head design? or will it not to to do that?

and could you post where i could find that cmos software? it looks quite interesting :laughing: 8)

Well Steve, the author of the software called RoboRealm, is working on a distance algorithm that will allow the bot to judge distance. How this works is an image in a video frame is a certain size, the closer you move to an object the larger this object becomes in a frame. Just like when you zoom in to an object with your camera, that object fills more of the frame.

wow, thats pretty sick stuff :smiley:, so it uses like, depth perception? with only one eye? :open_mouth:

ahhh, so that was actual footage with the camera…I thought you incorporated some kind of demo video into the vid to give us a look at what you will be doing…

that software is nifty…I WANNA I WANNA!!!

Sorry, I did not finish my last post. My wife came home and I was supposed to be fixing the shower. She gone now so ill finish what I was going to say. :laughing:

I have been working closely with Steve, the author of RoboRealm with practical ideas that will improve the software, and the distance idea was one of my suggestions. It is not yet implemented in the latest release yet, but it should provide some basic distance sensing. It’s not going to be as good as IR or Sonar, but it should allow the bot not to run in to walls and other large objects. :laughing:

I’ts been available on my website under “Vision Control” for a long time. My website link is at the bottom of every reply I do.

You can get it direct by going to www.roborealm.com

My bad Mike :laughing: Sorry about that hehe ^^’’ I remember seeing something simlar to that hand thing done with the cmucam. Thos roborealm program looks interesting. Does it use any camera? Wecams also? How does this “roborealm” software work when you have the camera on the robot itself? Wouldn’t you need to link the cam to the computer somehow? Or you gunna have an itx or a gumstix on your bot?

Oh yea, one more thing. Not sure if you ever worked with IR sensors but I have for over a year or so and I can easily say “f them.” (sorry for my language) I would suggest going with something better like the ultrasound sensor nick found at sparkfun.

LEAVE HIS DAUGHT ALONE!!! TOUCH HER AND MIKE WILL KILL YOU WITH HIS BIPED!!!

there, I said it for her protection :smiley:

Yes the cam needs to be connected to the computer. I plan to send the video feed wirelessly to the pc, and then the results from RoboRealm wirelessly back to the bot.

IR sensors has its advantages. Sonar is reportedly more accurate than IR, and My bot head has the sonar elements installed in the eye sockets of my current bot head design.

thats cool, sounds like youve got everything figured out than.

Thats pretty cool. Lots of work though. What advantages does IR have? As far as I seen in my 14 months in using it, it totally sucks. It gets different results based on the color and material it is facing. The values you get flicker a lot. However, its really cheap. I found an IR LED w/ transmitter and emiter for 99cents. ultrasound is like 30$ =/

Mind if I ask how you plan on talking to your computer? You going to use a wiport or something? Or one of those 80$ wireless cameras with receiver and then plug the receiver into your computer’s TV Tuner? (or other AV to computer device)

I played with the RR program for an hour or so. It is pretty simple, but hard to get right. And without having a robot to try it on, its sort of hard. Question though: How did steve get his robot controlled in the tutorials by the pc? He used a vb script to do it, but how did the variables from the RR program get to the vb script and how did the vb script control the motors?

I plan to use two types of wireless devices. Wi-Port is one of them, and there is another small video transmitter. If I can use the wi-port for both, that would be best.

you have to show use this set up :laughing:

if you get it working, in the future, could you show everyone how to do it? because i would like to know along with everyone else to know how to do it, to implement it into our own roobots :laughing:

Sure, no problem.

My main concern is weight. I hope my bot does not get to heavy with so much stuff on it. :laughing:

We shal will see.