I am looking at using the Pheonix with Xans/Sentas code as a chassis, with another controller for the ‘mind’.
Using the Phenoix code to control all the movements erc makes for less processor overhead, as I will be using a CMU Cam for doing some tracking work, and an IMU for some adaptive work. I have asked over in the Phenoix code post if anyone has any serial command data rather than me trying to workout the dualshock commands so I can command it from my module?
Once I have this working (going to have a very powerful PIC32 ‘brain’ at 80MHz) I will have it doing some other trick stuff like chasing a light around, mapping a room out, and some over advanced terrain stuff, all very exciting! Just waiting to finish building my pheonix chassis and I’ll be on the way!
My background is embdeed engine management systems, so have loads of development stuff kicking around, and would love to have a bit of a robotics hobby on the go!
My Pheonix ‘Parts Spec’ so far is Phenoics Chasis with 13kg/cm Servos (3DOF/leg), BB2, SSC-32.
I plan to add a PIC32MX controller for the intelligence section, as well I have also ordered today a CMUCam to do some visual tracking stuff.
Ultimatley my dream movement from this point would be to build a larger scale version (perhaps 2 or 3 times the size) and use the small readily aviuable pneumtic cyliners you can by, and run a small 4 stroke IC engine for air compression and power generation. I may even create a small nitro based electric genertor for this pheonix (weight dependant), just for fun, which I guess is the best motivation one could have!!
Pics to follow as I progress, and of course I will post all of my code, findings and CAD work as I go for the bigger version!
I always set my sights high, makes for more interesting mistakes!!
Just been looking over the OpenCV stuff, and some tracking stuff written for a guys PhD.
After looking further into it, I intend to have the robot operate in 3 modes:
-Fully Auto - All ‘thinking’ done on board, basic shape tracking, room learning, return to base stuff (for charger station) etc, hide from nois/light (act like a cockroach in essence)
-Semi Auto - All ‘thinking’ done by a remote PC over xBee, with wireless 802 camera for more advanced facial recognition, chasing stuff down, higher processes etc
-Dumb - Remote control, FPV control via host PC, USB joystick and VB console app for live data streaming etc. The only autonomous functions will be the gaits and movements as per Phoenix code…
Playing with arduino, some servos, a webcam and OpenCV right now
You were able to get a CMU cam? I tried to order one about two years back. Finally got a Blackfin Camera. Now they’re on their last legs! I will add Blackfin camera to a RoboMagellan 'bot. Autonomous, seeks out several waypoints over park type topography. Avoids obstacles, “Blob Tracking” of orange cone for goals. Multi-processor, custom six-wheel drive chassis with suspension. GPS, 9DOF IMU, sonar/IR sensors. Telemetry. Secondary R/C control.
This sounds very interesting. I had a look at those camera modules. They don’t mention face recognition/tracking at all. Is it still possible?
Looking forward to more posts.
Blackfin does tracking, I don’t know about the face recognition. Sounds like an algorithm that would have to run on a PC as opposed to just running on the Blackfin cpu.
If the CMU cam can do face recognition, then I’m sure with the algorithms, the Blackfin could do it as well.