Review: Pixy CMUcam5 Image Sensor

     First of all, I would like to thank to Let's Make Robots for selecting me to review this great product!

     Second, I have to say I'm so sorry for taking soooo long to post the review. Spare time is scarce, and when I have some, I work hardly in MDi #4.

     I will not mention many technical details, since it's well done at the official pages of the product (some links at the end of this post). This post is a quick overview of Pixy and mainly I do intend to share with you my fun, experiences, difficulties and perceptions about this nice electronic module.

     I would like to show you more good content right now... but also do not like to wait more to post it. So... here we go!

 

 

     What is Pixy?

     Have you thought about connect a camera in your Arduino? (I guess many people has already thought about implementing vision in their robots.) Actually connect a camera directly to Arduino is not so simple (or is not possible), and even if it were possible, it would overwhelm the Arduino. 

     Pixy was created to meet this need. And thanks to a powerful processor, the work goes easy and fast for the microcontroller.

     Pixy processes the image (of a blue object, for example, at a determined point) and sends the coordinates to the microcontroller. With this data you can make, for example, a robotic arm to grab the desired object; instruct DC motors to spin and follow the object; there are many possibilities.

     A great feature of is that you can "teach" objects to Pixy. Actually the color of the object will be stored. To teach Pixy an object you can use the "teaching button" or the PixyMon application.

 

 

     PixyMon

     PixyMon is a great application "to see what your robot sees". Also you can set the signatures, define some settings, run the Pan/tilt Demo, and some other tasks.

 

 

     Pixy + Arduino

     Pixy can work with some microcontrollers. I have tested Pixy only with Arduino (since I just have this microcontroller, and it's my main focus). 

     And it couldn't be easier! A cable comes with Pixy to connect it directly in Arduino boards using the ICSP pins.

 

     The picture below shows the Arduino IDE Serial monitor and the coordinates of the detected object/signature, while Arduino runs the hello_world example.


 

 

     MDi #4 + Pixy

     As I proposed in my entry for the “Call for Reviewers”, I’ll be using Pixy in my robot MDi #4. In the second video (from top to bottom) you can see my robot with Pixy running the Pan/tilt Demo. I was running the Pan/tilt Demo directly from PixyMon (without using Arduino) and the servo motors connected directly in the Pixy. Since in that case Pixy was powered by the USB cable, and as I was using MG995 servos, an external power supply was used for the servo motors, because the USB port has limited current capability.

 

     Tip: while running the Pan/tilt Demo, start with low values for the "gain" and increase it till get the ideal value.

 

 

     Makeblock mBot + Pixy

     I saw some videos of robots chasing objects, and it was very fun. I thought mBot would be ideal to try something. As its brain (the mCore board) uses ATmega238 (like Arduino UNO) and also have the ICSP pins, I just needed to make a bracket (with high impact polystyrene) to attach Pixy in mBot, I could immediately start working on code.

 

     The two last videos show some tests with mBot chasing a green LEGO bucket. The code is working well (just need some fine adjusts), and when done I'll share it here.

     Issues: mCore have built-in RGB LEDs that randomly flicker while running Pixy's code. It would be nice if the color of the LED was the same of the desired object... but how it's going is very annoying. I have simply covered the LEDs with electrical tape to make the videos. I'm not sure what is the cause (I guess some conflict between the libraries of mBot and the way how mCore works with its ports).

 

 

     Conclusion and final considerations

     Pixy may seem limited to working only with colors. But this limit can be overcome by working with "color codes", which are combinations of various colors to form a code. It is a compulsory sensor if you want to make smarter robots.

     I intend to keep exploring this sensor in some future projects and share them here.

 

     Something to look forward to is the face detection capability. Can't wait for it!

face_detection_pixy.png

     From: http://www.cmucam.org/projects/cmucam5

 

     Comments, questions and contributions are apreciated.

 

 

     References and links

     http://www.cmucam.org/projects/cmucam5 - Overview - CMUcam5 Pixy - CMUcam: Open Source Programmable Embedded Color Vision Sensors

     http://tutorial.cytron.com.my/2015/08/28/colour-tracking-mobile-robot-pixy/ - Colour Tracking Mobile Robot (PIXY) - Tutorial by Cytron - Very good code for a object follower robot.

https://www.youtube.com/watch?v=YLRgmmcyUqg

Nice review. This reminds

Nice review.  This reminds me that I need to complete mine! 

I drew some different conclusions about the device.  I will be interested in your response.  Like you, I am short on time and keep wanting to try new things to make sure I am giving it its due.

Regards,

Bill