How to synchronise two CMOS Camera Modules for Stereo Vision

It has been a while since my last posting on this topic. In the meantime I made significant progress as you can see here:There is now a PCB with all necessary stuff on it which for example completely alleviates all I2C chip configuration problems I had in the past with my first wire-connected protot


This is a companion discussion topic for the original entry at https://community.robotshop.com/blog/show/how-to-synchronise-two-cmos-camera-modules-for-stereo-vision

Nice work !

Will the general idea be to generate depth through horizontal disparity?  If so, what are you plans regarding matching?  Will you be using OpenCV or working on your own algorithms?

This is my masterplan

This is my masterplan:

  1. Generate a Side-by-Side live Standard Definition video signal from separate signals of two camera modules at an appropriate pupillary distance.
  2. Transmit this signal with standard components from a robot/RC car/quadrocopter/etc. to my remote control.
  3. Feed the signal into the a Zeiss Cinemizer Plus or maybe the successor Cinemizer OLED.
  4. and enjoy stereo-vision from a remote positon.

My focus in this moment is rather not an autonomous robot with 3D-vision algorithms but to remotely control air or ground vehicles with stereo vision. If you consider RC airplanes or RC helicopters this is also known as First Person View (FPV), a quite common topic due to the availability of video goggles like this one. But to my knowledge none of them offers stereo vision. So I tried to build my own.

As the digital video signals (BT.601 and BT.656 standard) arrive at 27MHz I am forced to use an FPGA instead of e.g. a Parallax Propeller. The FPGA synchronizes the two video streams and rearranges them to build the side-by-side image configuration which the Cinemizer reads. It acts basically as a video stream filter - read two streams and make one out of them.