“Are there LED that can project themselves farther?”
Yes, they are laser diodes. Below are some examples of laser gizmos. Notice that a very cheap laser pointer is used. I’ve never done any tinkering with these, but would make an interesting school project.
You probably need to fully map out what you plan to do first. The main issue is that with the cheap laser pointers you are dealing with visible light and not IR. Cheap IR detectors that are made to respond to 38khz modulated IR may not see the visible light from the laser. The detector below might be hacked to remove the IR filtering so it will respond to the 38khz visible light. For testing, the laser beam could be modulated at 38khz using an IR remote control to modulate the power to the laser. I don’t know what type of laser you have, so I can’t comment on how you need to modify it. For the el-cheapo types, remove the batterys, insert a small dowel with a + power wire on the end, and the housing is the - power connection. Use a tie wrap to hold the button down.
Calculating the distance of an object with 2 cameras isn’t very difficult. TO calculate distance, you need the distance between the image sensors (how far apart the cameras are on the bot) and the angle at which they are looking at the object at. From there, you get a triangle (isosceles if the angles are the same) and you can calculate the distance from the midpoint between the sensors and the object.
Don’t know if I explained that very well…if you don’t understand I will happily clarify it for you.
I think I understand the theory and the math, but what software is capturing the images and processing them such that the distance is being calculated and the cams being positioned? Some times the difficulty is moving from theory to actual implementation.
If you have one year to develop this technology, you would probably need to have a sonar or IR based distance sensor to “train” your stereo vision system. The way humans and other animals with stereo vision does it was developed biologically over millions of years…
Before a stereo vision can determine distance, another portion must exist, which is spacial awareness. I don’t know if this can be accomplished with “precision” type of code, you would probably need to develop some sort of fuzzy logic to interpret the things the camera sees into distance. This is apparent in humans and animals since they both bump into things (collision unavoidance ) from time to time because of misinterpreting distance information from sight.
There’s nothing wrong with sonar detection, many animals do it where light is scarce to traverse through obstacles (such as bats through a cave full of stalactites and stalagmites).
I was thinking more, you can calculate de distance just by seeing how many pixels is the object off one image from the other. If it’s farther apart, the object is closer, if it’s very close together then the object is far. By the number of pixel from one object on both image you could (if you have already elaboraded math code) make a pretty good (I think) distance mesurement, of course IR and sonar are much more presice, but that wasn’t the initiale goal.
I fyou have made you code, you don’t need a year to make it work If you’re talking about it developping itself over try and errors… I don’t think were there yet