I want to take my Bliss Robot to the next level of "object avoidance". Please note that I just want the "next level", not the "ultimate level" (yet) ;-)
I got some advice from ignoblegome but I would like to openly ask which would that "next level" may be. I have seen several things out there:
2 x IR sensors in a single servo
3 x fixed IR sensors(left-front / front / right-front)
IR / Ultrasound combination (etc.)
If someone could point me to an example which is more advanced than my robot (but well explained) would be great...
More things I am considering (related):
I read some interesting things about 2D mapping (so I can guess if the robot would fit a "hole" instead of just changing directions...)
As I am already using a MaxBotix EZ1 I was thining on adding 1 or 2 more and use the daisy chain mode (which can be a fast way of getting measures?)
I’ve been thinking of something like this for a while too(multiple sensors and clever math for detection), perhaps you could do some sort of PID-avoidance with all the sensors?
The wikipedia-article is a bit technical, but I got the explanation after a few reads
So, what is your goal? Just find the largest opening or distance and go that way? Wander around avoiding objects? Go to a certain place, do something, then come back? Depending what’s your goal (or I should say your robot’s goal) you will need a different approach. But you also need to take sped into consoderation. Say you want to follow a line. If your robot is slow, you can do it with just one sensor (I’ve done it with Lego). But as you increase the speed, you will soon need 3 sensors, the 5 sensors or your robot will miss the line. More sensors will allow you to increase your traveling speed. That’s why people have used 2 or 3 IR sensors on one panning servo. This setup allows the robot to have 180 degrees distance data by panning only 90 degrees (or 60 degrees for a 3 sensor setup). An even faster setup has a rotating laser and sensor that spins fast and takes 360 degrees distance data. That should be enough to do mapping, even SLAM, but the problem is the map takes a lot of memory. So far the best mapping results I’ve seen involved a computer. The robot would take measurements and send them to the computer for display and mapping.
If a computer is involved for mapping, I would suggest using a webcam installed up high that can see the whole room. Have an IR beacon on the robot so the webcam can see the robot, then use a grid system and tell the robot it’s position in the grid. This is how the soccer robots find their way in the field and also find the ball.
If you need video data from the robot, then it gets complicated. You either need to install a computer (just a micro motherboard or SBC or a Chumby…) directly on the robot, or use a wireless camera setup. The cheapest wireless setup has a tiny camera, a receiver and a USB video caption box. This way, you can set the wireless cam to act like a regular webcam and use MRL, ROS, Robotrealm, MSRS and others to do the mapping and more.
I guess the most obvious step is to substitute my current sensor (I always read 3 positions) for 3 sensors and use the same program but “faster” as I do not need to wait for servo times… but i would like an improvment in the wandering algorithm… not just speed.
I think it is better to have some kind of “2d map” which is actually an array with several measures (say 9)?.. but if I just turn to the angle where the distance is longer… I am still basically doing the same thing, but with more resolution?. I need to read more about mapping or SLAM? (googling this now suggestions welcome))
I would like to wander around avoiding obstacles… I will leave for a later stage to learn the space (rooms) so it can go from one known place to another.
I haven’t found much arduino code examples with more than one sensor… just to confirm if I am doing things right turning on one when the other is off…
Lastly I am still unsure on buying 2 (or 3) sharp IR sensors or a couple of aditional maxibotix EZ1… the dasy chain mode sounds promising… but it is much less used out there…