Force Field Algorithm
I feel like I had really good luck with sonars and the “Force Field Algorithm”. I’ll donate some arduino code if its anything you’re interested in. Once I smoothed out the sonar data a bit, it gave a pretty good obstacle avoidance while moving around and looking elsewhere. At a few bucks per sonar, its hard to beat.
I think around 12 are needed to be effective indoors and cover 360 degrees and handle wall bouncing. My bot only covered 270ish, so it would turn back towards a wall (if its goal was behind the wall) after it had turned directly away from it, as the force field was blind in the back. This can be solved with software to have a short term memory of what is behind or some other techniques…still wish I had 12 though. Despite the blindspot, it would work its way around a wall while attempting to reach its goal.
Did you do anything with the laser line level yet?
Nice work, I really hope you stay interested in this project, I think we all can learn a lot from it. I think I’ll hold off on LIDAR until I see what happens with yours.
I read about the Force Field Algrithm from a University…
I read about the Force Field Algrithm from a University of Waterloo Paper from a couple years ago.
Would be VERY much interested in seeing how you implemented it. It will be a bit before I get back to the Laser Line Level. I’m re-doing my code around command and sensor processing in python. Moving a lot of my “git-er-dun” style inline coding to approriate classes, and threading where I can. I’ve removed the mySQL command queueing nonsense that I had between the webserver and the bot, and replaced it with a websocket client/server. Much more responsive (but you all knew that!)
I’m still tee-ing the commands to a mySQL table for logging and potential replay, but that may even go away in the future…
I’m also trying to understand how to set up a publish and subscribe system to support multiple “bots” as they come online. Or more appropriately… to support multiple sensors as they get added to a bot.
By the looks of your videos, I’ve got a couple years of catchup to do. Hopefully I can lean on you from time to time to guide me in the right direction.
I’m very interested in where you are going with this as its an area I intend to spend more time on when life/kids and work allow me to!
I spend a lot of time watching insects which are essentially pretty dumb but seem to get around ok, even with very little brain power, and they dont get stuck in corners either
I’ve been thinking about having two levels of navigation, one being a low-level, basic object avoidance, similar to an insect. Your sonars and IR sensors seem ideal for that. Then a higher level intelligence that can perform the localization and mapping/search path navigation which the Lase Range finder seems idea for.
So you can use the laser to make a plan and head for for your goal, but the sonars can override to get around obstacles and then when danger is passed, the higher level cortex makes a new plan.
Love to see where you go with this.
This robot is great! You seem to have the same kind of goals as I do (I suppose most people here would want the autonomous, “goto location”, and manual drive modes, really). I’ve got my Pi, and varous other parts. Just trying to work out what to use as a frame.
I look forward to seeing how you go with it!