Arduino and SLAM

Hello,

I am developing a robot platform for children for educational purposes. My aim is to build a sub 100$ bot with arduino brains. I myself is a good programmer, and if I were to do it for myself, I would use an ARM or a better platform, but I have to constraint this project with arduino.

I have already designed some parts, and I have access to a 3d printer, so it is going all well. I already have a servo/sharp IR sensor scanner, and currently I am using a tamiya tracked vehicle for platform.

Now here is my question: I dont want to build just another ROV, or just another bug with object avoidance. I would like to write at least a SLAM. Any of you think it would be possible to build a SLAM algorithm with arduino and simple sensors such as sharp IR sensors? I have found http://openslam.org/tinyslam.html - which is a slam built with 200 lines of C code... Could it be possible to port this to arduino?

Has any of you done any similar work?

Best Regards,

C.B.

 

 

 

 

I’ve looked at things like

I’ve looked at things like that too, it is quite possible(at least it seems so to me) processing-power wise, the only problem I can see is the limited RAM on an Atmega328p. A map can grow big quite fast, even if you have “low” accuracy.

 

Adding an EPROM or maybe storing the map on an SD-card might be an idea.

I’ve tried using an IR

I’ve tried using an IR  sensor/servo scanner to do scan matching before but ended up quiting because the sensor doesn’t provide enough data and the data isn’t accurate enough. 

The problem is that scan matching and SLAM applications require that you can detect features from the data. Since the IR set up only produces sparse data, you don’t get the fine enough detail needed to reliably detect features. 

Looking at the tinySLAM webpage they mention that they used a Hokuyo URG-04LX which has a range of about 4 meters and records about 670 measurements over 240 degrees in a tenth of a second…a servo / IR sensor set up isn’t going to get anywhere near this speed and measurement density (not to mention the range). 

I’m not trying to say it’s impossible, I tried for about a month to get my application to work and would love to see what you’re able to come with if you give it a go. 

My two cents for what it is worth

You might have to lower your expectations if you want to succeed at this. There is a lot of inaccuracy in sensors and actuators and being a good programmer isn’t the only qualification you need to minimise this. What are your plans if you can map an area. To output the resultant map to a computer screen to see it’s success?

Having a map, what kind of tasks would you give it to implement the localisation part. Perhaps you could make an arena for it? An area that can change. Then you could output each changed configuration to see if it updates correctly? But this is still the mapping part.

I mean most people want a robot that can fetch a beer from the fridge but that takes infinately more power and cost. But you could make an environment suited to a small bot and have it fetch blocks, or whatever once it has mapped out it’s environment and knows what is a block and what is a wall. I would keep the area small from a tabletop to maybe 3m x 3m area. But the smaller it is the more interesting it will be because the task can achieved in a short enough time to maintain the attention span of the group you are trying to teach.

 

As far as sensing goes,

ChristheCarpenter’s Wii Camera breakout board, a laser, and a servo should give some pretty good data. Or, you could try to replicate the $25 kinect clone that was covered on hackaday.com.

Do you have a link for that

Do you have a link for that $25 kinect clone? I couldn’t find it…

Thanks!