Autonomous robot's navigation

Since October 2007 I developed new object recognition algorithm “Associative Video Memory” (AVM).

Algorithm AVM uses a principle of multilevel decomposition of recognition matrices, it is steady against noise of the camera and well scaled, simply and quickly for training.

And now I want to introduce my experiment with robot navigation based on visual landmark beacons: “Follow me” and “Walking by gates”.


Follow Me


Walking from p2 to p1 and back

I embodied both algorithms to Navigator plugin for using within RoboRealm software.
So, you can try now to review my experiments with using AVM Navigator.

The Navigator module has two base algorithms:

-= Follow me =-
The navigation algorithm do attempts to align position of a tower and the body
of robot on the center of the first recognized object in the list of tracking
and if the object is far will come nearer and if it is too close it will be
rolled away back.

-= Walking by gates =-
The gate data contains weights for the seven routes that indicate importance of this gateway for each route. At the bottom of the screen was added indicator “horizon” which shows direction for adjust the robot’s motion for further movement on the route. Field of gates is painted blue if the gates do not participate in this route (weight rate 0), and warmer colors (ending in yellow) show a gradation of “importance” of the gate in the current route.

  • The procedure of training on route
    For training of the route you have to indicate actual route (button “Walking by way”)
    in “Nova gate” mode and then you must drive the robot manually by route (the gates will be installed automatically). In the end of the route you must click on the button “Set checkpoint” and then robot will turn several times on one spot and mark his current location as a checkpoint.

So, if robot will walk by gates and suddenly will have seen some object that can be recognized then robot will navigate by the “follow me” algorithm.

If robot can’t recognize anything (gate/object) then robot will be turning around on the spot
for searching (it may twitch from time to time in a random way).

Now AVM Navigator v0.7 is released and you can download it from RoboRealm website.
In new version is added two modes: “Marker mode” and “Navigate by map”.

Marker mode

Marker mode provides a forming of navigation map that will be made automatically by space marking. You just should manually lead the robot along some path and repeat it several times for good map detailing.

Navigation by map

In this mode you should point the target position at the navigation map and further the robot plans the path (maze solving) from current location to the target position (big green circle) and then robot begins automatically walking to the target position.

For external control of “Navigate by map” mode is added new module variables:

NV_LOCATION_X - current location X coordinate;
NV_LOCATION_Y - current location Y coordinate;
NV_LOCATION_ANGLE - horizontal angle of robot in current location (in radians);

Target position at the navigation map
NV_IN_TRG_POS_X - target position X coordinate;
NV_IN_TRG_POS_Y - target position Y coordinate;

NV_IN_SUBMIT_POS - submitting of target position (value should be set 0 -> 1 for action).

Examples


Quake 3 Odometry Test


Navigation by map


Visual Landmark Navigation

Thank you!

And it would be nice if somebody here could try AVM Navigator in action.

It is easy :slight_smile:

You just should use the variables that described below for connection of your robot to AVM Navigator:

Use variable NV_TURRET_BALANCE for camera turning: NV_TURRET_BALANCE - indicates the turn degree amount. This value range from -100 to 100 with forward being zero.

Use for motor control NV_L_MOTOR and NV_R_MOTOR variables that have range from -100 to 100 for motion control ("-100 " - full power backwards, “100” - full power forwards, “0” - motor off).

I used in my experiments:

Software: Windows 7, RoboRealm\AVM_Navigator

Hardware:

  • tracks platform;
  • motor/servo controller (OR-AVR-M128-DS + OR-USB-UART);
  • HXT12k (servo for camera rotation);
  • Logitech HD Webcam C270;
  • 3Q NetTop Qoo (Intel Atom 230 1600Mhz).

[video=youtube;0jXW83az20E]http://www.youtube.com/watch?v=0jXW83az20E

You can find out more in topic “Using of AVM plugin in RoboRealm”

**Quake 3 Mod **

Don’t have a robot just yet? Then click here to view the manual that explains how to setup RoboRealm
with the AVM module to control the movement and processing of images from the Quake first person video game.
This allows you to work with visual odometry techniques without needing a robot!

The additional software needed for this integration can be downloaded here.

Is it possible to play with virtual robot in “Navigation by map” mode?

Yes!

Just look into documentation and download the “AVM Quake 3 mod” installation.

Next modification of AVM Navigator v0.7.2.1is released.

Changes:
Visual odometry algorithm was updated:


Visual Odometry

I have done new plugin for RoboRealm:

roborealm.com/help/EDV_DVR_thumb.jpg

Digital Video Recording system (DVR)

You can use the “DVR Client-server” package as a Video Surveillance System in which parametric data
(such as VR_VIDEO_ACTIVITY) from different video cameras will help you search for a video fragment
that you are looking for.

You can use the “DVR Client-server” package as a powerful instrument for debugging your video processing
and control algorithms that provides access to the values of your algorithm variables that were archived
during recording.

Technical Details

  • ring video/parametric archive with duration of 1 - 12 months;

  • configurable database record (for parametric data) with maximal length of 190 bytes;

  • writing of parameters to database with discretization 250 ms;

  • the DVR Client can work simultaneously with four databases that can be located at remote computers.

I prepared simple video tutorial "Route training and navigation by map":

i2.ytimg.com/vi/qVz9iBazqug/default.jpg

See more details about tuning of “Marker mode” and “Navigation by map” modes.

forums.trossenrobotics.com/attachment.php?attachmentid=3507&d=1317583050&thumb=1

forums.trossenrobotics.com/attachment.php?attachmentid=3508&d=1314812632&thumb=1

It’s test of new algorithm for AVM Navigator v0.7.3. In this video was demonstrated how robot just try to go back on checkpoint from different positions of the learned route.

i4.ytimg.com/vi/wj-FKhdaU5A/default.jpg

Back to checkpoint!

It would be nice if you will provide some sketchs (images) of your indication variant :wink:

Thank you for your support!

Do you use color matching or just grayscale matching?

The AVM algorithm use only grayscale matching and it not allow to AVM see difference between green and red balls. So, good object for recognition with AVM should has appreciable texture. Also you should endeavor so that the object occupies as much as possible the square of interest area (red rectangle) during training because for AVM anything that was placed in interest area will become as object (background also).

Now I want draw your attention to the next problem: “What your robot see and how it affect to navigation?”

In our case the robot uses only limited sequence of images from camera for navigation. The AVM Navigator application just tries to recognize images so that be able to understand robot location. And if you show in route training process of some image sequence (just lead your robot in “Marker mode”) then in automatic navigation mode the robot should have possibility to see the same sequence of images without anyone changes. Otherwise AVM Navigator will not be able to recognize images and it will cause to fail in localization and navigation.

So, such navigation do not needed in specially prepared markers on the walls. But walls should be not cleared (without any textures).

You can find out more in this topic:
roborealm.com/forum/index.php?thread_id=4246#

Now I work over new “Watching mode” that allow robot to respond on motion in input video.

[video=youtube;c1aAcOS6cAg]http://www.youtube.com/watch?v=c1aAcOS6cAg

This mode will be included to the next version of AVM Navigator v0.7.3.

AVM Navigator v0.7.3 is released!

** Changes:**

  • The new “Back to checkpoint” algorithm was added in “Navigation by map” mode.

[video=youtube;wj-FKhdaU5A]http://www.youtube.com/watch?v=wj-FKhdaU5A

  • Also new “Watching mode” was developed.
    And robot can move to direction where motion was noticed in this mode.

[video=youtube;c1aAcOS6cAg]http://www.youtube.com/watch?v=c1aAcOS6cAg

Also common usability was improved.

By the way I received new video from user that succeed in “Navigation by map”:

[video=youtube;214MwcHMsTQ]http://www.youtube.com/watch?v=214MwcHMsTQ

His robot video and photo:

[video=youtube;S7QRDSfQRps]http://www.youtube.com/watch?v=S7QRDSfQRps

roboforum.ru/download/file.php?id=22484&mode=view
roboforum.ru/download/file.php?id=22280&mode=view
roboforum.ru/download/file.php?id=22281&mode=view

I believe that you also will have success with visual navigation by AVM Navigator module :slight_smile:

>> could this be integrated into ROS or a similar framework?

The AVM Navigator module is working in RoboRealm environment and you can use RoboRealm API for connection AVM Navigator (with helping of “NV_” variables) to any external application.

See for more details:
Extending/Interfacing/Programming RoboRealm
AVM Navigator Module

<<<

Yet another video from user whose robot has extremely high turn speed but AVM Navigator module could control robot navigation even in this bad condition!

[video=youtube;G7SB_jKAcyE]http://www.youtube.com/watch?v=G7SB_jKAcyE

His robot video:

[video=youtube;FJCrLz08DaQ]http://www.youtube.com/watch?v=FJCrLz08DaQ

It is a testing of new robot for AVM Navigator project:

[video=youtube;F3u0rTNBCuA]http://www.youtube.com/watch?v=F3u0rTNBCuA

If you have one then we could test it with AVM Navigator.

See this topic for more details.

It seems that I have not understood you.

Have you considered selling it as an APP on MyRobots?

What did it mean?

Did you say about selling of application “RoboRealm + AVM Navigator” within some of robots that has been presented in MyRobots site?

Or did you mean selling of our finished product: “Twinky rover” + “RoboRealm + AVM Navigator” from MyRobots site?

Twinky rover:

edv-detail.narod.ru/winky_s.jpg

Playing with Twinky rover that was controlled by AVM Navigator:

[video=youtube;4tpwAvcmZf8]http://www.youtube.com/watch?v=4tpwAvcmZf8

[video=youtube;pt2y7xkiTXo]http://www.youtube.com/watch?v=pt2y7xkiTXo

Twinky rover presentation:

[video=youtube;Fkjpma1oWAg]http://www.youtube.com/watch?v=Fkjpma1oWAg