I make Autonomous navigation robot using move_base and amcl package with ROS.
Use map created by gmapping(https://youtu.be/9EaXi4bh2s0)
The first video is autonomous navigation before tuning.
The second video is color tracking.(Not using ROS)
The third video is attitude control with IMU.
The fourth video is autonomous navigation after tuning.
The parameter of move_base after tuning has been uploaded to github.
https://github.com/ranbo0311/Mecanum_robot.git
Parameters refer to these codes : https://github.com/zaki0929/raspimouse_navigation_2.git
Robot
Size/Weight
-W x L x H : 200 x 240 x 170 mm
-1.5 kg
Controller
-Raspberry Pi 3 B+ (Raspbeian Stretch + ROS kinetic)
-4 x Arduino pro mini (MD, Analog sensor + Ultrasonic sensor, 2 x Rotary encoder)
-PC (Windows 10 + Virtualbox[Ubuntu 16.04 + ROS kinetic])
Hardware
-4 x Mecanum wheel 60 mm
Materials
-Polycarbonate plate (Body)
-ABS (3D printer : https://www.tiertime.com/up-mini-2/)
Actuators
-4 x DC motor (RE-260RA + 74 : 1 gear box)
-2 x servo motor (SG-90)
Sensors/Other devices
-8 x Ultrasonic sensor (HC-SR04)
-1 x Temperature sensor (LM61CIZ)
-1 x IMU (MPU-6050)
-1 x camera (picamera v2)
-1 x Lidar (YDLIDAR X4)
-4 x Rotary encoder (https://www.pololu.com/product/3499)
-4 x Current sensor (ACS712ELCTR-05B-T)
-1 x Ring LED (Neo Pixel Ring)
-2 x Motor driver IC (TB6612FNG)
Youtube(My channel):https://www.youtube.com/channel/UCgEETDZfbyr0_9rYclqnPsw
Autonomous navigation
- Actuators / output devices: DC Motors, Servo motors
- Control method: autonomous
- CPU: Raspberry Pi, PC, Arduino
- Programming language: Python, C++
- Sensors / input devices: YDLIDAR X4, Rotary encoders, IMU sensor, camera
This is a companion discussion topic for the original entry at https://community.robotshop.com/robots/show/autonomous-navigation-mecanum-wheel-robot