Mecanum Wheel robot pet

Posted on 29/09/2018 by ranbo0311
Steps completed / 4
Press to mark a step as
completed or click here to complete all
Components you will need
Select missing items to add them
to the cart or select all
Other requirements
Introduction
I make robot pet.(Not a pet robot!)
 

Robot

Size/Weight

 -W x L x H : 200 x 240 x 170 mm

 -1.5 kg


The goal of my project is to create a robot pet that works indoors.(Not a pet robot!)

I think that pet robots are robots similar to animals (such as dogs and cats).

However, what I'm trying to make is a pet as a robot.

The following functions can be thought of as necessary functions to achieve this goal.

  • Moving mechanism(Mapping,Autonomous traveling etc.)
  • Cute look
  • Recognition function (audio, camera etc.)
  • Reaction function (movement, voice etc.)

I am developing to implement these functions.

I keep the development record on Youtube(https://www.youtube.com/channel/UCgEETDZfbyr0_9rYclqnPsw).

System configuration(I can not upload images for some reason so I will link the image)

 

I adopted the Mecanum wheel. Because it becomes the same as the pet robot in the walking mechanism.
So, I will use a Mecanum wheel that can move in all directions like a robot.

Mecanum wheel

https://www.robotshop.com/en/60mm-mecanum-wheel-set-2x-left-2x-right.html

I controlled by referring to the following page.

https://seeeddoc.github.io/4WD_Mecanum_Wheel_Robot_Kit_Series/

I used Aruduino pro mini and TB6612FNG for control robot.

Arduino programs:https://drive.google.com/open?id=1puwvaQ0M6EIU0JviEIyd0kBxR8JDYrvE

At first I was using TA7291P for MD, but I changed to TB6612FNG because of the large loss.

Power-supply voltage:5.08 V

PWM frequency(Output from Arduino pro mini) : 31.25 kHz

TA7291P Output voltage :

3.90 V(100%), 3.16 V(80%), 2.43 V(60%), 1.69 V(40%), 0.95 V(20%)

TB6612FNG Output voltage :

4.44 V(100%), 3.71 V(80%), 2.96 V(60%), 2.21 V(40%), 1.49 V(20%)

On average, TA7291P has a lower output voltage of 0.54 V than TB6612FNG.

 

I am planning to compensate using IMU to go straight.

I used Madgwick filter to calculate Quaternion.

http://x-io.co.uk/open-source-imu-and-ahrs-algorithms/

 

Then, in order to autonomously travel inside the room, Raspberry pi 3B + (Raspbian Stretch + ROS kinetic) was used for controlling the entire robot. Furthermore, in order to use the autonomous running package, visualize the state, I prepare a PC(Virtualbox(Ubuntu 16.06 + ROS kinetic)) outside and communicate via ROS.

This video is a state of autonomous traveling using parameters before tuning. I used gmapping to create the map.

At this stage I am doing with non-holonomic behavior.

The next video is autonomous traveling after tuning parameters. The map used is the same.

It is done with holonomic behavior.

move_base parameters is here:https://github.com/ranbo0311/Mecanum_robot.git

ROS configuration

Although there are still incomplete parts, the movement mechanism is almost complete.

 

The next step is recognition.

This step is still on its way.

First , color tracking was done without ROS.

Color tracking was done using picamera v2 and OpenCV.

I am using the controller to operate the robot being tracked in the video.

 

What I am doing now is simple color recognition, but I would like to do face recognition, object recognition, etc. in the future as well.

 

Next, AR marker tracking was done using ROS.

I used OpenCV aruco in Python to detect AR markers.

 

 

 

It is expected to use a USB microphone and julius.

Flag this post

Thanks for helping to keep our community civil!


Notify staff privately
It's Spam
This post is an advertisement, or vandalism. It is not useful or relevant to the current topic.

You flagged this as spam. Undo flag.Flag Post