Leveraging The Community - I can't do it all myself

Forgive me a little "Make My Robot Great Again", I am in the middle of upgrading the brain and sensors of my 17 year old robot.  Actually, the only things I was able to reuse:

  • Chassis
  • 7.2v 5000mAH Power source (six 1.2v NiMH C-cells)
  • Two 6v Metal Gear AutoTrol motors
  • Full Skirt with 6 direction bump detection (F,LF,LR,R,RR,RF)
  • "RC Airplane" soft rubber wheels
  • Encoders
  • "Third Wheel" skid

I have managed to add:

  • Pololu 5v Step-Up/Step-Down Voltage Regulator
  • Raspberry Pi 3
  • Micronauts Pi Droid Alpha (12bit ADC, Motor Driver, Digital i/o, Level-Shift)
  • Tilt-Pan Sensor Platform
  • HC-SR04 Ultrasonic Distance Sensor
  • GP2Y0A6 IR Distance Sensor
  • PiCam
  • Rechargable Amplified Audio Speaker
  • Voltage-Current Sensor

I originally set out to reproduce the Rug Warrior Pro firmware and the examples from "Mobile Robots: Inspiration to Implementation", Jones, Flynn, Seiger.  17 years ago I was the moderator of the Rug Warrior Yahoo Group with 300 enthusiasts sharing how-to and software on the common RugWarrior robot platform.  The platform was designed by PhDs, and passed through myriads of college student beta-testers, so it felt like you were not alone and success was probable.

The benefit of community that shares a common platform is tremendous.  At this point there seems to be only two choices to have this again: 1) Build a ROS interface layer for my bot, or 2) Build a MyGoPiGo.py interface which enables the motors, encoders, ultrasonic, and pan servo.

ROS seems like a really long pull, so I'm starting on the second path - the MyGoPiGo.py interface.

I need to figure out how to PID my platform:

 

  • Initialization
  • Interrupt callback/thread to read the encoders,
  • speed thread to control motors for fwd,bwd
  • distance thread to control motors for fwd,bwd,spin, and turns
  • interface to the robot thread

 

No small feat to do it all myself, but it seems if one chooses to build your own robot platform, these are the hurdles each maker must clear first.

Alan

 

Solo Projects

As you indicated, embarking on an entirely custom project means those who wish to become active and help only tend to be able to provide input into certain parts of the project. Normally using a Pi means some fairly high level programming (you indicate a camera - is that for image recognition or telepresence?). Would be great to see some photos.

Consider a Compass

If I understand where you are at and what your goals are, I might advocate that you try to incorporate a compass next, and a GPS if possible.

Imagine a robot that needs to get from point A to point B, and Point B is 100ft away and the surface is not entirely flat…might have pebbles, bumps, grass, whatever.

A 2-motor differential drive bot with a compass can easily accomplish this, maintaining a close to straight line all the way to destination.  With GPS, it can easily make minor corrections along the way to arrive “near” Point B.  Nearness is still a problem, but at least the bot can drive a straight line and self correct.

I am skeptical that any bot with encoders could do this reliably, even on almost perfect flat terrain.  Is it worth trying?  Even a parking lot has bumps, pebbles, etc. that would throw encoders into massive error all at once or through small accumulations.  How then to self-correct and get back on track? 

Self-correction becomes all the more necessary when you introduce obstacles where the bot has to drive around them and still figure out how to get close to destination with a new heading.

A lot of people go down the encoder path under an assumption that if you know how much each wheel has moved, that you can calculate where your bot is headed and where it is.  I believe these assumptions are quickly invalidated in the real world.  Just one man’s opinion.  I wish you luck and learning with whatever you do.

Hardware vs Software

I agree software PID will be much weaker than adding hardware gyro/compass.  I mentioned PID as one of the recovering lost capabilities due to giving my bot a frontal lobotomy.  At the height of my former bot’s life, it made me very happy without precision location.  The PID just made it seem a little less random, when it was wandering with random avoid and escape turns.  

It implemented a Rodney Brooks “subsumption architecture”, had a few goals (“keep alive” was the only meaningful goal), expressed rudimentary awareness of temperature, light, sounds and humans, expressed synthetic emotions (based on “estimated remaining life”, time since last required escape, and alone or human present) and truly non-functional mood states.  It did all this in less than 32kb of code and with a very “in the moment” strategy.

Having four processing cores, a Gig of memory, nearly unlimited storage, wireless connectivity to unlimited knowledge sources gives me huge pause for “What Do You Want To Do Today?”.  I am torn between recovering my basic robot function before I start exploring the new capabilities, and leveraging the things others have done, or embarking to add something to the “hobby robot” toolbox.  

I have already tested on-board speech recognition with the CMU Sphinx engine (element14 gave me free Pi3!) RoadTest Review of Pocket Sphinx on Pi2 vs Pi3 and I have tested each item of my hardware individually, but I don’t have any experience computing with the Ultrasonic sensor, or the Camera.  I want to create a few basic “sofware sensors” from the PiCam and ultrasonic distance to replace the hardware sensors my bot lost in the lobotomy - left and right light intensity, as well as left,center,right close obstacle detection, and the human present/moving sensor.  

Additionally, I really want to add “wall and door recognition”, “corner recognition and park out of the way” behaviors.  I have seen people posting track an object, and follow an object examples, but these capabilities are not high on my “wish list behaviors to integrate”.  Finding and recharging when “estimated life” gets low are high on my wish list and will probably be the next hardware upgrade I give my robot.  A standardized robot recharging dock and robot dock contactor, that can be set for USB standard 5v at 2amps or just solve the physical problem and let me put my RC battery charger to the dock would be very helpful.

By simulating the GoPiGo fwd(), us_dist() and [pan]servo(angle) APIs, I was able to re-use someones “ultrasonic distance, servo scan, display map” software.  That brought a real high - both that I could re-use someone elses code, and feeling apart of a community.  I lived through digital component standardization, and standard language libraries, to published web services.  The hobby robot community has suffered greatly from the lack of standarized platform and interface definitions.  ROS addresses this at the professional and academic level, but the hobbist and school grades 1-12 are still faced with recreate everything from the ground up.  

When I started programming professionally (Intel 8080, Motorola 6800) I didn’t even want an operating system.  If it worked I wanted to know how and what it did, and if it didn’t work, I wanted to know the problem was my code.  One of my first jobs was to write the runtime code for a new compiler.  The woman writing the compiler didn’t even want to know what was actually running on the processor.  I could not understand where she was coming from.  Now I want to be more like her.  (The desire to know/understand something about everything can be a curse.) 

Dreaming a bit, I really want an RDF/OWL brain in my bot, with natural language query/response generation tied to text to speech and speech recognition, with visual object recognition and integration into the RDF.  I want my bot to recognize me, and use me to increase its functional capabilities in areas it discovers it is deficient, learning only what it can use, and using its hardware, knowledge, and software to its fullest. 

Oh yea, and I don’t want to have to write all that, and I don’t want it to cost more than $500.

Here’s my robot today:

 

               

RugWarriorPro_GoPiGo.jpg