Pogo - RugWarriorPro With New Raspberry Pi 3 Brain

Pogo first sprang and sang “To Life” in my home in 2000, from a Rug Warrior Pro robot kit offered by AK Peters, Ltd.  His original brain was a 2 MegaHz Motorola 68HC11 with 32K bytes of "non-volitile" RAM.

 Pogo - Rug Warrior Pro Robot With Raspberry Pi Brain Transplant


In 2015, at the age of 15, Pogo underwent a total brain transplant receiving a Raspberry Pi B+, and a year later, underwent a follow-up surgical upgrade to the 1.2 GigaHz, four-core, Raspberry Pi 3 with 1GB of RAM (60-100 times the processing power with 30 thousand times more RAM than the original Rug Warrior Pro 68HC11 brain.)

 

 

Pogo from the ground up:

 

  •  RugWarriorPro "Brawns": 
    •  Du-Bro 2.5" Dia. captive-air soft rubber tires
    • Autotrol 6v DC Geared Down Motor
    • 4 C-Cell, and 2 C-Cell, Battery Holders
    • 6" round non-conducting chassis
    • Delron rolling ball “third wheel”/skid
    • Full Skirt with six direction bump detection
    • Added: Pololu 5V Step-Up/Step-Down Voltage Regulator S18V20F5
    • Six 5000mAh C-Cells
  • Raspberry Pi 3 - 1.2GHz Four-core Processor w/1GB RAM
  • Brawns to PiDA Connection Card
  • Mikronauts Pi Droid Alpha Robotics Digital and Analog Interface
    • MCP3208 10-bit ADC
    • MCP23S17 16 channel Digital I/O Expansion (with Interrupt Line tied to RPi GPIO pin 19)
    • LM293D Dual H-Bridge Motor Driver 
    • Separate Motor Power Path (unregulated 7.2v)
    • Separate Servo Power Path (4.8v Four cell tap off battery)
    • SPI interface to RPi
    • 2:1 voltage divider from 7.2v battery to ADC for 2 mv precision
  • Top Layer Card (Mikronauts PiJumper)
    • Pololu SP Power Switch
    • ACS712 Current Sensor
    • Tilt-Pan Sensor Platform
      • Twin SG90 Servos
      • HC-SR04 Ultrasonic Ranging Sensor
    • Rechargeable, wired, amplified speaker
  • Sharp GP2Y0A60 Infrared Distance Sensor
    • 10-150cm / 4 inches-5 feet Ranging
    • Facing 90 deg left for wall following 
    • Non-linear analog voltage output
  • PiCam 
    • Mounted at top of skirt facing forward
  • Software:
    • Hardware Interface Library in Python

Demonstrations and Test Code:

  • Sphinx Speech Recognition
  • Festival Text-To-Speech
  • Class (with test main) for each sensor
  • Class (with test main) for motors
  • GoPiGo Python Function API (Partial)
  • RugWarriorPro Function API (Partial)
  • Battery_Life Measurement and Estimation
  • 180 degree Ultrasonic Distance Map to console window

 

2017: After another SD card went to read-only, and finding the built-in WiFi of my Pi3 board had gone flackey, I managed to bring it to life on the latest Raspbian Stretch, somewhat.  Here is my Raspberrry Pi3 upgraded Rug Warrior Pro robot running egret.py ("think alot, act a little"):

https://www.youtube.com/watch?v=X1hrGEDxV8w


This is a companion discussion topic for the original entry at https://community.robotshop.com/robots/show/pogo-rugwarriorpro-with-new-raspberry-pi-3-brain

Photos

Looking forward to seeing photos and videos of Pogo. Since there was no primary image, I temporarily added one of the Pi3. Planning to compete in any competitions?

Thanks - image updated - please check

I wasn’t able to figure out the process well - I tried adding a photo and marking the one you added for delete, but it doesn’t seem to work.

No competitions planned…I just want to get finished with driver level and start working with the PiCam.

I need to learn OpenCV and create a set of camera based “sensors” or “image analyzers”, like:

 

  • left and right half of image - light intensity sensor (can output direct to motors for Braitenburg Vehicle Behavior)
  • r,theta,phi to shape-with-size area of maximum intensity
  • recognize a Window to the outside (during the day) - returning angle to center, window aspect ratio, and size estimate
  • recognize a Television screen (16:9 aspect with varying average intensity, surrounded by band with relatively constant average intensity
  • Returns r,theta,phi to QR Code, AR code in image with code “identifier” - put one on base board of every room
  • Room recognizer - window match, door size match, 180 degree color sequence match, ??
  • Human presence recognizer, r,theta,phi to human
  • Human Identity “Verification” - not just human but what known human

 

Eventually, (10 years from now - I’m retiring in December), the bot should spend its “just got off charger” time cataloging unknown objects in its environment, and later when it sees I am eating or just sitting down at my desk, should ask if I have time to teach it some object names and attributes.

I really want to figure out how to get out of hearing “I die in 1 hour, recharge me please”, having to plug it in, come back four hours later, and press the on button to start another 6-12 hours of “life”.  I think the robot community needs to get together and figure this out - standardize the dock, and contacts, charging circuitry, etc.  It is a really hard problem that prevents robots from becoming independent. 

Alan

Thanks - image updated - please check

I wasn’t able to figure out the process well - I tried adding a photo and marking the one you added for delete, but it doesn’t seem to work.

No competitions planned…I just want to get finished with driver level and start working with the PiCam.

I need to learn OpenCV and create a set of camera based “sensors” or “image analyzers”, like:

 

  • left and right half of image - light intensity sensor (can output direct to motors for Braitenburg Vehicle Behavior)
  • r,theta,phi to shape-with-size area of maximum intensity
  • recognize a Window to the outside (during the day) - returning angle to center, window aspect ratio, and size estimate
  • recognize a Television screen (16:9 aspect with varying average intensity, surrounded by band with relatively constant average intensity
  • Returns r,theta,phi to QR Code, AR code in image with code “identifier” - put one on base board of every room
  • Room recognizer - window match, door size match, 180 degree color sequence match, ??
  • Human presence recognizer, r,theta,phi to human
  • Human Identity “Verification” - not just human but what known human

 

Eventually, (10 years from now - I’m retiring in December), the bot should spend its “just got off charger” time cataloging unknown objects in its environment, and later when it sees I am eating or just sitting down at my desk, should ask if I have time to teach it some object names and attributes.

I really want to figure out how to get out of hearing “I die in 1 hour, recharge me please”, having to plug it in, come back four hours later, and press the on button to start another 6-12 hours of “life”.  I think the robot community needs to get together and figure this out - standardize the dock, and contacts, charging circuitry, etc.  It is a really hard problem that prevents robots from becoming independent. 

Alan

Power

You need to change battery technology. Try 18650 batteries: https://www.robotshop.com/en/lg-18650-37v-3500mah-lipo-battery-cell.html

These are very lightweight for their size / capacity. Two cells can provide 7.4V at 3500mAh. Four cells doubles this capacity to 7Ah. Unlike most other 18650 cells, these actually have a maximum capacity of 3500mAh (others might advertise that much but only be around 2Ah). We created a custom 11.1V pack using these cells:

https://www.robotshop.com/en/111v-3500mah-3s1c-10c-li-ion-battery.html

That’s a pretty cool project!

Impressive upgrade from the original robot.
Looking forward to seeing your updates!