Visual Rover w/ Arm Control Software (VRACS)

I have recently enabled my rover with Xbee and I want to program my own control software :smiley:. I will try to update this and release it to the community when I’m done.

The idea of the software is to create a “game like” interface that consists of a 3D model of the rover that is completely interactive. The user can drag the arm around in 3D space or control it with a joystick. The camera feed will be projected in front of the robot. Later I will add support for the Sharp GP2D12 so the projected image can have actual depth so it will be easier to interact with any environment. I will be using Blitz3D

Right now I can control the robot from the interface but it is very complicated. Im trying to implicate IK now… :neutral_face:

Heres the 3D model used…

http://i1007.photobucket.com/albums/af195/bucketlamp/robot_v2.jpg

Pretty cool project! Looking forward to see how it goes. :slight_smile:

I can’t seem to find out how to read from the GP2D12 sensor using the SSC-32. I have connected it like in this diagram lynxmotion.com/images/assemb … c32m02.gif. My question is what commands do I send to the SSC-32 to get feedback, I have tried “VA VB” (analog) and “A B C DL” (digital). With the digital command I get “1111” and the analog I get nothing. Please help anyone :smiley:

Show us an image of your wiring…

Read Analog Inputs.

Example: "VA VB VC VD "

VA, VB, VC, and VD read the value on the input as analog. It returns a single byte with the 8-bit (binary) value for the voltage on the pin.

When the ABCD inputs are used as analog inputs the internal pullup is disabled. The inputs are digitally filtered to reduce the effect of noise. The filtered values will settle to their final values within 8mS of a change. A return value of 0 represents 0vdc. A return value of 255 represents +4.98vdc. To convert the return value to a voltage, multiply by 5/256. At power up the ABCD inputs are configured for digital input with pullup. The first time a V* command is used, the pin will be converted to analog without pullup. The result of this first read will not return valid data.

Read Analog Input Example: "VA VB "

This example will return 2 bytes with the analog values of A and B. For example is the voltage on Pin A is 2vdc and Pin B is 3.5vdc, the return value will be the bytes 102 (binary) and 179 (binary).

Look into your code. Even if the GP2D12 were not connected you should receive a 0 back when doing the query. You can jumper the input to +5vdc to see a 255 from the query.

My connection is exactly like this http://i1007.photobucket.com/albums/af195/bucketlamp/connection1.jpg
And I am using the 5v on C and D for the Xbee, is this the problem…?

I wasn’t able to get a read even in LynxTerm but then I charged the batery :stuck_out_tongue: (i need to buy that 2800mah batery). Now I can get a reading in LynxTerm, when I click the buttons for VA, VB, etc they all say pretty random values mostly around 18, but not zero like before :stuck_out_tongue:. In my program I am getting characters instead of numbers…The first read is a | (instead of 0) then letters and what not, I think it is converting everything into a string. Moving my hand towards and away from the sensor does change the values (all of them actually :question: ), so its working, I just cant interpret the values, any ideas?

Well I guess leaving the robot on while programming without resets kind of makes it unreliable, should of known that…Also I had to convert the values into ASCII, duh. Works great now, thanks Jim :smiley:

Yeah low batteries will make you crazy. Your problem has nothing to do with powering the Xbee from the SSC-32. It has ample current for that device. There is no problem in fact. It is doing exactly what you are telling it to do. The SSC-32 is sending you a binary value. The binary value is displayed as “random” not really random but seemingly random characters because a terminal can only display ascii values properly. Are you writing a program? If so what language are you using? Edit: looks like you figured it out. lol

Here’s a screen shot of some progress.

http://i1007.photobucket.com/albums/af195/bucketlamp/panel_sc1.jpg

The red lines are distances that are short enough to be objects detected by the IR sensor in the front, the other ones off to the side (not supposed to be there) are pretty rare (noise, probably from my giant table top LED lamp). Now I’m working to eliminate the noise, determine the width of the object and finally program it to pick up the object on command… :smiley:

I am interested in adding more sensors to my robot (GPS, sonar). I am only using the SSC-32 right now. Is there a way to send commands from the SSC (using byte output or discrete output) to an input on the BotBoard. I want the BotBoard to be able to read individual sensor data and relay the data back to one input on the SSC.

Is this possible? I’m pretty sure there is an easier way… Thank you

You can read A/D (sensors) from the SSC-32 under commands of the BB II.

Alan KM6VV

Sorry I didn’t explain that well. I am using Xbee’s to control the SSC-32 from my program (EDIT from a computer). A and B inputs are occupied by one IR and C and D are being used by the Xbee. I want to send a signal through the SSC to the BotBoard, the BotBoard processes the signal and decides which sensor to read from. Then the data is sent back to the SSC-32 that can be interpreted by my program. The Botboard would just hold a processing and sensor reading code. All the servos will be connected to the SSC-32

Pardon, sounds backwards. Xbee -> BBII -> SSC-32

Any reason to connect Xbee to BBII?

Alan KM6VV

Xbee -> SSC-32 -> BBII

I only want to use the BBII to read sensor data.

That’s how it should be. The Xbee can also be connected to the BBII.

Alan KM6VV

http://i1007.photobucket.com/albums/af195/bucketlamp/panel_sc2.jpg

Shadows, ambient light from the camera is used as background color, improved GUI, screen moves with camera.

I have completely integrated my Saitek X52 Flight Control System and have 4 modes of control (Long range, short range, arm control, and scan mode)

Amazon Link to Joystick

Smooth sailing so far. I am still reading up on inverse kinematics, so difficult…

I am looking for a way to accurately map the environment by using previous scans for obstacle avoidance and guidance.

This is for autonomous control which will consist of a preloaded map or start in a “mapping” mode, where the rover autonomously roams around the area, real-time sensors will tell it to stop, then it will scan a 2 foot wide area (already accomplished), the location of the scanned area will be determined by displacement data from either the encoders or an accelerometer, and turn around to continue evaluating. The “map” can be saved and loaded, if the user wants to load a map they will just drag and drop the robot model into the proper location in the virtual environment (the map is projected using cubes).

EDIT : The mapping process allows the user to be able to set checkpoints for the robot to navigate to, thus making navigation much faster. :open_mouth:

Should I use encoders or an accelerometer to send position change data back to my program? Which one is more accurate for what I am doing?

Use all the sensors you can! Use sensor fusion to combine. Use Kalman Filters.

Look up SLAM Navigation or mapping.

Alan KM6VV