Hello, I am a hobbyist who has tinkered with SSC32 control using the standard serial and basic atom approaches made readily available by Lynxmotion. I am currently working on a project involving an arm completing a relatively static task for a competition (on occasion the containers I’m trying to place things in are jostled out of position and the whole recorded sequence of events falls out of wack). I would like to improve the arm’s accuracy by monitoring positions using Matlab video processing in real time. The Matlab code for object tracking is not too big of an issue for me, but I am not familiar with how to go about sending the info generated by this code to the servo controller. I looked into rewriting the excel document generated by the Rios software (editing coordinates by adding and subtracting to compensate for displacement was my ultimate goal) with a test Matlab script and reloading my project within the Rios window. However, I could not see how to pursue this strategy into the realm of real time. I am not very familiar with basic code language or the intricacies of the Rios software (I can make the arm do a nice recorded sequence with a pretty high degree of accuracy). Is there any way to interface in real time with the Rios software as I described? Is the best strategy to directly control the arm using Matlab? Any suggestions would be greatly appreciated!
details:
My arm is connected via serial cable to my computer throughout the task
I’m using a SSC32
I do have a bot board and basic atom pro available on an old project if that would be helpful
no need to post in multiple topics. Most people read all of them (the ones that would reply).
I think you’ll have to drive the arm live with command strings. If you can see the destination point of the wrist (I’m assuming you can) and the end of the wrist (gripper) that you intend to pick the object up with, then it boils down to planning the trajectory and generating the reverse-kinematics. At various locations along the trajectory, you will need to compare the actual location of the grip to the desired location, called tracking error. add -tracking error to the next waypoint along the trajectory. Do this as often as you can.
Can I assume that you can correct for camera distortion and translate viewed coordinates into the proper frame? what vision library do you have to work with? (CMVision etc.)?
Matlab or most any other computer language can generate a string of ASCII characters terminated with CR and LF (0dh and 0ah). The manual has examples, or look at the code examples. I know, they are in BASIC, and you want to write code that is very similar to C, but the strings consist of constants, variables and punctuation, which you can lift out of the examples. Get some strings together and try them, and then show us your strings and we can discuss them if needed.
Fortunately for me I don’t really need to worry too much about tacking error to the extent you just detailed. The competition involves putting a 1/2 i.d. pvc pipe into a half gallon milk jug at a range of around 35 cm. I am able to do this with suitable accuracy just using the Rios software, however, occasionally the milk carton is jostled a bit (up to an inch or so) by dropping in other objects. My primary goal for the adjustment was to observe the change by tracking the circular opening in the carton and planning ahead for the new position. If possible from there I may go ahead and actively track the arm. The carton is at a constant height, so I’m really just interested in the carton’s change in x,y for now. So should I use a “read file” in the basic script to read in new coordinate data generated from a Matlab script that is tracking displacement? Or could you send me (or tell me where I can find) such a live command string? Where do I find documentation and examples for the Rios or relevant basic code? Sorry if I seem lost, I’m relatively new to this type of problem in Matlab. Oh, and my matlab is 2013 a pre-release. I have the computer vision add on.
Thanks!
Oh, and I was planning on using a fixed webcam located around a foot directly above the milk carton looking straight down. I hadn’t planned on accounting for distortion and whatnot… should I?
Can’t nail it, unfortunately okay. I shall look into that and see where it goes. Thanks for the advice! I might check in later if i have more questions or can’t figure out what I’m doing.
convert the output of the vision processing software to coordinates, then use IK to convert those coordinates to angles, and output those angles via your COM port to the SSC-32.
Output the x,y,z coordinates from the vision software to the Basic board which then calculates the angles based on IK and sends them to the SSC-32 controller.
In either case, you’ll need x,y,z coordinates from Matlab.