SSC-32 servo smoothing/timing (Visual Basic driven)

I am using an SSC-32 hooked up to 3 HSR-5990TG servos and driven via the serial port in my computer and a program I wrote using Visual Basic 2008. I am trying to drive the servos to follow sinusoids of the same freq but different phase smoothly for a robotics application. The critical part of the code is attached below. Ultimately what I am doing is splitting the sinusoid up in to multiple points using a timer and telling the servos to get to an updated position at each timer tick. I use the very convenient time to complete function of the SSC-32 group move to decide the velocities of the servos to complete the move on time before the next update. Of course, depending on the duration of the timer tick, the motion can be smooth (servos not able to move fast enough to reach the end position before the next signal is sent) or can be choppy if the servo has enough time to decelerate/stop at the desired end position.

It is desirable for the setup I am trying to build to have the sinusoidal motion smooth independent of the update rate/timer tick interval. So I modified the code such that the servos are first told to reach a certain position in a certain amount of time, then before that position is reached, they are told to go to another position. So the desired position is always leading the actual position. This works well, servo operation is smooth across a variety of update rates. To get a better idea about what I did, check out the picture below:

http://i19.photobucket.com/albums/b188/type11969/drivingvectors.png

The blue sinusoid is what I am trying to follow. The various colored lines represent the steps that I am telling the servos to follow. As you can see, before each step is completed, the servo gets an updated position. The penalty is a decrease in amplitude (although I’ve found the HSRs to overshoot a decent amount anyway). This penalty doesn’t really show itself unless the update is large.

So now for the problem/question/oddity. If I have the update rate set to 15ms (position updated every 30ms), the servos respond well. Same at 30ms, 90ms, 120ms, etc. They visually and audibly seem to be moving at the same speed. BUT, if I set the update rate set to 20ms, 40ms, etc or really anything off that 30ms multiple, the servos slow down dramatically even though the driving frequency is the same.

So what could be causing this? I think it boils down to the serial port or SSC-32 rates of some sort, but I am but a lowly Mech Eng so I don’t understand all that business. Any ideas?

Apologies for the length.

[code]Private Sub Timer1_Tick(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Timer1.Tick
If Time <= halfstep Then
Led1.Value = True
'phase00m = freq0 * (Time - dT) + phase0
'Vel0 = Math.Abs(Amp0 * (((1 / 2) * Math.Sin(phase00)) - ((1 / 2) * Math.Sin(phase00m))) / dT)
'If Vel0 <= 100 Then
'Vel0 = 100
'End If
Pos0 = initPos0 + Amp0 * ((1 / 2) * Math.Sin((freq0 * dT) + phase0))

        'phase11m = freq1 * (Time - dT) + phase1
        'Vel1 = Math.Abs(Amp1 * (((1 / 2) * Math.Sin(phase11)) - ((1 / 2) * Math.Sin(phase11m))) / dT)
        'If Vel1 <= 100 Then
        'Vel1 = 100
        'End If
        Pos1 = initPos1 + Amp1 * ((1 / 2) * Math.Sin((freq1 * dT) + phase1))

        'phase22m = freq2 * (Time - dT) + phase2
        'Vel2 = Math.Abs(Amp2 * (((1 / 2) * Math.Sin(phase22)) - ((1 / 2) * Math.Sin(phase22m))) / dT)
        'If Vel2 <= 100 Then
        'Vel2 = 100
        'End If
        Pos2 = initPos2 + Amp2 * ((1 / 2) * Math.Sin((freq2 * dT) + phase2))

        knobPos0.Value = (Pos0 - 1500) / 11.1
        'KnobVel0.Value = Vel0
        KnobPos1.Value = (Pos1 - 1500) / 11.1
        'KnobVel1.Value = Vel1
        KnobPos2.Value = (Pos2 - 1500) / 11.1
        'KnobVel2.Value = Vel2

        SerialPort1.Write("#0 P" & Pos0 & "" + "#1 P" & Pos1 & "" + "#2 P" & Pos2 & "T" & numDT.Value & "" + Chr(13))

        Time = Time + halfstep
        inc = 1
    ElseIf Time > halfstep Then
        If inc = 1 Then
            Led1.Value = False
            Pos0 = initPos0 + Amp0 * ((1 / 2) * Math.Sin((freq0 * (Time + plusstep)) + phase0))
            Pos1 = initPos1 + Amp1 * ((1 / 2) * Math.Sin((freq1 * (Time + plusstep)) + phase1))
            Pos2 = initPos2 + Amp2 * ((1 / 2) * Math.Sin((freq2 * (Time + plusstep)) + phase2))

            knobPos0.Value = (Pos0 - 1500) / 11.1
            KnobPos1.Value = (Pos1 - 1500) / 11.1
            KnobPos2.Value = (Pos2 - 1500) / 11.1

            SerialPort1.Write("#0 P" & Pos0 & "" + "#1 P" & Pos1 & "" + "#2 P" & Pos2 & "T" & numDtplus & "" + Chr(13))
            inc = 0
            Time = Time + halfstep
        ElseIf inc = 0 Then
            Led1.Value = True
            Pos0 = initPos0 + Amp0 * ((1 / 2) * Math.Sin((freq0 * (Time + dT)) + phase0))
            Pos1 = initPos1 + Amp1 * ((1 / 2) * Math.Sin((freq1 * (Time + dT)) + phase1))
            Pos2 = initPos2 + Amp2 * ((1 / 2) * Math.Sin((freq2 * (Time + dT)) + phase2))

            knobPos0.Value = (Pos0 - 1500) / 11.1
            KnobPos1.Value = (Pos1 - 1500) / 11.1
            KnobPos2.Value = (Pos2 - 1500) / 11.1

            SerialPort1.Write("#0 P" & Pos0 & "" + "#1 P" & Pos1 & "" + "#2 P" & Pos2 & "T" & numDT.Value & "" + Chr(13))
            inc = 1
            Time = Time + halfstep

        End If
    End If

End Sub

Private Sub Start_but_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles but_Start.Click
    SerialPort1.Close()
    SerialPort1.Open()
    'Vel0 = 0
    'Vel1 = 0
    'Vel2 = 0
    Time = 0
    numDtplus = 1.5 * numDT.Value
    dT = numDT.Value / 1000
    halfstep = dT / 2
    plusstep = dT + halfstep
    initPos0 = 1500
    Amp0 = 11.1 * sldAmp0.Value
    phase0 = (Pi / 180) * sldph0.Value
    freq0 = 2 * Pi * numFreq.Value
    initPos1 = 1500
    Amp1 = 11.1 * sldAmp1.Value
    phase1 = (Pi / 180) * sldph1.Value
    freq1 = 2 * Pi * numFreq.Value
    initPos2 = 1500
    Amp2 = 11.1 * sldAmp2.Value
    phase2 = (Pi / 180) * sldph2.Value
    freq2 = 2 * Pi * numFreq.Value
    Pos0 = initPos0 + Amp0 * ((1 / 2) * Math.Sin((freq0 * Time) + phase0))
    Pos1 = initPos1 + Amp1 * ((1 / 2) * Math.Sin((freq1 * Time) + phase1))
    Pos2 = initPos2 + Amp2 * ((1 / 2) * Math.Sin((freq2 * Time) + phase2))
    SerialPort1.Write("#0 P" & Pos0 & "" + "#1 P" & Pos1 & "" + "#2 P" & Pos2 & "T" & numDT.Value & "" + Chr(13))
    knobPos0.Value = (Pos0 - 1500) / 11.1
    KnobPos1.Value = (Pos1 - 1500) / 11.1
    KnobPos2.Value = (Pos2 - 1500) / 11.1
    inc = 0
    Timer1.Interval = numDT.Value / 2
    Timer1.Enabled = True
End Sub[/code]

Hello type11969,

Welcome to the forum. I took the liberty to place the program in a code box for easier reading. I hope you don’t mind. I don’t know the answer to your question, but I will bring it to the attention of the SSC-32 designer. The SSC-32 has a lot going on inside the little chip. I suspect it’s just an anomoly or artifact that may not be changable. I seem to remember RIOS updates the SSC-32 at 30mS intervals as well.

Also here is a thread about servo smoothness and the SSC-32. It’s gets very informative when Laureatus chimes in.
lynxmotion.net/viewtopic.php?t=4332

Thanks, I’ve been reading a lot on this forum, definitely read that acc/dec thread a few times now. Missed the code box formating part, thanks for that. I’m looking forward to hearing what the SSC-32 designer has to say!

Is it safe to assume that you understand the signal from the SSC-32 only gets sent to a servo every 20mS, yes?

And the potential sampling issues that represents when combined with the asynchronous updating of that information in the SSC-32 by your controlling program?

And last but not least if you are attempting to use a timer based upon the windows WM_TIMER message to control something in the < 100mS range… well it just plain doesn’t do that very accurately. The message is fine for durations in the hundreds of mS where 5, 10, even 20mS of delta are a small percentage but it’s all over the place when trying to actually time events in that lower range.

This is all stuff I don’t really know. I ran an experiment to evaluate the servo response based on the DT value (keep in mind that I was using a stopwatch feature in my cellphone and was only visually evaluating the end points):

http://i19.photobucket.com/albums/b188/type11969/responsetime.png

A .1Hz sinusoid was used to drive the servo so I am looking for a half cycle time of 5 seconds.

I am going to strip out everything running in the background and reset the priority of the VB program to see how it changes.

Another question . . . Is it possible to describe a step in the GP sequencer such that the step ends before the prescribed position is met due to a velocity constraint? Or does the step time take precedence over the prescribed velocity? It seems like the step time does take precedence but I am curious if people have had success coming up with a work around.

Just looking for the smoothness I’ve achieved using VB but by dropping the sequence down to the EEPROM on the board.

Thanks,

Chris

Hi,

I am the fabled “SSC-32 designer.” I believe there is some interaction between the SSC-32 update rate (20ms) and the position command rate, but I have not determined exactly what that is yet. I’ll post again when I get that figured out.

In the GP sequencer the next step will not be started until the target pulse width for the current step has been reached. If the time for a step is in conflict with the servo speeds, then the winning constraint is whichever would make the step take longer. So which constraint takes precedence depends on how far the servos need to move.

Mike

Thanks for checking in to this Mike, I look forward to hearing back from you.

So it sounds like the only way to obtain a smooth sinusoid is to generate the curve outside the SSC-32 and send updates such that the servos never complete their move. I did kill all unnecessary background programs and set the VB .exe that was driving the servos to a “realtime” priority with basically no change in the servo response plot (as shown a few posts up). Guess I have to try programming the bot board next . . .

-Chris

Fwiw, I am working on this as well. I was also unable to get the SSC-32 to respond quickly enough over serial to have the level of smoothness I expected when I shortened the curves past a certain point. I was not able to determine if it was the SSC-32 or just the inherent latency of PC based serial communications though.

I have moved over to a microprocessor instead (propellor) and am finding the servo accel/deaccel now works exactly as I wanted/expected. In my case, I’m driving the servos accel/deaccel position curves directly from the microprocessor. My PC sends only higher level commands to the micro.

One thing I would still like to check though, is to see what would happen if I drove the servos with the SSC-32, but sent the SSC-32 serial commands from the microprocessor (real-time) instead of a PC . I think that would determine if the SSC-32 was the limiter or not. Even before that though, I started thinking it might be necessary to calculate what the theoretical timing limitations are regarding sending SSC-32 updates given it’s command protocol and the max 115kbps serial speed.

Also…

This might be obvious, but just-in-case… the servos physically only ever move at one speed …full speed. So to follow your curves, your external program can not be sending positional commands that never complete…it can only send ever changing “velocity/speed” commands. You’ll need to calculate the slope at each point on your positional curve. However, sending only velocity commands makes determining accurate servo position at any given time very difficult (without some sort of feedback loop). This might or might not matter depending on your application.

(An exception to the “servos always move at full speed” statement is that servos seems to have some sort-of built-in deaccel handling to avoid overshooting the position. Zoomkat mentioned this in other thread. I think using feature would require custom servo modifications though.)

Thanks kaliatech. I’m still just getting my feet wet with programming and am definitely in over my head when it comes to hardware timing, etc. What language did you use to program the propellor microprocessor? What are you using to send commands down to it from your PC? I have a bot board here that I plan on playing with but I think the SSC-32 via the serial port will work well enough if I keep the update interval to a multiple of 30ms. At least for now . . .

Yup, thanks, I do realize that I am only approximating the sinusoid with a series of average velocities over a period of time. I think that by updating the position/time command before the command can finish will minimize the effect of the internal servo PID controller attempting to accelerate/decelerate the servo to the desired velocity. Fortunately accurate positioning for my application does not matter . . . yet.

For the “custom” part of my programs, I’ve been programming the Propellor chip using the higher level Spin language that it supports. I’m new to microprocessors too, and even though Spin is a thousand times easier than assembly (to me), it still required some getting used to. My program is making use of various libraries for serial communications and the servo pulse handling. These libraries were written in Propellor assembly.

I’m also over my head regarding many lower-level details, but that’s also kind-of the fun of it. I haven’t experimented with the bot boards yet. When I was researching options previously, I remember thinking that the Atom Pro chip would probably be fast enough to do everything I needed (serial comms, accel/deaccel calcs, for ~10 servos), but I wasn’t positive. The bot boards and supported chips do look really nice to work with.

I communicate with the Propellor chip from my PC wirelessly over Bluetooth using a BlueSMIRF transceiver. I’ve only recently gotten everything working together (PC -> bluetooth -> Propellor -> smooth accel/deaccel servos) and am now putting it all on a simple custom circuit board so I can get back to my main project. Point being I haven’t tested everything completely yet…only that the basic premise does work regarding smooth (and dynamically configurable) servo accel/deaccel.

One more thought … :confused: As EddieB noted above, the default windows timers do not have sufficient resolution (and are not consistent enough) to go much lower than your current .1hz tests. You can get access to lower level timers (DirectX/Game programmers do this often), but I’d be surprised if that was default in the VB Timer control. I haven’t used VB in a while though, so don’t know for sure.

If you’re also not sure, you might want to do a quick test to see what the max resolution of your Timer is without any other code running.

If the Timer is limiting you, then an alternative approach to your program might be to run everything in an infinite loop with no timer. At the start of each loop get your *real *time delta (not ticks) from your previous loop and run your calculations on that. Then you would limited only by CPU and I/O speed, and your dT would be consistent under those limits.

How would I check the max resolution of the timer? I’ll definitely take a look for a lower level timer, thanks for the tip. I do think I am getting what I need right now out of the current VB timer, I don’t need an interval less than 30mS for now.

I understand what you are saying regarding the infinite loop idea but I am not sure how to implement it. Basically sounds like you are building a timer from the loop. Can you tell the software to wait for a given amount of time? Hmm

To check the max resolution, write a small program that sets your Timer to “1ms” or something. Also, at the beginning of your program store the system’s time in milliseconds. In the timer callback handler, keep track of how many milliseconds have passed since the start time, and how many times your handler has been called. Every 500 loops (or whatever makes sense) calculate how much time on average passsed between each call and write that out to console or something.

I was curious myself, so I checked out the MSDN docs. If your Timer in VB is a Windows Form Timer (which I think it is though I can’t tell for sure from the code you posted), the docs say the resolution is at best, ~55ms. The same docs also mention that you can use the .NET framework’s Timer class to get better resolution though. More info here: msdn.microsoft.com/en-us/library … timer.aspx .

The idea of writing an infinite loop is that your program never waits. It simply runs as fast as it can all the time. Your code is responsible for skipping anything that shouldn’t run if enough elapsed time hasn’t passed between loops. All calculations are done using elapsed time rather than # of loops. This follows real physics more closely. I think this way of thinking is hard concept for higher level business programmers to grasp (it was for me anyways), but is very natural to anyone that works in embedded and microprocessor systems. Using elapsed time rather than ticks is also common in game programming.

If interested, you can get a sense of some of the lower level details around timing on Windows from this older article: msdn.microsoft.com/en-us/magazine/cc163996.aspx .

(edit: linked fixed)

Thanks for the info. I came across this which seems like it will be helpful:

codebeach.com/tutorials/high … -basic.asp

The link on the bottom of your last post is broken.

I took your advice and made a program to compare the system clock vs the vb timer for different intervals . . . the vb timer certainly is garbage. Even at 200mS for 50 ticks the error is 4%. Additionally the error is all over the place, it can obviously get worse at smaller increments, but it also gets better too. Gonna use the system clock as a timer and see how the response goes . . .

I am using an SSC-32 hooked up to 3 HSR-5990TG servos and driven via the serial port in my computer .My goal is to programe the robot to fallow the specific path that i want. I want to connect the SSC-32 terminal to Microsoft Excel with visual basic programe, it make the programming very easier than before ,because in this way I jast put the value for position and time and some thing like this and it convert the value to text and send them to the SSc-32 terminal.

I try to connect it and i can do it but i have some problem during this connection that i want to ask you.

when i connect it ,visual basic make the value as Text file and send it to the one macros of the terminal. the problem is that when i changed the value on excel sheet, the value is sent as macros, but the macros in terminal dosent change, i mean the excel make the macros in the source file of the SSc-32 but the terminal can’t show it and i have to close and open the terminal again to uppdate the macros. how can solve this problem.

another thing that i want to ask u is that,
i wrote the programe like this:

#1P 1100 s 1000
#2P 1000 s 1000
#3P 2000 s 1500
#4P 2000 S 100
#5P 1500 S 100
#6P 2000 S 100

#1P 750 s 1000
#2P 750 s 1000
#3P 750 s 1000
#4P 750 S 1000
#5P 750 S 1000
#6P 750 S 1000

when the Excel send it as a macro to the terminal of SSc-32 the servos just run by second commend I mean #1P 750 s 1000
#2P 750 s 1000

and they did’t go to the first position that i wote it like #1P 1100 s 1000…
in simple word the SSc-32 skip the first part of command.
how can fix this problem? :frowning:

sorry for long word
thanks in advance

did you wait the 1000ms for the servo to move before sending it the next command? the ssc-32 doesn’t queue up commands, it runs off the most recently received.