BTW, you can try using the CCP module in “PWM mode” to do everything, but I think you’ll find that you can’t get it to go slow enough (down to 50 Hz) unless you run the PIC on a very slow clock.
Pete
BTW, you can try using the CCP module in “PWM mode” to do everything, but I think you’ll find that you can’t get it to go slow enough (down to 50 Hz) unless you run the PIC on a very slow clock.
Pete
Another concern with the CCPs even if you can get it to run at the frequency you like is that most PICs don’t have very many of them. It sounds like Pete can offer some code too (probably much cleaner than this) but here’s mine (6 servos from a 16f88):
ServoController.c in CCS C
If it would work one reason the CCP might be a better route than my version or I’d guess Pete’s is likely similar, is that the CCP would allow you RonCo like Set it and Forget It! action. So if you’re using this single chip as a main processor on the bot as well as using it to control PWM lines, you’re not tying the processor up for those 2.5ms per cycle counting when you should be reading sensors and deciding what to do next.
Using Timer interrupts you can do a non-busy wait for one or maybe two servos, but if you wan’t to handle many of them, I don’t see any way other than a busy wait or a second chip.
heh… I’m getting a bit of a kick reading through the code now (it’s quite old)… I remember removing the arrays and expanding everything out to inline calls because I was sure that those array dereferences were the source of an illusive performance bottleneck
I got my CCP stuff working last night. There were a couple of pitfalls because (I think) the docs don’t explain the ‘compare mode’ very clearly, but it all makes sense now.
Andy is quite right that if you want to do multiple servos, it needs some careful thought about how you’ll generate the timing. Before I changed to the CCP approach, I found that interrupt overhead was big enough to mess up my ‘calculated’ timing - doing an interrupt every 64 uS is a lot of overhead when running at 8 Mhz.
Pete
Hmm…
I didn’t know that it would tie up the micro so much!
Maybe I should just have the SSC-32 control the correction servos and have the microcontroller send the corrected positions as ASCII instructions to the SSC-32?
But then… I’ve got to add the bluetooth into the mix, there…
I’m starting to get confused as to how I’ll get all these buggers to talk to each other properly.
What I want to do is have the bluetooth talk to both the SSC-32 and the microcontroller.
The SSC-32 does it’s usual stuff.
The microcontroller reads the sensors, does the PID algorithm (using the reference of the current positions of the correcting servos that it gets from the bluetooth) and then tells another microcontroller to drive the correcting servos to the proper new position to balance.
My question is…
Is there a way to send the positioning data that will be coming to my computer to both the SSC-32 and the micro (without having to send it first to one and have that one send it to the other)?
Well there’s only one stream of data, so do you really want both to see it?
It think offloading the servo pulse generation to the SSC32 is a good plan.
I’d suggest that the bluetooth data go into your PIC and have the PIC decide what gets passed onto the SSC32. Since almost all of your communications with the SSC32 are going to be one way, this is a prime canidate for the PIC to bit-bang a serial interface to the SSC32 while the UART can be used for the connection to the bluetooth module.
If you go this route, you can use the UART interrupts to discover new data from the PC via bluetooth and spend the rest of your time working on balancing and the like. With the PIC sitting between the radio and the SSC32, you also get to define your own over the air protocol and not be stuck with the verbosity of the SSC32s ascii interface.
Go go WikiGadget!
Ahh…
So that’s what bit-banging means.
For once, a term that makes sense…
Good point, Andy.
One question, though…
Is bit-banging fast enough that it won’t unduly tie up my micro (I’m guessing so, since a UART seems to be just a queued form of bit-banging)?
it’ll consume the processor for as long as it takes
The processor will be used full time for the duration of the transaction… so look at the baud rate and how much data you’re sending to determine how long that will be.
Another option might be to choose a PIC with more than one UART
LOL :point: LOL
You probably don’t have to be “tied up” while bit-banging the serial bits. Especially if you were to use a slower baud rate, you can do a fair amount of processing between each bit.
If the bit-banging is interrupt-driven, you can be sending out serial data and doing other stuff at the same time.
Another option would be ‘multiplex’ the outgoing side of the UART. That is, add a logic ‘switch’ on the TX signal, and control it with another PIC pin. Then the PIC chooses whether it is talking to the BTooth or to the SSC. This approach assumes that you never need to talk to both the BTooth and the SSC at exactly the same time (it seems like a reasonable restriction). If need be, you could arrange the SW to send one byte to the BTooth, then one byte to the SSC, etc.
Pete
Hmmm…
Well, let’s see…
32MHz = 1 million Hertz = 1 million bps
So, I’m guessing that the micro can do 115200 bps without a problem.
Each ASCII character is what, a byte of information?
The max characters per servo command is 7 plus the four characters at the end that it wants to see as a carriage return ("").
7 * 19 servos = 133 bytes = 1064 bits
1064 bits / 115200 bps = .00924 seconds
Even if my numbers are off a bit, one hundredth of a second or two shouldn’t be enough to effect whatever rate I wish to sample the sensors at (probably only about ten times per second).
Now that I think about it, I’ll only want to send info to the SSC-32 every 20ms or so, since the servos can only update that fast, anyway.
That way, I don’t overfill the SSC-32’s buffer and get nasty lags (I’m guessing that’s what would occur in such a situation?)
So, I suppose I’ll sample the sensors every 20 ms, and have the micro do the PID algorithm and then bit-bang the corrected instructions in the interim.
Interupts and the such are neato for you guys, but we have to remember that my experience is based upon a single QBasic (eek!) class and what VBE.NET that I’ve picked up on my own.
Nice linear programs are what I can accomplish at this point.
or sticking an I2C or SPI interfaced UART on the board could be another option for you if you are already using one of those interfaces to other devices.
Whoa there, cowboy!
A 32 Mhz CPU does not execute instructions at that rate (typically), and it takes multiple instructions to bit-bang even a single bit.
But, you’re correct that you have plenty of throughput anyway…
If you don’t use interrupts, you’ll be very “single task”, but it will initially be easier to get going.
When the time comes, perhaps Andy and/or I can help you with the SW.
Pete
One more thing: A serial ‘byte’ is actually 10 bits, because of start/stop bits. So it’s handy to take a “baud” number like 115200, divide it by 10, and you get a rough number for “bytes per second” .
Also, I doubt that you will want to update the SSC every 20 mS. It has its own processing to do, and it may not keep up. Even if it can keep up, the mechanical stuff in the servos can not keep up, so it’s a waste to update them that often.
Pete
I had thought about tacking on another UART, Eddie, but Andy advised that might be difficult.
Ahhhh.
Well, if the throughput is there, then I don’t have to worry about messy numbers.
10 bits, eh?
Well, I guess that makes sense.
Oh, I guess I better tone down that number a bit, then.
How does every 50ms sound?
I could go to ever 100ms, but with the added 20ms delay of the SSC-32 plus the time it takes for the PID algorithm, the time in between each balance correction might be too great for adequate correction of a quick blow.
If it was necessary to get a fast response between my PC and my biped, then I could understand the need for interupts, but even if it takes a quarter of a second to get my biped to move after I press a button, there shouldn’t be any real problems.
Don’t get me wrong, I plan upon implimenting them later on, whether or not I actually need them, as every little improvement will help.
But for starters, I’ll stay my hand.
Sorry if I gave that impression, but I don’t recall any such exchange.
I don’t know what you’re actually building, but it sounds like your requirement is NOT that you need to update the servos every ‘n’ mS, but rather that when an update is needed, the update needs to get there quickly.
Once a position update has been started, you have a lot of ‘free time’ while the servos actually move.
When the time comes, you should measure the actual latency in the SSC-32. You mentioned 20 mS - is that a real number? And how does the SSC’s latency change with a different # of servos? As I recall the SSC docs have some of these numbers…
Also, what is the latency in the servos themselves? They will have both SW latency and mechanical latency.
In the future, if need be, we can talk about ways to optimize your ‘servo response time’. For example, if it turns out that the SSC and its serial port interface is not quick enough, you could consider using your 18F PIC to talk directly to 1 or 2 of the most-critical servos. Then the bottleneck would be the servos themselves.
Pete
It was me I think. In an earlier post, I described ways of getting another serial port, and I said that adding a separate dedicated UART chip might be more trouble than it’s worth. But I haven’t really looked into the choices…
For now, I would suggest doing the “hardware MUX” approach as a way to talk to 2 different devices via a single UART. A single 74HCT02 chip (for example) would be the only extra hardware required.
Pete
A quick survey of the SSC32 firmware that I did a while ago suggests that it does send a new pulse to the servos every 20ms and is capable of getting any update into the next frame. Pete is right about the servos not responding that quickly.
I’ve said some things in other posts about making sure various radios would allow full updates every 20ms that it sounds as though your picking up on but this is aimed at the goal of off-board processing being the primary objective. The idea being that if the servos can conceivably be updated that quickly, I shouldn’t be creating any extra bottlenecks between my control code (on a PC) and the servos so that I’d have plenty of room for the software to grow into before I have any hardware limitations.
If the majority of your control code will reside on board, than you need only as much bandwidth as your control code uses.
Woops, sorry about the mix up!
I do that a lot.
Oh, I see what you’re getting at.
Rather than have a timer control when to send the information, have the information itself control whether or not it gets sent.
With that in mind, my setup would run somewhat like this:
Bluetooth instructions get recieved by the micro’s UART, are stored in local memory and are then imediately passed onto the SSC-32.
As frequently as the accelerometer allows, the microcontroller will read it.
If the values have derivated a certain threshold from the balanced position, then the PID algorithm will go underway and correcting instructions sent immediately thereafter.
The only problem that I see with the above method is overstacking the SSC-32’s queue.
True, the threshold levels can be adjusted to make that impossible to do, but that harms the accuracy of sensors.
I think there still has to be some sort of timer to at least limit the frequency of servo updates.
You’re right, though that they shouldn’t be made to wait around if there’s been no recent updates.
Here’s the times that I know:
20ms is the time that I throw around when talking about SSC-32 delays for this reason:
Every 5ms, one of the four servo banks (each has 8 servos) gets sent their instructions.
Depending on what part of the cycle incoming instructions hit, they can take 5-20ms just to get sent.
Then there’s the time that it takes for the micro to retrieve those incoming instructions as well as the time it takes for the servos to recieve them, do their own processing, and then actually move at their max speed (I’ll be correcting them at maximum speed, of course) which is 160ms per 60 degrees.
Since the largest time there is the servo’s mechanical movement, the time that it’ll take total is greatly dependant upon how much correction is needed.
I doubt, however, that any correction greater than 15 degrees at a time would be needed, so that’s 40 ms there…
You’re right, math simply won’t work here.
I’ll have to get off my bum and actually test it.
.<
Well, I have a good idea of what I’m doing now, thanks to you guys.
When my biped comes back and the parts that I’m ordering come in, I’ll be prepared.
Thanks again!