PIC18F1220

measurements
the frequency measurement was obtained using a digital multimeter. it’s just a cheep off brand that immitates the fluke line. It seems to work really well though. Its autorangeing and measures frequency all the way into the MHz range. I also has a button to switch between frequency and duty so I could measure something like the duty cycle of a PWM that I created If i wanted to.

Sounds cool, checked mine
Sounds cool, checked mine and it only went up to 200 kHz. It has duty cycle too, but found it can misread that at times, need an o-scope as back up.

Delay subroutine

in school I developed some delay functions that worked on a pic18F8680 but they don’t seem to work on my pic18F1220. both pic microcontrollers have an instruction clock of 8Mhz. the one in school used a 32Mhz crystal divided by 4 to get an instruction clock of 8MHz. my pic 18F1220 uses the internal clock configured to run at 8Mhz. I tested the frequency of the instruction clock by outputting FOSC/4 on pin RA4 this gave me 2Mhz which I think means that the instruction clock is running at 8Mhz. I gues its possible that FOSC/4 is the instruction clock and that would mean that its only running at 2Mhz.

 

Does anyone know how to detrmine the frequency of the instruction clock when using the internal osscilator.

Would that not just be a

Would that not just be a matter of pausing it in a time, and see how long time it actualy took?

Sorry if it was a dumb comment, but it seamed so straight forward?!?

**its probably that simple. **

That seems like a logical simple solution. What i was wondering is when I use the internal clock if it gets divided by 4 like an external clock or if it is being used directly? If its used directly, then I have programmed something wrong since i’m positive that its not executing instructions at 8Mhz.

 

can someone check my math? I’m creating a delay of 100ms so i start with the 8Mhz clock and use a prescaler of 16 so now its 500kHz at this frequency I think i’d have to have a timer count 50,000 times to get a 100ms delay. so i’d set my 16 bit timer at 15536 (2^16-50000) and have it count up. i’d loop while its counting and once it overflows i’d exit the loop.

2Mhz instruction clock
It appears that instruction clock runs at 2Mhz. when I recalculated the delays for a 2Mhz clock the delays seem to be fairly accurate. Its possible that the internal oscillator is at 8Mhz and its being divided by 4 like an external oscillator is. I’ve been reading the data sheet but it doesn’t seem to specify if it is divided or not.

Correct

Dunno about the 18 series, but in the 16 series, four clock cycles are required per instruction cycle. (Don’t forget, of course that jumps require an extra instruction cycle.) With an 8MHz crystal, each instruction takes 1/810^64 = 0.5us.

So, I think your calculations are correct (again, they ARE correct for a 16 series chip).

Cycles

For clarity, I use British/American punctuation in numbers, so a comma "," is a thousands separator and a period "." is a decimal point. i.e. 100,000 .75 is one hundred thousand point seven five.

I suggest you think of delays in terms of "instruction cycles" rather than in real time units like seconds. It will make it easier if you change your oscillator. A delay of 100,000 instruction cycles will take 1 / 8,000,000 * 4 * 100,000 = 50ms.

The answer to your question is "yes." The clock frequency (whether internal or external) is divided by four to obtain the instruction frequency.

I found it very helpful to code everything for a 4MHz clock (ie 1us per instruction) then quarter my values and switch up to a 16MHz crystal.

thanks, At first I was under
thanks, At first I was under the assumption that the internal osscilator was not being divided by four. this assumption is obviously wrong. Thanks for everyones help.