How millis () can be used to average sensor readings

Hi,

I am working on current sensing project using uno and acs712. I am able to get current values after averaging with a for loop of 5000 runs. Issue is it takes time to process it and its like a delay. Un usable delay which can use for another process or function. Is there any way that I can get rid of these main loop delays and use millis () for better performance.

pls guide me on this.

There was a very nice

There was a very nice tip/walktrough for this not so long ago: https://www.robotshop.com/letsmakerobots/running-average-filter

ADC CPU utilization

You could put the ADC inside in a timer ISR and apply the forementioned running average scheme.

i am not sure how it can be

i am not sure how it can be done. any sample that i can get some idea. my sketch is posted also 

 

ADC inside ISR

1. Create an ISR to service an unused timer.

2. Setup the timer to match your sample interval.

3. Initialize the ring buffer, insertion pointer or index, total, average etc.

4. Upon timer interrupt.

    a. Subtract oldest data entry from running total

    b. Read new A/D value

    c. Store new A/D value in ring buffer

    d. Add new A/D value to total

    e. Compute average

    f. Start new A/D conversion.

You will need some addition code to keep track of the number of valid samples or expect the first N-1 samples to be invalid. Make sure that the “total” variable is sufficient to handle the sum of N conversions.