Hi!
Im having an issue making sense of the output on a wheatstone amp and Im afraid I’ve misunderstood something along the way since Im new to load cells.
I’m using a single channel (Strain 1/A0) of the RB-Onl-38 amp installed on an Arduino Uno with a pre-calibrated load cell from Tranducer Techniques. I’ve left the gain resistor from the factory in place and I’ve set the reference pot on the shield as described in the documentation (to generate a raw ADC value of 338 with no load cell connected - which also seems to be the baseline output with the cell connected and no load applied). Ive used load cells a number of times before but always small values that were easy to calibrate with weights and apply a conversion factor manually. The tricky part here is that the load cell is a 20 ton cell and I dont have any standard or known weights large enough to manually calibrate, so I have to accomplish this using the certificate of calibration.
Measuring the Excitation pins I see that the shield seems to provide an excitation voltage of 3.3V (I expected 5, and my load cell was calibrated at 10V).
According to my load cell calibration certificate - with 10V of excitation the load cell produces 2.2387 mV/V at a full scale load of 20 tons, and 1.119 mV/V at half scale or 10 tons.
As I understand it this means that with 3.3V excitation, the non-amplified output would be about 0.7388 mV/V at 20 tons. (since 3.3V/10V = 0.33 … similarly if the excitation voltage were 5 volts, I would expect the mV/V output at full scale to be half the 2.238 mV/V reported with 10V excitation).
I also have a gain of 495 to factor in from the amp, AND to complicate matters further the shield outputs the amplified signal into an analog input that is 5V at full scale so with all this Im not confident how to reconcile the gain, 3.3V excitation, and 5V ADC in order to extrapolate the expected raw ADC value at 20 tons for a load cell certified with 10V excitation.
The manual mentions something about setting the reference pot to offset the output to “mid-supply voltage” but in the test procedure it has you set that pot to a raw ADC value of about 338, which is half of the 3.3V excitation voltage…but not half the 5V rail, or half the count of the 5V analog in pins on an Uno. I dont understand why?
At any rate, my assumption was that with the reference set to output an ADC value of 338 with no load cell connected, I could expect to see an ADC read value of 584 with the full 20 tons applied (0.7388 mV/V multiplied by the 3.3V excitation, then multiplied by a gain of 495 for a final value of 1.202 Volts – which itself would appear as an analogue read value of 246, which I assume would be added to the unloaded or baseline value of 338 that was established by the reference pot during setup. So, can someone who is comfortable with load cells, calculating gain, etc tell me if this is all correct?