What is the difference between A/D resolution and A/D accuracy?

 

Resolution and accuracy are two commonly confused specifications for analog to digital conversions.

Resolution defines the smallest increment of change that can be detected in an analog measurement. If you have an A/D input range of 0 to +10 volts with 12-bit resolution, then the range of 10 volts is divided into 12 bits, or 4096 divisions. Therefore, a 1-bit change from the A/D input corresponds to a difference of 0.00244 volts (10 volts / 4096 counts). Increasing the bits of precision that is returned by the A/D input will increase the resolution by decreasing each count of the A/D input.

Accuracy specifies the maximum difference between the actual analog value compared with the value measured. If you feed a 2.5 volt signal to an A/D input, the accuracy is a measure of how closely the A/D indicates the applied voltage is actually 2.5 volts.