Responsive. Ready. Rock star treatment.

For live, friendly, technical support please call us at 864-843-4343 or email
What is the difference between A/D resolution and A/D accuracy?
Reference Number: AA-00552 Views: 3096 Created: 11/15/2012 05:39 pm Last Updated: 09/02/2016 04:22 pm 0 Rating/ Voters

Resolution and accuracy are two commonly confused specifications for analog to digital conversions.

Resolution defines the smallest increment of change that can be detected in an analog measurement. If you have an A/D input range of 0 to +10 volts with 12-bit resolution, then the range of 10 volts is divided into 12 bits, or 4096 divisions. Therefore, a 1-bit change from the A/D input corresponds to a difference of 0.00244 volts (10 volts / 4096 counts). Increasing the bits of precision that is returned by the A/D input will increase the resolution by decreasing each count of the A/D input.

Accuracy specifies the maximum difference between the actual analog value compared with the value measured. If you feed a 2.5 volt signal to an A/D input, the accuracy is a measure of how closely the A/D indicates the applied voltage is actually 2.5 volts.

Proudly Made in the USA
© Copyright 1996 - 2016, Sealevel Systems, Inc. All rights reserved.