Category: Tech Update with Jon Titus

Q: I need an analog-to-digital converter, probably one with 16 bits. The more bits, the better the resolution, right?

A: Not so fast. To start, understand the difference between resolution and accuracy. I have an inexpensive 6-inch plastic ruler and a steel machinist’s ruler. Both have a resolution of sixteenths of an inch, but which provides the better accuracy? Right, the machinist’s ruler. The manufacturer used fine engraving to mark the ruler divisions. The plastic ruler has thick “bumps” with paint on them.

Let’s look at a 16-bit ADC with an input range of 0 to 2.5 volts. It has a resolution of:

or one part in 65536, or 15.3 parts-per-million. That means the ADC can detect 65536 separate voltages over the 0-to-2.5-volt input range, or:

Each voltage step has a difference of 0.038 mV or 38 μV from the nearest-neighbor steps, as shown in Figure 4. No, that’s not an error. The ADC provides 65536 voltage values, from 00000000000000002 to 11111111111111112, but only 65535 voltage increments, or “steps” across that range.

Of those 16 bits, the left-most bit — the most-significant bit (MSB) — has a decimal “weight” of either 0 or 32768. The right-most bit — the least-significant bit (LSB) — has a weight of either 0 or 1. The bits between the LSB and MSB has increasing weights of 2, 4, 8, 16, 32, and so on. Thus, the LSB has a voltage “weight” of:

or just 0.038 mV. But the MSB has a voltage weight of:

or 1.245 volts, which approximates half of the 2.5-volt full scale range for the 16-bit ADC.

Q: The LSB means I can measure a voltage as small as 0.038 millivolts?

A: Well, not quite. Resolution of 16 bits from an ADC doesn’t necessarily mean an accuracy down to the level of 0.038 mV. Accuracy depends on many things such as electrical noise, the accuracy of the ADC’s voltage reference, tolerances of components, and so on. I’ll let James Bryant, an engineer who worked for ADC manufacturer Analog Devices explain (with slight editing):

No presently available 16-bit ADC has an absolute accuracy of 15 ppm relative to its full-scale input voltage. The best 16-bit ADCs have gain errors of several least-significant bits LSBs. So even with a perfect voltage reference their initial absolute accuracy is, at best, about 14-bits or so, off the shelf.

Because most applications require linearity but not absolute precision, the voltage reference on many ADC integrated circuits has an accuracy of about 10-bits (1 part in 1024), and some are less accurate. This is because a high-precision reference is quite large, would make the converter more expensive, and is not needed by most users.

Separate external reference voltage sources are better, but still nowhere near 16-bit accuracy. The best available reference has an initial accuracy of 1 mV in 10 V, or about 13-bits. Most high-performance references are of the order of 11- to 12-bit accurate. Even with calibration it is hard to achieve 16-bits, and it is very difficult to maintain it over temperature. (Ref. 1.)

So if you choose a 16-bit ADC, assume you will have an accuracy of 13 bits. That means a resolution of one part in 213:

So over the ADC-input-voltage range 0-to-2.5 volts you still get a 16-bit result, but you can’t treat the three least-significant bits as accurate enough to have given you useful information. If you ignore the three LSBs, you have a resolution of:

or 0.122 mV/step in the 13-bit result.

You can ignore these bits in calculations, but with one warning: The 13-bit result might not reach all the way to the maximum voltage in an ADC range. But it’s unlikely you would ever choose a voltage range that an input voltage might exceed.