ADC Resolution Calculator

Calculate ADC LSB size, quantization levels, max code, and convert between voltage and digital code for any resolution (6–24 bit) and reference range.

Calculator Electronics Updated Apr 18, 2026
How to Use
  1. Enter ADC resolution (bits) and reference voltage range.
  2. The tool shows LSB (smallest step), total levels (2^N), and max code.
  3. Optionally convert a voltage to digital code or a code back to voltage.
  4. The staircase plot shows how an ideal ADC quantizes a voltage ramp.
Input
bits (6-24)
V
V
decimal or 0x
Presets
Quantization Steps
LSB
Levels
V → Code
Code → V

Show Work

Enter values to see the ADC analysis.

Formulas

LSB Size
LSB = V_ref / 2^N
Smallest detectable voltage step.
Total Levels
N_levels = 2^N
Distinct output codes.
Max Code
Code_max = 2^N − 1
Highest representable code.
Voltage → Code
Code = round(V × 2^N / V_ref)
Quantizes voltage to nearest code.
Code → Voltage
V = Code × LSB
Reconstructed voltage.
Dynamic Range
DR = 6.02 × N + 1.76 dB
Theoretical SNR for a full-scale sine.

History of the Analog-to-Digital Converter

Alec Reeves patented Pulse-Code Modulation (PCM) in 1938 while working for ITT\'s Paris research lab — the first description of sampling a continuous waveform and encoding each sample as a binary number. Bernard Widrow\'s MIT doctoral thesis (1956) formalized the statistical theory of quantization, giving engineers the tools to analyze quantization noise rigorously. Bell Labs\' T1 carrier system (1962) was the first commercial digital telephony, using 8-bit ADCs at 8 kHz sample rate to carry 24 voice channels on a single pair.

The four major ADC architectures appeared in rapid succession: successive-approximation (SAR) in the 1950s (slow, moderate resolution); flash/parallel in the 1960s (very fast, low resolution); sigma-delta (Σ-Δ) oversampling in the 1970s (slow, very high resolution); pipelined in the 1980s (high speed + high resolution). Each trades resolution, speed, power, and cost differently; all follow the same LSB = Vref / 2N relationship this calculator uses.

Modern ADCs span an extraordinary range: 1 Gsps, 8-bit flash converters for oscilloscopes and radar; 250 Msps, 14-bit pipelined for software-defined radio; 24-bit Σ-Δ at 32 ksps for seismic recording; 6.02·N + 1.76 dB is still the theoretical SNR limit for any resolution. Oversampling and noise-shaping push effective resolution well beyond the physical element count — a 1-bit Σ-Δ modulator running at 10 MHz can produce 20-bit equivalent resolution at audio rates.

About This Calculator

Enter ADC resolution in bits (6–24) and reference voltage. The tool computes LSB (smallest voltage step), total quantization levels (2N), maximum code (2N−1), and converts between a given voltage and its quantized code — useful for debugging firmware that converts raw ADC counts into physical units.

Remember that nominal resolution ≠ effective resolution. A 16-bit ADC with 4 LSB of noise has effective resolution closer to 14 bits (ENOB ≈ 14). For precision work, also check INL/DNL specs, reference drift, and the data-sheet noise histogram at your target sample rate. Everything runs client-side; no values leave your browser.

Frequently Asked Questions

What is an LSB?

LSB (Least Significant Bit) is the smallest voltage change an ADC can resolve. LSB = V_ref / 2^N. A 10-bit ADC with 3.3V reference has LSB = 3.3 / 1024 ≈ 3.22 mV per step.

How much resolution do I actually need?

Match resolution to noise floor + signal range. A 12-bit ADC gives 72dB dynamic range — fine for most sensor inputs. Going above 16 bits matters only if you've addressed noise, analog gain stages, and reference quality.

What's the difference between resolution and accuracy?

Resolution = smallest step size (theoretical). Accuracy = how close the reading is to truth (real-world, after INL, DNL, offset, gain errors, reference drift). A 16-bit ADC with ±4 LSB error has only ~14-bit usable accuracy.

What's ENOB?

Effective Number Of Bits = the noise-adjusted resolution. A 16-bit ADC with 12 bits of SNR actually delivers ~12 ENOB. Datasheet specs give ENOB vs. input frequency for SAR and Σ-Δ ADCs.

Why is max code 2^N−1, not 2^N?

With N bits you have 2^N unique codes (0 to 2^N−1). Code 0 represents 0V; code 2^N−1 represents just below V_ref. V_ref itself would require 2^N, which doesn't fit in N bits.

Common Use Cases

Arduino ADC (10-bit, 5V)

LSB = 4.88 mV. Measure a 0-5V sensor with ~5mV resolution — fine for most robotics.

ESP32 ADC (12-bit, 3.3V)

LSB = 0.806 mV. Battery monitoring, pot reading, any analog input under 3.3V.

Audio ADC (16-bit, 2V)

LSB = 30.5 µV. CD-quality audio requires this to avoid audible quantization noise.

Precision Instrumentation (24-bit)

LSB ≈ 200 nV at 3.3V range. For strain gauges, thermocouples, any low-level measurement where signal < 1mV.

Oscilloscope Front-End (8-bit)

LSB = 7.8 mV per div at 2V range. Good enough for waveform display; trade resolution for speed.

Last updated: