A history of digital-to-analog conversionPeople have been converting D/A quantities for a long time. Probably among the earliest uses was the summing of calibrated weights in weighing applications
(Figure A, left center).
Early electrical D/A conversion inevitably involved switches and resistors of different values, usually arranged in decades. The application was often the calibrated balancing of a bridge or reading an unknown voltage, via null detection. The most accurate resistor-based DAC of this type is Lord Kelvin's Kelvin-Varley divider (figure, large box). Based on switched resistor ratios, it can achieve a ratio accuracy of 0.1 ppm (23 bits or more), and standards laboratories still largely employ it. See part 2 (April 26, 2001) of this series for the details of Kelvin-Varley dividers.
High-speed D/A conversion resorts to electronically switching the resistor network. Early electronic DACs were built at the board level using discrete precision resistors and germanium transistors (figure, center foreground, a 12-bit DAC from a Minuteman missile D-17B inertial navigation system, circa 1962). In the mid-1960s, Pastoriza Electronics probably produced the first electronically switched DACs available as standard products. Other manufacturers followed, and discrete and monolithically based modular DACs (figure, right and left) became popular by the 1970s. The units were often potted (figure, left) for ruggedness, performance, or preservation of proprietary knowledge. Hybrid technology produced smaller packages (figure, left foreground). The development of Si-Chrome resistors permitted precision monolithic DACs, such as the LTC1595 (figure, immediate foreground).
In keeping with all things monolithic, the cost-performance trade-off of modern high-resolution IC DACs is a bargain. Think of it! A 16-bit DAC in an eight-pin IC package. What Lord Kelvin would have given for a credit card and LTC's phone number!