(Created page with " == Abstract == Digital calibration techniques are widely utilized to linearize pipelined A/D converters (ADCs). However, their power dissipation can be prohibitively high, e...") |
m (Scipediacontent moved page Draft Content 270883455 to Hamoui Taherzadeh-Sani 2008a) |
(No difference)
|
Digital calibration techniques are widely utilized to linearize pipelined A/D converters (ADCs). However, their power dissipation can be prohibitively high, especially when high-order gain calibration is needed. For high-order gain calibration, this paper proposes a design methodology to optimize the data precision (number of bits) within the digital calibration unit. Thus, the power dissipation of the calibration unit can be minimized, without affecting the linearity of the pipelined ADC. A 90-mn FPGA synthesis of a 2nd-order digital gain-calibration unit shows that the proposed optimization methodology results in a 59% reduction in power dissipation.
The different versions of the original document can be found in:
Published on 01/01/2008
Volume 2008, 2008
DOI: 10.1109/icecs.2007.4511078
Licence: CC BY-NC-SA license
Are you one of the authors of this document?