Calibration and ranging are two tasks associated with establishing an accurate correspondence between any instrument’s input signal and its output signal.
To calibrate an instrument means to check and adjust its response so the output accurately corresponds to its input throughout a specified range.
In order to do this, one must expose the instrument to an actual input stimulus of precisely known quantity.
For a pressure gauge, indicator, or transmitter, this would mean subjecting the pressure instrument to known fluid pressures and comparing the instrument response against those known pressure quantities.
One cannot perform a true calibration without comparing an instrument’s response to known, physical influence or impact.
To range an instrument means to set the lower and upper range values (LRV & URV), so it responds with the desired sensitivity to changes in input.
For example, a pressure transmitter set to a range of 0 to 200 PSI (0 PSI = 4 mA output ; 200 PSI = 20 mA output) could be re-ranged to respond on a scale of 0 to 150 PSI (0 PSI = 4 mA ; 150 PSI = 20 mA).
In analog instruments, re-ranging could only be accomplished by re-calibration, since the same adjustments were used to achieve both purpose (i.e. Flow measurement).
In digital instruments, calibration and ranging are typically separate adjustments (i.e. it is possible to re-range a digital transmitter without having to perform a complete re-calibration), so it is important to understand the difference, As smart Transmitters using HART or FOUNDATION FIELDBUS protocols