The following article covers the magnetic flow meter principle, magnetic flowmeter calibration and magnetic flow meter configuration.
Magnetic Flowmeter Calibration
In measurement and control loops where the process flow is a conductive liquid, magnetic flowmeters can be used to measure flow.
As fluid passes through the meter’s magnetic field, the fluid acts as a conductor. The change in potential varies directly with the fluid velocity…
Input and Output Standards
Disconnect the flow tube from the transmitter. A magnetic flowmeter calibrator simulates the signal provided by the electrodes in the flow tube. The operating voltage and frequency range of the calibrator must match those of the magnetic flowmeter. Select the maximum output signal using the calibrator range switch.
The signal options include 5,10, or 30 mV AC. The magnetic flowmeter calibrator has a pre-determined test point, so the percent output knob is used to set each output for a five-point check. Since output is in milliamps, a milliammeter is the appropriate output measurement standard for this calibration setup.
Calibration - Five-Point Check
To begin the calibration of a magnetic flowmeter, calculate the input signal value. The input signal is equal to the upper range multiplied by the calibration factor and by the phase band factor. These values are indicated on the instrument’s data plate.
Input Signal = Upper Range x Calibration Factor x Phase Band Factor
Record the output values at each test point, and from this data determine if the instrument is within manufacturer’s specifications.
The following formula tells if the range of error is within manufacturer’s specifications:
Accuracy = (Deviation / Span ) * 100
Deviation = Expected Value - Actual Value
Adjust zero at the lowest point in the instrument’s range by turning the zero adjust screw until the output reading is correct. Then adjust span, and , since zero and span often interact, verify both until no further adjustment is necessary.
To conclude the calibration, recheck the upscale and downscale readings to verify that the instrument is properly calibrated.