Resistance thermometer Self-heating

Resistance thermometer Self-heating

A resistance thermometer is a passive resistance sensor; it requires a measuring current to produce a useful signal.

Because this measuring current heats the element wire above the true ambient temperature, errors will result unless the extra heat is dissipated.

Self-heating is most often expressed in mW/ºC, which is the power in mill watts (1000 I2 R) required to raise the thermometers internal temperature by 1ºC. The higher the mW/ºC figure, the lower the Self Heating.

As an example, assume a 5 mA measuring current is driven through 100 platinum RTD at 100ºC.

Self-heating is specified as 50 mW/ºC in water moving at 3 ft/sec. The amount of heat generated is:

1000 mW x (0.005 A)2 x (138.5) = 3.5 mW

The self-heating error is:

(3.5 mW) / (50 mW/ºC) = 0.07ºC

The generated heat increases with higher sensor element resistance (when a constant current measurement device is used), or with increasing measuring current.

The resulting error is inversely proportional to the ability of the thermometer to shed extra heat; which in turn depends on thermometer materials, construction, and environment.

The worst self-heating occurs when a high resistance is packed into a small body. Thin film elements, with little surface area to dissipate heat, are an example.

Self-heating also depends on the medium in which the thermometer is immersed. Error in still air may be over 100 times greater than in moving water.

1 Like