Skip links

What is uncertainty and why is it important? What is the difference between uncertainty and accuracy?

According to “Guide to the expression of uncertainty in measurement” (GUM), which is widely regarded as the international standard for the evaluation of uncertainty, the uncertainty of measurement is: “…non-negative parameter characterizing the dispersion of the quantity values being attributed to a measurand, based on the information used…”. What does it mean is that, the “true” value of measurand (e.g. line pressure, temperature, velocity, frequency, density, volume, etc.) lies within interval governed by result of measurement performed by a sensor and certain agreed value (attributed to uncertainty).

No alt text provided for this image

Uncertainties can be expressed in absolute or relative terms. Word “uncertainty” without any adjective means “absolute” uncertainty and is expressed in engineering unit corresponding to measurand. “Relative uncertainty” is official metrology term which is uncertainty (absolute) divided by result of measurement. However, there is another “unofficial” way of expressing uncertainty utilized by various manufacturers: “relative to full span”. It is important to understand how the uncertainty is expressed before using it for any practical application. Pay attention to wordings in datasheets, calibration certificates, manuals, etc.

There is another term “accuracy” which is sometimes used interchangeably with “uncertainty”.  According to GUM, measurement accuracy is qualitative concept which describes closeness of result of measurement to “true” value. So, if any instrumentation document prescribes numeric value to accuracy, this is not correct by definition. In such cases, one should assume that term “accuracy” is used instead of “uncertainty”.

Level of uncertainty shows how confident can you be in your measurements. This confidence or the opposite, the doubt in measurement will impact the decision making.

Generally, uncertainty of measurement in oil and gas industry varies between fiscal metering skids and measurements at the wellsite, the latter having the least accurate measurements due to complex fluid behaviour.

No alt text provided for this image

Uncertainty of measurement is not constant value, it tends to change over time, mostly towards deterioration. In other words, there will inevitably be more cases of mismeasurements, especially in field measurement applications. Worst part is that these mismeasurement occasions are kept undetected for the most part unless some measures are taken.

Mismeasurements are mainly caused by:

  • Wrong calibration – Bias in measurement
  • Lack of maintenance/ Mishandling – Drifting measurement
  • Wrong flow calculation – simplistic approach instead of proper real time calculations (z-factor, fluid density, etc.)
  •  Wrong parameters – design parameters used instead of actual parameters (which are dependent on pressure/temperature and composition)
  •  Scattered calculation sources (excel, DCS, SCADA, PI, reporting tools, etc.)

How does you organisation manage uncertainties in production measurement?