Space Background Image

Definition of Measurement Terms for Mass Properties Instruments

Mass properties measurement instruments use a wide variety of terms to describe the differences between the real quantity and the measured value.

No universal definitions – There is considerable difference of opinion regarding the terms error, uncertainty, precision, accuracy, sensitivity, and resolution. Some of this is lack of universally accepted definitions, and some is due to recent technology which has required redefining some traditional terms. When comparing mass properties instruments, make sure that the terms employed to describe accuracy refer to the same quantities.

Accuracy

  • No measurements have absolute accuracy
  • Accuracy is defined as the closeness with which a measurement agrees with the standard
  • Accuracy is usually specified as a tolerance on a measurement where the tolerance is the amount of uncertainty in the stated value
  • Accuracy data may be graphically displayed (calibration, correction, or error curve)
  • Accuracy is generally stated as a percentage. But a % of what?
  • Accuracy must be defined over a given range

Error (General)

  • Error is the KNOWN difference between a measurement and the true value
  • In a calibration procedure, a standard is measured and the error is the difference between the indicated measurement value and the standard value
  • Since the error is known, it can be corrected or compensated for
  • Frequently it is known that certain errors exist but the degree to which they exist is unknown. These errors are them more correctly called uncertainty. ANY TIME AN UNCERTAINTY CAN BE QUANTIFIED IT BECOMES AN ERROR AND CAN BE COMPENSATED.

Some Specific Types of Error:

  • Linearity errors where the (classical) sensitivity varies with the magnitude of the measured quantity and not in accordance with their mathematical relationship. This applies to non-linear relationships as well as linear. Flow, for example, is often measured by sensing a pressure drop across a restriction. The relationship is that the pressure varies as the square of the flow rate. Linearity error would still be an expression of the error between the true flow and the theoretical, (non-linear) relationship. Applied to Center of Gravity (CG) measurement, this quantity is usually stated as a percent of measurement. Simply stated, this means that the measurement is more uncertain the larger the CG offset from the reference point on the measuring instrument. In some cases this may be compensated for.
  • Hysteresis error is the difference in two measurements of the same quantity when the measurement is approached from opposite directions. In some measurement situations, a good operator can eliminate or minimize hysteresis error. This cannot usually be fully compensated for. Best approach is to always approach the measurement from the same direction to improve repeatability, if not accuracy.

Uncertainty

  • Uncertainty is the most troublesome quantity in any measurement.
  • It is an accumulation of the unknowns. Even our best measurement standards introduce uncertainty since we have no means of duplicating and confirming these values identically every time. The farther removed we are from the primary standards, the greater the uncertainty introduced into a given measurement. Likewise, the more terms required to define a measurement the more uncertainties are introduced, and finally, the larger the mathematical effect of a given term in a measurement the greater the uncertainty (i.e. quantities raised to the third power in defining a measurement have a greater effect than those which have a linear, first power, relationship).

Resolution

Resolution is the size of the smallest increment which can be shown on the measurement display. On a digital display, it is the value of the least significant digit. On an analog display it is the smallest display change detectable by a “qualified” operator.

Sensitivity

  • Sensitivity today is more commonly used to describe what was formerly called responsiveness. This is the smallest change in the measured quantity which consistently causes the output of the measuring instrument to change. It is largely a function of friction in mechanical systems. In some systems, the resolution of the digital display is the limiting factor since it is often selected to be compatible with the transducer limitations
  • The classical definition of sensitivity is the ratio between the change in measurement to the change in measured quantity.
  • It may also be expressed as a (dimensional) gain. This is most clearly applied to dimensional measurements, or analog displays. i.e. a typical micrometer indicator with a 0.5 inch diameter barrel moves one full turn, or 1.57 inches when the dimension being measured changes by 0.025 inches. This sensitivity would be 1.57/0.025 or 62.8 inches per inch (gain = 62.8). The sensitivity of an analog voltmeter would be stated as volts per inch of pointer travel.

Repeatability

  • Repeatability is the degree to which an instrument duplicates its measurement for the same input change. It is an overall measure of the quality of the measurement.
  • Non-repeatability may be represented as the +/- % tolerance for an instrument.
  • Repeatability is often the most important characteristic where small changes are being measured.

Precision

  • The term precision is one of the least useful terms in the measurement vocabulary.
  • It is often not a valid measure of anything.
  • The classical definition of precision is similar to the definition of resolution, that is, the number of significant digits to which a measurement may be read by a qualified operator. This definition is abused when the measurement system includes calculations or digital displays with high resolution where, for example, the average of several 2 decimal place readings is calculated and displayed to 6 decimal places. There is an implied degree of accuracy in a 6 place display which is not at all valid.

This measurement terminology applies to all our mass properties instruments, including center of gravity measurement instruments, moment of inertia measurement instruments, and spin balance machines.