In metrology, the term measurement is closely associated with all activities related to scientific, industrial, commercial, and human aspects. It is defined as the assignment of a number to a characteristic of an object or event that can be compared with other objects or events. The knowledge of the reality that surrounds us is based on the measurement of physical quantities, in fact we can say that to know means to measure.

The need to establish precise rules for trade and for the organization of territory has made the need for measurement alive since the Mesolithic period; the development of craft activities, the construction of dwellings and artifacts for public use, the exchange mediated by coins, and the organization of work and property led to the devising, from the earliest civilizations, of practical systems of measurement, such as the division of time into months and days (and then into smaller fractions), of lengths and areas, of the weight of various objects, and of the value of monetary units.

Each people developed their own systems of measurement, but the transformation from one system of units to another was always approximate until equivalence criteria were established that took into account precise patterns for the various units of measurement.

The earliest examples of units of length and time, found in Egyptian and Assyrian cultures, were kept in temples and other sacred buildings and were preserved for centuries without much change. More rigorous were the units of measurement of ancient Rome, which spread throughout Europe and, by the fall of the empire, had given rise to numerous systems of measurement, also very different from each other. It was not until the end of the 18th century that we saw the emergence of measurement systems that gradually acquired a worldwide character.

Measurement applications

Types of measurement applications can be divided into only three broad categories:

  • Monitoring of processes and operations: refers to situations where the instrument is used to monitor some physical quantity (without any control functions).
  • Control of processes and operations: is one of the most important classes of measurement applications. This usually refers to an automatic feedback control system.
  • Experimental Engineering Analysis: is that part of engineering design, development, and research that relies on laboratory tests of one kind or another to answer questions.

Any application of measurement, including those that have not yet been “invented,” can be classified in one of the three groups just listed, or any combination of them.

The primary goal of measurement in industrial inspection is to determine the quality of the manufactured component. Various quality requirements such as allowable tolerance limits, shape, surface finish, size, and flatness must be considered to verify that the component meets the quality specifications. To do this, quantitative information about a physical object or process must be obtained by comparing it to a reference. The three basic elements of measurement that are important are the following

  • Measurand, a physical quantity to be measured (such as length, weight, and angle);
  • Comparator, to compare the measurand (physical quantity) with a known standard (reference) for evaluation;
  • Reference, the physical quantity or property to which quantitative comparisons are to be made and which is internationally recognized.

All of these three elements would be considered to explain direct measurement against a calibrated fixed reference. To determine the length of the component, the measurement is made by comparing it to a steel scale (a known standard).

Interferences in measurements

In metrology, in cases where the environmental conditions of the actual use of the transducer differ significantly from the environmental calibration conditions, the effects of the influence quantities must be taken into account. In these cases, specific tests must be performed on a population of transducers or at least on a single transducer.

It seems necessary to emphasize that attention must be paid to the environmental conditions, not only during the operation of the sensor, but also during the previous phases, such as storage and transport; these environmental conditions, if not checked and verified, can significantly and, above all, unpredictably modify the metrological performance of the transducer. Some of the most important influences that occur in mechanical and thermal measurements are summarized below.

Temperature effects

For each transducer, the operating temperature range is specified within which it can be used without damage. Within this range, the trends of both the zero drift and the sensitivity drift are generally provided by the manufacturer. For example, in the case of resistance strain gages, both the trends of the apparent deformation as a function of temperature (zero drift) and the sensitivity coefficient of the calibration factor as a function of temperature (sensitivity drift) are given.

Another method of expressing the effect of temperature is to determine a range of variation of the error due to temperature, expressed, for example, as a percentage of full scale. It is also necessary to know the maximum and minimum temperature at which the transducer can be exposed without permanent damage, i.e. without the metrological characteristics changing. Changes in ambient temperature not only affect the static metrological characteristics, but also the dynamic ones. It is necessary that the values supplied by the manufacturer refer to a specific temperature variation range. However, temperature has effects that can be significant even when there are step changes.

Acceleration effects

Errors due to acceleration can occur either directly on the sensitive element or on the connection or support elements and can be of such magnitude as to cause deformations that render the measurements made meaningless. In general, the transducers will show a more significant sensitivity to acceleration along some axes; therefore it is necessary to indicate the triad of the selected reference axes and to express the error due to acceleration.

The acceleration error is defined as the maximum difference between the output of the transducer in the absence and in the presence of a specified constant acceleration applied along a specific axis. Finally, it should be noted that some sensors are sensitive to the acceleration due to gravity, so the position of the transducer with respect to the gravitational field is an essential constraint.

Effects due to vibrations

The variation of the frequency of the vibrations, applied according to a specific reference axis, can determine (for example, due to resonance phenomena, etc.) significant effects in the signal output provided by the transducer.

To express the effect due to vibrations, it will be necessary to define the maximum variation of the output, for each value of the physical input quantity, when a given amplitude of vibration and for a given frequency range is applied along an axis of the transducer.

Effects of ambient pressure

Sometimes it is necessary to verify that the transducer will operate in conditions where the pressure is significantly different from the pressure at which the calibration was performed, which is generally the ambient pressure. Relatively different pressures from those at which the calibration tests were carried out can determine variations in the internal geometry of the transducer to vary the metrological characteristics provided by the manufacturer.

A deviation from the calibration conditions is much more serious than damage to the transducer, which can be easily detected by the experimenter. The error due to pressure is defined as the maximum variation of the transducer output for each value of the input quantity included in the measurement range when the pressure at which the transducer operates is varied at specified intervals.

Effects of transducer commissioning

If the transducer is not installed with care, it may be damaged (e.g. deformation of the structure) and the operating conditions of the transducer may change. The manufacturer does not have any information on this cause of failure, and the user must ensure that the instrument is installed properly and correctly.

Measurement methods

When precision measurements are made to determine the values of a physical quantity, different measurement methods are used. A measurement method is defined as the logical sequence of efficient operations used to measure the physical quantities under observation.

The better the measurement method used and the better the instruments and their technology, the closer to reality is the measure describing the state of the measured physical quantity. In principle, therefore, the measure represents the physical reality with a certain approximation, or with a certain error, an error that can be made very small, but never zero.

The choice of measurement method depends on the accuracy required and the amount of error that can be tolerated. Regardless of the method used, the primary objective is to minimize the uncertainty associated with the measurement. The most common methods of measurement are as follows:

Direct method

In this method, the quantity being measured is compared directly to the primary or secondary standard. Scales, calipers, micrometers, goniometers, etc. are used in the direct method. This method is widely used in the production field. In the direct method, there is a very small difference between the actual value and the measured value of the quantity. This difference is due to the limitation of the person who makes the measurement.

The advantage of direct measurement is that it is more difficult to make gross errors, since the instrument used for comparison is generally simple and therefore not subject to hidden errors.

Indirect method

In this method, the value of a quantity is obtained by measuring other quantities that are functionally related to the required value. The quantity is measured directly, and then the value is determined using a mathematical relationship.

Most measurements are made indirectly, almost always for cost reasons. For example, the density of a given substance could be measured directly using a device called a densimeter, but it is definitely more convenient to measure the mass and volume of the substance directly and then establish the relationship.

Indirect measurements, on the other hand, are more subject to approximations, since the error propagation is present in the formula that represents the physical law. Therefore, it is necessary to pay special attention to the approximations that are made when making direct measurements.

Fundamental or absolute method

In this case, the measurement is based on the measurements of the base quantities used to define the quantity. The quantity under consideration is measured directly and then linked to the definition of that quantity.

Comparative method

In this method, as the name suggests, the quantity to be measured is compared with the known value of the same quantity or another quantity that is practically related to it. The quantity is compared to the reference and only the deviations from the reference are recorded after the comparison. The most common examples are comparators, dial indicators, etc.

Transposition method

In this method, the measurement is made by direct comparison, where the quantity to be measured, V, is first compared with a known value, X, of the same quantity; then X is replaced by the quantity to be measured and compared again with another known value, Y. If the quantity to be measured is equal to both X and Y, then it is equal to


An example of this method is the determination of mass by balancing methods and known weights.

Coincidence Method

This is a “differential” method of measurement in which a very small difference between the quantity to be measured and the reference is determined by carefully observing the coincidence of certain lines and signals. Calipers and micrometers are examples of this method.

Deflection method

In this method, the value of the quantity to be measured is indicated directly by the deflection of a pointer on a calibrated scale. Pressure measurement is an example of this method.

Complementary method

The value of the quantity to be measured is combined with a known value of the same quantity. The combination is adjusted so that the sum of the two values is equal to the predetermined reference value. An example of this method is determining the volume of a solid by displacing a liquid.

Zero measurement method

In this method, the difference between the value of the quantity to be measured and the known value of the same quantity to be compared with is set to zero.

Substitution method

This is a direct comparison method. In this method, the value of the quantity to be measured is replaced by a known value of the same quantity, chosen so that the effects produced by these two values in the indicator are the same. The Borda method of mass determination is an example of this method.

Contact method

In this method, the surface to be measured is touched by the sensor or measuring tip of the instrument. Care must be taken to maintain a constant contact pressure to avoid errors due to excessive constant pressure. Examples of this method include measurements using a micrometer, caliper gauge, and dial indicator.

Non-Contact method

As the name implies, there is no direct contact with the surface being measured. Examples of this method include the use of optical instruments, a toolmaker’s microscope, and a profile projector.

Composite method

The actual contour of a part to be inspected is compared to its maximum and minimum tolerance limits. This method can be used to check the cumulative errors of the interconnected elements of the component, which are controlled by a combined tolerance. This method is very reliable for ensuring interchangeability and is usually performed using composite GO gauges. The use of a GO plug gauge to check the thread of a nut is an example of this method.

Notify of

Inline Feedbacks
View all comments
Scroll to Top