Guide to Measurement and Instrumentation -- Calibration of Measuring Sensors and Instruments

Home | Glossary | Books | Links/Resources
EMC Testing | Environmental Testing | Vibration Testing

AMAZON multi-meters discounts AMAZON oscilloscope discounts


We just examined the various systematic and random measurement error sources in the last section. As far as systematic errors are concerned, we observed that recalibration at a suitable frequency was an important weapon in the quest to minimize errors due to drift in instrument characteristics. The use of proper and rigorous calibration procedures is essential in order to ensure that recalibration achieves its intended purpose; to reflect the importance of getting these procedures right, this whole section is dedicated to explaining the various facets of calibration.

We start in Section 2 by formally defining what calibration means, explaining how it’s performed and considering how to calculate the frequency with which the calibration exercise should be repeated. We then go on to look at the calibration environment in Section 3, where we learn that proper control of the environment in which instruments are calibrated is an essential component in good calibration procedures. Section 4 then continues with a review of how the calibration of working instruments against reference instruments is linked by the calibration chain to national and international reference standards relating to the quantity that the instrument being calibrated is designed to measure. Finally, Section 5 emphasizes the importance of maintaining records of instrument calibrations and suggests appropriate formats for such records.

Principles of Calibration

Calibration consists of comparing the output of the instrument or sensor under test against the output of an instrument of known accuracy when the same input (the measured quantity) is applied to both instruments. This procedure is carried out for a range of inputs covering the whole measurement range of the instrument or sensor. Calibration ensures that the measuring accuracy of all instruments and sensors used in a measurement system is known over the whole measurement range, provided that the calibrated instruments and sensors are used in environmental conditions that are the same as those under which they were calibrated. For use of instruments and sensors under different environmental conditions, appropriate correction has to be made for the ensuing modifying inputs, as described in Section 3. Whether applied to instruments or sensors, calibration procedures are identical, and hence only the term instrument will be used for the rest of this section, with the understanding that whatever is said for instruments applies equally well to single measurement sensors.

Instruments used as a standard in calibration procedures are usually chosen to be of greater inherent accuracy than the process instruments that they are used to calibrate. Because such instruments are only used for calibration purposes, greater accuracy can often be achieved by specifying a type of instrument that would be unsuitable for normal process measurements. For instance, ruggedness is not a requirement, and freedom from this constraint opens up a much wider range of possible instruments. In practice, high-accuracy, null-type instruments are used very commonly for calibration duties, as the need for a human operator is not a problem in these circumstances.

Instrument calibration has to be repeated at prescribed intervals because the characteristics of any instrument change over a period. Changes in instrument characteristics are brought about by such factors as mechanical wear, and the effects of dirt, dust, fumes, chemicals, and temperature change in the operating environment. To a great extent, the magnitude of the drift in characteristics depends on the amount of use an instrument receives and hence on the amount of wear and the length of time that it’s subjected to the operating environment. However, some drift also occurs even in storage as a result of aging effects in components within the instrument.

Determination of the frequency at which instruments should be calibrated is dependent on several factors that require specialist knowledge. If an instrument is required to measure some quantity and an inaccuracy of _2% is acceptable, then a certain amount of performance degradation can be allowed if its inaccuracy immediately after recalibration is _1%. What is important is that the pattern of performance degradation be quantified, such that the instrument can be recalibrated before its accuracy has reduced to the limit defined by the application.

Susceptibility to the various factors that can cause changes in instrument characteristics varies according to the type of instrument involved. Possession of an in-depth knowledge of the mechanical construction and other features involved in the instrument is necessary in order to be able to quantify the effect of these quantities on the accuracy and other characteristics of an instrument. The type of instrument, its frequency of use, and the prevailing environmental conditions all strongly influence the calibration frequency necessary, and because so many factors are involved, it’s difficult or even impossible to determine the required frequency of instrument recalibration from theoretical considerations. Instead, practical experimentation has to be applied to determine the rate of such changes. Once the maximum permissible measurement error has been defined, knowledge of the rate at which the characteristics of an instrument change allows a time interval to be calculated that represents the moment in time when an instrument will have reached the bounds of its acceptable performance level.

The instrument must be recalibrated either at this time or earlier. This measurement error level that an instrument reaches just before recalibration is the error bound that must be quoted in the documented specifications for the instrument.

A proper course of action must be defined that describes the procedures to be followed when an instrument is found to be out of calibration, that is, when its output is different to that of the calibration instrument when the same input is applied. The required action depends very much on the nature of the discrepancy and the type of instrument involved. In many cases, deviations in the form of a simple output bias can be corrected by a small adjustment to the instrument (following which the adjustment screws must be sealed to prevent tampering). In other cases, the output scale of the instrument may have to be redrawn or scaling factors altered where the instrument output is part of some automatic control or inspection system. In extreme cases, where the calibration procedure shows signs of instrument damage, it may be necessary to send the instrument for repair or even scrap it.

Whatever system and frequency of calibration are established, it’s important to review this from time to time to ensure that the system remains effective and efficient. It may happen that a less expensive (but equally effective) method of calibration becomes available with the passage of time, and such an alternative system must clearly be adopted in the interests of cost efficiency. However, the main item under scrutiny in this review is normally whether the calibration interval is still appropriate. Records of the calibration history of the instrument will be the primary basis on which this review is made. It may happen that an instrument starts to go out of calibration more quickly after a period of time, either because of aging factors within the instrument or because of changes in the operating environment. The conditions or mode of usage of the instrument may also be subject to change. As the environmental and usage conditions of an instrument may change beneficially as well as adversely, there is the possibility that the recommended calibration interval may decrease as well as increase.

Control of Calibration Environment

Any instrument used as a standard in calibration procedures must be kept solely for calibration duties and must never be used for other purposes. Most particularly, it must not be regarded as a spare instrument that can be used for process measurements if the instrument normally used for that purpose breaks down. Proper provision for process instrument failures must be made by keeping a spare set of process instruments. Standard calibration instruments must be totally separate.

To ensure that these conditions are met, the calibration function must be managed and executed in a professional manner. This will normally mean setting aside a particular place within the instrumentation department of a company where all calibration operations take place and where all instruments used for calibration are kept. As far as possible this should take the form of a separate room rather than a sectioned-off area in a room used for other purposes as well. This will enable better environmental control to be applied in the calibration area and will also offer better protection against unauthorized handling or use of calibration instruments. The level of environmental control required during calibration should be considered carefully with due regard to what level of accuracy is required in the calibration procedure, but should not be over-specified, as this will lead to unnecessary expense. Full air conditioning is not normally required for calibration at this level, as it’s very expensive, but sensible precautions should be taken to guard the area from extremes of heat or cold; also, good standards of cleanliness should be maintained.

While it’s desirable that all calibration functions are performed in this carefully controlled environment, it’s not always practical to achieve this. Sometimes, it’s not convenient or possible to remove instruments from a process plant, and in these cases, it’s standard practice to calibrate them in situ. In these circumstances, appropriate corrections must be made for the deviation in the calibration environmental conditions away from those specified. This practice does not obviate the need to protect calibration instruments and maintain them in constant conditions in a calibration laboratory at all times other than when they are involved in such calibration duties on plant.

As far as management of calibration procedures is concerned, it’s important that the performance of all calibration operations is assigned as the clear responsibility of just one person. That person should have total control over the calibration function and be able to limit access to the calibration laboratory to designated, approved personnel only. Only by giving this appointed person total control over the calibration function can the function be expected to operate efficiently and effectively. Lack of such definite management can only lead to unintentional neglect of the calibration system, resulting in the use of equipment in an out-of-date state of calibration and subsequent loss of traceability to reference standards.

Professional management is essential so that the customer can be assured that an efficient calibration system is in operation and that the accuracy of measurements is guaranteed.

Calibration procedures that relate in any way to measurements used for quality control functions are controlled by the international standard ISO 9000 (this subsumes the old British quality standard BS 5750). One of the clauses in ISO 9000 requires that all persons using calibration equipment be adequately trained. The manager in charge of the calibration function is clearly responsible for ensuring that this condition is met. Training must be adequate and targeted at the particular needs of the calibration systems involved. People must understand what they need to know and especially why they must have this information. Successful completion of training courses should be marked by the award of qualification certificates.

These attest to the proficiency of personnel involved in calibration duties and are a convenient way of demonstrating that the ISO 9000 training requirement has been satisfied.

Calibration Chain and Traceability

The calibration facilities provided within the instrumentation department of a company provide the first link in the calibration chain. Instruments used for calibration at this level are known as working standards. As such, working standard instruments are kept by the instrumentation department of a company solely for calibration duties, and for no other purpose, then it can be assumed that they will maintain their accuracy over a reasonable period of time because use-related deterioration in accuracy is largely eliminated. However, over the longer term, the characteristics of even such standard instruments will drift, mainly due to aging effects in components within them. Therefore, over this longer term, a program must be instituted for calibrating working standard instruments at appropriate intervals of time against instruments of yet higher accuracy. The instrument used for calibrating working standard instruments is known as a secondary reference standard. This must obviously be a very well-engineered instrument that gives high accuracy and is stabilized against drift in its performance with time. This implies that it will be an expensive instrument to buy. It also requires that the environmental conditions in which it’s used be controlled carefully in respect of ambient temperature, humidity, and so on.

When the working standard instrument has been calibrated by an authorized standards laboratory, a calibration certificate will be issued. This will contain at least the following information:

  • • identification of the equipment calibrated
  • • calibration results obtained
  • • measurement uncertainty
  • • any use limitations on the equipment calibrated
  • • date of calibration
  • • authority under which the certificate is issued

The establishment of a company standards laboratory to provide a calibration facility of the required quality is economically viable only in the case of very large companies where large numbers of instruments need to be calibrated across several factories. In the case of small to medium size companies, the cost of buying and maintaining such equipment is not justified.

Instead, they would normally use the calibration service provided by various companies that specialize in offering a standards laboratory. What these specialist calibration companies do effectively is to share out the high cost of providing this highly accurate but infrequently used calibration service over a large number of companies. Such standards laboratories are closely monitored by national standards organizations.

In the United States, the appropriate national standards organization for validating standards laboratories is the National Bureau of Standards, whereas in the United Kingdom it’s the National Physical Laboratory. An international standard now exists (ISO/IEC 17025, 2005), which sets down criteria that must be satisfied in order for a standards laboratory to be validated. These criteria cover the management requirements necessary to ensure proper operation and effectiveness of a quality management system within the calibration or testing laboratory and also some technical requirements that relate to the competence of staff, specification, and maintenance of calibration/test equipment and practical calibration procedures used.

National standards organizations usually monitor both instrument calibration and mechanical testing laboratories. Although each different country has its own structure for the maintenance of standards, each of these different frameworks tends to be equivalent in its effect in ensuring that the requirements of ISO/IEC 17025 are met. This provides confidence that the goods and services that cross national boundaries from one country to another have been measured by properly calibrated instruments.

The national standards organizations lay down strict conditions that a standards laboratory has to meet before it’s approved. These conditions control laboratory management, environment, equipment, and documentation. The person appointed as head of the laboratory must be suitably qualified, and independence of operation of the laboratory must be guaranteed. The management structure must be such that any pressure to rush or skip calibration procedures for production reasons can be resisted. As far as the laboratory environment is concerned, proper temperature and humidity control must be provided, and high standards of cleanliness and housekeeping must be maintained. All equipment used for calibration purposes must be maintained to reference standards and supported by calibration certificates that establish this traceability. Finally, full documentation must be maintained. This should describe all calibration procedures, maintain an index system for recalibration of equipment, and include a full inventory of apparatus and traceability schedules. Having met these conditions, a standards laboratory becomes an accredited laboratory for providing calibration services and issuing calibration certificates. This accreditation is reviewed at approximately 12 monthly intervals to ensure that the laboratory is continuing to satisfy the conditions for approval laid down.

Primary reference standards describe the highest level of accuracy achievable in the measurement of any particular physical quantity. All items of equipment used in standards laboratories as secondary reference standards have to be calibrated themselves against primary reference standards at appropriate intervals of time. This procedure is acknowledged by the issue of a calibration certificate in the standard way. National standards organizations maintain suitable facilities for this calibration. In the U.S., this is the National Bureau of Standards, and in the UK it’s the National Physical Laboratory.

Similar national standards organizations exist in many other countries. In certain cases, such primary reference standards can be located outside national standards organizations. For instance, the primary reference standard for dimension measurement is defined by the wavelength of the orange-red line of krypton light, and it can therefore be realized in any laboratory equipped with an interferometer. In certain cases (e.g., the measurement of viscosity), such primary reference standards are not available and reference standards for calibration are achieved by collaboration between several national standards organizations who perform measurements on identical samples under controlled conditions [ISO 5725 (1994) and ISO 5725-2/Cor1 (2002)].

What has emerged from the foregoing discussion is that calibration has a chain-like structure in which every instrument in the chain is calibrated against a more accurate instrument immediately above it in the chain. All of the elements in the calibration chain must be known so that the calibration of process instruments at the bottom of the chain is traceable to the fundamental measurement standards. This knowledge of the full chain of instruments involved in the calibration procedure is known as traceability and is specified as a mandatory requirement in satisfying the ISO 9000 standard. Documentation must exist that shows that process instruments are calibrated by standard instruments linked by a chain of increasing accuracy back to national reference standards. There must be clear evidence to show that there is no break in this chain.

To illustrate a typical calibration chain, consider the calibration of micrometers.

A typical shop floor micrometer has an uncertainty (inaccuracy) of less than 1 in 10^4. These would normally be calibrated in the instrumentation department or standards laboratory of a company against laboratory standard gauge blocks with a typical uncertainty of less than 1 in 10^5.

A specialist calibration service company would provide facilities for calibrating these laboratory standard gauge blocks against reference-grade gauge blocks with a typical uncertainty of less than 1 in 10^6. More accurate calibration equipment still is provided by national standards organizations. The National Bureau of Standards and National Physical Laboratory maintain two sets of standards for this type of calibration, a working standard and a primary standard. Spectral lamps are used to provide a working reference standard with an uncertainty of less than 1 in 10^7

The primary standard is provided by an iodine-stabilized helium-neon laser that has a specified uncertainty of less than 1 in 10^9. All of the links in this calibration chain must be shown in any documentation that describes the use of micrometers in making quality-related measurements.


National standards organization

Standards laboratory

Company instrument laboratory

Process instruments

(Primary reference standards)

(Secondary reference standards)

(Working standards)


Iodine-stabilized helium-neon laser Spectral lamp Reference-grade gauge blocks Standard gauge blocks Shop-floor micrometer

Inaccuracy 1 in 10^9 Inaccuracy 1 in 10^7 Inaccuracy 1 in 10^6 Inaccuracy 1 in 10^5 Inaccuracy 1 in 10^4


Calibration Records

An essential element in the maintenance of measurement systems and the operation of calibration procedures is the provision of full documentation. This must give a full description of the measurement requirements throughout the workplace, instruments used, and calibration system and procedures operated. Individual calibration records for each instrument must be included within this. This documentation is a necessary part of the quality manual, although it may exist physically as a separate volume if this is more convenient. An overriding constraint on the style in which the documentation is presented is that it should be simple and easy to read.

This is often facilitated greatly by a copious use of appendices.

The starting point in the documentation must be a statement of what measurement limits have been defined for each measurement system documented. Such limits are established by balancing the costs of improved accuracy against customer requirements, and also with regard to what overall quality level has been specified in the quality manual. The technical procedures required for this, which involve assessing the type and magnitude of relevant measurement errors, are described in Section 3. It’s customary to express the final measurement limit calculated as _2 standard deviations, that is, within 95% confidence limits (for an explanation of these terms, see Section 3).

Instruments specified for each measurement situation must be listed next. This list must be accompanied by full instructions about the proper use of the instruments concerned. These instructions will include details about any environmental control or other special precautions that must be taken to ensure that the instruments provide measurements of sufficient accuracy to meet the measurement limits defined. The proper training courses appropriate to plant personnel who will use the instruments must be specified.

Having disposed of the question about what instruments are used, documentation must go on to cover the subject of calibration. Full calibration is not applied to every measuring instrument used in a workplace because ISO 9000 acknowledges that formal calibration procedures are not necessary for some equipment where it’s uneconomic or technically unnecessary because the accuracy of the measurement involved has an insignificant effect on the overall quality target for a product. However, any equipment excluded from calibration procedures in this manner must be specified as such in the documentation. Identification of equipment that is in this category is a matter of informed judgment.

For instruments that are the subject of formal calibration, documentation must specify what standard instruments are to be used for the purpose and define a formal procedure of calibration.

This procedure must include instructions for the storage and handling of standard calibration instruments and specify the required environmental conditions under which calibration is to be performed. Where a calibration procedure for a particular instrument uses published standard practices, it’s sufficient to include reference to that standard procedure in the documentation rather than to reproduce the whole procedure. Whatever calibration system is established, a formal review procedure must be defined in the documentation that ensures its continued effectiveness at regular intervals. The results of each review must also be documented in a formal way.

A standard format for the recording of calibration results should be defined in the documentation.

A separate record must be kept for every instrument present in the workplace, irrespective of whether the instrument is normally in use or is just kept as a spare. A form should be used that includes details of the instrument's description, required calibration frequency, date of each calibration, and calibration results on each occasion. Where appropriate, documentation must also define the manner in which calibration results are to be recorded on the instruments themselves.

Documentation must specify procedures that are to be followed if an instrument is found to be outside the calibration limits. This may involve adjustment, redrawing its scale, or withdrawing an instrument, depending on the nature of the discrepancy and the type of instrument involved.

Instruments withdrawn will either be repaired or be scrapped. In the case of withdrawn instruments, a formal procedure for marking them as such must be defined to prevent them being put back into use accidentally.

Two other items must also be covered by the calibration document. The traceability of the calibration system back to national reference standards must be defined and supported by calibration certificates. Training procedures must also be documented, whether the instrument is normally in use or is just kept as a spare. A form should be used that includes details of the instrument's description, required calibration frequency, date of each calibration, and calibration results on each occasion. Where appropriate, documentation must also define the manner in which calibration results are to be recorded on the instruments themselves.

Documentation must specify procedures that are to be followed if an instrument is found to be outside the calibration limits. This may involve adjustment, redrawing its scale, or withdrawing an instrument, depending on the nature of the discrepancy and the type of instrument involved.

Instruments withdrawn will either be repaired or be scrapped. In the case of withdrawn instruments, a formal procedure for marking them as such must be defined to prevent them being put back into use accidentally.

Two other items must also be covered by the calibration document. The traceability of the calibration system back to national reference standards must be defined and supported by calibration certificates. Training procedures must also be documented, specifying the particular training courses to be attended by various personnel and what, if any, refresher courses are required.

All aspects of these documented calibration procedures will be given consideration as part of the periodic audit of the quality control system that calibration procedures are instigated to support. While the basic responsibility for choosing a suitable interval between calibration checks rests with the engineers responsible for the instruments concerned, the quality system auditor will need to see the results of tests that show that the calibration interval has been chosen correctly and that instruments are not going outside allowable measurement uncertainty limits between calibrations. Particularly important in such audits will be the existence of procedures instigated in response to instruments found to be out of calibration. Evidence that such procedures are effective in avoiding degradation in the quality assurance function will also be required.


Proper instrument calibration is an essential component in good measurement practice, and this section has been dedicated to explaining the various procedures that must be followed in order to perform calibration tasks efficiently and effectively. We have learned how working instruments are calibrated against a more accurate "reference" instrument that is maintained carefully and kept just for performing calibration tasks. We considered the importance of carefully designing and controlling the calibration environment in which calibration tasks are performed and observed that proper training of all personnel involved in carrying out calibration tasks had similar importance. We also learned that "first stage" calibration of a working instrument against a reference standard is part of a chain of calibrations that provides traceability of the working instrument calibration to national and international reference standards for the quantity being measured, with the latter representing the most accurate standards of measurement accuracy achievable. Finally, we looked at the importance of maintaining calibration records and suggested appropriate formats for these.



Calibration date | Calibrated by | Calibration results

Type of instrument:

Manufacturer's part number:

Measurement limit:


Instructions for use:

Calibration frequency:

Company serial number:

Manufacturer's serial number:

Date introduced:

Signature of person responsible for calibration:



_1. Explain the meaning of instrument calibration.

_2. Explain why calibration is necessary.

_3. Explain how the necessary calibration frequency is determined for a measuring instrument.

_4. Explain the following terms: (a) calibration chain (b) traceability (c) standards laboratory

_5. Explain how the calibration procedure should be managed, particularly with regard to control of the calibration environment and choice of reference instruments.

_6. Will a calibrated measuring instrument always be accurate? If not, explain why not and explain what procedures can be followed to ensure that accurate measurements are obtained when using calibrated instruments.

_7. Why is there no fundamental reference standard for temperature calibration? How is this difficulty overcome when temperature sensors are calibrated?

_8. Discuss the necessary procedures in calibrating temperature sensors.

_9. Explain the construction and working characteristics of the following three kinds of instruments used as a reference standard in pressure sensor calibration: dead-weight gauge, U-tube manometer, and barometer.

_10. Discuss the main procedures involved in calibrating pressure sensors.

_11. Discuss the special equipment needed and procedures involved in calibrating instruments that measure the volume flow rate of liquids.

_12. What kind of equipment is needed for calibrating instruments that measure the volume flow rate of gases? How is this equipment used?

_13. Discuss the general procedures involved in calibrating level sensors.

_14. What is the main item of equipment used in calibrating mass-measuring instruments? Sketch the following instruments and discuss briefly their mode of operation: beam balance, weigh beam, and pendulum scale.

_15. Discuss how the following are calibrated: translational displacement transducers and linear-motion accelerometers.

_16. Explain the general procedures involved in calibrating (a) vibration sensors and (b) shock sensors.

_17. Discuss briefly the procedures involved in the following: rotational displacement sensors, rotational velocity sensors, and rotational acceleration sensors.

_18. How are dimension-measuring instruments calibrated normally?

_19. Discuss briefly the two main ways of calibrating angle-measuring instruments.

_20. Discuss the equipment used and procedures involved in calibrating viscosity-measuring instruments.

_21. Discuss briefly the calibration of moisture and humidity measurements.


PREV.: Measurement Uncertainty

Article index []

top of page   Home

Home | Glossary | Books | Links/Resources
EMC Testing | Environmental Testing | Vibration Testing

Updated: Sunday, 2014-03-30 5:31 PST