Calibration

« Back to Glossary Index

Definition and Importance of Calibration
Calibration is the comparison of measurement values delivered by a device under test with those of a calibration standard of known accuracy.
– The outcome of the comparison can result in no significant error being noted on the device under test, a significant error being noted but no adjustment made, or an adjustment made to correct the error to an acceptable level.
– The calibration standard is normally traceable to a national or international standard held by a metrology body.
– The formal definition of calibration by the International Bureau of Weights and Measures (BIPM) involves establishing a relation between quantity values and corresponding indications with associated measurement uncertainties.
– The calibration process is purely a comparison and does not include any subsequent adjustment.
– National Metrology Institutes (NMIs) maintain primary standards of measurement to provide traceability to customers’ instruments by calibration.
– Examples of NMIs include NPL in the UK, NIST in the United States, and PTB in Germany.
– The Mutual Recognition Agreement allows for traceability from any participating NMI, eliminating the need for companies to obtain traceability from the NMI of their country.
– NMIs establish an unbroken chain from top-level standards to instruments used for measurement.
– The need for known accuracy and uncertainty has led to the establishment of national laboratories and consistent and comparable standards internationally.
Calibration and subsequent measurements should be traceable to internationally defined measurement units to improve quality and gain acceptance from outside organizations.
– Traceability is achieved through a formal comparison to a standard related to national or international standards.
– Quality management systems require an effective metrology system, including periodic and documented calibration of all measuring instruments.
– ISO 9000 and ISO 17025 standards specify the level of traceable actions and how they can be quantified.
Calibration values are often accompanied by a traceable uncertainty statement to a stated confidence level.

Instrument Calibration Process
Calibration may be required for a new instrument, after repair or modification, when moving to a different location, or after a specified time period or usage.
Calibration may be necessary before and/or after a critical measurement or after an event such as exposure to shock, vibration, or physical damage.
– Sudden changes in weather or questionable observations may also prompt calibration.
Calibration may be specified by customer requirements or instrument manufacturer recommendations.
Calibration often involves adjusting the output or indication of a measurement instrument to agree with the value of the applied standard.
– The calibration process begins with the design of the measuring instrument, which should be capable of holding a calibration through its calibration interval.
Calibration ensures the quality of measurement and the proper working of the instrument.
– The exact mechanism for assigning tolerance values and calibration intervals varies by country and industry type.
– The manufacturer assigns the measurement tolerance, suggests a calibration interval, and specifies the environmental range of use and storage.
– The using organization assigns the actual calibration interval based on specific measurement requirements.
Calibration methods can be manual or automatic.
– Manual calibration involves multiple steps, such as connecting the device under test to a reference gauge and comparing readings.
– Automatic calibration uses specialized equipment, such as an electronic control unit and pressure transducer, with data collection facilities for automation.
– Manual calibration requires manual record-keeping, while automatic calibration streamlines data gathering.
– Both methods have their advantages and are suitable for different types of devices.
Calibration procedures capture all the steps needed for a successful calibration.
– Manufacturers or organizations may provide calibration procedures that meet specific requirements.
Calibration procedures can be found in clearinghouses like the Government-Industry Data Exchange Program (GIDEP).
– The traceability of calibration is established by repeating the process with transfer standards or certified reference materials.
– Metrology considers other factors during calibration process development.

Calibration Standards
– ISO 17025 does not provide clear recommendations for calibration intervals.
– ANSI/NCSL Z540 states that equipment should be calibrated at periodic intervals for acceptable reliability.
– ISO-9001 requires calibration or verification of measuring equipment at specified intervals or prior to use.
– MIL-STD-45662A mandates calibration at periodic intervals for acceptable accuracy and reliability, with the option to adjust intervals based on previous calibrations.
Calibration intervals may be agreed upon with the customer or regulated by legal requirements.

Accuracy Ratio
– The goal is to have a standard with less than 1/4 of the measurement uncertainty of the device being calibrated.
– The 4:1 ratio, originally 10:1, is commonly used for calibration.
– Maintaining a 4:1 accuracy ratio with modern equipment is challenging.
– If the accuracy ratio is less than 4:1, the calibration tolerance can be reduced or the accuracy of the device being calibrated can be adjusted.
– Informal calibration processes can lead to tolerance stacks and post-calibration issues.

Historical Development of Calibration
– The words ‘calibrate’ and ‘calibration’ entered the English language during the American Civil War.
– Ancient civilizations like Egypt, Mesopotamia, and the Indus Valley used angular gradations for construction.
Calibration was associated with precise division of linear distance and angles.
– Linear measurements and weighing scales supported commerce and technology until AD 1800.
– Standardization attempts like the Magna Carta led to the establishment of the Metric system.
– Early measurement devices had direct units, such as length using a yardstick and mass using a weighing scale.
– During the reign of Henry I, a yard was defined as the distance from the tip of the King’s nose to the end of his outstretched thumb.
– The Assize of Measures in the reign of Richard I standardized measurements throughout the realm.
– The Mètre des Archives from France and the establishment of the Metric system further standardized measurements.
Calibration errors, like zeroing errors, can be adjusted by the user.
– The Mercury barometer and water-filled manometers were early pressure measurement devices.
– The Industrial Revolution led to the adoption of indirect pressure measuring devices.
– The Bourdon tube, invented by Eugène Bourdon, is an example of an indirect reading instrument.
– Pressure instruments Source:  https://en.wikipedia.org/wiki/Calibration

Calibration (Wikipedia)

In measurement technology and metrology, calibration is the comparison of measurement values delivered by a device under test with those of a calibration standard of known accuracy. Such a standard could be another measurement device of known accuracy, a device generating the quantity to be measured such as a voltage, a sound tone, or a physical artifact, such as a meter ruler.

The outcome of the comparison can result in one of the following:

  • no significant error being noted on the device under test
  • a significant error being noted but no adjustment made
  • an adjustment made to correct the error to an acceptable level

Strictly speaking, the term "calibration" means just the act of comparison and does not include any subsequent adjustment.

The calibration standard is normally traceable to a national or international standard held by a metrology body.

« Back to Glossary Index
+1 (949) 289-6286