Calibration
This article provides useful, detailed information about Calibration.
Calibration refers to the process by which a usable scale on a measuring instrument is put in place. For instance, a mercury thermometer can be calibrated with a Celsius scale by observing the lengths of the mercury column at two set temperatures. Generally the temperatures that are taken into account are the freezing point (0° C) and boiling point (100° C) of water. Following this, the gap between them is divided into 100 equal parts. The gaps are made at intervals above and below if required.
In order to make it certain that uniformity and accuracy are maintained in electric meters, they are calibrated in accordance with the established norms of measurement for the given electrical unit, such as volt, ampere, ohm, and watt, as has been set in the United States by the National Institute of Standards and Technology.
The basic values for the ohm and ampere are dependent on the internationally acknowledged demarcation of these units with reference to mass, conductor dimension, and time. The measurement methods that utilize these fundamental units are flawless, and can be reproduced. For instance, absolute ampere measurements entail the utilization of a weighing-balance system that takes into account the force between a series of permanent coils and a coil that moves.
Absolute measurements of current and possible dissimilarity are of major significance, mostly in laboratory work. However, for nearly all other objectives, comparative measurements are adequate. Frequency calibrations gauge the functioning of frequency standards. The frequency standard that needs to be calibrated is referred to as the device under test.
In most instances the device under test comprises a quartz, rubidium, or cesium oscillator. It should be borne in mind that for the calibration to be effective, it must be judged against a standard reference.