Instrument error
Instrument error refers to the error of a measuring instrument, or the difference between the actual value and the value indicated by the instrument. There can be errors of various types, and the overall error is the sum of the individual errors.
Types of errors include
- systematic errors
- random errors
- absolute error
- other error
Systematic errors
The size of the systematic error is sometimes referred to as the accuracy. For example the instrument may always indicate a value 5% higher than the actual value; or perhaps the relationship between the indicated and actual values may be more complicated than that. A systematic error may arise because the instrument has been incorrectly calibrated, or perhaps because a defect has arisen in the instrument since it was calibrated. Instruments should be calibrated against a standard instrument that is known to be accurate, and ideally the calibration should be repeated at intervals. The most rigorous standards are those maintained by a standards organization such as NIST in the United States, or the ISO in Europe.
If the users know the amount of the systematic error, they may decide to adjust for it manually rather than having the instrument expensively adjusted to eliminate the error: e.g. in the above example they might manually reduce all the values read by about 4.8%.
Random errors
The range in amount of possible random errors is sometimes referred to as the precision. Random errors may arise because of the design of the instrument. In particular they may be subdivided between
- errors in the amount shown on the display, and
- how accurately the display can actually be read.
Amount shown on the display
Sometimes the effect of random error can be reduced by repeating the measurement a few times and taking the average result.
How accurately the display can be read
If the instrument has a needle which points to a scale graduated in steps of 0.1 units, then depending on the design of the instrument it is usually possible to estimate tenths between the successive marks on the scale, so it should be possible to read off the result to an accuracy of about 0.01 units.
Other errors
The act of taking the measurement may alter the quantity being measured. For example, an ammeter has its own built-in resistance, so if it is connected in series to an electrical circuit, it will slightly reduce the current flowing through the circuit.
References