English Wikipedia - The Free Encycl...
Download this dictionary
Instrument error
Instrument error refers to the combined accuracy and precision of a measuring instrument, or the difference between the actual value and the value indicated by the instrument (error). Measuring instruments are usually calibrated on some regular frequency against a standard. The most rigorous standard is one maintained by a standards organization such as NIST in the United States, or the ISO in European countries. However, in physics—precision, accuracy, and error are computed based upon the instrument and the measurement data. Precision is to 1/2 of the granularity of the instrument's measurement capability. Precision is limited to the number of significant digits of measuring capability of the coarsest instrument or constant in a sequence of measurements and computations. Error is ± the granularity of the instrument's measurement capability. Error magnitudes are also added together when making multiple measurements for calculating a certain quantity. When making a calculation from a measurement to a specific number of significant digits, rounding (if needed) must be done properly. Accuracy might be determined by making multiple measurements of the same thing with the same instrument, and then calculating the result with a certain type of math function, or it might mean for example, a five pound weight could be measured on a scale and then the difference between five pounds and the measured weight could be the accuracy. The second definition makes accuracy related to calibration, while the first definition does not.

See more at Wikipedia.org...


© This article uses material from Wikipedia® and is licensed under the GNU Free Documentation License and under the Creative Commons Attribution-ShareAlike License