The term accuracy class is used in analog metrology, namely for the accuracy of pointer instruments. Since pointer instruments vary in accuracy due to the type of movement, the scale resolution and the construction of the pointer, pointer instruments have been divided into different accuracy classes.
The accuracy class is the difference in measured value between a reference level and the measured value read on a pointer instrument. This value corresponds to the measured value deviation. The higher the accuracy class, the higher the percentage deviation of the measured value. For example, accuracy class 1 has a percentage measured value deviation of 1% of the measuring range. For a measuring range of 10 V, the maximum deviation is 0.1 V. If the measured value is 2 V, for example, the deviation can be 0.1 V, which corresponds to a measuring accuracy of 5 %.
The accuracy classes are defined in DIN 1319 and in the European standardEN 600051.