What is the difference between sensitivity and accuracy?
Answers
Answered by
0
Sensitivity is an absolute quantity, the smallest absolute amount of change that can be detected by a measurement.
Accuracy can be defined as the amount of uncertainty in a measurement with respect to an absolute standard. Accuracy specifications usually contain the effect of errors due to gain and offset parameters. Offset errors can be given as a unit of measurement such as volts or ohms and are independent of the magnitude of the input signal being measured.
Accuracy can be defined as the amount of uncertainty in a measurement with respect to an absolute standard. Accuracy specifications usually contain the effect of errors due to gain and offset parameters. Offset errors can be given as a unit of measurement such as volts or ohms and are independent of the magnitude of the input signal being measured.
Similar questions