In the field of instrumentation engineering, measurements with high precision are key. To achieve this, engineers must comprehend basic concepts such as resolution, accuracy, and repeatability and how a proper measurement system calibration is necessary towards this goal.
Accuracy is a fundamental parameter that quantifies how close a measured value is to the true or expected value. In the context of sensors, accuracy refers to the deviation of the measured value from the actual value. Sensor accuracy is influenced by various factors, including non-linearity, hysteresis, and repeatability.
Non-linearity is a crucial aspect of sensor accuracy. It refers to the sensor's ability to maintain a linear relationship between the input stimulus and the corresponding output signal. A perfectly linear sensor exhibits a consistent change in output signal for a uniform change in the input stimulus. In other words, non linearity is the maximum Deviation of the Calibration Curve from a straight line drawn between the no-load and Rated Load outputs, expressed as a percentage of the Rated Output and measured on increasing load only.
According to FUTEK Glossary, accuracy is the limit tolerance which defines the average deviation between the actual output versus theoretical output. In practical transducer applications, the potential errors of nonlinearity, hysteresis, nonrepeatability and temperature effects do not normally occur simultaneously, nor are they necessarily additive. Therefore, accuracy is calculated based upon the RMS value of potential errors, assuming a temperature band of ± 10° F, full rated load applied, and proper set up and calibration. Potential errors of the readout, crosstalk, or creep effects are not included.
Repeatability characterizes the consistency and precision of measurements obtained from the same sensor under similar operating conditions. It quantifies the sensor's ability to produce the same output value when subjected to multiple consecutive measurements of the same input stimulus.
In other words, repeatability (or sometimes referred to non-repeatability), is the maximum difference between transducer output readings for repeated loadings under identical loading and environment conditions. Repeatability is typically expressed as a standard deviation or a percentage of the full-scale measurement range.
Factors that can affect sensor repeatability include mechanical tolerances, temperature variations, and electronic noise. Engineers strive to minimize repeatability errors to ensure consistent and reliable measurements.
Sensor resolution refers to the smallest incremental change in the input stimulus that a sensor can detect and express in its output signal. It indicates the level of detail or granularity with which a sensor can measure.
In other words, resolutions is the smallest change in mechanical input which produces a detectable change in the output signal. Resolution is commonly expressed in terms of the least significant bit (LSB), digits, or percentage of the full-scale range.
The resolution of a measurement depends on the capabilities of the sensor, signal conditioning circuitry, and the analog-to-digital converter (ADC) used in the measurement system. A higher resolution allows for more precise measurements, enabling engineers to discern smaller changes in the input stimulus.
Image 1: The difference between Accuracy and Repeatability in sensors.
The signal conditioner bandwidth and sampling rate play vital roles in determining the resolution of a measurement system. The signal conditioner, which often includes amplification and filtering stages, prepares the sensor's output signal for further processing.
The bandwidth of the signal conditioner refers to the range of frequencies that it can process without significant distortion. A narrower bandwidth can attenuate high-frequency components of the signal, leading to a loss of information and reduced resolution. Therefore, selecting a signal conditioner with an appropriate bandwidth is crucial to preserve the fidelity of the measured signal.
The sampling rate, on the other hand, determines how frequently the analog signal is converted to a digital representation by the ADC. A higher sampling rate allows for more samples per unit time, capturing fine details and increasing the effective resolution of the measurement system. However, sampling at excessively high rates may introduce noise and aliasing effects, which can compromise measurement accuracy.
Check out our Resolution Calculator page and estimate the actual resolution of a force and/or torque measurement system comprised of any FUTEK sensor interconnected connected to any of the FUTEK's signal conditioners.
Accuracy relates to the closeness of a measured value to the true value or reference standard. It accounts for all errors, including systematic errors such as sensor non-linearity, drift, and calibration inaccuracies. Achieving high accuracy requires minimizing sources of error and calibrating the measurement system to account for any inherent biases.
Repeatability, on the other hand, refers to the consistency of measurements obtained from the same sensor under identical conditions. It quantifies the variation in output values when measuring the same input stimulus multiple times. Repeatability errors are typically random in nature and can arise from factors such as electronic noise, thermal fluctuations, and minor mechanical variations.
There is also a misunderstanding of the differences between resolution versus accuracy (accuracy vs resolution). Accuracy, as discussed earlier, refers to the deviation of a measured value from the true value. It encompasses all sources of errors, both systematic and random. Achieving high accuracy involves minimizing systematic errors through calibration, compensating for sensor non-linearity, and accounting for environmental factors.
Resolution, on the other hand, describes the level of detail or granularity with which a measurement system can detect and represent changes in the input stimulus. It quantifies the smallest discernible change that the system can reliably capture. While accuracy focuses on the deviation from the true value, resolution emphasizes the system's ability to resolve fine changes within the measurement range.
Image 2: The resolution of the measurement system.
For more than 3 decades, FUTEK is a pioneer and a leader in the force and torque measurement industry. We proud ourselves on designing and manufacturing sensors with high accuracy, repeatability and resolution.
FUTEK's A2LA and NIST accredited load cell calibration lab offers calibration and recalibration services for load cells, torque sensors, and strain gage amplifiers. FUTEK load cell calibration equipment are ISO 17025 and ANSI Z540-1 certified for high accuracy and fast turnaround.