Calibrating the Force Measuring System
In force measurement, the measuring system is comprised of all the elements that carry the end-to-end signal flow, from the force sensor on one end to the output device on the other. Each element in the system has the potential to introduce error in the measurement. This article discusses the importance of calibration to the accuracy of the measuring system’s output.
The article is broken up into the following topics:
- The Measuring System
- Errors in the Measuring System
- Calibrating the Measuring System
- Calibration Methods
- Calibration Standards
- Calibration Procedure
The Measuring System
NOTE: In this article, we refer to the quantity to be measured as the measurand. The measurand of interest here includes load, weight, force and pressure.
A force measuring system has elements assembled in a “chain” to measure the force and output its value. The elements of the measuring system include the following:
- Input transducer: transforms the force being measured into an electrical signal
- Signal conditioners: conditions the signal (e.g., filters, isolators, amplifiers)
- Monitoring systems
- Output device: actuators, switches or displays
These same measuring elements are used in control systems, with the addition of feedback. A measuring system can add a feedback loop when the measurement is used to control an industrial process.
Errors in the Measuring System
Errors are unavoidable in measuring systems. Error is defined as the difference between the output from the measuring system and the actual value of the thing being measured. In this regard, error equates to the level of uncertainty in the measurement. In designing a measuring system, knowing the level of certainty becomes important as it affects the interpretation of the measurement.
There are two categories of errors in measuring systems: systematic errors and random errors.
Systematic errors are errors inherent in the operation of measuring devices themselves. These errors may occur if there is improper handling or operation of the instruments by the system technician. They can only be mitigated through frequent calibration and proper training.
There are two types of systematic errors: zero errors and span errors. Zero errors occur when the measuring system reflects a non-zero output at no load. An instrument or device with a zero error will produce an input-output curve that is parallel to the true measurement value (see Figure 1).
Span errors occur when the slope of the input-output curve of the measuring system is different from the slope of the true measurement input output curve (see Figure 2).
Of course, these two error types can also be combined to form a compound systematic error (see Figure 3).
Random errors are statistical variations (both positive and negative) in the measurement result due to the precision limitations of a device. Causes of random errors in a measuring system include electrical noise and variations in environmental conditions.
Even though these errors appear unpredictable, they often have a normal distribution about their mean value and therefore statistical methods can be used to quantify them. That is, using repeated measurements to find mean and standard deviation, random errors can be expressed as an uncertainty to a number of standard deviations (e.g. 3σ implies a confidence interval of 99%).
In practical applications, the total measuring system error is expressed as the known systematic error plus the uncertainty to three standard deviations from the mean (for the random error).
Sources of Error
Technicians should be mindful of the ways in which errors can be introduced into a measuring system. Sources of error include:
Environmental factors: examples include the aforementioned temperature and electrical noise, as well as other environmental conditions such as air pressure, vibration, etc.
Inconsistent measurement methods: each measurement must be conducted with the same set of procedures on the entity being measured to avoid other variables being introduced into the measurement. For example, placement of the measurand in the device or even the rate at which the load is applied.
Instrument drift: instruments, over time and with use, eventually drift from their original settings, motivating the need for frequent calibration.
Lag: some instruments have a delay in reaching their steady state. If the measurement is made before steady state is reached, the reading will be off.
Hysteresis: similar to lag, a device in the measuring system may not return to equilibrium due to “memory” caused with use. The result is a different output between the increasing and decreasing force, at any given force. In other words, the output of a load cell at a given force depends on whether as the load is increased from zero to load or decreased from rated capacity to load.
Calibrating the Measuring system
According to the International Society of Automation (ISA), calibrationis defined as “a test during which known values of a measurand are applied to the transducer and corresponding output readings are documented under specified conditions.” Specified conditions for proper calibration include the following:
- the ambient conditions
- the force standard machine
- the technical personnel
- the operating procedures
As stated previously, variations in these conditions introduce random errors, increasing the uncertainty of the overall measuring system.
The ambient conditions refer to the temperature and the relative humidity in the testing environment.
The force standard machine refers to the reference standard instrument used to verify testing machines. It should have a measurement uncertainty that is better than one-third of the nominal uncertainty of the instrument being calibrated. This reference equipment is used for measuring ranges and the desired levels of uncertainty, traceable to the SI units by organizations like the National Calibration Laboratories (NCL) and the National Metrological Institute (NIM).
The technical personnel should be well-trained, qualified system technicians who perform technical and management procedures according to the equipment manufacturer’s manual or laboratory’s documented procedure. They must have knowledge of pneumatic, hydraulic, mechanical and electrical instrumentation. They must have received training in subjects of dynamic control theory; analog and digital electronics; microprocessors; and field instrumentation.
Beyond technical knowledge, technicians must have keen attention to detail to ensure operating procedures are strictly followed and that all steps and results are clearly documented to ensure the integrity of the calibration results. Qualified technical personnel are the ones who carry out the operational procedures.
Every calibration should be performed to a specified tolerance, or acceptable deviation from the actual value. The calibration tolerance may not only be based on the manufacturer’s specified value, but also on other factors such as the requirements of the process, the capability of available test equipment and the consistency with similar instruments in the laboratory or in the field.
Not every organization is equipped to perform proper calibration on the measuring system. In those cases, it’s best to choose a manufacturer with the expertise to offer load cell calibration services.
There are four primary methods used in calibrating a measuring system: 1) zero and span calibration; 2) individual and loop calibration; 3) formal calibration; and 4) field calibration.
Zero and Span Calibration
Zero and span calibration are done to eliminate the zero and span errors of the instrument. Zero adjustments are used to reduce the parallel shift of the input-output curve while span adjustments are used to change the slope of the input-output curve. Tacuna Systems’ instruments allow for both zero and span adjustments as can be seen in this manual.
Individual Instrument and Loop Calibration
Individual instrument calibration is performed on each component in the measuring system in isolation. This is done by decoupling it from the measuring system. It requires applying an input from a known source and measuring the output at various inputs. Loop calibration, on the other hand, is performed on the measuring system as a whole. This has the advantage of verifying the whole system all at once which saves time.
Formal calibration is performed by an approved laboratory where instruments used for official measurements are periodically calibrated, traceable to primary standards and certified. This type of calibration is performed under controlled temperature, atmospheric pressure and humidity. In the system, formal calibration is performed on the transducers, measuring, analyzing and computing instruments.
Field calibration is a field maintenance operation where the measuring system is installed. This ensures there is no deviation from the normal operation, determined during formal calibration. Field calibration is done to determine the repeatability of the measurements and not the absolute accuracy of the measurement.
Tacuna Systems offers load cell calibration services for its equipment prior to shipment. Customers receive a calibration certificate with the product when this service is added to a load cell purchase.
Standards are necessary for calibration as detailed in earlier sections. There are three types of standards used in calibration: 1) the primary force standard; 2) the secondary force standard; and 3) the working force standard.
According to the American Society for Testing and Materials (ASTM) standard ASTM E74, Standard Practices for Calibration and Verification for Force-Measuring Instruments, the following are the definition of each of these standards.
Primary Force Standard
A primary force standard is a dead weight force that is applied directly to the measuring system without any intermediate mechanism such as levers, hydraulic, multipliers or the like. The value of applied load, force, weight or pressure is determined by comparison with reference standards, traceable to national standards of mass.
Secondary Force Standard
A secondary force standard is an instrument or mechanism whose calibration has been established by comparison with the primary force standard. In order to utilize a secondary force standard for calibration, it must have been calibrated by comparison with a primary force standard.
Working Force Standard
Working force standards are the instruments used to verify testing machines. They may be calibrated by comparison with primary force standards or secondary force standards. They are also called the force standard machines.
A measuring system should be calibrated prior to installation. Once the system is installed, the calibration options are as follows:
For a permanent installation, use a transfer standard to perform calibration once installed. The “transfer standard” is an already calibrated transducer. Note the calibration range may differ from the instrument range, the latter referring to the capability of the instrument (often greater than the desired measurement range or calibration range).
Design the installed system in such a way that it can be removed anytime for frequent recalibration.
There are different approaches to performing calibration, but the general steps for calibrating a measuring system are listed below.
Set the input signal to 0%, that is zero or no-load. Then adjust the initial scale of the measuring system to be calibrated.
Next, set the input signal to 100%, or the full-scale output capacity. Then adjust the full scale of the measuring system to be calibrated.
Reset the input signal back to 0% and check the system’s output readings. If the value is more than one-quarter of the rated value on the instrument’s datasheet, readjust the scale until it falls within the tolerance level. This is also called the zero adjustment.
Reset the input signal to 100% and check the instrument’s output reading. If the value is more than one-quarter of the rated value specified in the instrument’s datasheet, then readjust the scale till it falls within a tolerable range.
Repeat steps 3 and 4 until the zero-balance and the full scale are within the tolerance of one-quarter of their nominal values. This is the span adjustment.
Perform the measuring cycle by gradually changing the input signal by 20% or 25% and then recording the instrument’s output reading throughout the span of calibration values. In other words, apply loads sequentially as 0%, 20%, 40%, 60%, 80%, 100%, 80%, 60%, 40%, 20%, 0% or 0%, 25%, 50%, 75%, 100%, 75%, 50%, 25%, 0%. The readings should be observed and recorded after a sufficient period of stabilization, to avoid lag errors.
Calibration of a measuring system prior to any measurement is essential. It’s also part of ongoing maintenance over its useful lifespan.
The calibration process results in a calibration curve showing the relationship between the input signal and the response of the system. Information derived from the calibration curve includes system sensitivity, linearity and hysteresis.
For best results, Tacuna Systems recommends the purchase of professional calibration services with the purchase of every load cell.
- The Calibration Principles.
- The Calibration Handbook of Measuring Instruments by Alessandro Brunelli.
- The Instrumentation Reference Book by Walt Boyes.
- A Guide to the Measurement of Force by Andy Hunt.
- The Standard Practices for Calibration and Verification for Force-Measuring Instruments.
- Force Measurement Glossary
- MechTeacher – Generalized Measurement System