Search Knowledge Base by Keyword

Load Cell Classes: NIST Requirements

banner artwork of two printed circuit boards with a computer between them

Our article, Measurement Uncertainty in Force Measurement, covers standards-defined total measurement uncertainty calculations for a given load cell. These procedures rely on very specific environmental conditions with very specific laboratory test equipment traceable to a primary standard.

In reality, this testing is costly, even when performed on just one device out of all of a particular design. It is therefore generally reserved for the most sensitive of instruments, or instruments where the application merits, or the customer demands, this type of rigor. Standards bodies have instead created load cell classes with agreed-upon standard performance parameters and tolerances. Manufacturers seeking certification from these standards bodies must submit each new load cell design to their approved laboratories for testing. The load cell is certified when it meets all the performance criteria for its load cell class. The manufacturer must maintain certification throughout the production of the product, through the maintenance requirements of each of these bodies.

If you buy a certified load cell, then what does its load cell class tell you? This document gives a high-level answer to this question for NTEP-certified load cells. NTEP stands for National Type Evaluation Program, a non-profit corporation that verifies a device’s compliance with the US Department of Commerce’s National Institute of Standards and Technology (NIST) Handbook 44 [1]. Further details appear in NIST’s published requirements described herein.

Standards Bodies Governing Load Cells

This document exclusively covers requirements published in (NIST) Handbook 44 [1]. Its companion document, Load Cell Classes: OIML Requirements, covers similar requirements by the International Organization of Legal Metrology (known as OIML from the organization’s name in French) in its Recommendation 60-1 (abbreviated R60-1) [2]. Together, NIST and OIML define the standards for most load cells sold globally. Compliance with OIML specifications is important for load cells sold outside of the United States.

Background Concepts in Load Cell Requirements

This section reviews basic terminology used in requirements for load cells and measuring systems. Figure 1 depicts them for further clarity.

graphic showing industry terms as a function of load
Figure 1: Illustration of Load Definitions (Figure 3 in [2])

Load Definitions

The common symbols used to describe loads, and their definitions, appear below.




Minimum Dead Load


The smallest load (expressed in mass units) that can be applied to a load cell.

Minimum Load of the Measuring Range


The smallest load (expressed in mass units) applied to the load cell under test, and is required to be within 10% of \(E_{min}\).

Maximum Capacity


The largest load (expressed in mass units) that can be applied to a load cell, per manufacturer’s specifications. It is usually well below the safe load limit.

Maximum Load of the Measuring Range


The largest load (expressed in mass units) applied to the load cell under test, and is required to be within 90% of \(E_{max}\)

Safe Load Limit


The greatest load (expressed in mass units) that can be applied to the load cell without permanently changing the performance of the load cell.

Measuring Range Definitions

The common expressions for load cell measuring ranges are as follows.




Load Cell Measuring Range

\(D_{max} – D_{min}\)

The range of mass values used for testing

Maximum Measuring Range

\(E_{max} – E_{min}\)

The maximum range of mass values a load cell can accurately measure.

Verification Interval Definitions

The concept of a verification interval is important to the definition of load cell classes. The common related terms are as follows.




Load Cell Scale Divisions


The smallest unit quantity of weight the measuring device indicator can read; the resolution of the scale.

Load Cell Verification Interval


The quantity the load cell measuring range is divided into for testing.

Minimum Value of Verification Interval


The minimum quantity the scale’s measuring range is divided into for testing. This must be no more than 10 x \(d\) per NIST Handbook 44 requirements.

Maximum Number of Scale Divisions


The maximum number of divisions that the maximum measuring range can be divided into. Note \(d\) x \(n_{LC}\) = the maximum measuring range.

Other Important Terms


Apportioning Factor




A unit-less multiplier applied to a load cell observed error to quantify the portion of that error attributable to the load cell alone.

NIST Load Cell Classes

As previously stated, NIST requirements are in its Handbook 44 (going forward, “the Handbook”); they apply to load cells for commercial use in the United States and Canada. They are much like the OIML requirements with some additional specifications.

NIST Class Designations and Their Commercial Application

NIST assigns five classes to load cells: I, II, III, III L, and IIII. Table 7a in the Handbook describes the application of each class as follows:

NIST Load Cell Class



Precision Laboratory Weighing


Laboratory weighing, precious metals and gems, grain test scales


All commercial weighing including grain test scales, retail precious metals and semi-precious gem weighing, grain-hopper scales, animal scales, postal scales, laundry scales and on-board vehicle weighing systems up to a 30,000lb capacity


Large capacity commercial scales such as vehicle scales, on-board vehicle weighing systems with a capacity greater than 30,000 lbs, axle load scales, livestock, railway track, crane and hopper scales (other than grain hoppers)


Weight measurement for highway weight enforcement such as axle load weighers

Scale Divisions (Minimum Resolution) and Full Scale Requirements Per NIST Load Cell Class

For each of these load cell classes, NIST specifies the number of divisions \(n\)), the value of the load cell scale divisions (\(d\)), and the minimum value of the verification scale (\(e\)). The general cases appear in the table below, based on Table 3 from [1]. NIST has specific verification scale minimums depending on the application; therefore, the user should always refer to Handbook 44 for the most up-to date information.

Table 1: Parameters for Accuracy Classes


Value of Verification Scale Division

(d or e)

Number of Scale Divisions (n)
Minimum Maximum
SI Units

equal to or geater than 1 mg



1 to 50 mg, inclusive

100 100,000
equal to or greater than 100 mg 5,000 100,000

0.1 to 2 g, inclusive

100 10,000

equal to or greater than 5 g

500 10,000
III L equal to or greater than 2 kg 2,000 10,000
IIII equal to or greater than 5 g
100 1,200
U.S. Customary Units
III 0.0002 lb to 0.005 lb, exclusive 100 10,000
0.005 oz to 0.125 oz, inclusive 100 10,000
equal to or greater than 0.01 lb 500 10,000
equal to or greater than 0.25 oz 500 10,000
III L equal to or greater than 5 lb 2,000 10,000
IIII greater than 0.01 lb 100 1,200
greater than 0.25 oz 100 1,200

Maximum Permissible Error Per Load Cell Class

Unlike OIML, NIST specifies the MPE mostly to quantify permissible errors due to creep and creep recovery. The table below (from Table T.N.4.6 of [1]) gives the MPE per load cell class for type evaluation testing only. Type evaluation testing is the NIST term for testing performed for certification.

Table 2: MPE During Type Evaluation For Each Load Cell Class

MPE in Load Cell Verifications Divisions (v) = pLC × Basic Tolerance in v
Class pLC × 0.5 v pLC × 1.0 v pLC × 1.5 v
I 0 – 50,000 v 50,001 v – 200,000 v 200,001 v +
II 0 – 5,000 v 5,001 v – 20,000 v 20,001 v +
III 0 – 500 v 501 v – 2,000 v 2,001 v +
IIII 0 – 50 v 51 v – 200 v 201 v +
III L 0 – 500 v 501 v – 1,000 v

Add 0.5 v to the basic
tolerance for each
additional 500 v or
fraction thereof up to
a maximum load of
10,000 v

As with OIML requirements, \(v\) represents the load cell verification interval (which is similar to \(e\)), and \(p_{LC}\) is the apportionment factor, applied to basic tolerances described below. NIST assigns the following \(p_{LC}\)s:


\(p_{LC}\) = 0.7 for load cells of any class other than III L marked “S” for single load cell application use

\(p_{LC}\) = 1 for load cells of any class other than III L marked “M” as intended for multiple load cell applications

\(p_{LC}\) = 0.5 for Class III L load cells marked either M or S

Maximum Tolerances Per Load Cell Class

In addition to MPE, NIST specifies tolerances per load cell class, as a function of the load cell scale divisions. For type evaluation testing, the tolerance values account for hysteresis within a specified temperature range, barometric pressure and power supply. For all other testing, the load cell must fall within the tolerances regardless of ambient conditions. However, these tolerances are generally twice the limit of those used for type evaluation testing.

Specifically, NIST defines two types of tolerances:

  • Acceptance tolerances and
  • Maintenance tolerances

Acceptance tolerances, as previously stated, are half the maintenance tolerances. They are used for load cell type evaluation, initial commercial use, within 30 days of corrective service for non-compliance, or within 30 days of an overhaul or reconditioning of the system.

Maintenance tolerances apply to equipment in use, or field testing tolerances.

The general tolerance requirements per load class are given in the table below for maintenance testing (from Table 6 of [1]). NIST, however, has additional requirements for specific applications (such as postal scales, multiple vs single load cell applications); therefore, the user should consult with the Handbook.

The table tells us that if we have, for example, a Class III load cell with a test load equal to 6000 x \(d\) (again \(d\) being the load cell’s scale division value), the load cell must read within 2 scale division values of 6000 x \(d\) (meaning it must read within 5998\(d\) and 6002\(d\)).

Table 3: Maintenance Tolerances Per Accuracy Class

Tolerance in Scale Divisions
  1 2 3 4
Class Test Load
I 0 – 50,000 50,001 – 200,000 200,001 +
II 0 – 5,000 5,001 – 20,000 20,001 +
III 0 – 500 501 – 2,000 2,001 – 4,000 4,001 +
IIII 0 – 50 51 – 200 201 – 400 401 +
III L 0 – 500 501 – 1,000 Add 1 d for each additional
500 d or fraction thereof

Limits on Repeatability Error

NIST requires only that multiple weighings of the same load (the number of which is unspecified) must agree within the maintenance tolerances given in Table 3.

Limits on Creep Error

NIST requirements here are similar to OIML’s but slightly more permissive. Like OIML, NIST assumes a maximum recommended load (\(D_{max}\)) that is within 90-100% of the maximum capacity (\(E_{max}\)). Handbook 44 requires that a compliant load cell, when loaded with its maximum recommended load (\(D_{max}\)) for 30 minutes, has a final reading that differs from its initial reading by no more than the absolute value, |MPE| calculated from Table 2.

The NIST requirement for 20-30 minutes of \(D_{max}\) is identical to OIML. That is, a compliant load cell, when loaded with its maximum recommended load (\(D_{max}\)) for 30 minutes, must have a final reading that differs from its reading after 20 minutes by no more than 0.15 x |MPE|.

Limits on Dead Load Output Return Per Load Cell Class

Recall this requirement describes the performance of the load cell when measuring a load equal to the minimum weight (\(D_{min}\)) immediately before and immediately after the above 30-minute creep test.

NIST requires that the reading of load, \(D_{min}\), prior to the creep test not differ from the reading immediately after the test by the following, depending on the load cell class and number of divisions (\(n\)):


Recovery Value



III with \(n leq 4000\)


III with \(n > 4000\)




Ambient Conditions Where Limits Must Be Met


Like OIML, NIST’s default temperature range within which the tolerances and MPE apply is \(-10^{\circ}\)C through \(+40^{\circ}\)C , or \(14^{\circ}\)F through \(104^{\circ}\)F , for all classes.

If the temperature range differs from this, the load cell, depending on class, must have a minimum temperature range as follows:





Accuracy Class

Class I

Class II


Minimum Temperature Span

\(5^{\circ}\)C (\(9^{\circ}\)F)

\(15^{\circ}\)C (\(27^{\circ}\)F)

\(30^{\circ}\)C (\(54^{\circ}\)F)

When the load cell is in an environment outside its specified temperature range, NIST requires that the measuring system not give a reading.

Temperature effect on zero balance

The zero value for a compliant load cell cannot vary by more than 3 divisions per \(5^{\circ}\)C or \(9^{\circ}\)F for a Class III L load cell; it cannot vary by more than 1 division per \(5^{\circ}\)C or \(9^{\circ}\)F for all other load cell classes.

Barometric Pressure

For all load cell classes except for Class I, the zero indication variance must be one scale division or less for a change in barometric pressure of 1 kPa over the total barometric pressure range of 95 kPa to 105 kPa (28 in to 31 in of Hg).

Other Factors

NIST further specifies tolerances and cut off values (beyond which the system should not give a reading) for radio disturbances, power interruptions and low power. These are independent of the tolerances given above and can be found in Handbook 44 Section 2.

Testing Accuracy

NIST has requirements for the accuracy of test equipment used to certify measuring systems. However, they are far more concerned with the response of the load cell and its performance in changing environmental conditions than the uncertainty of the reference standard.

In general, they require the error in the test process to be equal to or less than 1/3 the allowed load cell tolerance or 70% of the tolerance of the entire end-to-end measuring system.

NIST Required Labeling

There are quite a few labeling requirements provided by NIST, particularly for specific weighing applications. The list below gives a few of the required markings.

  • Manufacturer’s identification
  • Model number
  • Serial number
  • Load cell accuracy class
  • Nominal capacity and value of the scale division (\(d\))
  • Any special operating temperature range, if outside the \(14^{\circ} – 104^{\circ}\)F default
  • Whether rated for single or multiple load cell applications

A Few Words about NTEP Certificates of Compliance

NTEP stands for the National Type Evaluation Program. It is a non-profit body responsible for carrying out the testing of measuring equipment. A load cell receives an NTEP certificate of compliance when through testing by an NTEP accepted laboratory proves it adheres to NIST requirements . Generally a manufacturer that wishes to have its product used for trade, commerce, enforcement of laws or government data gathering submits a sample unit of a design to one of these laboratories, where it undergoes testing over several months. Equipment that fails testing must be resubmitted within 90 days; a design that fails three testing cycles must be scrapped unless acceptable proof of corrections to the deficiencies is provided to the lab.

Active vs Inactive NTEP Compliance

A load cell with active status of NTEP conformance is a device manufactured, sold or deployed having a current certificate of conformance. This means the tested prototype of that load cell has has attained NTEP compliance, and the certification continues to be maintained. A load cell sold with inactive NTEP conformance status is a device that was manufactured under an active certificate of conformance, but the certificate has since expired while the device has been in inventory or in commercial use.

Whole System vs. Component Compliance

Note that laboratories often test only components of a measuring system rather than the entire system built with these components. NIST places the responsibility of compliance of the whole on the system manufacturer.

When an NTEP laboratory certifies a whole scale or measuring system rather than a component, the individual load cell(s) in the system is/are not considered individually certified. Therefore, if its internal load cells are replaced with a different model, the system will no longer be NTEP compliant.


This document has explained at a high level the tolerance requirements imposed by NIST. In an ideal world, a tested load cell of a specified class will meet these tolerances with 100% certainty with regular calibration. (Note that the specifications given in the load cell data sheet express the characteristics of that particular load cell when properly calibrated, and fall within any required tolerances for certification.) When they do not meet these tolerances, the certified load cell must be taken out of service to be repaired or replaced.

This document also covers how load cell class relates to its divisions of resolution. These divisions help determine the measurement intervals detectable by a weighing system using that particular load cell. For more detailed information on this topic, see our article What Is the Lowest Weight a Load Cell Can Measure.

As always, contact Tacuna Systems with any issues regarding our load cells behaving outside of their specifications.



NIST Handbook 44, Specifications, Tolerances, and Other Technical Requirements for Weighing and Measuring Devices, as adopted by the 104th National Conference on Weights and Measures, 2019, National Institute of Standards and Technology, US Department of Commerce (latest version, 2023 as adopted by the 107th National Conference on Weights and Measures)


OIML R 60-1, Metrological regulation for load cells Part 1: Metrological and technical requirements, Organisation Internationale de Métrologie Légale, Edition 2017 (latest version 22 November 2021)


R 60 OIML-CS rev.2 Additional requirements from the United States Accuracy class III L, Organisation Internationale de Métrologie Légale, January 2018 (latest version 22 November 2021)

Table of Contents