Selecting Pressure Gauge Accuracy: ASME & EN 837-1 | Manogauge

2026-05-08

Specifying a pressure gauge requires balancing performance, safety, and cost. Over-specifying accuracy adds unnecessary expense, while under-specifying can compromise process control and plant safety. The optimal choice depends on a clear understanding of international accuracy standards, the specific demands of the application, and the environmental factors that degrade instrument performance over time. This technical overview provides a systematic framework for selecting the appropriate pressure gauge accuracy class for industrial service, ensuring reliable measurement and operational integrity.

Understanding Accuracy Standards: ASME B40.100 and EN 837-1

Pressure gauge accuracy is defined as the conformity of the indicated value to the true pressure value. For analog (Bourdon tube) gauges, this is typically expressed as a percentage of the full-scale range, or span. Two primary international standards govern these classifications: ASME B40.100 in North America and EN 837-1 in Europe. Both standards define accuracy grades or classes that specify the maximum permissible error across the gauge's scale.

It is critical to distinguish between accuracy of full-scale value and accuracy of indication. Nearly all industrial analog gauges are specified by accuracy of full-scale value. For a 1.0% accuracy gauge with a 0-100 bar range, the maximum permissible error is ±1 bar at *any point on the scale*. This means the relative error is much higher at the low end of the range (e.g., a ±1 bar error at a 10 bar reading is a 10% error relative to the reading). For this reason, the normal operating pressure should always be in the middle third of the gauge's range (approximately 25% to 75% of span), where the stated accuracy is most effective.

In contrast, accuracy of indication, common for digital gauges, defines the error as a percentage of the specific reading. This provides a more consistent relative accuracy across the measurement range but is not the standard for mechanical gauges.

Mapping Accuracy Grades Between Standards

Engineers often encounter specifications referencing either ASME or EN standards. While the principles are similar, the grade designators differ. Understanding the equivalence is essential for global sourcing and equipment specification. ASME B40.100 uses a letter system (e.g., 1A, 2A, B), while EN 837-1 uses a class number corresponding to the percentage error (e.g., 1.0, 1.6).

The following table provides a direct comparison of common accuracy grades and their typical applications:

| ASME B40.100 Grade | EN 837-1 Class | Accuracy (% of Span) | Typical Application                |
|--------------------|----------------|----------------------|------------------------------------|
| 4A                 | 0.1            | ±0.1%                | Test Gauges, Laboratory Standards  |
| 3A                 | 0.25           | ±0.25%               | Precision Measurement, Calibration |
| 2A                 | 0.5, 0.6       | ±0.5%, ±0.6%         | Critical Process Control           |
| 1A                 | 1.0            | ±1.0%                | Industrial Process Monitoring      |
| A                  | 1.6            | ±1.6%                | General Purpose, OEM Equipment     |
| B                  | 2.5            | ±2.5%                | Utility Service (Air, Water)       |
| C, D               | 4.0            | ±4.0%                | Low-Cost Indicators, Regulators    |

Manogauge manufactures instruments compliant with all major industrial grades, ensuring interchangeability and adherence to project specifications regardless of the governing standard.

Application-Driven Accuracy Selection

The required accuracy is dictated entirely by the application's tolerance for measurement error.

How Service Conditions Degrade In-Service Accuracy

A gauge's nameplate accuracy is its performance under controlled, static laboratory conditions. In the field, several factors can introduce additional error, degrading its effective accuracy.

Calibration Intervals and Lifecycle Management

Accuracy is not permanent. All mechanical gauges are subject to drift over time due to mechanical wear, fatigue, and environmental stress. A calibration program is essential for verifying the performance of installed instruments.

The frequency of calibration depends on the criticality of the application and the severity of the service conditions. A common starting point is:

Calibration records reveal the instrument's performance trend. A gauge that consistently requires significant adjustment is a candidate for replacement. The calibration standard used should be at least four times more accurate than the device under test (a 4:1 Test Uncertainty Ratio) to ensure a valid calibration. Regular calibration is not just a best practice; it is a core component of process safety management and quality control systems like ISO 9001.

Key takeaways


← All insights