What is gauge, absolute and differential pressure?
Pressure is a widely measured physical variable within a diverse range of modern applications and industries worldwide. Pressure (P) is defined as the unit force (F) divided by the area (A) over which that force is evenly distributed, e.g. P = F/A. This simple relationship means that pressure increases proportionately with force if the working area remains constant.
Atmospheric pressure is often used as a zero reference in pressure measurement, when the resulting measurement is referred to as gauge (g) pressure. Atmospheric pressure is also called barometric pressure and due to gravity the mean value at sea level is approximately 1 bar absolute (1013.25 mbar). Atmospheric air pressure decreases with increase in altitude, creating an increasingly negative pressure (vacuum) until the absolute pressure zero point (total vacuum) is reached in outer space. Conversely, liquid pressure increases below sea level at a rate of approximately 1 bar for every 10 m of static water (hydrostatic) depth and the world’s deepest ocean depth is equivalent to a pressure of over 1000 bar (approximately 1000 times greater than atmospheric pressure). Hydrostatic pressure increases in proportion to depth or level measured from the surface because of the increasing weight of fluid exerting downward force from above.
Where pressure is measured in the region below atmospheric pressure, this is known as vacuum or negative pressure. It can be measured as “absolute” pressure (“a”) where the zero point is a full vacuum (e.g. barometric / meteorological measurements, altitude monitoring, sealed container pressure etc) although most applications use a “gauge” reference with zero point at ambient atmospheric pressure. This automatically deducts the effect of ambient pressure, which is usually the most common measurement requirement. For applications monitoring the difference in two input pressures (such as filter condition, orifice plate flow etc) this is known as differential pressure ( “d” or “DP”) and the zero point simply occurs when both input pressure values are zero.
How are pressure sensors used for level measurement?
Pressure sensors can be used to measure liquid level using the hydrostatic principle, from open water such as reservoir level or sea depth to storage tanks and vessels. This assumes that the media is a static head of liquid with constant specific gravity and the system is compensated proportionally where liquid density is not the same as clear water. For example when measuring oil level in a tank, a lower specific gravity means a proportionally lower pressure for every metre of hydrostatic depth or level. Also if the tank is not vented, or it is sealed under pressure, then it is necessary to measure and compensate for surface pressure above the liquid.
In practice hydrostatic level measurement can be achieved using a fully submersible probe with hermetically sealed cable assembly to sense the liquid pressure head above the probe inlet. Alternatively a pressure sensor can be externally mounted at the bottom of tanks or vessels. A differential pressure sensor can be used to provide a single output for sealed tanks representing the difference in two input pressures for tank level and surface gas pressure or they can simply be measured separately using two sensors and outputs.