Search

Measurement of Dewpoint in Industrial Gases

We are an online community created around a smart and easy to access information hub which is focused on providing proven global and local insights about sustainability

08 Feb, 2024

This post was originally published on Sustainability Matters

Measurement of dewpoint in industrial gases is a strength of Michell Instruments and AMS. However, it is good to know some basic rules about “Good Measurement Practice” for this important KPI.

Good Measurement Practice

Michell manufacture a vast range of devices to monitor dew point. As you would expect from experts, their equipment is designed for all types of application. From the ‘simple’ Dew Point Transmitter to sophisticated Chilled mirror; Quartz resonance; and Tunable Diode Array, Analysers.

Moisture measurement is seen in many, many industries and applications. If we focus on the ‘simple’ Dew Point Transmitter to start. The measurement ‘cells’ of these are generally solid state devices built of ceramics and base metals. Often they are designed to operate in a flowing gas stream and they are suitable for the measurement of the moisture content of a wide variety of gases. In general, if the gas (in conjunction with water vapor) is not corrosive to ceramics or base metals then it will be suitable for measurement.

These transmitters, while not ‘looking’ like it, are still an analyser so the next question is, how do we get them into the gas or get the gas to them?

Sampling Considerations

There are two basic methods of measuring a sample with a dew point transmitter, such as Michell’s Easidew, SF82 or one of the packaged systems based around these transmitters:

  • In-situ measurements are made by placing the transmitter inside the environment to be measured.
  • Extractive measurements are made by installing the sensor into a block within a sample handling system, and flowing the sample outside of the environment to be measured through this system.
     

Extractive measurements are recommended when the conditions in the environment to be measured are not conducive to making reliable measurements with the product.

Examples of such conditional limitations are:

  • Excessive flow rate
  • Presence of particulates matter
  • Presence of entrained liquids
  • Excessive sample temperature
     

The basic considerations for each measurement type are as follows:

In-Situ

Dew-Point Sensor Position — will the sensor see an area of the environment that is representative of what you want to measure?

For example, if the sensor is to be mounted into a glove box, there are three different positions in which it could be installed — each giving a different measurement:

  • Position A is on the purge inlet. In this position the sensor will confirm the dew point of the gas entering the glove box, but will not detect any leaks in the glove box itself, or any moisture released from the work piece.
  • Position B is on the gas outlet. In this position the sensor will be exposed to the gas leaving the glove box, and will therefore be detecting any moisture which has entered into the system (e.g. ingress/leaks), or has been released by the work piece.
  • Position C is in the glovebox itself, in this position the sensor will be only detecting any moisture in its immediate vicinity. Leaks not in close proximity to the measurement point may not be detected as this moisture could be drawn directly to the outlet.

If the transmitter is to be mounted directly into a pipe or duct, then consider that the installation point should not be too close to the bottom of a bend where oil or other condensate may collect.

When installed directly into a process line, velocity should be kept below 10m/s. When taking a sample there are many things to consider (see below) but in general a flow of between 1 and 6Nl/min fit. In either installation care must be taken with the gas being presented to the Dew Point sensor.

Flow Rates

Theoretically flow rate has no direct effect on the measured moisture content, but in practice it can have unanticipated effects on response speed and accuracy.

An inadequate flow rate can:

  • Accentuate adsorption and desorption effects on the gas passing through the sampling system.
  • Allow pockets of wet gas to remain undisturbed in a complex sampling system, which will then gradually be released into the sample flow.
  • Increase the chance of contamination from back diffusion: ambient air that is wetter than the sample can flow from the exhaust back into the system. A longer exhaust (sometimes called a pigtail) can also help alleviate this problem.
  • Slow the response of the sensor to changes in moisture content.
     

An excessively high flow rate can:

  • Introduce back pressure, causing slower response times and unpredictable effects on equipment such as humidity generators.
  • Extremely high gas speeds can cause damage to the sensor.
Extractive

For samples the main objective of any Sample Handling System is to get a representative, timely, clean sample to the measurement point. With that in mind flow rates; particulate loading; length of tubing; type of tubing and expected dew point; ambient temperature range; process temperature range; are all important.

In the meantime —
isn’t it time to be more Analytical!

For further information contact AMS Instrumentation & Calibration Pty Ltd on 03-9017 8225, or Freecall (NZ) 0800 442 743, alternatively on e-mail: sales@ams-ic.com.au or visit our web site at www.ams-ic.com.au.

Pass over the stars to rate this post. Your opinion is always welcome.
[Total: 0 Average: 0]

You may also like…

‘Poisoning the Well’ Authors Sharon Udasin and Rachel Frazin on PFAS Contamination and Why It ‘Has Not Received the Attention It Deserves’

‘Poisoning the Well’ Authors Sharon Udasin and Rachel Frazin on PFAS Contamination and Why It ‘Has Not Received the Attention It Deserves’

In the introduction to Sharon Udasin and Rachel Frazin’s new book, Poisoning The Well: How Forever Chemicals Contaminated America, the authors cite an alarming statistic from 2015 that PFAS (per- and polyfluoroalkyl substances) are present in the bodies of an estimated 97% of Americans. How did we ever get to this point? Their book is […]
The post ‘Poisoning the Well’ Authors Sharon Udasin and Rachel Frazin on PFAS Contamination and Why It ‘Has Not Received the Attention It Deserves’ appeared first on EcoWatch.

Turning down the heat: how innovative cooling techniques are tackling the rising costs of AI's energy demands

Turning down the heat: how innovative cooling techniques are tackling the rising costs of AI's energy demands

As enterprises accelerate their AI investments, the energy demand of AI’s power-hungry systems is worrying both the organisations footing the power bills as well as those tasked with supplying reliable electricity. From large language models to digital twins crunching massive datasets to run accurate simulations on complex city systems, AI workloads require a tremendous amount of processing power.

Of course, at the heart of this demand are data centres, which are evolving at breakneck speed to support AI’s growing potential. The International Energy Agency’s AI and Energy Special Report recently predicted that data centre electricity consumption will double by 2030, identifying AI as the most significant driver of this increase.1

The IT leaders examining these staggering predictions are rightly zeroing in on improving the efficiency of these powerful systems. However, the lack of expertise in navigating these intricate systems, combined with the rapidity of innovative developments, is causing heads to spin. Although savvy organisations are baking efficiency considerations into IT projects at the outset, and are looking across the entire AI life cycle for opportunities to minimise impact, many don’t know where to start or are leaving efficiency gains on the table. Most are underutilising the multiple IT efficiency levers that could be pulled to reduce the environmental footprint of their IT, such as using energy-efficient software languages and optimising data use to ensure maximum data efficiency of AI workloads. Among the infrastructure innovations, one of the most exciting advancements we are seeing in data centres is direct liquid cooling (DLC). Because the systems that are running AI workloads are producing more heat, traditional air cooling simply is not enough to keep up with the demands of the superchips in the latest systems.

DLC technology pumps liquid coolants through tubes in direct contact with the processors to dissipate heat and has been proven to keep high-powered AI systems running safely. Switching to DLC has had measurable and transformative impact across multiple environments, showing reductions in cooling power consumption by nearly 90% compared to air cooling in supercomputing systems2.

Thankfully, the benefits of DLC are now also extending beyond supercomputers to reach a broader range of higher-performance servers that support both supercomputing and AI workloads. Shifting DLC from a niche offering to a more mainstream option available across more compute systems is enabling more organisations to tap into the efficiency gains made possible by DLC, which in some cases has been shown to deliver up to 65% in annual power savings3. Combining this kind of cooling innovation with new and improved power-use monitoring tools, able report highly accurate and timely insights, is becoming critical for IT teams wanting to optimise their energy use. All this is a welcome evolution for organisations grappling with rising energy costs and that are carefully considering total cost of ownership (TCO) of their IT systems, and is an area of innovation to watch in the coming years.

In Australia, this kind of technical innovation is especially timely. In March 2024, the Australian Senate established the Select Committee on Adopting Artificial Intelligence to examine the opportunities and impacts of AI technologies4. Among its findings and expert submissions was a clear concern about the energy intensity of AI infrastructure. The committee concluded that the Australian Government legislate for increased regulatory clarity, greater energy efficiency standards, and increased investment in renewable energy solutions. For AI sustainability to succeed, it must be driven by policy to set actionable standards, which then fuel innovative solutions.

Infrastructure solutions like DLC will play a critical role in making this possible — not just in reducing emissions and addressing the energy consumption challenge, but also in supporting the long-term viability of AI development across sectors. We’re already seeing this approach succeed in the real world. For example, the Pawsey Supercomputing Centre in Western Australia has adopted DLC technology to support its demanding research workloads and, in doing so, has significantly reduced energy consumption while maintaining the high performance required for AI and scientific computing. It’s a powerful example of how AI data centres can scale sustainably — and telegraphs an actionable blueprint for others to follow.

Furthermore, industry leaders are shifting how they handle the heat generated by these large computing systems in order to drive further efficiency in AI. Successfully using heat from data centres for other uses will be a vital component to mitigating both overall energy security risks and the efficiency challenges that AI introduces. Data centres are being redesigned to capture by-product heat and use it as a valuable resource, rather than dispose of it as waste heat. Several industries are already benefiting from capturing data centre heat, such as in agriculture for greenhouses, or heating buildings in healthcare and residential facilities. This has been successfully implemented in the UK with the Isambard-AI supercomputer and in Finland with the LUMI supercomputer — setting the bar for AI sustainability best practice globally.

The message is clear: as AI becomes a bigger part of digital transformation projects, so too must the consideration for resource-efficient solutions grow. AI sustainability considerations must be factored into each stage of the AI life cycle, with solutions like DLC playing a part in in a multifaceted IT sustainability blueprint.

By working together with governments to set effective and actionable environmental frameworks and benchmarks, we can encourage the growth and evolution of the AI industry, spurring dynamic innovation in solutions and data centre design for the benefit of all.

1. AI is set to drive surging electricity demand from data centres while offering the potential to transform how the energy sector works – News – IEA
2. https://www.hpe.com/us/en/newsroom/blog-post/2024/08/liquid-cooling-a-cool-approach-for-ai.html
3. HPE introduces next-generation ProLiant servers engineered for advanced security, AI automation and greater performance
4. https://www.aph.gov.au/Parliamentary_Business/Committees/Senate/Adopting_Artificial_Intelligence_AI

Image credit: iStock.com/Dragon Claws

0 Comments