Search

From Crisis to Control: AI Offers New Hope for Water Management

We are an online community created around a smart and easy to access information hub which is focused on providing proven global and local insights about sustainability

19 Jun, 2025

This post was originally published on Sustainability Matters

Nearly 40 million people — roughly 12% of the U.S. population — rely on the Colorado River for water. This iconic river, which stretches across seven states, supports irrigation, generates power, and serves as a vital source of drinking water. Yet its flow has diminished by about 20% over the past century — a seemingly modest decline that carries significant consequences. A mere 10% reduction in flow jeopardizes $1.4 trillion in economic activity.

Three critical factors compound the challenges facing the Colorado River: overdependence, climate change, and aging infrastructure. Together, they create a pressing need for innovative water management solutions as water scarcity becomes a growing regional crisis.

The impacts of water scarcity are particularly visible in states like Arizona and California. A 2023 Arizona Department of Water Resources report predicted a groundwater shortage of 4.6 million acre-feet over the next century. For perspective, one acre-foot of water can support up to three households for a year, depending on the community. This looming shortfall could disrupt new development approvals in the Phoenix metropolitan area, home to 4.6 million people, unless alternative water sources are secured.

California faces similar challenges. The 2023 State Water Project Delivery Capability Report estimated that by 2043, the state’s water delivery capacity could decline by 23% due to shifting water flow patterns and extreme weather events. This reduction — equal to 496,000 acre-feet annually — represents enough water to supply 1.7 million homes for a year.

Beyond shortages, water loss due to leaks, theft, or metering inaccuracies — referred to as Non-Revenue Water (NRW) — exacerbates the crisis. Worldwide, approximately 35% of treated drinking water is lost each year as NRW. In the U.S. alone, six billion gallons of treated water are wasted daily, adding up to two trillion gallons annually. This staggering loss of resources costs municipalities around $8 billion every year, according to the American Society of Civil Engineers (ASCE).

CivilSense acoustic sensors are compatible with any pipe material.

Aging infrastructure further compounds the issue. The Environmental Protection Agency (EPA) estimates that $625 billion will be needed over the next two decades to address deteriorating drinking water systems. Leaks and inefficiencies drive up costs for utilities, hampering their ability to invest in critical infrastructure upgrades. The burden often falls on consumers, as utilities are forced to raise rates, adding financial pressure on households and businesses.

To address these challenges, Oldcastle Infrastructure has developed CivilSense™, a cutting-edge water infrastructure asset management solution that combines advanced artificial intelligence with decades of expertise. CivilSense uses network and acoustic data to identify leaks and predict pipe failures before they occur, offering municipalities a proactive and sustainable way to improve their water management. By leveraging data-driven insights, this solution helps cut operational costs, prevent major line breaks, and reduce water loss effectively, making it an invaluable tool for communities struggling with water scarcity.

CivilSense is particularly impactful as municipalities face staffing shortages, skill gaps due to retirements, and tight budgets that hinder necessary repairs. By providing a scalable and efficient solution, water utilities can mitigate resource loss and plan for the future with greater confidence.

Field deployment of CivilSense technology.

Forward-thinking communities are seeking ways to cope with water scarcity today with innovative measures like CivilSense. Consider Bartow County, a community about 50 miles north of Atlanta that buys about 95% of its water from neighboring cities and counties.

In an Oldcastle Infrastructure pilot program, CivilSense analyzed the water distribution network and identified nine separate leaks, varying from small (1–4 gallons per minute) to medium (5–9 gallons per minute) and large (more than 10 gallons per minute). Of the nine, two were small, three were medium, and four were large. The total water lost from these nine leaks totaled 83 gallons per minute, which translates to nearly 120,000 gallons a day or 43 million gallons per year.

“Repairing small leaks that are three-to-five gallons per minute may not sound exciting, but having the ability to find and fix leaks before they create more damage is a much more proactive and less costly approach,” said Lamont Kiser, director of Bartow County Water. “Our proactive approach is working for Bartow County and our citizens.”

Water scarcity is no longer a distant threat — it’s a present-day reality demanding action. With CivilSense, municipalities of all sizes can adopt smarter water management practices to protect their most vital resource. As the challenges grow, so do the opportunities to innovate and secure a sustainable water future for future generations.

Chris Cummings is a Smart Water Consultant, Digital Water Market, at Oldcastle Infrastructure.

Specializing in software solutions and go-to-market strategy, Chris Cummings is dedicated to advancing sustainable water management technologies.

Top image caption: CivilSense detects and prioritizes leaks for better resource allocation.

Pass over the stars to rate this post. Your opinion is always welcome.
[Total: 0 Average: 0]

You may also like…

‘Poisoning the Well’ Authors Sharon Udasin and Rachel Frazin on PFAS Contamination and Why It ‘Has Not Received the Attention It Deserves’

‘Poisoning the Well’ Authors Sharon Udasin and Rachel Frazin on PFAS Contamination and Why It ‘Has Not Received the Attention It Deserves’

In the introduction to Sharon Udasin and Rachel Frazin’s new book, Poisoning The Well: How Forever Chemicals Contaminated America, the authors cite an alarming statistic from 2015 that PFAS (per- and polyfluoroalkyl substances) are present in the bodies of an estimated 97% of Americans. How did we ever get to this point? Their book is […]
The post ‘Poisoning the Well’ Authors Sharon Udasin and Rachel Frazin on PFAS Contamination and Why It ‘Has Not Received the Attention It Deserves’ appeared first on EcoWatch.

Turning down the heat: how innovative cooling techniques are tackling the rising costs of AI's energy demands

Turning down the heat: how innovative cooling techniques are tackling the rising costs of AI's energy demands

As enterprises accelerate their AI investments, the energy demand of AI’s power-hungry systems is worrying both the organisations footing the power bills as well as those tasked with supplying reliable electricity. From large language models to digital twins crunching massive datasets to run accurate simulations on complex city systems, AI workloads require a tremendous amount of processing power.

Of course, at the heart of this demand are data centres, which are evolving at breakneck speed to support AI’s growing potential. The International Energy Agency’s AI and Energy Special Report recently predicted that data centre electricity consumption will double by 2030, identifying AI as the most significant driver of this increase.1

The IT leaders examining these staggering predictions are rightly zeroing in on improving the efficiency of these powerful systems. However, the lack of expertise in navigating these intricate systems, combined with the rapidity of innovative developments, is causing heads to spin. Although savvy organisations are baking efficiency considerations into IT projects at the outset, and are looking across the entire AI life cycle for opportunities to minimise impact, many don’t know where to start or are leaving efficiency gains on the table. Most are underutilising the multiple IT efficiency levers that could be pulled to reduce the environmental footprint of their IT, such as using energy-efficient software languages and optimising data use to ensure maximum data efficiency of AI workloads. Among the infrastructure innovations, one of the most exciting advancements we are seeing in data centres is direct liquid cooling (DLC). Because the systems that are running AI workloads are producing more heat, traditional air cooling simply is not enough to keep up with the demands of the superchips in the latest systems.

DLC technology pumps liquid coolants through tubes in direct contact with the processors to dissipate heat and has been proven to keep high-powered AI systems running safely. Switching to DLC has had measurable and transformative impact across multiple environments, showing reductions in cooling power consumption by nearly 90% compared to air cooling in supercomputing systems2.

Thankfully, the benefits of DLC are now also extending beyond supercomputers to reach a broader range of higher-performance servers that support both supercomputing and AI workloads. Shifting DLC from a niche offering to a more mainstream option available across more compute systems is enabling more organisations to tap into the efficiency gains made possible by DLC, which in some cases has been shown to deliver up to 65% in annual power savings3. Combining this kind of cooling innovation with new and improved power-use monitoring tools, able report highly accurate and timely insights, is becoming critical for IT teams wanting to optimise their energy use. All this is a welcome evolution for organisations grappling with rising energy costs and that are carefully considering total cost of ownership (TCO) of their IT systems, and is an area of innovation to watch in the coming years.

In Australia, this kind of technical innovation is especially timely. In March 2024, the Australian Senate established the Select Committee on Adopting Artificial Intelligence to examine the opportunities and impacts of AI technologies4. Among its findings and expert submissions was a clear concern about the energy intensity of AI infrastructure. The committee concluded that the Australian Government legislate for increased regulatory clarity, greater energy efficiency standards, and increased investment in renewable energy solutions. For AI sustainability to succeed, it must be driven by policy to set actionable standards, which then fuel innovative solutions.

Infrastructure solutions like DLC will play a critical role in making this possible — not just in reducing emissions and addressing the energy consumption challenge, but also in supporting the long-term viability of AI development across sectors. We’re already seeing this approach succeed in the real world. For example, the Pawsey Supercomputing Centre in Western Australia has adopted DLC technology to support its demanding research workloads and, in doing so, has significantly reduced energy consumption while maintaining the high performance required for AI and scientific computing. It’s a powerful example of how AI data centres can scale sustainably — and telegraphs an actionable blueprint for others to follow.

Furthermore, industry leaders are shifting how they handle the heat generated by these large computing systems in order to drive further efficiency in AI. Successfully using heat from data centres for other uses will be a vital component to mitigating both overall energy security risks and the efficiency challenges that AI introduces. Data centres are being redesigned to capture by-product heat and use it as a valuable resource, rather than dispose of it as waste heat. Several industries are already benefiting from capturing data centre heat, such as in agriculture for greenhouses, or heating buildings in healthcare and residential facilities. This has been successfully implemented in the UK with the Isambard-AI supercomputer and in Finland with the LUMI supercomputer — setting the bar for AI sustainability best practice globally.

The message is clear: as AI becomes a bigger part of digital transformation projects, so too must the consideration for resource-efficient solutions grow. AI sustainability considerations must be factored into each stage of the AI life cycle, with solutions like DLC playing a part in in a multifaceted IT sustainability blueprint.

By working together with governments to set effective and actionable environmental frameworks and benchmarks, we can encourage the growth and evolution of the AI industry, spurring dynamic innovation in solutions and data centre design for the benefit of all.

1. AI is set to drive surging electricity demand from data centres while offering the potential to transform how the energy sector works – News – IEA
2. https://www.hpe.com/us/en/newsroom/blog-post/2024/08/liquid-cooling-a-cool-approach-for-ai.html
3. HPE introduces next-generation ProLiant servers engineered for advanced security, AI automation and greater performance
4. https://www.aph.gov.au/Parliamentary_Business/Committees/Senate/Adopting_Artificial_Intelligence_AI

Image credit: iStock.com/Dragon Claws

The Rise of Chemical Recycling: What Recyclers Should Know

The Rise of Chemical Recycling: What Recyclers Should Know

During WWII, plastic appeared as a “material with 1,000 uses.” Fast forward to today, when global production of plastic has surpassed 359 million tons. While plastic has been helpful in many areas, it’s also created problems within the environment. Microscopic particles of plastic are in the soil, air, and water. They’re in animals, fish, and […]
The post The Rise of Chemical Recycling: What Recyclers Should Know appeared first on RecycleNation.

0 Comments