Search

The US is spending billions to reduce forest fire risks

We are an online community created around a smart and easy to access information hub which is focused on providing proven global and local insights about sustainability

17 Sep, 2023

This post was originally published on Sustainability Times

Source: Sustainability Times

Photo: Pixabay/DaveMeier

The U.S. government is investing over US$7 billion in the coming years to try to manage the nation’s escalating wildfire crisis. That includes a commitment to treat at least 60 million acres in the next 10 years by expanding forest-thinning efforts and controlled burns.

While that sounds like a lot – 60 million acres is about the size of Wyoming – it’s nowhere close to enough to treat every acre that needs it.

So, where can taxpayers get the biggest bang for the buck?

I’m a fire ecologist in Montana. In a new study, my colleagues and I mapped out where forest treatments can do the most to simultaneously protect communities – by preventing wildfires from turning into disasters – and also protect the forests and the climate we rely on, by keeping carbon out of the atmosphere and stored in healthy soils and trees.

Wildfires are becoming more severe

Forests and fires have always been intertwined in the West. Fires in dry conifer forests like ponderosa pine historically occurred frequently, clearing out brush and small trees in the understory. As a result, fires had less fuel and tended to stay on the ground, doing less damage to the larger, older trees.

That changed after European colonization of North America ushered in a legacy of fire suppression that wouldn’t be questioned until the 1960s. In the absence of fire, dry conifer forests accumulated excess fuel that now allows wildfires to climb into the canopy.

In addition to excess fuels, all forest types are experiencing hotter and drier wildfire seasons due to climate change. And the expanding number of people living in and near forests, and their roads and power lines, increases the risk of wildfire ignitions. Collectively, it’s not surprising that more area is burning at high severity in the West.

In response, the U.S. is facing increasing pressure to protect communities from high-severity wildfire, while also reducing the country’s impact on climate change – including from carbon released by wildfires.

High-risk areas that meet both goals

To find the locations with greatest potential payoff for forest treatments, we started by identifying areas where forest carbon is more likely to be lost to wildfires compared to other locations.

photo: Pixabay/Schwoaze

In each area, we considered the likelihood of wildfire and calculated how much forest carbon might be lost through smoke emissions and decomposition. Additionally, we evaluated whether the conditions in burned areas would be too stressful for trees to regenerate over time. When forests regrow, they absorb carbon dioxide from the atmosphere and lock it away in their wood, eventually making up for the carbon lost in the fire.

In particular, we found that forests in California, New Mexico and Arizona were more likely to lose a large portion of their carbon in a wildfire and also have a tough time regenerating because of stressful conditions.

When we compared those areas to previously published maps detailing high wildfire risk to communities, we found several hot spots for simultaneously reducing wildfire risk to communities and stabilizing stored carbon.

Forests surrounding Flagstaff, Arizona; Placerville, California; Colorado Springs, Colorado; Hamilton, Montana; Taos, New Mexico; Medford, Oregon, and Wenatchee, Washington, are among locations with good opportunities for likely achieving both goals.

Why treating forests is good for carbon, too

Forest thinning is like weeding a garden: It removes brush and small trees in dry conifer forests to leave behind space for the larger, older trees to continue growing.

Repeatedly applying controlled burns maintains that openness and reduces fuels in the understory. Consequently, when a wildfire occurs in a thinned and burned area, flames are more likely to remain on the ground and out of the canopy.

Although forest thinning and controlled burning remove carbon in the short term, living trees are more likely to survive a subsequent wildfire. In the long term, that’s a good outcome for carbon and climate. Living trees continue to absorb and store carbon from the atmosphere, as well as provide critical seeds and shade for seedlings to regenerate, grow and recover the carbon lost to fires.

Of course, forest thinning and controlled burning are not a silver bullet. Using the National Fire Protection Agency’s Firewise program’s advice and recommended materials will help people make their properties less vulnerable to wildfires.

Allowing wildfires to burn under safe conditions can reduce future wildfire severity. And the world needs to rapidly transition away from fossil fuels to curb climate change impacts that increase the risk of wildfires becoming community disasters.

This article was written by Jamie Peeler, a postdoctoral research fellow at the University of Montana. It is republished from The Conversation under a Creative Commons license. Read the original article.

The post The US is spending billions to reduce forest fire risks appeared first on Sustainability Times.

Pass over the stars to rate this post. Your opinion is always welcome.
[Total: 0 Average: 0]

You may also like…

‘Poisoning the Well’ Authors Sharon Udasin and Rachel Frazin on PFAS Contamination and Why It ‘Has Not Received the Attention It Deserves’

‘Poisoning the Well’ Authors Sharon Udasin and Rachel Frazin on PFAS Contamination and Why It ‘Has Not Received the Attention It Deserves’

In the introduction to Sharon Udasin and Rachel Frazin’s new book, Poisoning The Well: How Forever Chemicals Contaminated America, the authors cite an alarming statistic from 2015 that PFAS (per- and polyfluoroalkyl substances) are present in the bodies of an estimated 97% of Americans. How did we ever get to this point? Their book is […]
The post ‘Poisoning the Well’ Authors Sharon Udasin and Rachel Frazin on PFAS Contamination and Why It ‘Has Not Received the Attention It Deserves’ appeared first on EcoWatch.

The Rise of Chemical Recycling: What Recyclers Should Know

The Rise of Chemical Recycling: What Recyclers Should Know

During WWII, plastic appeared as a “material with 1,000 uses.” Fast forward to today, when global production of plastic has surpassed 359 million tons. While plastic has been helpful in many areas, it’s also created problems within the environment. Microscopic particles of plastic are in the soil, air, and water. They’re in animals, fish, and […]
The post The Rise of Chemical Recycling: What Recyclers Should Know appeared first on RecycleNation.

Turning down the heat: how innovative cooling techniques are tackling the rising costs of AI's energy demands

Turning down the heat: how innovative cooling techniques are tackling the rising costs of AI's energy demands

As enterprises accelerate their AI investments, the energy demand of AI’s power-hungry systems is worrying both the organisations footing the power bills as well as those tasked with supplying reliable electricity. From large language models to digital twins crunching massive datasets to run accurate simulations on complex city systems, AI workloads require a tremendous amount of processing power.

Of course, at the heart of this demand are data centres, which are evolving at breakneck speed to support AI’s growing potential. The International Energy Agency’s AI and Energy Special Report recently predicted that data centre electricity consumption will double by 2030, identifying AI as the most significant driver of this increase.1

The IT leaders examining these staggering predictions are rightly zeroing in on improving the efficiency of these powerful systems. However, the lack of expertise in navigating these intricate systems, combined with the rapidity of innovative developments, is causing heads to spin. Although savvy organisations are baking efficiency considerations into IT projects at the outset, and are looking across the entire AI life cycle for opportunities to minimise impact, many don’t know where to start or are leaving efficiency gains on the table. Most are underutilising the multiple IT efficiency levers that could be pulled to reduce the environmental footprint of their IT, such as using energy-efficient software languages and optimising data use to ensure maximum data efficiency of AI workloads. Among the infrastructure innovations, one of the most exciting advancements we are seeing in data centres is direct liquid cooling (DLC). Because the systems that are running AI workloads are producing more heat, traditional air cooling simply is not enough to keep up with the demands of the superchips in the latest systems.

DLC technology pumps liquid coolants through tubes in direct contact with the processors to dissipate heat and has been proven to keep high-powered AI systems running safely. Switching to DLC has had measurable and transformative impact across multiple environments, showing reductions in cooling power consumption by nearly 90% compared to air cooling in supercomputing systems2.

Thankfully, the benefits of DLC are now also extending beyond supercomputers to reach a broader range of higher-performance servers that support both supercomputing and AI workloads. Shifting DLC from a niche offering to a more mainstream option available across more compute systems is enabling more organisations to tap into the efficiency gains made possible by DLC, which in some cases has been shown to deliver up to 65% in annual power savings3. Combining this kind of cooling innovation with new and improved power-use monitoring tools, able report highly accurate and timely insights, is becoming critical for IT teams wanting to optimise their energy use. All this is a welcome evolution for organisations grappling with rising energy costs and that are carefully considering total cost of ownership (TCO) of their IT systems, and is an area of innovation to watch in the coming years.

In Australia, this kind of technical innovation is especially timely. In March 2024, the Australian Senate established the Select Committee on Adopting Artificial Intelligence to examine the opportunities and impacts of AI technologies4. Among its findings and expert submissions was a clear concern about the energy intensity of AI infrastructure. The committee concluded that the Australian Government legislate for increased regulatory clarity, greater energy efficiency standards, and increased investment in renewable energy solutions. For AI sustainability to succeed, it must be driven by policy to set actionable standards, which then fuel innovative solutions.

Infrastructure solutions like DLC will play a critical role in making this possible — not just in reducing emissions and addressing the energy consumption challenge, but also in supporting the long-term viability of AI development across sectors. We’re already seeing this approach succeed in the real world. For example, the Pawsey Supercomputing Centre in Western Australia has adopted DLC technology to support its demanding research workloads and, in doing so, has significantly reduced energy consumption while maintaining the high performance required for AI and scientific computing. It’s a powerful example of how AI data centres can scale sustainably — and telegraphs an actionable blueprint for others to follow.

Furthermore, industry leaders are shifting how they handle the heat generated by these large computing systems in order to drive further efficiency in AI. Successfully using heat from data centres for other uses will be a vital component to mitigating both overall energy security risks and the efficiency challenges that AI introduces. Data centres are being redesigned to capture by-product heat and use it as a valuable resource, rather than dispose of it as waste heat. Several industries are already benefiting from capturing data centre heat, such as in agriculture for greenhouses, or heating buildings in healthcare and residential facilities. This has been successfully implemented in the UK with the Isambard-AI supercomputer and in Finland with the LUMI supercomputer — setting the bar for AI sustainability best practice globally.

The message is clear: as AI becomes a bigger part of digital transformation projects, so too must the consideration for resource-efficient solutions grow. AI sustainability considerations must be factored into each stage of the AI life cycle, with solutions like DLC playing a part in in a multifaceted IT sustainability blueprint.

By working together with governments to set effective and actionable environmental frameworks and benchmarks, we can encourage the growth and evolution of the AI industry, spurring dynamic innovation in solutions and data centre design for the benefit of all.

1. AI is set to drive surging electricity demand from data centres while offering the potential to transform how the energy sector works – News – IEA
2. https://www.hpe.com/us/en/newsroom/blog-post/2024/08/liquid-cooling-a-cool-approach-for-ai.html
3. HPE introduces next-generation ProLiant servers engineered for advanced security, AI automation and greater performance
4. https://www.aph.gov.au/Parliamentary_Business/Committees/Senate/Adopting_Artificial_Intelligence_AI

Image credit: iStock.com/Dragon Claws

0 Comments