Search

An Arbor Day Question – Are there too many trees or not enough?

We are an online community created around a smart and easy to access information hub which is focused on providing proven global and local insights about sustainability

28 Apr, 2023

This post was originally published on Healthy Forest

By Kyle Johnson

Editors note: Kyle Johnson is a forester with the Bureau of Land Management’s Missoula Field Office. Mr. Johnson is not affiliated with Healthy Forests, Healthy Communities but gave us permission to share his Arbor Day message.

For Arbor Day (this Saturday April 29th) I thought we’d tackle a question that’s been vexing me for a while: How can it be that we often hear forests are imperiled, while also hearing that our forests are overcrowded? Are there too many trees or not enough? Like I said, this question often bothers me because I feel that the lay person and public at large may be confused by these seemingly contradicting messages, so let’s take a deeper look.

As mentioned in previous posts, here in the USA we are fortunate to have some of the most robust and stringent environmental laws and regulations in the world. Our collective awareness of taking care of the environment is amazing and also a benefit of being a developed, first-world country. Many nations in the world are still struggling to get by, and taking full advantage of their natural resources is one of the few avenues they have for profit let alone survival. This is the same as we did here in the US not that long ago, and I don’t write this from a place of judgement, but understanding. That said, there are indeed places and especially the jungle regions where deforestation and exploitation of the natural resources are advancing at a rate that they may forever lost if we as a global community don’t change our course. And I’ll add that we in the US contribute to these seemingly far away problems by buying from those regions and helping to fund or incentivize the destruction. Closer to home, habitat loss is a real problem in our nation as well, typically due to development and urban sprawl. Our cultural pursuit of “a place of our own” often comes at the cost of the wild critters who once lived there. So it’s true, globally deforestation and loss of habitat is a real problem facing our forests.

Conversely, here at home our public forests are by and large, overcrowded and suffering and epidemic of too many trees. This is due in large part to fire suppression that became national policy around 1915, and past harvesting practices that replaced stands of large, widely spaced trees with regenerated stands of small trees, densely stocked. It sounds counter intuitive perhaps to suggest that cutting more trees is the cure for cutting trees in the past, but that is in fact often the case. By selective harvesting, thinning and returning fire to the landscape we aim to produce healthier trees and by extension healthier forests that may one day again resemble their natural condition, and will be resilient to the challenges of a changing climate. While we have too many trees in these forests today, that could all change in the future if they succumb to insects, disease and megafires that are becoming more and more common in recent years.

It’s a tough nut to crack perhaps, and I appreciate your patience as I try to unravel it with you. The truth is there too many trees and not enough at the same time at the global scale, and we have to look a little closer to get a clearer picture. With that said, if you are inspired in plant a tree in your yard for Arbor Day, please do! “Blessed Are Those Who Plant Trees Under Whose Shade They Will Never Sit” – Unknown.

Pass over the stars to rate this post. Your opinion is always welcome.
[Total: 0 Average: 0]

Source: Healthy Forest

You may also like…

‘Poisoning the Well’ Authors Sharon Udasin and Rachel Frazin on PFAS Contamination and Why It ‘Has Not Received the Attention It Deserves’

‘Poisoning the Well’ Authors Sharon Udasin and Rachel Frazin on PFAS Contamination and Why It ‘Has Not Received the Attention It Deserves’

In the introduction to Sharon Udasin and Rachel Frazin’s new book, Poisoning The Well: How Forever Chemicals Contaminated America, the authors cite an alarming statistic from 2015 that PFAS (per- and polyfluoroalkyl substances) are present in the bodies of an estimated 97% of Americans. How did we ever get to this point? Their book is […]
The post ‘Poisoning the Well’ Authors Sharon Udasin and Rachel Frazin on PFAS Contamination and Why It ‘Has Not Received the Attention It Deserves’ appeared first on EcoWatch.

Turning down the heat: how innovative cooling techniques are tackling the rising costs of AI's energy demands

Turning down the heat: how innovative cooling techniques are tackling the rising costs of AI's energy demands

As enterprises accelerate their AI investments, the energy demand of AI’s power-hungry systems is worrying both the organisations footing the power bills as well as those tasked with supplying reliable electricity. From large language models to digital twins crunching massive datasets to run accurate simulations on complex city systems, AI workloads require a tremendous amount of processing power.

Of course, at the heart of this demand are data centres, which are evolving at breakneck speed to support AI’s growing potential. The International Energy Agency’s AI and Energy Special Report recently predicted that data centre electricity consumption will double by 2030, identifying AI as the most significant driver of this increase.1

The IT leaders examining these staggering predictions are rightly zeroing in on improving the efficiency of these powerful systems. However, the lack of expertise in navigating these intricate systems, combined with the rapidity of innovative developments, is causing heads to spin. Although savvy organisations are baking efficiency considerations into IT projects at the outset, and are looking across the entire AI life cycle for opportunities to minimise impact, many don’t know where to start or are leaving efficiency gains on the table. Most are underutilising the multiple IT efficiency levers that could be pulled to reduce the environmental footprint of their IT, such as using energy-efficient software languages and optimising data use to ensure maximum data efficiency of AI workloads. Among the infrastructure innovations, one of the most exciting advancements we are seeing in data centres is direct liquid cooling (DLC). Because the systems that are running AI workloads are producing more heat, traditional air cooling simply is not enough to keep up with the demands of the superchips in the latest systems.

DLC technology pumps liquid coolants through tubes in direct contact with the processors to dissipate heat and has been proven to keep high-powered AI systems running safely. Switching to DLC has had measurable and transformative impact across multiple environments, showing reductions in cooling power consumption by nearly 90% compared to air cooling in supercomputing systems2.

Thankfully, the benefits of DLC are now also extending beyond supercomputers to reach a broader range of higher-performance servers that support both supercomputing and AI workloads. Shifting DLC from a niche offering to a more mainstream option available across more compute systems is enabling more organisations to tap into the efficiency gains made possible by DLC, which in some cases has been shown to deliver up to 65% in annual power savings3. Combining this kind of cooling innovation with new and improved power-use monitoring tools, able report highly accurate and timely insights, is becoming critical for IT teams wanting to optimise their energy use. All this is a welcome evolution for organisations grappling with rising energy costs and that are carefully considering total cost of ownership (TCO) of their IT systems, and is an area of innovation to watch in the coming years.

In Australia, this kind of technical innovation is especially timely. In March 2024, the Australian Senate established the Select Committee on Adopting Artificial Intelligence to examine the opportunities and impacts of AI technologies4. Among its findings and expert submissions was a clear concern about the energy intensity of AI infrastructure. The committee concluded that the Australian Government legislate for increased regulatory clarity, greater energy efficiency standards, and increased investment in renewable energy solutions. For AI sustainability to succeed, it must be driven by policy to set actionable standards, which then fuel innovative solutions.

Infrastructure solutions like DLC will play a critical role in making this possible — not just in reducing emissions and addressing the energy consumption challenge, but also in supporting the long-term viability of AI development across sectors. We’re already seeing this approach succeed in the real world. For example, the Pawsey Supercomputing Centre in Western Australia has adopted DLC technology to support its demanding research workloads and, in doing so, has significantly reduced energy consumption while maintaining the high performance required for AI and scientific computing. It’s a powerful example of how AI data centres can scale sustainably — and telegraphs an actionable blueprint for others to follow.

Furthermore, industry leaders are shifting how they handle the heat generated by these large computing systems in order to drive further efficiency in AI. Successfully using heat from data centres for other uses will be a vital component to mitigating both overall energy security risks and the efficiency challenges that AI introduces. Data centres are being redesigned to capture by-product heat and use it as a valuable resource, rather than dispose of it as waste heat. Several industries are already benefiting from capturing data centre heat, such as in agriculture for greenhouses, or heating buildings in healthcare and residential facilities. This has been successfully implemented in the UK with the Isambard-AI supercomputer and in Finland with the LUMI supercomputer — setting the bar for AI sustainability best practice globally.

The message is clear: as AI becomes a bigger part of digital transformation projects, so too must the consideration for resource-efficient solutions grow. AI sustainability considerations must be factored into each stage of the AI life cycle, with solutions like DLC playing a part in in a multifaceted IT sustainability blueprint.

By working together with governments to set effective and actionable environmental frameworks and benchmarks, we can encourage the growth and evolution of the AI industry, spurring dynamic innovation in solutions and data centre design for the benefit of all.

1. AI is set to drive surging electricity demand from data centres while offering the potential to transform how the energy sector works – News – IEA
2. https://www.hpe.com/us/en/newsroom/blog-post/2024/08/liquid-cooling-a-cool-approach-for-ai.html
3. HPE introduces next-generation ProLiant servers engineered for advanced security, AI automation and greater performance
4. https://www.aph.gov.au/Parliamentary_Business/Committees/Senate/Adopting_Artificial_Intelligence_AI

Image credit: iStock.com/Dragon Claws

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Add your own review

Rating

This site uses Akismet to reduce spam. Learn how your comment data is processed.