Search

Access, not mobility

We are an online community created around a smart and easy to access information hub which is focused on providing proven global and local insights about sustainability

13 Jul, 2020

This post was originally published on 15 Minutes City

It’s not about how fast you can go

Charlottesville, Virginia | UrbanizeHub

“Location, location, location.” It’s a real-estate mantra for very good reason. WHERE you are in a city is much more important than how fast you can move. Being at or close to the things you need is the first rule of real estate. And the second rule. And the third!

Mobility?

For most of the last century, transportation and urban planning discussions have focused not on location, but on speed. The planning keyword has been mobility: helping people get from point A to point B as quickly as possible. Live anywhere you want, and planners will build ways to quickly move you to where you need to be.

Here’s the fundamental flaw in mobility-focused planning: it is based on the fantastical idea that points A and B are forever fixed. But cities are not fixed. Cities evolve, people’s lives change, and travel patterns adjust. As travel speeds increase, people take jobs further away. Grocery stores relocate. A family’s circle of daily errands expands. Points A and B (not to mention C, D, and E) get ever farther apart. And as they get farther apart, “good” mobility (high speed) starts to look like this:

Singapore | Singapore Land Transport Authority

Singapore | Singapore Land Transport Authority

In the United States, the focus on mobility has funneled billions of dollars of investment into transportation infrastructure designed to get people moving further and further distances at higher and higher speeds. Predictably, rather than making things easier to get to, this investment has mostly pushed these destinations out over a wider area.

In evaluating proposals, U.S. transportation planners focus on “level of service” (LOS) traffic flow standards; in most places, this method is mandated by environmental review laws. LOS measures the amount of delay experienced by motor vehicles and therefore only grades streets based on their ability to process motor vehicles. LOS evaluation completely ignores those who are not in a motor vehicle or whose mobility is impaired. It does not measure the economic value of a street, nor does it bring destinations closer.

No, access

Mobility – speed – is merely a means to an end. The purpose of mobility is to get somewhere, to points B, C, D, and E, wherever they may be. It’s the “getting somewhere” — the access to services and jobs — that matters [1]. Strong Towns contributor Daniel Herriges defines the distinction like this: “Mobility is how far you can go in a given amount of time. Accessibility is how much you can get to in that time.” Good access comes from having a diversity of services intermingled within your own neighborhood, so you don’t have to go all the way across town — or outside of town — to get to what you need. Here’s what good access looks like:

Maastricht, The Netherlands | Gali Freund

Maastricht, The Netherlands | Gali Freund

Fortunately, the policy landscape is slowly beginning to acknowledge this. There is growing awareness among decision makers as well as advocates that access, not mobility, should guide planning. Signs of hope include the California Environmental Quality Act, which was updated in 2019 to shift the focus from LOS to “Vehicle Miles Traveled” (VMT). This still does not measure access or economic value, but it’s a step in the right direction. At the national level, the new Future of Transportation Caucus, a group of 12 Democrats in the U.S. House of Representatives, has introduced legislation to refocus transportation planning on improving access to services and jobs, and to create service standards for planners reviewing and implementing transportation projects. Organizations such as Transportation for America have made connecting people to jobs and services a key plank of their advocacy platform.

Most promisingly, the full U.S. House of Representatives recently passed an overarching spending bill called the Moving Forward Act. The transportation section of this bill requires recipients of federal transportation funding to measure how well their system connects people to the things they need, regardless of how they travel. (Here is Transportation for America’s analysis of this bill. Smart Growth America lauds the bill for remaking transportation policy in a more equitable way.) The bill is considered “dead on arrival” in the U.S. Senate, but even if this particular bill doesn’t become law, the House’s passage of it confirms that access is on its way to becoming the new yardstick in measuring transportation investments. The Moving Forward Act will die — at least in its current form — but access is here to stay.

Access and the 15-minute city

This policy evolution is great news for the development of 15-minute cities. A focus on access puts people’s needs — not their speed of travel — at the forefront of planning decisions. This will result in a much more equitable planning decision, and over time it is likely to make transportation investments less costly, as pedestrians and cyclists are much cheaper to provide infrastructure for. As key services and jobs become closer and more accessible, the overall demand for travel will decline. And as urban land is used in a more efficient and compact way, municipal budgets will improve as tax receipts per acre go up [2].

That, ultimately, is mobility done right.


[1] Ideally the journey itself should be pleasurable, and at a minimum must feel safe. These both need much more consideration and are topics for future blog posts.

[2] The fiscal impacts of urban land use will also be the subject of deeper attention in a future blog post, but Joseph Minicozzi of Urban3 has written a great primer on this topic, and Charles Marohn’s Strong Towns book focuses on this.

Pass over the stars to rate this post. Your opinion is always welcome.
[Total: 0 Average: 0]

You may also like…

‘Poisoning the Well’ Authors Sharon Udasin and Rachel Frazin on PFAS Contamination and Why It ‘Has Not Received the Attention It Deserves’

‘Poisoning the Well’ Authors Sharon Udasin and Rachel Frazin on PFAS Contamination and Why It ‘Has Not Received the Attention It Deserves’

In the introduction to Sharon Udasin and Rachel Frazin’s new book, Poisoning The Well: How Forever Chemicals Contaminated America, the authors cite an alarming statistic from 2015 that PFAS (per- and polyfluoroalkyl substances) are present in the bodies of an estimated 97% of Americans. How did we ever get to this point? Their book is […]
The post ‘Poisoning the Well’ Authors Sharon Udasin and Rachel Frazin on PFAS Contamination and Why It ‘Has Not Received the Attention It Deserves’ appeared first on EcoWatch.

Turning down the heat: how innovative cooling techniques are tackling the rising costs of AI's energy demands

Turning down the heat: how innovative cooling techniques are tackling the rising costs of AI's energy demands

As enterprises accelerate their AI investments, the energy demand of AI’s power-hungry systems is worrying both the organisations footing the power bills as well as those tasked with supplying reliable electricity. From large language models to digital twins crunching massive datasets to run accurate simulations on complex city systems, AI workloads require a tremendous amount of processing power.

Of course, at the heart of this demand are data centres, which are evolving at breakneck speed to support AI’s growing potential. The International Energy Agency’s AI and Energy Special Report recently predicted that data centre electricity consumption will double by 2030, identifying AI as the most significant driver of this increase.1

The IT leaders examining these staggering predictions are rightly zeroing in on improving the efficiency of these powerful systems. However, the lack of expertise in navigating these intricate systems, combined with the rapidity of innovative developments, is causing heads to spin. Although savvy organisations are baking efficiency considerations into IT projects at the outset, and are looking across the entire AI life cycle for opportunities to minimise impact, many don’t know where to start or are leaving efficiency gains on the table. Most are underutilising the multiple IT efficiency levers that could be pulled to reduce the environmental footprint of their IT, such as using energy-efficient software languages and optimising data use to ensure maximum data efficiency of AI workloads. Among the infrastructure innovations, one of the most exciting advancements we are seeing in data centres is direct liquid cooling (DLC). Because the systems that are running AI workloads are producing more heat, traditional air cooling simply is not enough to keep up with the demands of the superchips in the latest systems.

DLC technology pumps liquid coolants through tubes in direct contact with the processors to dissipate heat and has been proven to keep high-powered AI systems running safely. Switching to DLC has had measurable and transformative impact across multiple environments, showing reductions in cooling power consumption by nearly 90% compared to air cooling in supercomputing systems2.

Thankfully, the benefits of DLC are now also extending beyond supercomputers to reach a broader range of higher-performance servers that support both supercomputing and AI workloads. Shifting DLC from a niche offering to a more mainstream option available across more compute systems is enabling more organisations to tap into the efficiency gains made possible by DLC, which in some cases has been shown to deliver up to 65% in annual power savings3. Combining this kind of cooling innovation with new and improved power-use monitoring tools, able report highly accurate and timely insights, is becoming critical for IT teams wanting to optimise their energy use. All this is a welcome evolution for organisations grappling with rising energy costs and that are carefully considering total cost of ownership (TCO) of their IT systems, and is an area of innovation to watch in the coming years.

In Australia, this kind of technical innovation is especially timely. In March 2024, the Australian Senate established the Select Committee on Adopting Artificial Intelligence to examine the opportunities and impacts of AI technologies4. Among its findings and expert submissions was a clear concern about the energy intensity of AI infrastructure. The committee concluded that the Australian Government legislate for increased regulatory clarity, greater energy efficiency standards, and increased investment in renewable energy solutions. For AI sustainability to succeed, it must be driven by policy to set actionable standards, which then fuel innovative solutions.

Infrastructure solutions like DLC will play a critical role in making this possible — not just in reducing emissions and addressing the energy consumption challenge, but also in supporting the long-term viability of AI development across sectors. We’re already seeing this approach succeed in the real world. For example, the Pawsey Supercomputing Centre in Western Australia has adopted DLC technology to support its demanding research workloads and, in doing so, has significantly reduced energy consumption while maintaining the high performance required for AI and scientific computing. It’s a powerful example of how AI data centres can scale sustainably — and telegraphs an actionable blueprint for others to follow.

Furthermore, industry leaders are shifting how they handle the heat generated by these large computing systems in order to drive further efficiency in AI. Successfully using heat from data centres for other uses will be a vital component to mitigating both overall energy security risks and the efficiency challenges that AI introduces. Data centres are being redesigned to capture by-product heat and use it as a valuable resource, rather than dispose of it as waste heat. Several industries are already benefiting from capturing data centre heat, such as in agriculture for greenhouses, or heating buildings in healthcare and residential facilities. This has been successfully implemented in the UK with the Isambard-AI supercomputer and in Finland with the LUMI supercomputer — setting the bar for AI sustainability best practice globally.

The message is clear: as AI becomes a bigger part of digital transformation projects, so too must the consideration for resource-efficient solutions grow. AI sustainability considerations must be factored into each stage of the AI life cycle, with solutions like DLC playing a part in in a multifaceted IT sustainability blueprint.

By working together with governments to set effective and actionable environmental frameworks and benchmarks, we can encourage the growth and evolution of the AI industry, spurring dynamic innovation in solutions and data centre design for the benefit of all.

1. AI is set to drive surging electricity demand from data centres while offering the potential to transform how the energy sector works – News – IEA
2. https://www.hpe.com/us/en/newsroom/blog-post/2024/08/liquid-cooling-a-cool-approach-for-ai.html
3. HPE introduces next-generation ProLiant servers engineered for advanced security, AI automation and greater performance
4. https://www.aph.gov.au/Parliamentary_Business/Committees/Senate/Adopting_Artificial_Intelligence_AI

Image credit: iStock.com/Dragon Claws

0 Comments