Understanding AI’s Water Use: Definitions matter

When it comes to debunking nonsense numbers, you can’t do better than the BBC’s More or Less radio programme and podcast.

The team includes the economist and writer Tim Harford. They regularly take politicians and companies to task for crimes against statistics and address issues related to the accurate representation of data.

So it’s not surprising then that the More or Less team recently turned their analytical guns on the misinformation that is flowing (pun intended) around data center water consumption. (Note: the incidents highlighted in the episode actually date back to late 2025 or early 2026. The issue is obviously still very much alive and relevant).

The episode focused around a book called Empire of AI which stated that AI demand could drive up consumption of fresh water to 1.1 trillion to 17 trillion gallons (4 to 6 trillion litres) of fresh water per year by 2027. This amount is apparently equivalent to half the water consumed in the UK.

Consumption vs withdrawal

However, as More or Less explained, when it comes to complex topics such as AI and water usage, definitions matter. It seems that Empire of AI potentially conflated water consumption with withdrawal. The 1.1 to 1.7 trillion gallon figure was sourced from a University of California, Riverside study that was actually for water withdrawal rather than water consumption. There is an important difference:

+ Water withdrawal refers to the amount of water that gets taken out of system or source. Importantly, some of that water will be consumed but some of it will also be returned. (Measures demand)

+ Water consumption is a sub-set of withdrawal and refers to water that is taken out of the water system and importantly not returned. (Measures loss)

The discrepancy was highlighted by independent researcher Andy Masley who published a detailed analysis of the claims and potential discrepancies in his substack which is well worth reading for more depth on this topic. The author of the Empire of AI acknowledged the mistake and issued a correction. The correction also contained links to some other important sources on the topic.

However, according to the More or Less team, the story does not end there. The University of California, Riverside paper also has some potential issues, they claim. The water usage numbers in the paper are apparently based on an estimate for global electricity use in 2027. However, it seems the author of those numbers, Alex de Vries-Gao, was only basing his analysis on servers that could be deployed in 2027 and excluded all existing infrastructure.

So now you have an overestimate intertwined with a potential underestimate!

Some further analysis by Vries-Gao to estimate total AI server capacity, and eventual water consumption, postulated that AI systems at the end of 2025 were consuming water at a rate of up to 750 billion litres of water, according to More or Less.

Is that a big number?

As usual, the programme likes to always ask the question: “Is that a big number?”

The answer seems to be yes – in fact it could exceed the total global consumption of bottled water which is more than 450 billion litres, according to Vries-Gao. However, another important caveat is only about 10 percent of that consumption is happening on-site at data centres, the rest is mostly happening at power stations.

Also the original consumption vs withdrawal issue is important to consider. While they are different, withdrawal is actually a more important consideration than actual consumption.

Power availability: the ultimate bottleneck

The sign-off to the episode unsurprisingly actually focused on power availability rather than water consumption or withdrawal. Accurate estimates on future water use depend on projecting installed AI infrastructure capacity accurately. Growing power availability constraints are making these projections increasingly difficult to predict accurately.

It’s also important to remember that the More or Less programme was created for a generalist audience and the issue of AI and Data Center water use has a huge number of other variables including direct liquid vs air-based cooling, silicon diversification (GPUs vs other chip types and the heat rejection implications), training vs inference (centralised vs distributed AI).

What is clear is that we have entered the third-wave of sustainable IT and this time is very different. The previous two cycles – in the early 2000s and 2010s – were largely driven by the industry responding to governmental scrutiny. This time it is much more grass-roots and bottom-up with community groups often leading the charge. The industry is responding. However, it remains to be seen if the response will match the speed and scale of the actual data center build outs.

Data Center Thought Leadership: Vertiv Frontiers

Vertiv Frontiers looks at the macro forces and technology trends shaping the future of the data center industry.

Proud to have finally published the Vertiv Frontiers report which looks at four macro forces and five key technology trends informing the future of the data center.

These range from Extreme densification, and Gigawatt scaling to Powering up for AI and Adaptive resilient liquid cooling:

After two decades of steady evolution – when cloud computing reshaped location and scale but core infrastructure remained largely constant – the next wave of transformation is accelerating at unprecedented speed.

Driven by AI and accelerated compute, this new era is redefining how digital infrastructure is designed, deployed, and scaled. The pace of change is unmatched, creating new possibilities to push the frontiers of innovation.

Vertiv Frontiers offers a lens on the future – an exploration of macro forces and the technology trends reshaping digital infrastructure.

It brings together the expertise of Vertiv specialists across power, thermal, IT systems, prefabricated modular infrastructure, advanced services, and AI infrastructure, reinforcing Vertiv’s position as a leading voice guiding the future of critical digital infrastructure.

The report was published with close cooperation from Vertiv’s chief technology and product officer Scott Armul, as well as other internal thought leaders such as Martin Olsen, Peter Panfil and Steve Madera.

Thanks to all of the other contributors from the Creative, Content and Web teams who helped to get this published.

The full report is available at Vertiv.com

Must-Read Research Blog: It’s the data center economy, stupid! Or is it?

Excerpt from Vertiv Must-Read Research Blog

Vertiv Must-Read Research blog
Vertiv Must-Read Research blog

Some fundamental truths cut through.

“It’s the economy, stupid”, did so for presidential hopeful Bill Clinton in the early 1990s. The phrase, and the strategy behind it, allegedly helped the Democratic candidate become US president. Focusing on the economy, while painting opponent George H.W. Bush as out of touch, sealed the deal with US voters.

The upshot? Prioritizing economics is always a sensible strategy. Unless perhaps that is you happen to be a hyperscale data center operator in the mid-2020s. Some might argue that short term economics have been deprioritized in the current AI arms-race.

Billions are being spent on new capacity by large operators, and LLM owners, who believe that almost no price is too high to pay right now to achieve future AI supremacy. According to the Financial Times, spending on data centers will jump from $333bn in 2024 to about $1trn in 2030. More than 80 percent of that investment will go into AI-related infrastructure.

But that level of AI investment is not representative of the totality of data center owners and operators. For the vast majority, made up of smaller enterprise owned facilities, shorter term economics and cost control is still a key consideration.

For more go to Vertiv.com

Must-Read Research Blog: Bringing the Cloud Back Down to (Sovereign) Earth

Must-Read Research

Excerpt from Vertiv Must-Read Research Blog

Disruptive technologies, by their very nature, are intended to upend the existing status quo.

Digital cameras have usurped traditional film and electric vehicles will eventually do the same to the combustion engine.

However, while change may be inevitable, invariably it is not pretty: often the disrupted party refuses to go quietly. The intangible future collides with the very tangible present and the status quo usually likes to do some disrupting back.

Take ridesharing apps for example. The mission of ridesharing companies is to revolutionize urban transport for the betterment of all – unless, that is, if you happen to be deeply invested in the existing taxi industry, for example.

True, a lot of headway has been made, but ridesharing app companies have also faced a lot of roadblocks – both figuratively and literally. From demonstrations of aggrieved taxi-drivers, to outright bans in some cities. Plans to create a virtual workforce of self-employed drivers have also come face-to-face with the realities of employment law.

More more go to Vertiv.com

Must-Read Research Blog: The Digital Resilience Rope-a-Dope

Excerpt from Vertiv Must-Read Research blog

Vertiv Must-Read Research
Vertiv Must-Read Research blog

The dictionary definition of resilience is: ‘the capacity to recover quickly from difficulties; toughness’.

When it comes to current real-life examples of resilience, we should probably look no further than front-line health workers, teachers, or anyone else who has stoically provided vital services amidst the pandemic.

Pre-pandemic however, we might have turned to a more obvious example of toughness: a boxer perhaps. And if you’re looking for a pugilist to illustrate a point, then there is no better example than Muhammad Ali.

One of Ali’s most famous fights was the so-called ‘Rumble in the Jungle’, in Zaire (now the Democratic Republic of Congo) in 1974. Sometimes referred to as ‘arguably the greatest sporting event of the 20thCentury’, a 32-year-old Ali found himself up against a younger and stronger world champion – 25-year-old George Foreman. Despite his fame and experience, Ali was the 4-1 underdog going into the match.

Float Like a Butterfly

As the story goes, in the run-up to the fight, Ali reinforced the preconception that he would use his famous agility – “float like a butterfly, sting like a bee” – to ‘dance’ around the slower, but stronger, Foreman. But the reality on the day was quite different. Rather than avoid Foreman, Ali seemingly allowed himself to get cornered. Not only that, but he often leant back into the ropes and let his body, but also crucially the ropes, soak-up the punishment. The result? Foreman eventually tired himself out and Ali won the fight. The ‘rope-a-dope’, as it has become known, proved decisive.

For more go to Vertiv.com

Must-Read Research Blog: Pandemic-Proof Data Centers Offer Hope for the Future

Must-Read Research, Vertiv
Must-Read Research, Vertiv

Excerpt from Vertiv Must-Read Research Blog.

The adage, ‘hope for the best, plan for the worst’, is particularly apt for the data center industry right now. 

Investment in new data center infrastructure is often based on an optimistic take of future technology demand but operators are also aggressively pragmatic when it comes to preventing downtime. 

No matter what the cause – faulty equipment, cybercriminals or grid-level power outages – investment in resilient infrastructure combined with rigorous operating practices should ensure the lights stay on or, at worst, only go off for the minimum amount of time. 

Unfortunately, as recently released research Post Pandemic Data Centers from Uptime Institute Intelligencepoints out, many operators were largely blindsided by Covid-19.

While there appear to have been relatively few (public) examples of Covid-19 related downtime over the last few months, the pandemic has put additional pressure on everything from data center design and construction to supply chains and staffing.

More at Vertiv.com

Crossing the edge knowledge chasm

 

achieve-1822503_1920My new role at Vertiv has kept me busy – in an exciting way – over the last couple of months so I’ve been a bit remiss in keeping this blog up to date.

However, events last week deserve a special mention.

Vertiv held its first Innovation Summit in Zagreb, Croatia on 16th and 17th April. The event had a regional focus with more than 300 delegates from across Central and Southern Europe from Croatia to Israel.

The central theme of the event was edge compute.

If that term sends a small shudder down your spine – it shouldn’t do. Edge is an important trend but it has also attracted a lot of hype which has at times outpaced technical clarity on what edge actually means in practice.

As Vertiv EMEA president Giordano Albertazzi puts it succinctly in this LinkedIn post, there is something of an edge knowledge gap out there between how some suppliers are using the term and  how end-users understand ‘the edge’.

“…one of the questions raised during a round table with journalists was about whether edge is really a new phenomenon or simply a re-branding exercise for existing branch office computing or content distribution networks?

I can understand that view, but I think edge as it is being defined now is most definitely something new and on a different scale to anything that we have seen before.

True, we have infrastructure – such as our range of prefabricated modular data centres manufactured outside of Zagreb – which predate the current focus on edge. But we are also already seeing demand for those systems in a range of new edge deployments.

So while there is certainly a ‘legacy edge’, there will also be a large number of clearly distinct and disruptive use cases which we believe will proliferate well beyond any pre-existing notions of edge.”

Vertiv is doing its bit to bridge the knowledge gap with an ongoing research project to put more meat on the bones of edge including defining a series of edge uses cases and archetypes.

The full edge research report is available from the Vertiv website.  

 

From commentator to supporter

Screen Shot 2018-03-09 at 11.36.12

It’s great to be able to finally reveal that I have joined Vertiv as director of influencer marketing*, EMEA. 

It’s a bit of shift from being an analyst and journalist for the last 20 years but hopefully a rewarding change.

I’m really looking forward to supporting a team after years spent in the commentary box. 

And Vertiv is a great company to be working with right now as it moves into the next phase of its standalone journey: combining a start-up ethos with a great heritage and reputation. 

*If you want to know more about influencer marketing in business to business – this white paper is a good start. 

Norway Wants to Win Hyper-Scale Gold in the Data Center Olympics

IMG_0086
Bergen, Norway

My latest and last (see below) Critical Thinking column over at Data Center Knowledge is on Norway’s bid to build up its data center industry.

To that end the Norwegian government recently published Powered by Nature: Norway as a Data Center Nation, a report that details the country’s credentials as an ideal data center location.

Coincidentally, I recently returned from my second trip to Norway, where I witnessed first-hand some of the things it has to offer data center operators.

IMG_0286 2
A chilly boat trip on the fjords just outside Bergen

First the positives: One that springs immediately to mind is the temperature. After visiting in February, I can report that Norway is indeed a great place for free cooling (the only thing that is free in Norway it seems). The temperature where I was, in Bergen, barely rose above 5°C (40°F), and it’s one of the warmer parts of the country, thanks to the Gulf Stream.

Screen Shot 2017-12-01 at 18.06.27
Mock-up of Lefdal Mine Datacentre

Norway also does have some established data center operators already, such as Green Mountain, Digiplex, and Basefarm. One of the most recent projects is also the most interesting: the Lefdal Mine Datacentre (which we have written about before) has ambitions to be the largest facility in Europe, and, as the name suggests, it is completely underground.

So given all that, why hasn’t Norway been able to attract a hyper-scale operator to date? Head over to Data Center Knowledge to read the full column.

NOTE: As mentioned above, this was my last column for Data Center Knowledge as I’ve got an exciting new role which I will be discussing soon. 

Big thanks to Yevgeniy Sverdlik and the team for allowing me to contribute to the great editorial over at Data Center Knowledge. I look forward to continuing to work with them in my new role. 

 

Would you trust Siri or Alexa to manage your datacenter?

Screen Shot 2018-02-19 at 14.14.19

Ok. The headline is a little extreme but it has some basis in truth.

This week I spoke with Litbit co-founder JP Balajadia whose company is developing AI personas to help with the management of critical infrastructure including datacenters.

Specifically we spoke about the deal the AI-start-up has done with CBRE.

The facilities management specialist is licensing Litbit’s AI  ‘persona’ technology to help it improve the management services it provides to datacenter customers.

The deal is still at an early stage and we didn’t discuss too much in the way of specifics but it will be interested to keep tabs on how it all progresses.

In particular, I’d like to know how many of CBRE’s 600 to 800 datacenter customers will sign up to the initiative and what the data privacy and security implications might be.

The other big challenge is how, and from where, Litbit and CBRE are going to pull data into the system to train the AI persona which will be known as REMI.

We did touch on it during our conversation – it will be text based initially – but the specifics of what the user interface for the REMI persona will be like will also be interesting to see.

For the full article go to Data Center Knowledge.

I’ve also written before about the wider challenges of using AI in datacenter management.