devxlogo

AI’s water demand spurs new solutions

Water Solutions
Water Solutions

In the fast-paced world of artificial intelligence, the technology’s thirst for water is becoming a growing concern. A recent study found that datacenter water consumption in Northern Virginia had increased by two-thirds over the past five years. “ChatGPT needs to ‘drink’ a 500 ml bottle of water for a simple conversation of roughly 20-50 questions and answers, depending on when and where ChatGPT is deployed,” researchers estimated in a paper early last year.

This is for a GPT-3-class model with roughly 175 billion parameters, a figure that seems small by today’s standards. GPT-4 is estimated to be between 1.7 and 1.8 trillion parameters, and these models are only going to get bigger. While this growth doesn’t bode well for datacenter power consumption, the same may not hold true for water consumption.

Datacenters do not consume water in the traditional sense. The real issue is water being removed from the local environment rather than being returned to its source. The IT infrastructure is not actually what consumes the water; even in liquid-cooled systems, these are usually closed loops losing very little fluid.

Instead, it’s the datacenter’s air handlers, often called evaporative or swamp coolers, used to keep systems from overheating that use the water. This, however, is a design choice. For example, in colder climates, dry coolers and “free cooling” can be sufficient, while in hotter, drought-prone regions, datacenters might opt for refrigerant-based systems.

Managing datacenter water consumption

Microsoft, for instance, is using refrigerant-based cooling in its datacenter developments in Goodyear, Arizona. While there are alternatives to evaporative cooling, such systems often come at the expense of higher power consumption.

See also  OpenAI Chair Sees AI Reshaping Work, Healthcare

In the world of hyperscalers, every decision ultimately comes down to margins. If using water to cool is cheaper and more efficient than using more electricity, many will choose the former despite water’s scarcity in certain regions. However, the trend may be shifting with the pace of AI innovation.

As chips grow ever hotter, passing the one-kilowatt mark, there’s a growing transition to liquid cooling. Nvidia’s Grace Blackwell Superchips, for example, are rated for 2,700 W with two designed to fit into a single chassis, leading to the adoption of direct liquid cooling (DLC). DLC is substantially more efficient than using fans but poses retrofitting challenges for many existing facilities.

Though a transition to liquid cooling represents a challenge, it could ironically cut water consumption in the long run. The higher thermal coefficient of liquid cooling allows for the use of dry coolers, which don’t consume water. There is also the potential for reusing the heat generated, such as contributing to district heating grids or supporting greenhouses.

Ultimately, we may continue to hear about datacenter water consumption issues until a critical mass of liquid-cooled systems is deployed. The race to find a balance between water and power use in AI and datacenter operations remains ongoing.

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.