The rapid development and deployment of powerful generative AI models come with significant environmental consequences, including increased electricity demand and water consumption. The excitement surrounding generative AI’s potential, from improving worker productivity to advancing scientific research, is hard to ignore. Nevertheless, the environmental consequences of this generative AI “gold rush” remain complex and challenging to mitigate.
The computational power required to train generative AI models, which can have billions of parameters, demands a staggering amount of electricity. This leads to increased carbon dioxide emissions and pressures on the electric grid. Deploying these models for real-world applications, allowing millions to use generative AI daily, and then fine-tuning the models to improve performance requires substantial energy long after the models have been initially developed.
Beyond electricity demands, a significant amount of water is needed to cool the hardware used for training, deploying, and fine-tuning generative AI models. This can strain municipal water supplies and disrupt local ecosystems. Additionally, the increasing number of generative AI applications has driven demand for high-performance computing hardware, adding indirect environmental impacts from its manufacture and transport.
“When we think about the environmental impact of generative AI, it is not just the electricity you consume when you plug the computer in. There are much broader consequences that go out to a system level,” says Elsa A. Olivetti, professor in the Department of Materials Science and Engineering and the lead of the Decarbonization Mission of MIT’s new initiative.
The electricity demands of data centers significantly contribute to the environmental impacts of generative AI. Data centers are used to train and run the deep learning models behind popular tools like ChatGPT and DALL-E. For instance, Amazon operates numerous data centers, each housing about 50,000 servers to support cloud computing services.
“What is different about generative AI is the power density it requires. Generative AI training clusters might consume seven or eight times more energy than typical computing workloads,” says Noman Bashir, lead author of a paper on the environmental impact of generative AI and a postdoc at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL).
Generative AI’s electricity and water burden
Scientists have estimated that the power requirements of data centers in North America increased from 2,688 megawatts at the end of 2022 to 5,341 megawatts at the end of 2023, driven partly by the demands of generative AI. Globally, the electricity consumption of data centers rose to 460 terawatts in 2022. By 2026, the electricity consumption of data centers is expected to approach 1,050 terawatts.
While not all data center computation involves generative AI, the technology has been a significant driver of increasing energy demands. “The demand for new data centers cannot be met in a sustainable way. The pace at which companies are building new data centers means the bulk of the electricity to power them must come from fossil fuel-based power plants,” says Bashir.
The water consumption of data centers also poses environmental impacts. It has been estimated that for every kilowatt-hour of energy a data center consumes, it needs two liters of water for cooling. “Just because this is called ‘cloud computing’ doesn’t mean the hardware lives in the cloud.
Data centers are present in our physical world, and their water usage has direct and indirect implications for biodiversity,” says Bashir. The computing hardware inside data centers brings additional environmental impacts. While it is difficult to estimate the power needed to manufacture a GPU, it would be more than what is required to produce a simpler CPU due to the more complex fabrication process.
A GPU’s carbon footprint is further compounded by the emissions related to material and product transport. The environmental implications of sourcing raw materials used to fabricate GPUs often involve unsustainable mining practices. The environmental demands of generative AI are likely to continue increasing.
While electricity demands of data centers have garnered significant attention, water consumption and the production of computing hardware also pose serious concerns. As generative AI models become more ubiquitous and evolve, addressing these environmental impacts will require a multi-faceted approach involving advancements in technology and changes in policy and practice.
April Isaacs is a news contributor for DevX.com She is long-term, self-proclaimed nerd. She loves all things tech and computers and still has her first Dreamcast system. It is lovingly named Joni, after Joni Mitchell.























