The data processing centers needed to run AI takes an incredible amount of energy and water to run—far surpassing traditional internet searches. Despite the fact that tech companies have promised carbon-neutral solutions, those seem unlikely to materialize.
From Scientific American: (Parshall 2024) “It costs about 30 times as much energy to generate text versus simply extracting it from a source.” A key takeaway from this research is that a large AI model can consume millions of liters of water for training. Big tech companies typically do not disclose the resources required to run their models, but outside researchers such as Luccioni have come up with estimates (though these numbers are highly variable and depend on an AI’s size and its task). She and her colleagues calculated that the large language model BLOOM emitted greenhouse gases equivalent to 19 kilograms of CO2 per day of use, or the amount generated by driving 49 miles in an average gas-powered car. They also found that generating two images with AI could use as much energy as the average smartphone charge. Others have estimated in research posted on the preprint server arXiv.org that every 10 to 50 responses from ChatGPT running GPT-3 evaporate the equivalent of a bottle of water to cool the AI’s servers.
Here is how the brilliant Dr. Christiana Zenner communicates on her syllabus about the environmental risks of AI:
- AI & ENVIRONMENT: AI involves a truly massive increase in fresh water consumption on all levels of scale, beyond already-thirsty data centers and tech production sector that require clean, fresh, cool (hence additional energy demand) water. See: Forbes (Feb 2024) for overall numbers accessibly presented, WaPo (Sept 2024) where despite some quirks the bottled-water scale and visualization helps students to visualize their individual impacts, and this article in Nature (2021) with analysis of the pre-AI water consumption tech sector signaling the already-massive issue with available data and methods.
- LABOR and LLMs: This recent video (13 min) communicates a lot about how human labor creates the baselines for AI analysis, as well as conditions for people who sort that data. This is global labor that is often invisibilized and low-wage, termed “tech sweatshops” by one advocate.