Resource Consumption by AI Technology

Building an AI model such as ChatGPT is highly resource consuming. It is difficult to measure all the costs. Most people are not aware of the resource usage underlying ChatGPT.

While building an LLM, we have to train the model on vast human-written text. It takes lot of computing, and hence lot of electricity. That results into heat generation. To keep it cool on hot days, data centres need to pump in water — often to a cooling tower outside its warehouse-sized buildings.

Microsoft’s water consumption increased 37 per cent from 2021 to 2022. The consumption reached 1.7 billion gallons — equal to 2500 Olympic-sized swimming pools. Most of this added consumption is due to AI.

Microsoft sourced the water from Raccoon and Des Monies rivers in central Iowa to cool a powerful supercomputer used to teach AI systems how to mimic human writing.

The growing demand for AI tools carries hefty costs, from expensive semiconductors to increase in water consumption. Few people in Iowa knew about its status as a birth place of GPT-4.

When users ask ChatGPT anything in a series of 5 to 50 prompts or questions, it gulps up 500ml of water. It varies depending on where the servers are located and the season.

There is indirect water usage, such as to cool power plants that supply electricity to the data centres.

Google too uses water. Its consumption doubled outside Las Vegas. It was also thirsty at Iowa, drawing more potable water to its data centres.

Microsoft is doing research to measure AI’s energy and carbon footprint. It is also working on ways to optimise resource consumption.

print

Leave a Reply

Your email address will not be published. Required fields are marked *