Could AI Be Heading Towards an “Energy Crisis”? Speculation suggests that a Microsoft engineer involved in the GPT-6 training cluster project has warned that deploying over 100,000 H100 GPUs in a single state might trigger a collapse of the power grid. Despite signs of OpenAI’s progress in training GPT-6, the availability of electricity could emerge as a critical bottleneck.
Kyle Corbitt, co-founder and CEO of AI startup OpenPipe, revealed in a post on social platform X that he recently spoke with a Microsoft engineer responsible for the GPT-6 training cluster project. The engineer complained that deploying InfiniBand-level links between GPUs across regions has been a painful task.
Continuing the conversation, Corbitt asked, “why not just colocate the cluster in one region?” The Microsoft engineer replied, “Oh yeah, we tried that first. We can’t put more than 100K H100s in a single state without bringing down the power grid.”
At the just-concluded CERAWeek 2024, attended by top executives from the global energy industry, discussions revolved around the advancement of AI technology in the sector and the significant demand for energy driven by AI.
As per a report from Bloomberg, during his speech, Toby Rice, chief of the largest US natural gas driller, EQT Corp., cited a forecast predicting AI could gobble up more power than households by 2030.
Additionally, Sam Altman from OpenAI has expressed concerns about the energy, particularly electricity, demands of AI. Per a report from Reuters, at the Davos Forum earlier this year, he stated that AI’s development requires breakthroughs in energy, as AI is expected to bring about significantly higher electricity demands than anticipated.
According to a report by The New Yorker on March 9th citing data of Alex de Vries, a data expert at the Dutch National Bank, it has indicated that ChatGPT consumes over 500,000 kilowatt-hours of electricity daily to process around 200 million user requests, equivalent to over 17,000 times the daily electricity consumption of an average American household. As for search giant Google, if it were to use AIGC for every user search, its annual electricity consumption would increase to around 29 billion kilowatt-hours, surpassing the annual electricity consumption of countries like Kenya and Guatemala.
Looking back at 2022, when AI hadn’t yet sparked such widespread enthusiasm, data centers in China and the United States respectively accounted for 3% and 4% of their respective total societal electricity consumption.
As global computing power gradually increases, a March 24th research report from Huatai Securities predicts that by 2030, the total electricity consumption of data centers in China and the United States will reach approximately 0.95/0.65 trillion kilowatt-hours and 1.7/1.2 trillion kilowatt-hours respectively, representing over 3.5 times and 6 times that of 2022. In an optimistic scenario, by 2030, the AI electricity consumption in China/US will account for 20%/31% of the total societal electricity consumption in 2022.
Read more
(Photo credit: Taiwan Business Topics)