ChatGPT Uses 17,000 Times More Electricity Than a US Household: Report

Estimated read time 3 min read
  • ChatGPT uses more than half a million kilowatt-hours of electricity daily, The New Yorker reported.
  • In comparison, the average US household uses just 29 kilowatt-hours. 
  • Estimating how much electricity the booming AI industry consumes is tough to pin down.

Thanks for signing up!

Access your favorite topics in a personalized feed while you’re on the go.

download the app

Bull

AI is using up a ton of electricity. 

OpenAI’s buzzy chatbot, ChatGPT, is probably using up more than half a million kilowatt-hours of electricity to respond to some 200 million requests a day, according to The New Yorker.

The publication reported that the average US household uses around 29 kilowatt-hours daily. Dividing the amount of electricity that ChatGPT uses per day by the amount used by the average household shows that ChatGPT uses more than 17 thousand times the amount of electricity.

That’s a lot. And if generative AI is further adopted, it could drain significantly more.

For example, if Google integrated generative AI technology into every search, it would drain about 29 billion kilowatt-hours a year, according to calculations made by Alex de Vries, a data scientist for the Dutch National Bank, in a paper for the sustainable energy journal Joule. That’s more electricity than countries like Kenya, Guatemala, and Croatia consume in a year, according to The New Yorker.

“AI is just very energy intensive,” de Vries told Business Insider. “Every single of these AI servers can already consume as much power as more than a dozen UK households combined. So the numbers add up really quickly.”

Still, estimating how much electricity the booming AI industry consumes is tough to pin down. There’s considerable variability in how large AI models operate, and Big Tech companies — which have been driving the boom — haven’t been exactly forthcoming about their energy use, according to The Verge.

In his paper, however, de Vries came up with a rough calculation based on numbers put out by Nvidia — which some have dubbed “the Cisco” of the AI boom. According to figures from New Street Research reported by CNBC, the chipmaker has about 95% of the market share for graphics processors.

De Vries estimated in the paper that by 2027, the entire AI sector will consume between 85 to 134 terawatt-hours (a billion times a kilowatt-hour) annually.

“You’re talking about AI electricity consumption potentially being half a percent of global electricity consumption by 2027,” de Vries told The Verge. “I think that’s a pretty significant number.”

Some of the world’s most high electricity use businesses pale in comparison. Samsung uses close to 23 terawatt-hours, while tech giants like Google use a little more than 12 terawatt-hours, and Microsoft uses a bit more than 10 terawatt-hours to run data centers, networks, and user devices, according to BI’s calculations based on a report from Consumer Energy Solutions.

OpenAI did not immediately respond to a request for comment from BI.