cover of episode ChatGPT may not be as power-hungry as once assumed

ChatGPT may not be as power-hungry as once assumed

2025/2/14
logo of podcast TechCrunch Industry News

TechCrunch Industry News

AI Deep Dive AI Chapters Transcript
People
J
Joshua Yu
Y
Yu
Topics
@Joshua Yu : 我认为目前ChatGPT的能源消耗与普通家用电器相比并不算大问题。我之所以做这个分析,是因为之前的研究已经过时,对AI能源消耗的描述不够准确,尤其是在描述当前AI的能源消耗方面。广泛报道的3瓦时/查询的估计是基于相当旧的研究,而且根据一些粗略的计算,似乎太高了。但我预计ChatGPT的基线功耗将会上升。未来的AI将更高级,训练它可能需要更多能源,并且会被更密集地使用,处理更多和更复杂的任务。推理模型将承担更多旧模型无法完成的任务,并生成更多数据,这都需要更多的数据中心。因此,我建议担心AI能源足迹的人应该减少使用ChatGPT的频率,或者选择能最小化计算的模型,例如尝试使用较小的AI模型,并节约使用需要大量处理或生成数据的模型。

Deep Dive

Shownotes Transcript

Translations:
中文

This is TechCrunch.

It's perfectly brown, it's melty, it's sweet. It's the s'more you just cooked over the fire at Auto Camp. That's right, Hilton brings you new ways to stay. But you don't just get nightly s'mores. You also get clubhouse happy hours to gather under the stars. All while staying in custom-designed airstreams, cabins, and tents, and insanely cool outdoor destinations. If Auto Camp sounds like a dream, it sounds like it's time to redeem those Hilton Honors Points at Hilton.com. Hilton for the stay.

ChatGPT, OpenAI's chatbot platform, may not be as power-hungry as once assumed, but its appetite largely depends on how ChatGPT is being used and the AI models that are answering the queries, according to a new study. A recent analysis by Epoch AI, a non-profit AI research institute, attempted to calculate how much energy a typical ChatGPT query consumes.

A commonly cited stat is that ChatGPT requires around 3 watt-hours of power to answer a single question or 10 times as much as a Google search. EPOC believes that's an overestimate. Using OpenAI's latest default model for ChatGPT, GPT-4.0, as a reference, EPOC found the average ChatGPT query consumes

around 0.3 watt-hours, less than many household appliances. The energy use is really not a big deal compared to using normal appliances or heating or cooling your home or driving a car, Joshua Yu, the data analyst at EPOC who conducted the analysis, told TechCrunch.

AI's energy usage and its environmental impact, broadly speaking, is the subject of contentious debate as AI companies look to rapidly expand their infrastructure footprints. Just last week, a group of over 100 organizations published an open letter calling on the AI industry and regulators to ensure that new AI data centers don't deplete natural resources and force utilities to rely on non-renewable sources of energy.

Yu told TechCrunch his analysis was spurred by what he characterized as outdated previous research. Yu pointed out, for example, that the author of the report that arrived at the 3 watt-hours estimate assumed OpenAI used older, less efficient chips to run its models. I've seen a lot of public discourse that correctly recognized that AI was going to consume a lot of energy in the coming years, but didn't really accurately describe the energy that was going to AI today, Yu said.

Also, some of my colleagues noticed that the most widely reported estimate of 3 Wh/query was based on fairly old research and, based on some napkin math, seemed to be too high. Granted, EBOX's 0.3 Wh figure is an approximation as well. OpenAI hasn't published the details needed to make a precise calculation. The analysis also doesn't consider the additional energy costs incurred by chat GPT features like image generation or input processing.

Yu acknowledged that long-input ChatGPT queries — queries with long files attached, for instance — likely consume more electricity upfront than a typical question. Yu said he does expect baseline ChatGPT power consumption to rise, however. The AI will get more advanced, training this AI will probably require much more energy,

and this future AI may be used much more intensely, handling much more tasks and more complex tasks than how people use ChatGPT today, Yu said. While there have been remarkable breakthroughs in AI efficiency in recent months, the scale at which AI is being deployed is expected to drive enormous, power-hungry infrastructure expansion. In the next two years, AI data centers may need nearly all of California's 2022 power capacity, 68 gigawatts,

according to a RAND report. By 2030, training a frontier model could demand power output equivalent to that of eight nuclear reactors, eight gigawatts, the report predicted. ChatGPT alone reaches an enormous and expanding number of people, marking its server demand similarly massive. OpenAI, along with several investment partners, plans to spend billions of dollars on new AI data center projects over the next few years.

OpenAI's attention, along with the rest of the AI industries, is also shifting to reasoning models, which are generally more capable in terms of the tasks they can accomplish but require more computing to run. As opposed to models like GPT-4.0, which respond to queries nearly instantaneously, reasoning models think for seconds to minutes before answering, a process that sucks up more computing, and thus power.

Reasoning models will increasingly take on tasks that older models can't and generate more data to do so, and both require more data centers, Yu said. OpenAI has begun to release more power-efficient reasoning models like O3 Mini, but it seems unlikely, at least at this juncture, that the efficiency gains will offset the increased power demands from reasoning models, thinking process, and growing AI usage around the world.

You suggested that people worried about their AI energy footprint use apps such as ChatGPT infrequently or select models that minimize the computing necessary to the extent that's realistic. You could try using smaller AI models like OpenAI's GPT-40 Mini, you said, and sparingly use them in a way that requires processing or generating a ton of data. This episode is brought to you by the Personal Finance Podcast.

You're ready to transform your finances, right? So you have to listen to a podcast that I've been loving. It's called The Personal Finance Podcast, and you'll find it very valuable. It's hosted by Andrew Gene Cola, and it's packed with tips on how to build wealth, creating multiple income streams, and just making your money work for you. I recently listened to an episode called The Six Numbers You Must Know to Build Wealth, and it's a great podcast.

In it, Andrew breaks down, you guessed it, six actionable steps anyone can take to start building their financial future in a way that makes it feel simple and not overwhelming. If you've ever wanted to feel confident about moving in the right direction with your finances, this podcast is where you got to start. Search for the Personal Finance Podcast on Apple Podcasts, Spotify, or wherever you listen to podcasts. Your wallet will thank you later.