Introduction: Courtesy Has a Cost
Who knew our good manners could carry an environmental price tag? Recent revelations show that saying “please” and “thank you” to ChatGPT – a habit of a majority of users – actually consumes extra energy. According to OpenAI CEO Sam Altman, all those polite prompts have cost the company “tens of millions of dollars” in additional electricity usage. In other words, being courteous to an AI isn’t just a social choice – it’s an energy-expensive one.
Quick Take: A Brief Summary
- Energy per ChatGPT prompt: ~0.34 watt-hours (Wh) of electricity – about what an oven uses in one second.
- Water usage per prompt: ~0.32 milliliters (0.000085 gallons) for data center cooling (a tiny amount, but significant at scale).
- Daily usage footprint: ChatGPT processes 2.5 billion prompts daily, totaling roughly 311 GWh of electricity per year (enough to power nearly 30,000 U.S. homes annually).
- Politeness “tax”: Extra words like “please” and “thank you” in prompts waste energy. OpenAI’s CEO says being polite to ChatGPT has added “tens of millions of dollars” to their electric bill.
- ChatGPT vs Google: Early studies claimed a ChatGPT query uses 10× more energy than a Google search. New official data suggests the gap is much smaller (only ~1.1× as much energy per query) due to updated efficiency figures.
Why does politeness matter here? Every unnecessary word in your prompt means more tokens for the AI to process, which in turn means more compute cycles and power draw. And with 67% of people in a recent U.S. survey reporting they are polite to AI chatbots, it’s not a trivial amount of extra work for ChatGPT. (Interestingly, 18% of those polite users said they do it to stay on a robot’s good side – insurance against a hypothetical AI uprising.) These fun human quirks aside, it raises a serious question: How much energy does ChatGPT actually use – and how much of that is wasted on our niceties?
Wattage per Chat: The Energy Behind Each Prompt

Every time you ask ChatGPT a question or give it a task, it consumes electricity on the servers powering the AI. OpenAI’s first official figures, disclosed in mid-2025, indicate that the average ChatGPT query uses about 0.34 Wh of electricity. To put that in perspective, that’s roughly the energy needed to power a high-efficiency LED light bulb for a couple of minutes, or an electric oven for just over one second. Along with that electricity, each prompt also evaporates about 0.32 milliliters of water (approximately one-fifteenth of a teaspoon) from the cooling systems in data centers. A single AI response doesn’t exactly drain a reservoir – but remember, ChatGPT serves millions of these responses every day.
It’s worth noting that earlier public estimates of ChatGPT’s per-query energy usage were much higher. Some studies in 2023 had pegged it at around 2.9 Wh per prompt, which is over 8 times the official figure. Those estimates suggested AI chats were dramatically more power-hungry, but Altman’s data (0.34 Wh average) implies ChatGPT is more efficient than we thought – or at least that the previous numbers were overestimates. The discrepancy likely comes from differences in methodology (what counts as an “average” query) and improvements in model efficiency over time. Regardless, we now have a clearer baseline for ChatGPT’s energy consumption per interaction.
(Even beyond electricity, consider the often-overlooked water footprint: One study found that generating a 100-word email with an AI like ChatGPT can consume about 1.4 liters of water behind the scenes (for server cooling) – roughly the volume of three 16-oz water bottles. A simple three-word response (“You are welcome”) might sip 40–50 mL of water. Tiny per use, but it scales with millions of users.)
From Watts to GWh: ChatGPT’s Massive Yearly Footprint
A few hundred milliwatts per query might not sound like much, but when you scale that up to the enormous usage of ChatGPT, the totals become staggering. As of mid-2025, ChatGPT processes over 2.5 billion user prompts each day [2]. Multiply that by 0.34 Wh per prompt, and you get approximately 850,000 Wh per day – which is 850 kWh daily. Over a full year, that’s about 311 gigawatt-hours (GWh) of electricity consumption just for answering ChatGPT questions.

To put 311 GWh in context, that’s roughly the annual electricity usage of 28,000–30,000 U.S. homes. In terms of cost, if we assume an average industrial electricity rate, running ChatGPT likely incurs on the order of $30–40 million in electricity expenses per year for OpenAI (in line with Altman’s “tens of millions” remark). In terms of environmental impact, 311 GWh consumed in a year could translate to well over 100,000 tons of CO₂ emissions (depending on the energy source mix) – equivalent to the yearly emissions of ~20,000 cars on the road.
For a more relatable comparison, consider the alternative uses of that much energy. Just one year of ChatGPT’s operation (at current usage levels) consumes about as much electricity as it would take to fully charge over 3 million electric cars or about 300 million smartphone charges. In short: the convenience of AI chat at a global scale carries a hefty energy cost, hidden behind each individual query.
The Politeness Tax: How “Please” and “Thank You” Add Up
One of the most fascinating (and ironic) contributors to ChatGPT’s energy bill is human politeness. By design, large language models generate and consume tokens for every word in a conversation. This means when you add extra courtesy words like “Hello,” “please,” “thank you,” or “how are you?”, the model dutifully processes them – using a bit more electricity each time. Individually, a couple of extra words draw negligible power, but across billions of interactions those tokens become a sizeable chunk of compute time.
Sam Altman recently pointed out this phenomenon when asked how much polite users were costing OpenAI. His answer: on the order of “tens of millions of dollars” in electricity have been spent handling gratuitous pleases and thank-yous [4]. In other words, there is a real monetary and energy tax on politeness at scale. We might call it the “courtesy cost” of AI. And it’s not a rare occurrence – as mentioned, a large majority of users tend to be polite in prompts by default.
Why do we do it if it’s not required (ChatGPT doesn’t mind if you’re blunt)? Likely it’s just human nature and habit. Some users feel it’s morally right to treat AI respectfully; others half-jokingly say they’re polite “just in case” AI attains some form of agency or memory of our behavior. There’s also an argument that using polite language with AI might condition the AI to respond more helpfully or safely – anecdotally noted by some designers who say “using polite language sets a tone for the response” . Regardless of the reasons, this quirk of human-AI interaction has a measurable efficiency impact.
ChatGPT vs. Google Search: Which Is Greener per Query?
Ever since AI chatbots exploded in popularity, people have compared their efficiency to that of traditional search engines. After all, both involve typing a question and getting an answer – but under the hood, the processes differ greatly. So, does asking ChatGPT use more energy than Googling something? The answer appears to be yes, but how much more has been a moving target.
Early analyses in 2024 suggested a dramatic gap. A report by the Electric Power Research Institute (EPRI) estimated that sending a query through ChatGPT uses about 10× more energy than a standard Google search without any AI features. This figure (echoed by other researchers) stemmed from the assumption that a typical Google search consumes ~0.3 Wh, whereas ChatGPT was thought to consume ~2.9 Wh or more per query [5]. In essence, AI chat was considered an order of magnitude more power-hungry than search.
However, as we discussed, OpenAI’s official metric significantly revised ChatGPT’s per-query usage downward. Using Altman’s number of 0.34 Wh per prompt, the comparison changes: ChatGPT likely uses only slightly more energy than a Google search – on the order of ~1.1× a basic search query’s energy. Google itself doesn’t publish exact figures for per-search energy, but 0.3 Wh per query is a commonly cited estimate from the late 2010s. If those figures hold, the gap between ChatGPT and Google is much narrower than initially feared.
Why the discrepancy? It could be that early estimates didn’t benefit from internal data and perhaps included inefficiencies or worst-case assumptions. It’s also possible that Google’s energy per search has risen in recent years (especially with AI enhancements creeping into search results), while OpenAI has optimized their inference servers. In any case, AI chat is still less efficient than search on a per-query basis – just not by an order of magnitude according to the latest data.
Toward a More Efficient (and Informed) AI Future
Understanding ChatGPT’s energy footprint helps users and companies make informed decisions about deploying such AI at scale. There’s an environmental imperative to improve efficiency, given that hundreds of millions of people now interact with ChatGPT daily. The good news is that awareness is growing: OpenAI’s transparency about per-query usage and the spotlight on the “politeness problem” could incentivize optimizations (in software and user behavior alike). Small changes – like slightly truncating overly polite prompts or fine-tuning models to handle conversational niceties more efficiently – might save significant energy when multiplied by billions of uses.
On the user side, this doesn’t mean we all need to become curt or stop using ChatGPT. But it’s a reminder that the digital isn’t immaterial – our AI habits have a real-world footprint. As AI integrations expand (in search engines, virtual assistants, business tools, etc.), efficiency will be a key part of responsible AI development. Engineers are exploring everything from improved algorithms to specialized hardware (AI chips) and renewable-powered data centers to mitigate these impacts. After all, if AI is the future, it ought to be a sustainable one.
In summary
ChatGPT’s “hidden costs” aren’t so hidden anymore. Each question we casually ask the AI carries a small cost in watts and water, which at scale becomes a significant global resource demand. Even our habit of politeness – normally a virtue – has an energy price when multiplied by millions of interactions. By recognizing these costs, we can push for greater efficiency in AI systems and perhaps tweak our own behaviors, ensuring that being polite and being sustainable don’t have to be at odds. The next time you thank an AI, you’ll know exactly what it’s costing – and that knowledge is power (in more ways than one).
FAQ
Q: How much energy does a single ChatGPT query actually use?
A: According to OpenAI CEO Sam Altman, each ChatGPT query uses approximately 0.34 watt-hours of electricity. That’s roughly equivalent to powering an LED lightbulb for 2 minutes or running an electric oven for just over one second.
Q: Does saying “please” and “thank you” to ChatGPT really waste energy?
A: Yes, but the individual impact is minimal. Extra words like “please” and “thank you” require additional processing tokens, which consume more energy. At scale, with 67% of users being polite to AI, this has cost OpenAI “tens of millions of dollars” in additional electricity costs.
Q: How much more energy does ChatGPT use compared to Google Search?
A: Current data suggests ChatGPT uses approximately 1.1× the energy of a basic Google search (0.34 Wh vs ~0.3 Wh). Earlier estimates claimed 10× more energy, but official figures show the gap is much smaller than initially thought.
Q: What’s ChatGPT’s total environmental impact?
A: ChatGPT processes 2.5 billion queries daily, consuming approximately 311 gigawatt-hours annually – enough to power nearly 30,000 U.S. homes. This translates to over 100,000 tons of CO₂ emissions yearly, equivalent to about 20,000 cars.
Q: How much water does ChatGPT use for cooling?
A: Each ChatGPT query uses about 0.32 milliliters of water for data center cooling. While tiny per query, this adds up to significant water usage at global scale.
Q: Should I stop being polite to ChatGPT to save energy?
A: While being direct saves energy, the individual impact is minimal. The bigger opportunity is for AI companies to optimize their systems for common conversational patterns and for users to be mindful of unnecessarily long prompts.

