Quick answer: A typical ChatGPT prompt produces around 0.16 g CO₂e and a typical Claude prompt around 0.21 g CO₂e. Multiply your weekly prompt count by these figures, then by 52, to get an annual estimate. For a heavy user sending 100 prompts a day, that works out to roughly 6 to 8 kg CO₂e per year, less than a short car journey.
Estimating the carbon footprint of your AI usage is more straightforward than it sounds. You combine per-prompt emission factors with your own prompt counts, then adjust for your electricity mix if you want extra accuracy. This guide walks through the method, gives you sensible numbers to plug in, and puts the result in context so you can see how AI use compares with everyday activities like driving, streaming or boiling a kettle.
Read more about whether AI is sustainable here.
How much CO2 does AI use per prompt?
Neither OpenAI nor Anthropic publish official per-prompt emission figures yet, so you need to rely on independent estimates and treat them as approximations. The best current numbers come from researchers and methodology groups working with publicly available data on data centre power draw and grid carbon intensity.
A reasonable, order-of-magnitude set of factors drawn from recent methodology work is:
| Assistant | Energy per prompt | Emissions per prompt |
|---|---|---|
| ChatGPT | ~0.34 Wh | ~0.16 g CO₂e |
| Claude | ~0.35 Wh | ~0.21 g CO₂e |
These figures cover “typical” text prompts, not large code generation or long-context jobs. They already include assumptions about how power is used in data centres and about the average grid mix.
Step-by-step method
1. Track your usage
Decide what you want to count: number of prompts per day, week or month sent to ChatGPT and Claude. A simple spreadsheet with date, model and prompt count is usually enough. If you use both services, track them separately because their per-prompt emissions differ.
For a typical knowledge worker, daily prompt counts vary widely. Light users might send 5 to 10 a day, while heavy users such as developers, researchers and content creators often send 50 to 200.
2. Pick per-prompt emission factors
Use the figures from the table above as your starting point. If you want to adjust for prompt length or task type, see step 4.
3. Calculate your annual emissions
Convert prompts to emissions for each system:
- ChatGPT emissions = prompts × 0.16 g CO₂e
- Claude emissions = prompts × 0.21 g CO₂e
Example: if over a year you send 3,000 prompts to each service:
- ChatGPT: 3,000 × 0.16 ≈ 480 g CO₂e
- Claude: 3,000 × 0.21 ≈ 630 g CO₂e
- Total ≈ 1.11 kg CO₂e

For rough intuition, one analysis suggests that 10 ChatGPT prompts a day works out to around 0.6 kg CO₂e per year, assuming a global average grid.
4. Optionally adjust for your location
The factors above assume an average global grid intensity. The actual data centre your prompt is processed in may run on cleaner or dirtier electricity. cloud.google France, with heavy nuclear, has a very low grid intensity. Poland, India and parts of the United States have a much higher one.
If you want to be more precise, you can:
- Find a typical grid carbon intensity for the region where the model is likely hosted (often North America or Western Europe; data-centre regions are not usually disclosed).
- Use the formula: CO₂ per prompt = Energy per prompt × PUE × Grid carbon intensity
PUE is power usage effectiveness; many hyperscale data centres sit around 1.1 to 1.3. ITU is a good starting point [PDF]. Without information on where your specific sessions run, sticking with the “global average” numbers is usually honest and transparent.
5. Include training only if you really need to
Per-user training emissions are hard to allocate. They can dwarf usage-phase numbers for very large models, but dividing total training emissions by billions of prompts makes the per-prompt share very small.
For an individual user, most practical calculators focus on usage-phase emissions, as shown in the numbers above.
ChatGPT vs Claude: which has the bigger footprint?
Based on current public estimates, Claude prompts produce roughly 30% more CO₂e per prompt than ChatGPT prompts. This reflects differences in model size, hardware and data centre choices rather than any fundamental gap in approach.
Two things worth keeping in mind:
- These numbers shift as both companies update their models and infrastructure. A figure from 2024 may not hold in 2026.
- Per-prompt differences are tiny compared with the variation between users. Someone sending 500 prompts a week to ChatGPT will easily out-emit someone sending 50 prompts a week to Claude.

The practical takeaway: model choice matters far less than how often you prompt and how long your prompts are.
How AI emissions compare to everyday activities
Putting prompt-level numbers in context helps:
| Activity | Approximate CO₂e |
|---|---|
| One ChatGPT prompt | 0.16 g |
| One Claude prompt | 0.21 g |
| One Google search | ~0.2 g |
| Sending an email with attachment | ~50 g |
| Boiling a kettle for one cup of tea | ~15 g |
| Streaming an hour of HD video | ~36 g |
| Driving 1 km in an average petrol car | ~180 g |
| Return flight London to New York | ~1,000,000 g (1 tonne) |

A heavy AI user sending 200 prompts a day to ChatGPT generates around 12 kg CO₂e a year. That is roughly the same as driving 70 km in a petrol car or a single one-way flight from London to Edinburgh.
What is being done to reduce AI emissions?
Both major AI providers and their data centre partners are working on the energy and carbon side of the picture, though progress and disclosure vary.
- Renewable matching. Microsoft, Google and Amazon, which between them host most major AI workloads, have committed to 24/7 carbon-free energy by 2030. “Matching” usually means buying renewable certificates rather than running on clean electricity in real time.
- Hardware gains. Newer chips like Nvidia’s H200 and Blackwell families deliver more inference per watt, lowering per-prompt energy. sciencenews
- Model compression. Smaller, distilled models can serve many queries at a fraction of the energy of frontier models. Anthropic’s Haiku and OpenAI’s mini variants are examples.
- Greater transparency. Pressure from regulators, particularly the EU AI Act and emerging UK rules, is pushing providers to publish more detail on training and inference emissions. blog.everythinggreen
The most useful thing to know is that the per-prompt figure is likely to drop over the next few years, but total AI energy use will probably grow because more people will use it more often.
How to make this practical for you
Simple personal estimate
- Count your prompts over a typical week (e.g. 70 ChatGPT, 30 Claude).
- Annualise the figure: multiply by 52.
- Apply the per-prompt factors above, then convert grams to kilograms by dividing by 1,000.
- This gives you a transparent, repeatable estimate you can update as your usage changes.
If you want a bit more rigour
- Separate short “chatty” prompts from long, document-heavy or code prompts. Multiply long ones by 2 to 3 times as a crude correction, because longer contexts and outputs draw more compute and energy.
- Document your assumptions (per-prompt factors, any multipliers, grid mix) so you or others can revisit them as better data appears.
- If you use AI for work, allocate the resulting emissions to your scope 3 reporting under category 1 (purchased goods and services) or category 11 (use of sold products), depending on whether you are the buyer or the seller.
How big is this in context?
Several independent analyses suggest that even fairly heavy personal ChatGPT use is a tiny fraction (well under 1%) of an average person’s annual carbon footprint in high-income countries. One study, for example, estimated that 50 prompts use on the order of a few watt-hours, corresponding to tens of grams of CO₂e on a typical grid. hannahritchie.substack
To put that in perspective, the average UK resident has a footprint of around 7 to 10 tonnes of CO₂e per year. AI use, even heavy AI use, sits at the gram level rather than the tonne level. That doesn’t make it irrelevant. Multiplied across millions of users it adds up. But it does mean reducing your personal AI use is a far smaller lever than changes to diet, transport or home heating.
Frequently asked questions
A typical ChatGPT prompt produces approximately 0.16 g CO₂e, based on independent estimates of energy use per query and average grid carbon intensity. Longer prompts, image generation or code-heavy tasks can use 2 to 3 times more.
A typical ChatGPT prompt produces approximately 0.16 g CO₂e, based on independent estimates of energy use per query and average grid carbon intensity. Longer prompts, image generation or code-heavy tasks can use 2 to 3 times more.
For practical purposes, no. Training emissions for large models are split across billions of prompts, making the per-user share negligible. Most personal carbon calculators focus on usage-phase emissions only.
Three practical steps:
1. Send fewer, longer prompts rather than many short ones;
2. Use smaller models such as Haiku, Mini or Flash variants for routine tasks;
3. Avoid repeating the same query when you already have an answer.
None of these will move your overall footprint by much, but they do reduce wasted compute.
Not yet. Neither OpenAI nor Anthropic publish official per-prompt figures. Most public estimates come from third-party researchers using disclosed model sizes, hardware specifications and grid data. EU AI Act requirements will likely force more disclosure from 2026 onwards.
If AI is a material part of your operations, yes. Most companies place purchased AI services under scope 3 category 1 (purchased goods and services). The numbers will be small compared with travel or cloud hosting in general, but reporting them shows you care.
Why not start measuring your website’s carbon footprint?
Before you improve your website’s performance, start by measuring the carbon footprint – then you can report on the carbon footprint savings as you speed up your site.
Use our Kanoppi carbon footprint plugin. This intuitive tool provides measurements and insights about your WordPress website’s carbon footprint and helpful recommendations for reducing it.
Sources and further reading
- Methodology and transparency calculations, Offset AI
- Measuring the environmental impact of AI inference, Google Cloud
- AI prompt carbon emissions, CNN
- Greening Digital Action, ITU [PDF]
- Carbon footprint of ChatGPT, Hannah Ritchie
- AI energy carbon emissions, Science News
- Measuring the carbon cost of AI, Everything Green
- Environmental impact of AI study, PMC
Ready to get started with Kanoppi?
Our innovative WordPress plugin is in private beta testing and launching soon. If you are interested, please request a demo and join our waiting list.
Contents
- How much CO2 does AI use per prompt?
- Step-by-step method
- ChatGPT vs Claude: which has the bigger footprint?
- How AI emissions compare to everyday activities
- What is being done to reduce AI emissions?
- How to make this practical for you
- How big is this in context?
- Frequently asked questions
- Sources and further reading




