Is ChatGPT Wasteful of Energy?

Peter Coates
16 min readNov 28, 2024

--

It seems as if everywhere I turn, there are articles about the extraordinary energy cost of ChatGPT, which is OpenAI’s Large Language Model (LLM) artificial intelligence query service. It’s an interesting question because it’s part technical and partly a matter of human values.

The articles and postings I’ve seen mostly say essentially the same thing, which is that LLM-based AI requires huge, energy intensive data centers. This is true, but hand-wringing about the size of the industrial plant required to support AI make no sense outside of the context of what the service costs per query, and how that cost compares to other things that we do, both with and without computers.

At the risk of spoiling the surprise, it is pretty trivial to show that not only is ChatGPT not very expensive in terms of energy use compared to many of the online services that we routinely use, it is in fact not very energy intensive compared to most of the things we do in the real world!

To get started, the energy cost per ChatGPT query is easy to come by. ChatGPT-4 reports that it consumes between 0.001 and 0.01 kWh per query. But that a lot or a little? My first reaction was, holy cow, that seems like a lot! But for the cost per query to mean anything, we need to have a feel for how much electrical power that is, and how it compares to other things we routinely do with power.

Power

Before looking at the details, it’s important to understand the terminology of power consumption. Electrical power consumption is measured in Watts, or Watts per hour, AKA Watt-hours, which is abbreviated Wh. One Wh is a tiny amount, so we usually talk about kilowatt-hours, or kWh, each of which is 1000 Watts of power for one hour.

Space heaters are a great way visualize small-scale energy cost for two reasons. Firstly, because the high setting of a space heater typically uses one kilowatt of power, or 1kWh per hour (one kilowatt-hour per hour.) Secondly, it also gives a feeling for the heat involved, because when you use electricity in an electronic device, almost all of the power consumed is converted to heat. If an electronic device uses a kilowatt of power, you have to somehow do something with approximately one space heater’s worth of heat. That’s why your 50W laptop has a strong fan — it has to blow 50/1000=0.05 of a kilowatt’s worth of heat, i.e., 1/20th of a space heater, out of the case and into the ambient room air.

You can also think of other forms of power in terms of the equivalent amount of electricity. An adult human body uses burns about 100W of power just doing nothing in particular, or about two laptop’s worth of energy, but the actual fuel we are burning is plants and animals.

We normally think of the energy cost of a car in terms of miles/gallon, but there are about 34kWh of energy in a gallon of gas, so you can also think of the energy cost of driving in terms of the equivalent kWh per mile. If your car gets 15 miles per gallon, you can think of it as 34kWh/15 miles=2.26kWh/mile even though it’s not used in the form of electricity.

(Note, battery powered cars are very different from internal combustion engine cars in this respect. A Tesla, for instance, gives about 32 miles per 11.5 kWh of charging, which would work out to 0.35kWh per mile, far less than an internal combustion engine. The biggest reason is that internal combustion engines convert only 20–25% of the energy in the fuel into motion. Also, the miles/kWh for electric vehicles doesn’t include the waste heat in creating the electricity at the plant, etc. The subject is its own rabbit hole, but suffice it to say, electric cars are much more efficient users of energy.)

This means that driving your 15mpg car for one mile consumes about as much power as running a full size space-heater for two hours and twenty-six minutes, or equivalently, you could say that running your space heater for one hour equals the energy consumption of 0.44 miles of driving. Different ways of saying the same thing.

Practically anything people in an industrialized society do has an energy cost. For example, take going for a run. In many areas of the US, for much of the year, simply opening the door incurs the use of a small amount of extra energy cost for either heating or air conditioning the outside air you let in. The clothing you run in will need to be washed and dried, which also uses power. The run will use some fraction of the useful life of your running shoes and clothes, which took power to manufacture and to ship to you. You used a small amount of power to charge your iPhone and ear buds so that you can listen to music while you run. More significantly, the servers and the network that stream the music to you also use power.

A laptop computer burns between 30 and 70 Watts depending on the model and what you are doing. If we say that your laptop averages 50W, that means you could run it for about 680 hours, or round-the-clock for about month, on the energy in a gallon of gas. Or looked at the other way around, the energy to run your laptop for an hour would take your gasoline powered car about 116 feet, i.e., the length of the street frontage of about five modest size row-houses.

Compared to Just Typing

We went through all that because saying whether ChatGPT takes “a lot of energy” is meaningless unless you have to have a way to compare it to the energy expenditure for the alternative ways people might otherwise spend that time or effort.

One alternative way to spend the time is to try to use ordinary Google Search to get answers. ChatGPT reports that a plain vanilla Google query takes about 0.0003kWh of power, i.e. 3/10 of one Watt-hour. Given the numbers above, a ChatGPT query takes between 3.3 and 33.0 times as much power as a Google query. Yikes! That sounds like a lot, right?

It does sound like a lot, but actually, we’ve only pushed the question back, because we don’t know if Google queries are “expensive”in terms of energy. Think of difference in cost between table salt and the rock salt you throw down to melt ice. A one pound box of table salt costs about $2.00, which is about 60 times as expensive as rock salt, which costs about $0.0325 per pound ($65/ton.) However, nobody complains about the relatively high cost of table salt because, who cares? No matter how much of a premium you are paying for someone to make rock salt into uniform snow white crystals, you’re probably not spending more than a few pennies a week on it, so what does it matter of rock salt for the sidewalk is 60x cheaper?

So how expensive are ChatGPT queries in comprehensible terms?

We could ask ChatGPT for the final answers, but let’s do some back of the envelope computation before doing so.

According to ChatGPT-4, a query costs between 1Wh and 10Wh. For simplicity, we’ll just arbitrarily pick a number in the middle of the range, say, 5Wh/query. This is probably high, because in most circumstances, simple questions will be disproportionately more common than complex questions, but let’s go with the middle value anyway.

That means if you did ChatGPT-4 queries at a rate of ten queries per hour (10 * 5Wh = 50Wh) the datacenter power behind your ChatGPT queries would be approximately equal to the ongoing energy cost of using the laptop (not the total cost, which would include all the energy used to manufacture and ship it, etc, which is a surprisingly large amount.)

Ten ChatGPT queries an hour would be a query every six minutes, which would be a very tough pace for a user to sustain throughout the day, every day, but still, it’s conceivable that someone using GPT-4 could cause the datacenter to burn as much energy as the laptop uses already.

It seems as if it would have to be an unusual case because, what kind of queries could a user come up with that require only six minutes each to conceive and type in, and then read and use the answer, in an average total time of six minutes, hour after hour, day after day? Very likely usage at 1/10 of that rate, or one query an hour, would qualify a person as a heavy user.

So intuitively, it’s probably almost impossible for a person to use ChatGPT so much that the energy consumption is equal to what they would use just sitting at a laptop typing, and in a realistic case it would average over time to be a small fraction of the energy it takes just to run the laptop.

Back of The Envelope Driving

Another way to look at is, dividing the kilowatt-hours in in a gallon of gas by 5Wh tells you that it would take about 6800 queries to use as much energy as our 15mpg reference vehicle uses driving fifteen miles.

Let’s say (just an intuitive estimate) that the heaviest users query ChatGPT on average once every twenty minutes during an eight hour day, i.e, 24 times a day.

At 6800 queries/gallon, that works out to 283 days of heavy querying. That’s pretty close to a year of heavy ChatGPT use, and more than a year for someone who takes weekends and holidays off.

That estimate turns out to be close to what GPT reports. GPT defines heavy use as 500 to 1000 queries a month, which is between 16 and 32 per day. They define moderate use as 100 to 500 queries a month, or between 3.3 and 16 queries a day. The average of 16 and 32 just happens to be 24, which is what we estimated was the the absolute high end, so our estimate for the max was only a little lower than what ChatGTP reports as the high end for a heavy user. Good guess.

If we divide the number of queries in a gallon of gas by the query rate ChatGPT gives for a moderate user, we see that such a user would get 704 days of ChatGPT, or about two years, for the energy used in a 7.5 mile round trip in the car.

How Does It Compare to Other Activities?

We saw above that it would be very difficult, maybe impossible, for even a compulsive ChatGPT user to burn as much energy on the datacenter side as his or her laptop consumes, because it would require almost triple the query volume of the heavy user category.

Realistically, for ordinary users, say, 250 queries a month, i.e., one query per working hour, ChatGPT doesn’t add very much to the amount of energy the user is consuming just by having the computer turned on. It would bump the 50Wh per hour for the laptop up to 55Wh per hour. Therefore, moderate use of ChatGPT only increases the energy you use simply playing solitaire by a factor of about 1/20th, and considerably less for a heavier machine like a desktop tower. This is actually pretty typical almost all online activity. As a rule, most of the total energy is consumed messing around on the Internet is on the user side, not on the datacenter side.

Also, as a general rule, the so-called “embodied energy” of the computer, i.e. the total energy used to produce the materials and manufacture the device, usually substantially exceeds the total energy consumed it its useful life. In other words, most of the total energy cost of your computer over its useful life happened before you opened the box.

Consider that the average American burns about 1.36 gallons of gas per day. The datacenter energy that the heaviest ChatGPT users would use in a year is less than the energy in the gas an average American burns each day. For a more typical user, average daily gas consumption would represent years of ChatGPT use.

In energy terms, iPhones are a remarkable bargain, with a full charge holding about 5.45Wh. An entire day’s charge is approximately the energy cost we’re estimating for a single ChatGPT query. So ChatGPT is relatively energy intensive compared to the baseline cost of keeping an iPhone charged all day. But, charging a phone is amazingly cheap compared to almost anything else we do. On the other hand, the embodied energy of the phone is huge — about 81% of the total lifetime energy cost of the device. Only about 16% is the cost of charging it, the network, datacenter, etc.

The exact cost of any given digital activity depends on many factors, such the particular device (phone vs TV vs computer, screen size, processor, video card, location, what we do, etc.) but for a general idea of what some more expensive computing activities cost in terms of energy, watching streamed video on your smartphone takes something like 0.07kWh per hour, most of which is on the datacenter and streaming side, because phones are such efficient display devices. Therefore, an hour of streaming to your phone equals seven computationally expensive ChatGPT queries, or 70 cheap queries, or 14 of our hypothetical 0.005kWh queries. Rearranging the numbers, we see that one hour of streaming video to your phone equals one query every 34 minutes for an entire work-day, which is heavy use, or a few work days of querying with moderate use. So realistic levels of ChatGPT use are quite cheap compared to streaming video on your phone.

On the other hand, watching streaming video on a 50" LED TV consumes as much as 100 times as much energy as watching the same video on your phone. Part of the expense is because the TV displays a lot more pixels, which means that the datacenter has to send more data out, but the bulk of the higher cost is from the large, bright display and the sound. If you are watching streaming video on a 50" TV, the energy cost of five hours of viewing, i.e, about three Netflix movies, is equal to the energy cost of a year of ChatGTP use. It is not unusual for people to run their TV five hours a day, so it seems fair to say that the energy cost of almost any conceivable level of ChatGPT use for a year is negligible compared to the energy cost a season of Game of Thrones.

Baking a cake takes about 1.5kWh. That’s equivalent to 300 of the standard 5Wh queries we’ve been talking about, which would be the low end of moderate ChatGPT use for three months. So, anywhere from four cakes a year to a cakes every two weeks would be in the energy cost range of a moderate ChatGPT user. Therefore, in energy terms, you might say that using ChatGP moderately heavily would be similar to being a home baker.

Suppose the local library is 2.5 miles away. That’s a five-mile round trip, so it takes 1/3 of a gallon of gas in our reference 15mpg vehicle. At 34kWh per gallon, the energy for the round trip is equivalent to 11.3kWh, which is 2266 ChatGPT queries, i.e., well into the moderate range of usage for a year. So ChatGPT is extremely cheap compared to driving to the library.

The low cost of baking compared to driving surprised me. If you drove our reference vehicle on the highway for the half-hour that a cake would be in the oven it would go about thirty miles and burn two gallons of gas. That is enough energy to bake 45 cakes. So both ChatGPT and baking are insanely cheap compared to driving, but ChatGPT is significantly cheaper.

If 1/3 of a gallon of gas seems like too much energy to waste driving to the library, you could walk. But it’s a hot day, and you get all sweaty walking to and from the library, so you decide to wash your clothes. There’s an energy cost to that. For the sake of argument, say a full change of clothes is 1/5 of a load of laundry. Washing a load might be 0.4kWh, and drying might be 2.25kWh, for a total of 2.65kWh. Divide by five for the proportion of the load that is your five-mile walk’s worth of sweaty clothing, and you get a clothes washing energy cost for your walk of about 0.53kWh, which is about 106 ChatGPT queries just for the washing and drying the clothes you wore to go to the library. There is also your part of the energy cost of running the entire library, plus the energy spent by vehicles stopping and starting so that you can cross the street, etc. All considered, it’s hard to imagine any scenario in which walking to the library saved energy over doing the research on ChatGPT.

A big surprise (at least to me) is the cost of gaming. Doing the math, 750 queries a month for a heavy user, divided by 8*30=240 working hours, equals 3.125 queries an hour, or 15.625Wh per hour for heavy use, assuming an 8 hour online day.

Typical online gaming uses about 10Wh per hour on the server side for most games, which is somewhat less datacenter energy than our heavy ChatGPT user. However, cloud-based games such as Google Stadia might use as much as 100Wh to 300Wh per hour on the datacenter side, so a Stadia player burns many times more datacenter energy than a heavy ChatGPT user.

But here’s the thing about gaming: home gaming computers typically use between 250W and 500W, which is five to ten times as much power as a typical laptop (much of the power is for the graphics cards.) Subtracting the baseline 50W for the laptop, a lightly equipped gamer might be using an extra 210Wh per hour beyond the baseline laptop wattage, or the equivalent of 14 heavy ChatGTP users. A gamer with a more powerful rig might be using as much as an extra 460Wh per hour beyond the baseline laptop wattage, or the energy equivalent of 30 heavy ChatGTP users. So even the most extreme ChatGPT consumption amounts to almost nothing compared to the energy cost of online gaming.

Finally, a 100W-equivalent LED lightbulb actually uses about 14Wh per hour. That’s equivalent to almost three ChatGPT queries per hour just to read an old-fashioned paper book in a poorly lit room, which puts you into the heavy ChatGPT user range. Even with energy-saving lamps, a well-lit room could easily equal the energy consumption level of several heavy GPT users. With old-fashioned conventional incandescent bulbs, say, three 100W bulbs, lighting the room would use the energy equivalent of 60 queries per hour, or the average consumption of 20 heavy ChatGPT users.

What You Get

The second part of the equation is what you get for that energy expenditure. This is subjective, but my experience of ChatGPT, and what I hear from others, is that it’s extremely valuable.

It’s an incredible time saver for things like generating letters and email. You have to be a very capable writer to do as well as ChatGPT for ordinary business writing, and it is much faster than doing it yourself. As we saw above, if the task would take you more than a few minutes on your 50W laptop, you’re consuming more electricity doing it yourself than you would having ChatGPT to do it.

ChatGPT also does deep analysis on both creative writing and non-fiction writing. I routinely use it for checking consistency of point-of-view in fiction, consistency of tense, and for usage questions, things that are mostly out of scope for an ordinary search engine. This would be prohibitively expensive to have a human do.

Another creative writing task that I use it for constantly is checking out plausible scenarios, either before or after I have written them. If I tell it that “My villain poisoned her victim with tetrodotoxin she got from a puffer fish she caught in a nearby lake, but fortunately he got the antidote in time,” ChatGPT will tell me instantly that puffer fish don’t live in fresh water and that there is no antidote, so no, that scenario doesn’t fly (I asked.)

ChatGPT is a remarkably effective learning tool if you know how to use it. Pick some abstruse topic and you can go back and forth with ChatGPT, refining your questions while GPT refines its answers, functioning like a very knowledgeable tutor. I don’t know if it’s the best way to get an overview of a subject, but my experience is that it’s excellent for zeroing in on particular things you don’t understand. Just today I realized I didn’t really understand why high-speed objects heat up in the atmosphere. “Friction” was all I knew. I went back and forth with ChatGPT for a few minutes, and now I actually understand it, and why the heating effect relates to the speed of sound, etc.

There is considerable value even in things that ChatGPT does relatively poorly, like writing computer code. It’s not perfect, and it sometimes makes egregious screw ups when coding, but it’s highly effective for producing working examples, generating shell programs, etc. On any given task, you would have to do the work in less than ten minutes to use less energy than asking ChatGPT, and few procedures take only ten minutes to write if you don’t already know how.

Does all of that result in a net energy savings? Who knows. So called labor-saving devices don’t usually result in people doing less labor as much as they let people get more done with the same amount of labor. Either way you look at it, it’s fantastically effective.

Summary

The ubiquitous criticism of the supposed high energy cost of ChatGPT and similar LLM services is wildly at odds with the reality.

For any reasonable set of assumptions about how a typical person might use it, ChatGPT actually seems to be relatively cheap in terms of energy when compared with many routine activities, whether online or in the physical world.

The personal energy cost of having even a modestly powered laptop turned on far exceeds the datacenter cost of even the heaviest ChatGPT use. This is true for almost any online activity, so comparisons of one online activity to another are of questionable value. That said, while it is difficult to compare streaming video or gaming to ChatGPT precisely, because the modes of use are so different, almost any way you compare them, both consume much more energy than ChatGPT, often multiple orders of magnitude more.

Likewise, it’s easy to show that many ordinary activities that we don’t usually think of as being at all energy intensive far exceed the energy demands of using ChatGPT. For instance, simply reading a book with one energy-saving lightbulb turned on takes more energy per hour than heavy ChatGPT use would. The energy cost of almost anything involving a motor vehicle makes the energy cost of ChatGPT negligible by comparison.

Could LLM processing be made cheaper? Yes, in fact since ChatGPT debuted in November 30, 2022, the energy cost per query has dropped by between one and two orders of magnitude (according to itself) and continues to decline. But even without further efficiency increases that are specific to ChatGPT, the price of datacenter computation in general drops significantly every year, if only because of Moore’s Law.

In summary, contrary to the idea that LLM-based AI is an extravagant consumer of energy, compared to most other activities, both online and in the physical world, at realistic levels of usage, it is in fact strikingly inexpensive in terms of the direct energy cost, and moreover, it replaces many common activities that are far more expensive in terms of energy use, particularly anything involving learning though streaming video.

Harder to quantify, but almost certainly real and possibly much more important are the more diffuse energy use reductions that are made possible by ChatGPT and similar services. If AI can produce a software routine or a business letter in seconds that would otherwise take a human worker an hour, that’s an hour that a 50W laptop doesn’t need to run, or could be applied to something else.

If I had to put everything above into one line I’d say “OMG! All the AI you can freakin’ eat in a year takes less energy than a gallon of gas?! Sign me up.

--

--

Peter Coates
Peter Coates

Written by Peter Coates

I was an artist until my thirties when I discovered computers and jumped ship for a few decades. Now I'm back to it. You can probably find some on instagram.

Responses (2)