A few days ago I posted that creating one 10-second AI video on OpenAI’s Sora app takes about 1 kilowatt hour of energy – about 10 per cent of the daily energy used by an average German household. After I posted I got one question which I’ve heard a lot whenever AI’s energy use is discussed:
What about Netflix and online gaming - surely they use a lot of energy too?
It’s a good question, and answering it helps to put AI’s energy demand in perspective.
So….
Lets start with Netflix - a credible estimate from The Carbon Trust in 2021 said that 1 hour’s streaming took around 18 watt hours. So in that case, making one 10 sec Sora Video takes the same as watching 5 and a half of hours of Netflix.
And in case you’re wondering, watching HD or 4K is pretty much the same.
Most of that energy by far is from your own device, so a large TV takes a lot more than a smartphone. In the Sora estimate, a device is not included.
—————————-
Now, online gaming is far harder to estimate, there are no clear numbers, but I found some clues.
One analysis looks at hosting Destiny 2, a popular online game, on an AWS “ng4b.16xlarge” instance. That’s a low spec GPU set up, of which Amazon’s AWS cloud has millions (no one would use this for serious AI applications).
This instance can host 9 games, each up to 9 players. If it’s half full,that’s 36 players. Overall power for that could be around 372 watts (workings in link below).
That’s c. 15 watts per player per hour. If you add a Playstion or PC, that could be 250 watts, so that’s .265 Kwh for every of gaming. So making 1 Sora Video takes 4 hours of gaming. And dozens of hours more with a smartphone.
——————————-
But with AI’s energy use, we should take a systemic view; after all, an AI model is a massive industrial system with a truly global impact.
We don’t have precise energy stats, but we can use overall cloud hosting costs as a proxy - in fact they are better, because to some extent they will also include the material impact of the hardware.
Netflix is widely assumed to be paying $1.3bn a year to Amazon’s AWS cloud service – it was the its biggest customer, up until AI, that is.
And gaming is likely to require far less than streaming.
An investor made some estimates on OpenAI’s compute costs based on the huge deals it has recently made.
This year OpenAI will pay around $6bn on compute (which is likely well below the true cost).
But here’s the thing: Netflix and online gaming are currently static - AI costs are spiralling upwards.
Next year Open AI will pay $14bn for compute, and in 2030 they’ll pay – wait for it - $295bn.
So in 5 years, OpenAI will be spending over 200 times more on compute than Netflix.
And that’s just based on the deals OpenAI’s already signed. They have every intention of signing even more.
So there you have it; there really is nothing you can do online that comes remotely close to the costs of using AI.
Below are my workings (which are very boring - but you’re welcome to look)
please read the original here :
or even better subscribe ….
and I’d love to read any comments you have
————————————————
Workings
My original post is on the calculations is here – so i won’t repeat that here.
Suffice it to say that an average German household used 3383 Kilowatt hours of energy a year. - you can find that here:
destatis.de/EN/Themes/S…
It’s important here to note the difference between watts and watt hours (and their 1000x multiples in kilowatts/hours, megawatt/hours etc).
A watt is like the strength of the electricity current at any given time – if it relates to a device, its the peak amount of power it needs.
The watt hour gives you an amount for overall energy used. So these units are related, but very different, and differently useful.
There was a lot of exaggerated estimates on video streaming a few years ago . Especially one from the French “The Shift” website – it was way off.
This Carbon Trust 2021 estimate is considered pretty sound:
carbontrust.com/our-wor…
Essentially, compute for streaming is very low; it comprises of storage, which takes very little energy, and then file transfer – which, even for large files is also quite efficient. Most of the compute is in showing the image on your screen.
With online gaming, there is far less information .- interestingly, most online gaming platforms – epic games and Valve/steam for example, are privately owned – so don’t release sustainability reports – where you might find some info on energy use.
I found this article – parsec.app/blog/publish…
- which considers hosting Destiny 2 from a business perspective. But it gives me the numbers i need to make some estimates.
BTW Destiny 2 is a Battle Royale style shooter (if that means anything to you) – so likely to be a lot more energy intensive than say Minecraft or Roblox – which take up a big chunk of online gaming time.
The “ng4b.16xlarge” is a very cheap instance – its a cluster of chips provided as a unit. It comprises of 4 Nvidia T4 GPUs. These chips are the specialist Graphics Processing Units” that were originally designed for games, but are now used for AI, because they are designed to handle many computations concurrently – which is ideal for AI-
But these GPUs are old – and are not competitive for an AI use these days. Each T4 requires about 70 watts – and to give some context, a current state of the art AI GPU, say the Nvidia GB300, needs around 1200 watts, and then huge amounts more power on top for cooling.
Anyway, our ng4b.16xlarge needs 280 watts for its GPUs – it can also use up to 48 virtual CPUs (like normal PC processors) – but in practice the CPus will need a fraction of the power of the GPUs. There is also air cooling to consider. So I’ve added on a third extra power – its a guess, but probably an OK one.
So that gets us to 372 watts. One Destiny 2 instance hosts 9 games – they vary in formats between 3 and 9 players max. As AWS instances are easy to scale up and down – my guess is servers are kept quite full most of the time – so i feel assuming 50% capacity is generous.
And of course, with games your console or PC is doing most of the work – as it turns out, nearly 20 times the work.
A PlayStation has its own GPU, as does a gaming PC – they vary, but 250 watts for the system seems a good estimate – a really powerful PC GPU will be double that, but they are less common.
So that’s how we get to .265 kilowatt hours per hour.
So then we get to system level calculations and here there’s a pretty obvious point to make:
We are not seeing Google, Microsoft or Amazon building out vast gigawatt data centres for Netflix or Steam.
Amazon have just built a 2 GW data centre fro Anthropic, Microsoft have just built a similar one for OpenAI.
We do not hear the CEO of Netflix or Steam screaming to anyone who will listen – like Sam Altman - that they are running out of GPUs.
So we’re clearly not in the same ball park – when we’re looking at AI compared to any other internet service.
The netflix spend is an estimate – which is widely used for example here: vocal.media/writers/unv…
(admitttedly not a great source – but I’ve seen it elsewhere too)
OPenAI is anything but open when it comes to its costs – but these are some estimates that cam from an investor based on the whopping $1.4 trillion of compute deals, OpenAI has agreed this year.
tomtunguz.com/openai-ha…
And as Altman has said he’s aiming for 250GW of compute by 2033 – he will obviously be planning to be adding multiples of that.
This amount of compute capacity is truly hard to......well, compute.
please read the original here :
or even better subscribe ….
and I’d love to read any comments you have