• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Digital Foundry: Xbox Series X: Thermal + Power Consumption Analysis.

for now importance of async compute has been equal to the importance of it in nvidia card to lead the market.
what cerny said :
"Our belief is that by the middle of the PlayStation 4 console lifetime, asynchronous compute is a very large and important part of games technology."
why do you think normal people are waiting for receipt on his i/o expectations ?
maybe he will expect right this time maybe he will not again, we'll see.

i didn't used it for long because you were less active but it's time for a
41b9tt.jpg
 
Last edited:
30W in standby mode?? Why?? I'm surely keeping mine powered off thanks.

Yeah that's way to high. One X was 10 watts. I don't see why it's 3x that with a more power efficient chipset. Maybe the console was doing something when he took the test? I'll measure when I get my box.
 
Most interesting thing for me is that both Yakuza and Dirt 5 use way less power then Gears 5. Looks like neither of those games are fully taking advantage of the console.
good analysis by DF ...... but the power consumption on newer titles will be higher....compared to BC titles
Backwards Compatibility through software is extremely processor intensive. My gaming PC struggles to play emulated PS2 games at 1440p whereas I can play Forza Horizon 4 at 1440p ultra at almost locked 60fps.
 
Last edited:
Backwards Compatibility through software is extremely processor intensive. My gaming PC struggles to play emulated PS2 games at 1440p whereas I can play Forza Horizon 4 at 1440p ultra at almost locked 60fps.
but its the x86 archietecture isnt it.....by AMD.....emulation shud be easy for those
 
It is literally impossible for you to know the things you are confidently stating are true.

So you are either a liar or an idiot; choose one.
why-not-both.jpg


but its the x86 archietecture isnt it.....by AMD.....emulation shud be easy for those

If anything, games NOT made for the hardware but emulated are going to be running far less efficiently and will generate more heat and sustained power use. The fact is that all games have access to the full 12TF of power. Gears 5 running at PC ultra settings at 4K60 with all the bells and whistles will be pushing that console just as hard from a power draw perspective as a next gen only game.

As I said in one of the other concern threads about this, I could write a 10 line exe that would bring the biggest most powerful machine to its knees and have the fans having it ready to take off to the moon. Using the maximum power draw is not hard. Using it efficiently is.
 
Last edited:
why-not-both.jpg




If anything, games NOT made for the hardware but emulated are going to be running far less efficiently and will generate more heat and sustained power use. The fact is that all games have access to the full 12TF of power. Gears 5 running at PC ultra settings at 4K60 with all the bells and whistles will be pushing that console just as hard from a power draw perspective as a next gen only game.
so ur saying that the games used in that video r already taxing the hardware
 
so ur saying that the games used in that video r already taxing the hardware
Absolutely. They can absolutely be drawing the maximum amount of power that the console has available and therefore generating as much heat as it possibly can.

There is a difference between using the full amount of power and using it efficiently and "getting the most out of it" though, which is what you and others are failing to understand. You can use 12TF of power and be getting Pong level graphics, or you can be getting RDR2 equivalent graphics. Same power draw, same heat, vastly different results.
 
Last edited:
Absolutely. They can absolutely be drawing the maximum amount of power that the console has available and therefore generating as much heat as it possibly can.

There is a difference between using the full amount of power and using it efficiently and "getting the most out of it" though, which is what you and others are failing to understand. You can use 12TF of power and be getting Pong level graphics, or you can be getting RDR2 equivalent graphics. Same power draw, same heat, vastly different results.
right the power is inefficiently used on emulation but still taxes the hardware
 
Last edited:
62ºC at the top damn. But that's the 'exaust' so to speak, so that's not in the insides. The hot air is supposed to leave. Just don't put that in an enclosed space and you'll be fine.

Seems efficient enough.
Hope consumers are smart and put this machines in a well ventilated place away from reach of children and pets.
62c heat can hurt a child if he/she puts hand on top exhaust area.

Hope consumers are smart. Lol
 
Ports are not made to use the potential of the hardware just like cross-gen games like I said... they are patched to run the best on the new system but their scope/design/code are yet based in the old gen hardware.

Is Gears 5 a remake? Remakes are made to use the potential of the new hardware.

If you really think Gears 5 is using the potential of Xbox Series X then I can only feel sorry for you because the machine has a lot more to show with proper next-gen titles... and I'm the PS fanboy.
Gears 5 engine is scalable. It can take full advantage of low to very high ed hardware. There's a Insane settings in PC for that very purpose. Series X using features from Ultra and insane settings and some brand new added for seriesX
 
I'd be interested to see how this does using more of it's PSU (say close to 300W) while lying on it's side. If top in vertical mode is 62C while drawing 200W power, how will the less efficient sideways orientation coupled with 100W in power draw fare? 90C?
 
I'd be interested to see how this does using more of it's PSU (say close to 300W) while lying on it's side. If top in vertical mode is 62C while drawing 200W power, how will the less efficient sideways orientation coupled with 100W in power draw fare? 90C?

You'll never see XSX using that much power, a PSU with such efficiency doesn't exist, nor would it be cost-effective. It's most likely a standard 80+ one, meaning the console has way more capable PSU that it'll ever need. I think ~250W is the peak we will see, somewhere in next 3-4 years once current-gen consoles are abandoned and there will be only next-gen-focused titles that full use the new hardware.



No game in the market reaches no where close to 95% GPU utlization.

I think the word you're looking for is "efficiency" - like yeah, we will never see a game that utilizes 100% of a GPU's true capabilities, but that doesn't mean you cannot load a GPU in 99-100% by a shitty, unoptimized code, or by simply unlocking the framerate.
 
BTW Wichard, you can't say it the highest temp ever measured and then excuse it away by saying it is also the highest power console when 20 sec earlier you showed the XBX using more power. Do you read your script?
 
BTW Wichard, you can't say it the highest temp ever measured and then excuse it away by saying it is also the highest power console when 20 sec earlier you showed the XBX using more power. Do you read your script?
By highest power i think he means it's the most powerful console on the market.
 
Last edited:
did he need to have 3 quarters of the video be about xbox features? i mean, i get a fair bit of that stuff is relevant to heat generation and power consumption but it was a tad indulgent.

the exhaust is 60 degrees. he calls 46 degrees skin temp... i dunno man it seems a bit warm to me.

anyway, i'm not warring, i don't care about xbox heat or ps5 heat, but yeah this video seems to show that the xbox pumps out really hot air.

also, i'll just say again, i think the reason very few objects have fan vents on their tops is because it is so easy for dust to fall into them... i think the XSX is gonna be a dust-trap.
 
Last edited by a moderator:
By highest power he means it's the most powerful console on the market.

Which is irrelevant and sloppy. Of course newer parts get more done per unit time. But power (in Watts) and temperature are intimately related. So either he is an idiot for confusing his audience by using power in two contexts, one scientific which he measures, then switching to street usage (12TF!!). Or he is an apologist by switching terminology to make an excuse for his favorite console. Either way it is a bad look.
 
Last edited:
Which is irrelevant and sloppy. Of course newer parts get more done per unit time. But power (in Watts) and temperature are intimately related. So either he is an idiot for confusing his audience by using power in two contexts, one scientific which he measures, then switching to street usage (12TF!!). Or he is an apologist by switching terminology to make an excuse for his favorite console. Either way it is a bad look.
Actually rewatching the video it seems you picked up on the information wrong, the video never shows any of Gears 5 running on XBoneX, that 204-206W is from the native SeriesX version of Gears 5.
 
Last edited:
for now importance of async compute has been equal to the importance of it in nvidia card to lead the market.
what cerny said :
"Our belief is that by the middle of the PlayStation 4 console lifetime, asynchronous compute is a very large and important part of games technology."
why do you think normal people are waiting for receipt on his i/o expectations ?
maybe he will expect right this time maybe he will not again, we'll see.

i didn't used it for long because you were less active but it's time for a
Here is a question. If a game is using or not using asynchronous compute, how can you martino martino tell that a particular game is not using asynchronous compute.

Lets take for instance Gears 5, Doom, Horizon Zero Dawn, Clay Book, The Witcher 3, The Division, Hitman, Tomb Raider, Battlefield 1, Deus Ex, RDR2, Which one would you say uses or does not use Asynchronous compute?
 
Last edited:
Actually rewatching the video it seems you picked up on the information wrong, the video never shows any of Gears 5 running on XBoneX, that 204-206W is from the native SeriesX version of Gears 5.

Did you see 14:48? No doubt if he did show Gears 5 on XBX it would be higher.
 
It is the same claim lol
Just because I did not type 95% doesn't make it false.
No game in the market reaches no where close to 95% GPU utlization.
Okay but this has already been disproven in this very thread by a FS2020 screenshot.

Why is it so hard to be like "oh shot, my mistake guys, carry on"?
 
Everything looks solid for the XSX. The noise levels being lost in the ambient wash is great news. Thermals are less of a concern when it's like that, as opposed to if the fan was screaming at full load. Looks like XSX offers better efficiency for dash and last gen titles.

wOEOcye.jpg


giphy.gif
Well, they conveniently forgot the most important info, what most people will use: the standby mode. Yes, you know why (it's not good, 30W).

DF are doing what they do since the XB360 days, don't underestimate them. Here it's a simple but effective cherry picking (they select only the most favorable information). Bonus: suddenly standby mode is bad and you shouldn't use it.

BTW standby mode will get used as it's around 1mn to load a game from cold boot.
 
Did you see 14:48? No doubt if he did show Gears 5 on XBX it would be higher.

If it was running as a backwards compatible title most likely, but the Gears 5 game running on Series X is not the same as the one running on One X. It's not actually running in back compat mode, it's actually running as native Series X title. AT least that's my understanding.

But maybe someone hear with access to a watt meter and XboxOne X can recreate the scene and give us numbers?



Well, they conveniently forgot the most important info, what most people will use: the standby mode. Yes, you know why (it's not good, 30W).
Seriously? Watch the video for 5 more seconds and he tells us exactly what the wattage is for standby mode.
 
Last edited:
Well, they conveniently forgot the most important info, what most people will use: the standby mode. Yes, you know why (it's not good, 30W).

DF are doing what they do since the XB360 days, don't underestimate them. Here it's a simple but effective cherry picking (they select only the most favorable information). Bonus: suddenly standby mode is bad and you shouldn't use it.

BTW standby mode will get used as it's around 1mn to load a game from cold boot.

No one can force you to watch the video, but I suggest you should at least refrain from posting false claims and retarded conspiracy theories.
 
I'd be interested to see how this does using more of it's PSU (say close to 300W) while lying on it's side. If top in vertical mode is 62C while drawing 200W power, how will the less efficient sideways orientation coupled with 100W in power draw fare? 90C?
What kinda retarded logic is that lol?! There are some very good thermodynamics literature on Amazon. And I'm sure you could learn a bit from a 5 minute video on YouTube. Pretty sure you already know 90C is outside of the operating temperature, and don't be surprised when XSX runs cooler in horizontal, than ps5 in vertical.
 
Here is a question. If a game is using or not using asynchronous compute, how can you martino martino tell that a particular game is not using asynchronous compute.

Lets take for instance Gears 5, Doom, Horizon Zero Dawn, Clay Book, The Witcher 3, The Division, Hitman, Tomb Raider, Battlefield 1, Deus Ex, RDR2, Which one would you say uses or does not use Asynchronous compute?

why the examination ? will it change level of support of asynchronous compute in game technology in 2016 ?
 
Last edited:
why the examination ? will it change level of support of asynchronous compute in game technology in 2016 ?
I'm just curious, you said you were waiting for receipts and I'm wondering what type of receipt you were waiting for as proof that async compute is now part of game development.
why do you think normal people are waiting for receipt on his i/o expectations ?
maybe he will expect right this time maybe he will not again, we'll see.
I can tell you in my honest opinion that async compute is very common. So here we have a variety of game engines from various studios that support async compute across both AMD and Nvidia GPUs. Except for The Witcher 3 engine, but i have no doubt that Cyberpunk will support async compute as its just a better way to do things. At this point in time, i doubt there is a game engine used to make PC and console games that does not support async compute. That is what Mark Cerny was talking about. All these games except for The Witcher 3 support async compute.

Gears 5 = Unreal Engine 4
Doom = id Tech 6
Horizon Zero Dawn = Decima
Clay Book = Unreal Engine 4
The Witcher 3 = RED Engine no support
The Division = Snowdrop
Hitman = Glacier Engine
Tomb Raider = Foundation Engine
Battlefield 1 = Frostbite
Deus Ex = Dawn Engine
RDR2 = RAGE
 
Last edited:
I'm just curious, you said you were waiting for receipts and I'm wondering what type of receipt you were waiting for as proof that async compute is now part of game development.
an industry massive shift with most game supporting mainly or even better only api where it's possible.
can you really improve improve performance that much while keeping legacy support at the same time ?
 
an industry massive shift with most game supporting mainly or even better only api where it's possible.
can you really improve improve performance that much while keeping legacy support at the same time ?
See my post above. It is built into nearly all game engines used by most developers at this point. Unreal Engine 4 and Unity which are the most popular 3rd party game engines in the world have it. There is no downside of using async compute, it is just a better way to dispatch jobs for the GPU which is a highly multithreaded processor.
 
Last edited:
See my post above. It is built into nearly all game engines used by most developers at this point. Unreal Engine 4 and Unity which are the most popular 3rd party game engines in the world have it. There is no downside of using async compute, it is just a better way to dispatch jobs for the GPU which is a highly multithreaded processor.
you're moving the goalpost...
Never said it's bad tech and it's not about existence of it , it's about adoption of it in actual games
who care an engine support it if not lot of games use it
Is RT already big now ? all engine support it .... lol
now count game release this gen with dx12/vulkan
count all the other game release without them (you will see lot of unity / ue games 🤷‍♂️)
yeah big and large part of games tech.....
and let's no go on hardware front....mid gen leader of the market was only beginning to support it.
 
Last edited:
If it was running as a backwards compatible title most likely, but the Gears 5 game running on Series X is not the same as the one running on One X. It's not actually running in back compat mode, it's actually running as native Series X title. AT least that's my understanding.

But maybe someone hear with access to a watt meter and XboxOne X can recreate the scene and give us numbers?




Seriously? Watch the video for 5 more seconds and he tells us exactly what the wattage is for standby mode.
timestamp?
 
you're moving the goalpost...
I just listed 9 games and engines that use it from various game studios. Stands to reason other games using the same engine would also support it? It took a while for us to get here, owing to game development cycle being long, raytracing in its current form in games is new and is a performance killer, surely you understand that Async compute which is a way to asynchronously dispatch and compute jobs is not the same thing as raytracing?

What if i also told you that one way developers improve performance while doing ray tracing is to schedule jobs asynchronously with other things. Lol

Denoising RT effects is essential. We've packaged up best in class denoisers with the NVIDIA RTX Denoiser SDK)
Overlap the acceleration structure (BLAS/TLAS) build/update and denoising with other regimes (G-Buffer, shadow buffer, physical simulation) using asynchronous compute queues



Checkerboard rendering and various temporal upscaling jobs are done via async compute. Async time warp or reprojection is done via async compute. It is an industry wide accepted way to do things now because it is free performance uplift as you can put jobs within moments when the GPU is not doing anything.
 
Last edited:
Due to NDA I doubt they are allowed to do a teardown on these preview units. I'm looking forward to a site like Gamers Nexus doing a teardown of the XSX and comparing and contrasting it to what they've found with Zen 2 and RDNA 2 on PC.
 
What kinda retarded logic is that lol?! There are some very good thermodynamics literature on Amazon. And I'm sure you could learn a bit from a 5 minute video on YouTube. Pretty sure you already know 90C is outside of the operating temperature, and don't be surprised when XSX runs cooler in horizontal, than ps5 in vertical.

Yes I don't know much about thermodynamics...that's why I asked a question? I think you've seen too many long winters my friend.
 
BTW, has anyone besides digital foundry independently verified the settings on gears 5 are really ultra, on an actual series x?

At least for Doom, high, ultra, nightmare, ultra nightmare, are very close visually. You could lower some of the most taxing settings and it wouldn't be easy to tell.

Another sony warrior that claims to know so much, but in reality knows fucking nothing

429359c1e6ef2bdce7531104bfbee93d.gif

e2b8eac49c99e931a5946c66133141c0.jpg



The sky is brown, I support that so it must be true!
doesn't flight simulator lack RT? It's not using the RT cores, and without DLSS it also isn't using tensor cores. That's a good chunk of the gpu.
 
Yes I don't know much about thermodynamics...that's why I asked a question? I think you've seen too many long winters my friend.
Nah, it just seemed like blatant trolling is all. 62C to 90C is 144F to 194F :messenger_hushed: :messenger_hushed: . That's a phenomenal difference to say the least. I could understand a 1-2C difference at most, especially with the airflow they have. Once you get into the high 70°C and up, you will start to shorten the life of the chip, especially on a console, where airspace ain't as numerous.
 
I just listed 9 games and engines that use it from various game studios. Stands to reason other games using the same engine would also support it? It took a while for us to get here, owing to game development cycle being long, raytracing in its current form in games is new and is a performance killer, surely you understand that Async compute which is a way to asynchronously dispatch and compute jobs is not the same thing as raytracing?

What if i also told you that one way developers improve performance while doing ray tracing is to schedule jobs asynchronously with other things. Lol

Main Points
Optimize your acceleration structure (BLAS/TLAS) build/update to take at most 2ms via pruning and selective updates

Denoising RT effects is essential. We've packaged up best in class denoisers with the NVIDIA RTX Denoiser SDK)
Overlap the acceleration structure (BLAS/TLAS) build/update and denoising with other regimes (G-Buffer, shadow buffer, physical simulation) using asynchronous compute queues


Or do you want me to list every game one after the other that use raytracing?
tenor.gif

you fail to see the point and move goalpost too quickly for me here.
 
Nah, it just seemed like blatant trolling is all. 62C to 90C is 144F to 194F :messenger_hushed: :messenger_hushed: . That's a phenomenal difference to say the least. I could understand a 1-2C difference at most, especially with the airflow they have. Once you get into the high 70°C and up, you will start to shorten the life of the chip, especially on a console, where airspace ain't as numerous.

Well like I said, I don't know much about thermal calculations and performance, I just based if off 200W producing 60C, which in my lizard-brain meant that 100W probably produces 30C, which in my monkey-brain again means 300W produces 90C...it was more intended as a hyperbolic way to pose the question.
 
you fail to see the point and move goalpost too quickly for me here.
I'm telling you that you are woefully wrong about the adoption of asynchronous compute in the industry as a whole. When Mark Cerny said during the beginning of this gen in 2013 that he hopes that Asynchronous computer become a large part of game development.

He was right and it is a large part of game development. You can't tell from just looking at games, you would have to read developer documentations and presentations to see that. Luckily that's the type of shit i enjoy and spend my time doing.


 
Last edited:
BTW, has anyone besides digital foundry independently verified the settings on gears 5 are really ultra, on an actual series x?

At least for Doom, high, ultra, nightmare, ultra nightmare, are very close visually. You could lower some of the most taxing settings and it wouldn't be easy to tell.


doesn't flight simulator lack RT? It's not using the RT cores, and without DLSS it also isn't using tensor cores. That's a good chunk of the gpu.

IDK why you'd assume that they and MS would just straight up lie, but a number of things look to be exact matches for PC Ultra, such as the reflectivity of floors. It's worth noting that this is Ultra and not Insane, so in some areas the XSX version is behind "Maxed" PC settings.
 
Top Bottom