• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Corndog

Banned
Keep listening and it seems to be explained why. If you have something that is barely using the CPU at all (engine designed around Jaguar) then if there’s a manual profile on the developer kit that allows you to manually allocate power ratios then they could just keep it pegged in favour of the GPU.
And this is second hand information relayed by the same Leadbetter that couldn’t quite wrap his head around what Cerny was saying about how the system works in the Eurogamer interview he conducted.
And you don’t “choose to downclock” but have the variable frequency system choose to do so on the component you specify if it needs to due to what is being run, which again is unlikely with low CPU utilisation.

Once again, Watts consumed/heat generated is based on frequency AND the instructions being run. Not just the frequency. A CPU or GPU can idle at a high clock all day without consuming many Watts/generating much heat.
So now you agree with me that developers need to be able to choose what to downclock or idle if you prefer that. The cpu or gpu. That’s been my main point this entire time.

And while Leadbetter may be incorrect in his understanding he says he was told this by multiple devs. Are they wrong too?
I think I’m done. Maybe I’m wrong but the evidence doesn’t seem to say that.
 
So now you agree with me that developers need to be able to choose what to downclock or idle if you prefer that. The cpu or gpu. That’s been my main point this entire time.

And while Leadbetter may be incorrect in his understanding he says he was told this by multiple devs. Are they wrong too?
I think I’m done. Maybe I’m wrong but the evidence doesn’t seem to say that.

It can be helpful for developers to use a fixed profile for evaluating the performance of different code, but it is not a part of production code that is run on real PS5’s.
It’s a feature of the dev-kit.

It’s explained in the Eurogamer interview that the video you link to comes from.

On release hardware it’s handled by PS5’s “model SoC”.

You seem to be focussing in one one point instead of the context in which it was said and the other points from the same article that say how both CPU and GPU stay at or near their full clock-speeds.
It seems a bit like cherry picking to force a point, to be blunt about it.

The entire quote and context from the original article the video references:

Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. It makes perfect sense as most game engines right now are architected with the low performance Jaguar in mind - even a doubling of throughput (ie 60fps vs 30fps) would hardly tax PS5's Zen 2 cores. However, this doesn't sound like a boost solution, but rather performance profiles similar to what we've seen on Nintendo Switch. "Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.

This isn’t the first time this has been explained in this thread.

As to my earlier point of synthetic burn-tests exceeding power draw limits, the same interview also has something to say about that:

I wondered whether there were 'worst case scenario' frequencies that developers could work around - an equivalent to the base clocks PC components have. "Developers don't need to optimise in any way; if necessary, the frequency will adjust to whatever actions the CPU and GPU are performing," Mark Cerny counters. "I think you're asking what happens if there is a piece of code intentionally written so that every transistor (or the maximum number of transistors possible) in the CPU and GPU flip on every cycle. That's a pretty abstract question, games aren't anywhere near that amount of power consumption. In fact, if such a piece of code were to run on existing consoles, the power consumption would be well out of the intended operating range and it's even possible that the console would go into thermal shutdown. PS5 would handle such an unrealistic piece of code more gracefully."

Emphasis mine.

This isn’t “bad hardware design”. It’s how fixed clock systems work when they’re designed to be performant with realistic pieces of code and not pointless loops.

Exceeding power draw simply is not a problem unique to variable clocks at all, and by targeting maximum power draw and varying clocks a “few percent” to peg it there, you’re actually making it a lot easier to get close to the limit of what the chip can do as far as useful work goes.

Power hungry scenes aren’t complex action sequences. They are low triangle high frame rate scenes.

Writing code to try and consume as many Watts as possible (stress test benchmark) is trivial.
Writing code to get the absolute most out of a chip while doing useful work (calculating prime numbers, scientific work etc) is a very complex subject and not at all trivial.
Writing very efficient game code (high utilisation) with lots of memory look ups and adjustments, with staged pipelines that depend on other work being completed, and trying to parallelise it is extremely difficult and just doesn’t even come close to the power consumption of the other two tasks, especially the first one, which is the “unrealistic” code Cerny means.

Fixed clocks on a games console in no way whatsoever means it can run all three scenarios without exceeding TDP. If it was clocked low enough to sustain the first scenario, gaming code would be running way slower than it otherwise could, and the chip would be running very cool.

This is the kind of guess work on cooling and power-supply that gets done when designing an APU with fixed clocks. You have to bake in some kind of margin.
 
Last edited:

Fordino

Member
So now you agree with me that developers need to be able to choose what to downclock or idle if you prefer that. The cpu or gpu. That’s been my main point this entire time.

And while Leadbetter may be incorrect in his understanding he says he was told this by multiple devs. Are they wrong too?
I think I’m done. Maybe I’m wrong but the evidence doesn’t seem to say that.
The developers are able to specify hardware profiles only on the PS5 devkit, they're completely optional and are there to help them optimise code.

In the actual PS5 barely any throttling/etc will be needed, games will not be demanding full CPU and GPU capability for long periods of time. Cerny says it will happen very rarely, if at all.

Cerny quotes:
"Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power,"

"Developers don't need to optimise in any way; if necessary, the frequency will adjust to whatever actions the CPU and GPU are performing," Mark Cerny counters. "I think you're asking what happens if there is a piece of code intentionally written so that every transistor (or the maximum number of transistors possible) in the CPU and GPU flip on every cycle. That's a pretty abstract question, games aren't anywhere near that amount of power consumption. In fact, if such a piece of code were to run on existing consoles, the power consumption would be well out of the intended operating range and it's even possible that the console would go into thermal shutdown. PS5 would handle such an unrealistic piece of code more gracefully."
 
In the actual PS5 barely any throttling/etc will be needed, games will not be demanding full CPU and GPU capability for long periods of time. Cerny says it will happen very rarely, if at all.

There is this assumption—likely due to how variable clocks work in the PC and mobile space—that there is just always some inherent compromise that needs to be made, and that because it’s variable that surely means that these are peak and not typical figures.

The reality of it is that it’s entirely a way of having your actual game code running at 2.23Ghz without some high power draw map screen spiking the power and crashing the chip.

It means running your actual game code—the only thing that really matters—at speeds you otherwise wouldn’t be able to do so safely.
Who cares if some uncapped 2D menu screen gets declocked, if the highly complex scene full of triangles and lots of dependent processing doesn’t and gets to run full beans?

This is the way next-next generation consoles will run. If XSX GPU had the same technology it would be even more potent than it is. It’s more power out of less silicon space.
 
Last edited:

Fordino

Member
It means running your actual game code—the only thing that really matters—at speeds you otherwise wouldn’t be able to do so safely.
Who cares if some uncapped 2D menu screen gets declocked, if the highly complex scene full of triangles and lots of dependent processing doesn’t and gets to run full beans?
Yeah, my PS4 Pro gets crazy loud when the uncapped multiplayer menu in COD Modern Warfare is showing, when the soldiers are walking towards the screen.

The same used to happen with the Rocket League menus before they patched it.

There's no need for uncapped scenes like that to go crazy hard on the console.
 

XO_o

Member
P4QjNFz.jpg


nTqJuHU.jpg


50JbQYF.jpg


4s5lLj0.jpg


SBTJ3p6.jpg


P4QjNFz.jpg


hkgraphtest123.gif


hk_wip_03_2017.gif


vid_20151024_141113.gif


pumpcat2-2.gif


thx2.gif


stray devblog
 

Corndog

Banned
Yeah, my PS4 Pro gets crazy loud when the uncapped multiplayer menu in COD Modern Warfare is showing, when the soldiers are walking towards the screen.

The same used to happen with the Rocket League menus before they patched it.

There's no need for uncapped scenes like that to go crazy hard on the console.
One thing I’ve never understood is why leave the frame rate uncapped? I honestly have never played a game that has fan issues on a map screen. What games do this on Xbox?

Edit: and yes this is a genuine question. I’m not trying to start some console comparison.
 
Last edited:

Fordino

Member
One thing I’ve never understood is why leave the frame rate uncapped? I honestly have never played a game that has fan issues on a map screen. What games do this on Xbox?

Edit: and yes this is a genuine question. I’m not trying to start some console comparison.
I'm not sure, I don't have an XBox, but I'm assuming that the (shared) code for those games is doing the same, it's likely just less noticeable as the Xboxes have better cooling this gen.
 

DrDamn

Member
One thing I’ve never understood is why leave the frame rate uncapped? I honestly have never played a game that has fan issues on a map screen. What games do this on Xbox?

Edit: and yes this is a genuine question. I’m not trying to start some console comparison.

The latest CoD does on the same screen the other poster mentioned. Warzone, party of people walking towards the screen prior to starting the game. First time I've noticed the fan in my X1X. I hear my OG PS4 fan all the time. Other player in my party actually noticed it first on their machine (X1X too).
 

PaintTinJr

Member
…...

The reality of it is that it’s entirely a way of having your actual game code running at 2.23Ghz without some high power draw map screen spiking the power and crashing the chip.

It means running your actual game code—the only thing that really matters—at speeds you otherwise wouldn’t be able to do so safely.
Who cares if some uncapped 2D menu screen gets declocked, if the highly complex scene full of triangles and lots of dependent processing doesn’t and gets to run full beans?

…..
I'm in agreement with most of the great points about the PS5's deterministic clocks you make, but the point above isn't how it works (AFAIK) from Cerny's GDC talk, and the only reason I mention it, is because I think you made a similar comment a couple of weeks back and I though it was just a mistake in your explanation to someone., and not how you understood it

In the scenario above, the PS4 Pro is pulling high power for intensive 3D workload, drops power draw on the map screen with nothing to do, and the latency in going from high power draw, to modest power draw results in excess power converted to heat with a raised APU temp for a prolonged period - which then causes the fan to run crazy.

The PS5 is designed so that in the high intensity 3D scene the clock will be set to provide a workload that fits the fixed power available. Then when the switch to a 2D map screen happens -with nothing to render - the clock will change to the very maximum (so probably a jump from 2.17Ghz to 2.23Ghz) with no additional heat rise because the rise in clock renders the 2D map as a higher load - needing the excess power for the higher clock.
 

Corndog

Banned
I'm in agreement with most of the great points about the PS5's deterministic clocks you make, but the point above isn't how it works (AFAIK) from Cerny's GDC talk, and the only reason I mention it, is because I think you made a similar comment a couple of weeks back and I though it was just a mistake in your explanation to someone., and not how you understood it

In the scenario above, the PS4 Pro is pulling high power for intensive 3D workload, drops power draw on the map screen with nothing to do, and the latency in going from high power draw, to modest power draw results in excess power converted to heat with a raised APU temp for a prolonged period - which then causes the fan to run crazy.

The PS5 is designed so that in the high intensity 3D scene the clock will be set to provide a workload that fits the fixed power available. Then when the switch to a 2D map screen happens -with nothing to render - the clock will change to the very maximum (so probably a jump from 2.17Ghz to 2.23Ghz) with no additional heat rise because the rise in clock renders the 2D map as a higher load - needing the excess power for the higher clock.
Couldn’t this be solved by the gpu lowering the power faster or am I missing something.
 

PaintTinJr

Member
Couldn’t this be solved by the gpu lowering the power faster or am I missing something.
I don't think it can, because the circuit is the wiring distance from the positive to negative terminals around the circuit, so I would assume dropping power is the time it takes for a change to travel that distance around the circuit. Whereas an increase in clock is probably the time of one wave length at the current clock frequency, before it can be changed, which would be much quicker in theory.
 
I'm in agreement with most of the great points about the PS5's deterministic clocks you make, but the point above isn't how it works (AFAIK) from Cerny's GDC talk, and the only reason I mention it, is because I think you made a similar comment a couple of weeks back and I though it was just a mistake in your explanation to someone., and not how you understood it

In the scenario above, the PS4 Pro is pulling high power for intensive 3D workload, drops power draw on the map screen with nothing to do, and the latency in going from high power draw, to modest power draw results in excess power converted to heat with a raised APU temp for a prolonged period - which then causes the fan to run crazy.

The PS5 is designed so that in the high intensity 3D scene the clock will be set to provide a workload that fits the fixed power available. Then when the switch to a 2D map screen happens -with nothing to render - the clock will change to the very maximum (so probably a jump from 2.17Ghz to 2.23Ghz) with no additional heat rise because the rise in clock renders the 2D map as a higher load - needing the excess power for the higher clock.

What you’ve written makes no sense to me, and I can’t see how it relates to the GDC talk.

What kind of latency are you talking about? Simple scenes at uncapped frame rates can draw a lot of power indefinitely. It doesn’t matter what came before it. There is zero latency in power consumption changing as instructions change. The instructions being run is what determines power consumption.

A clock speed jumping up when there’s apparently no work to do is also what is described as a “race to idle” condition and explicitly said not to be the reason PS5 hits maximum clocks, too.

GPU burn tests to max out power consumption use simple geometry at uncapped frame rates. Not complex scenes.

Sorry if I’ve misunderstood what you were trying to say, but I can’t make any sense of it.
 
Last edited:
I don't think it can, because the circuit is the wiring distance from the positive to negative terminals around the circuit, so I would assume dropping power is the time it takes for a change to travel that distance around the circuit. Whereas an increase in clock is probably the time of one wave length at the current clock frequency, before it can be changed, which would be much quicker in theory.

This just isn’t how any of this works at all. The power consumed is a result of amount of transistors switching per unit of time. Amount of transistors flipped depends on code being run. Power consumption is not some arbitrarily requested thing that takes time to be fulfilled and adjusted independent of the machine code being run.

A power hungry workload transitioning into a low power workload would result in power consumption immediately going down, and even if for some reason it didn’t and took a second or whatever, it would quickly diminish before there has been any time for the die to saturate and spin the fan up, let alone keep it pegged there.

This all seems like nonsense, but if I have misunderstood your point, please help me understand it.
 

PaintTinJr

Member
What you’ve written makes no sense to me, and I can’t see how it relates to the GDC talk.

What kind of latency are you talking about? Simple scenes at uncapped frame rates can draw a lot of power indefinitely. It doesn’t matter what came before it. There is zero latency in power consumption changing as instructions change. The instructions being run is what determines power consumption.

Sorry if I’ve misunderstood what you were trying to say, but I can’t make any sense of it.
When a game is running in 3D, it is using more of the transistors in the GPU, than in 2D and so more power is drawn to keep those transistors switching at that clockspeed. When there is a sudden drop in processing demand - and those transistors become idle - the demand for power drops, but the power for that cycle is already in the circuit (AFAIK) before the drop in demand, and because the power won't dissipate as work done because the transistors aren't switching now - with out a signal at the base, the power travelling between the collector and emitter temporarily stores as capacitance, before dissipating as wasted heat energy.
 
When a game is running in 3D, it is using more of the transistors in the GPU, than in 2D and so more power is drawn to keep those transistors switching at that clockspeed. When there is a sudden drop in processing demand - and those transistors become idle - the demand for power drops, but the power for that cycle is already in the circuit (AFAIK) before the drop in demand, and because the power won't dissipate as work done because the transistors aren't switching now - with out a signal at the base, the power travelling between the collector and emitter temporarily stores as capacitance, before dissipating as wasted heat energy.

That’s just not what’s happening or the cause for simple scenes to increase power consumption. “Complex scenes” also don’t use “more” of a GPU, either. It’s about flipped transistors per unit of time, hence uncapped frame rates and simple scenes.
The scene used as an example of a power hungry one by Sony is one with low complexity.
It’s like you’re imagining some kind of water hammer effect in electronics, and that it is responsible for adding heat, but that effect if even measurable would last for such an insanely short amount of time as to never cause any kind of sustained extra heat to require a fan to spin up to deal with it, let alone keep them spun up indefinitely.
I think your imagination is taking over a bit.
 

Nickolaidas

Member
These look amazing.
I don't even know whats going on there, infact i don't want to know.
Kill it with FIRE.:messenger_beaming:
If by 'fire' you mean sex, I'll kill it with my own hands.

And by 'hands', I mean cock.

Because in the end, it's all about expanding your horizons and being open-minded to new, exciting experiences.
 
Last edited:
Here's what I expect to see on the Microsoft event:

First Party Content:
- Perfect Dark Reboot (The Initiative)
- Fable Reboot (Playground Games)
- New Forza (Turn 10)
- Halo Infinite gameplay reveal (343)
- New look at Hellblade 2 (Ninja Theory) <---- This one could be something like Microsoft trying to say "hey, look! we've got nice graphics too!"
- New look at Battletoads, maybe gameplay (Dlala Studios, Rare)
- Everwild (Rare)
- Psychonauts 2 gameplay (Double Fine)
- Trailers from Wasteland 3 (inXile) and Grounded (Obsidian)

Third Party Content + Extras
- Elden Ring (From Software)
- Cyberpunk 2077 running on Series X (CD Projekt RED) + Back Compat enhancements showcase
- Dying Light 2 release date announcement (Techland)
- Japanese Studio exclusive game
- Assassin's Creed Valhalla real gameplay (Ubisoft)

This is all that I can think of, I'm pretty sure we are going to see new IP's from both first and third parties and more surprises. What do you guys think?
 
Last edited:
Here's what I expect to see on the Microsoft event:

First Party Content:
- Perfect Dark Reboot (The Initiative)
- Fable Reboot (Playground Games)
- New Forza (Turn 10)
- Halo Infinite gameplay reveal (343)
- New look at Hellblade 2 (Ninja Theory) <---- This one could be something like Microsoft trying say "hey, look! we've got nice graphics too!"
- New look at Battletoads, maybe gameplay (Dlala Studios, Rare)
- Everwild (Rare)
- Psychonauts 2 gameplay (Double Fine)
- Trailers from Wasteland 3 (inXile) and Grounded (Obsidian)

Third Party Content + Extras
- Elden Ring (From Software)
- Cyberpunk 2077 running on Series X (CD Projekt RED) + Back Compat enhancements showcase
- Dying Light 2 release date announcement (Techland)
- Japanese Studio exclusive game
- Assassin's Creed Valhalla real gameplay (Ubisoft)

This is all that I can think of, I'm pretty sure we are going to see new IP's from both first and third parties and more surprises. What do you guys think?

Seems fine but they need to be careful with that HellBlade 2 reveal. Ideally it should be pure gameplay and running at a Native 4K. Anything less and there will be backlash.
 

Nickolaidas

Member
If true, good on Microsoft. Any IP revival is good news. Still hurts they never went through with Phantom Dust. I remember being heavily invested in that game about 15 years ago.
 

Sinthor

Gold Member
Some says Fable had the potential to be MS Zelda, but they didn't invest on this.

I was always intrigued by what I saw of the Fable games. Were they really good and fun to play? If they were, it would sure be good for the Xbox community to get that back and running. As I recall, the last one ended up being delayed, delayed and finally cancelled, wasn't it? Maybe the could resurrect that one and re-engineer it so they could continue the series?
 

Sinthor

Gold Member
These fuckers are really dragging this shit out, aren't they?

I think a big part of the 'delays' is that neither MS nor Sony wants to release pricing yet and the more they talk without mentioning price, the more people will be asking about it. So just trying to drag things out till they are ready. I also think that both of them want to see what the other is going to be pricing at first as well.
 

Sinthor

Gold Member
Here's what I expect to see on the Microsoft event:

First Party Content:
- Perfect Dark Reboot (The Initiative)
- Fable Reboot (Playground Games)
- New Forza (Turn 10)
- Halo Infinite gameplay reveal (343)
- New look at Hellblade 2 (Ninja Theory) <---- This one could be something like Microsoft trying to say "hey, look! we've got nice graphics too!"
- New look at Battletoads, maybe gameplay (Dlala Studios, Rare)
- Everwild (Rare)
- Psychonauts 2 gameplay (Double Fine)
- Trailers from Wasteland 3 (inXile) and Grounded (Obsidian)

Third Party Content + Extras
- Elden Ring (From Software)
- Cyberpunk 2077 running on Series X (CD Projekt RED) + Back Compat enhancements showcase
- Dying Light 2 release date announcement (Techland)
- Japanese Studio exclusive game
- Assassin's Creed Valhalla real gameplay (Ubisoft)

This is all that I can think of, I'm pretty sure we are going to see new IP's from both first and third parties and more surprises. What do you guys think?
That would be a cool lineup to see! The only one I think is less likely is the Elden Ring. I don't think we're going to see anything on that till at least end of year. I'm just thinking if George R.R. Martin can't even finish his last Song of Ice and Fire BOOK...he probably can't write a game script quickly either! :) Would be awesome though!
 
Status
Not open for further replies.
Top Bottom