• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Microsoft: We Could Have Used Variable Clocks for Xbox Series X, But We’re Not Interested in TFLOPS Numbers

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
That is 100% NOT how it works. The dev does not decide where the power goes. All they have control over is the code they write, so they can write less intensive on the CPU code if they want to ensure max GPU power, but they can't explicitly do that by just setting power levels or anything.

They do have the ability to set a power budget for their workload. That is possible. On the dev kits it's limited because Sony is setting the power budget, but that'll change on the retail units.

You literally just said the same thing I said. They would move power to the GPU and take power from the CPU which would throttle the CPU and reduce performance of the CPU. The question is what are the impacts.

If the CPU and GPU can run full power all the time, what would the point of a variable clock speed be? If they can run full throttle together all the time, why would you shift power? Makes no sense.

Cerny talked about this and answered this question in the Road to PS5 show. It's to deal with the cooling issues mainly. And to get more TFs as an added bonus.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
They do have the ability to set a power budget for their workload. That is possible. On the dev kits it's limited because Sony is setting the power budget, but that'll change on the retail units.
You have that absolutely backwards.

The devkits let them set power profiles for testing... that is not how games will ever work when actually running on a users PS5.
Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power.

The system is workload based, and automatic.

Devs can lock frequencies on dev kits likely to test isolated pieces of code at a given frequency.
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
You have that absolutely backwards.

The devkits let them set power profiles for testing... that is not how games will ever work when actually running on a users PS5.

You sure? I can't find the article that I'm thinking about. It's possible you are correct. I read it months ago. I'd be curious if you could find that link.
 

GODbody

Member
Here's what I'm talking about:



Although Cerny did say "at or near", he does explicitly say there's enough power for them to both be running at full frequency.


It's all workload dependent; although taking all of these statements into account.. it actually sounds like most of the time one or the other will be budging a bit. Him throwing around "at or NEAR" is telling.

Yeah that's what I based my understanding of it off of. If you have a fixed budget of 100w and the CPU is drawing 35w and the GPU is drawing 65w and both are at their peak then there's no need to shift power and even if the power does get shifted is the unit receiving the power suddenly going to exceed it's peak frequency?

If it were possible for them both to be running at the peak at the same time then the whole variable frequency design becomes moot.
 

IntentionalPun

Ask me about my wife's perfect butthole
Yeah that's what I based my understanding of it off of. If you have a fixed budget of 100w and the CPU is drawing 35w and the GPU is drawing 65w and both are at their peak then there's no need to shift power and even if the power does get shifted is the unit receiving the power suddenly going to exceed it's peak frequency?

If it were possible for them both to be running at the peak at the same time then the whole variable frequency design becomes moot.

They each have a max frequency (so no won't go above that) and a max power draw, with the budget fairing the GPU in general, and also can be shifted around.

But they can't both be at max frequency all the time, or likely even most of the time (reading between the lines from Mark Cerny.)

They are likely workloads where they both can be maxed though, Cerny said as much. I dunno.. you just aren't really getting that power usage isn't ONLY tied to frequency. It's frequency x's what code is actually running.

It's really pretty much the exact same concept as throttling when your power draw is too high. The difference is the PS5 does it predictably, and ahead of time (before the power draw happens, it guesses it would and instead lowers frequencies.) Whereas most of those kinds of throttling systems are reactionary, or are based on heat (which is effected by ambient temperatures.)
 
Last edited:

John Wick

Member
Funny how all of the talk about how the PS4 ran games at 1080p and the Xbox One ran games at 720p/900p magically vanished in a puff a smoke as soon as the Xbox One X came out.

Salty? So the Xbox One X came and what? What happened? PS4 continued to dominate. Which system had the best games? It wasn't the one with the flops
X came out 1 year after the Pro so naturally it would be more powerful duh.
 

Gavon West

Spread's Cheeks for Intrusive Ads
MS is so full of crap. I have NO IDEA why they seem to be off their game since the PS5 showing. They need more confidence. There's no way they could know if the PS5 is harder to optimize for or not.
You seriously dont think they would know if its easier to develop for Variable performance or fixed performance??? Why wouldnt they?
 

GODbody

Member
They each have a max frequency (so no won't go above that) and a max power draw, with the budget fairing the GPU in general, and also can be shifted around.

But they can't both be at max frequency all the time, or likely even most of the time (reading between the lines from Mark Cerny.)

They are likely workloads where they both can be maxed though, Cerny said as much. I dunno.. you just aren't really getting that power usage isn't ONLY tied to frequency. It's frequency x's what code is actually running.

It's really pretty much the exact same concept as throttling when your power draw is too high. The difference is the PS5 does it predictably, and ahead of time (before the power draw happens, it guesses it would and instead lowers frequencies.) Whereas most of those kinds of throttling systems are reactionary, or are based on heat (which is effected by ambient temperatures.)

Power draw and frequency have a linear relationship. One goes up as the other goes up and one goes down as the other goes down. If the GPU is drawing more power that means it's running at a higher frequency, if a GPU doesn't need to run at it's peak frequency it is going to be drawing less power. vice versa for the CPU.

If you reduce the frequency you are also reducing the power consumed. Frequency and power consumption are directly tied together and cannot be altered separately. A CPU isn't going to be drawing power by running at a lower frequency or reducing power consumption while running at it's peak.

The power budget is fixed on the PS5 so it's going to be drawing the same amount of power no matter what. Cerny's algorithms come in to determine which needs the power the most for the given workload.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Power draw and frequency have a linear relationship. One goes up as the other goes up and one goes down as the other goes down. If the GPU is drawing more power that means it's running at a higher frequency, if a GPU doesn't need to run at it's peak frequency it is going to be drawing less power. vice versa for the CPU.

If you reduce the frequency you are also reducing the power consumed. Frequency and power consumption are directly tied together and cannot be altered separately. A CPU isn't going to be drawing power by running at a lower frequency or reducing power consumption while running at it's peak.

I didn't say they didn't have a linear relationship.

Either stop responding to me or actually address what i'm saying about the workload being the thing you are missing. Good day.
 

GODbody

Member
I didn't say they didn't have a linear relationship.

Either stop responding to me or actually address what i'm saying about the workload being the thing you are missing. Good day.
Oh okay. You've already addressed what you've been saying yourself though.

Not true; it will run both at max frequency as long as the workload wouldn't cause a higher power draw.

Frequencies aren't what cause power draw on their own.

According to Cerny they can both be max most of the time
(likely because most of the time games aren't fully stressing one or the other from a power draw standpoint). We'll really need to wait to hear form devs about the downclocking and how much of an issue it is.
Again.. you are missing one huge factor.. what is running on the CPU/GPU (workload.)

They can both be running at max as long as the workload isn't too high.
They each have a max frequency (so no won't go above that) and a max power draw, with the budget fairing the GPU in general, and also can be shifted around.

But they can't both be at max frequency all the time, or likely even most of the time (reading between the lines from Mark Cerny.)

They are likely workloads where they both can be maxed though, Cerny said as much. I dunno.. you just aren't really getting that power usage isn't ONLY tied to frequency. It's frequency x's what code is actually running.

It's really pretty much the exact same concept as throttling when your power draw is too high. The difference is the PS5 does it predictably, and ahead of time (before the power draw happens, it guesses it would and instead lowers frequencies.) Whereas most of those kinds of throttling systems are reactionary, or are based on heat (which is effected by ambient temperatures.)

You've got quite a few contradictory statements there.
 

IntentionalPun

Ask me about my wife's perfect butthole
Yes I corrected myself; he didn't say "max most of the time" he said "at or near max most of the time." I corrected myself in this statement:


But you know that isn't the main point here, so again.. address what I'm saying or have a nice day.

Here's a 2 fun facts for you:

The PS4 has a locked CPU and a locked GPU.

The PS4 draws different amounts of power depending on what it's doing without changing these clocks.

Now wrap your brain around that and re-read my posts.
 
Last edited:

GODbody

Member
Yes I corrected myself; he didn't say "max most of the time" he said "at or near max most of the time." I corrected myself in this statement:


But you know that isn't the main point here, so again.. address what I'm saying or have a nice day.

Here's a 2 fun facts for you:

The PS4 has a locked CPU and a locked GPU.

The PS4 draws different amounts of power depending on what it's doing without changing these clocks.

Now wrap your brain around that and re-read my posts.

...yes the ps4 draws different about of power between different activities because those different activities are running at different frequencies.

I have never stated that the entire frequencies needed to be used constantly I only made the statement that the PS5 is not likely to be running the CPU and GPU at peak frequencies simultaneously.

I'm really not sure what point you're trying to make here. You've made many different statements against what I've said only to turn around and incorporate my statements into your own posts against me.
 

sendit

Member
...yes the ps4 draws different about of power between different activities because those different activities are running at different frequencies.

I have never stated that the entire frequencies needed to be used constantly I only made the statement that the PS5 is not likely to be running the CPU and GPU at peak frequencies simultaneously.

I'm really not sure what point you're trying to make here. You've made many different statements against what I've said only to turn around and incorporate my statements into your own posts against me.

Okay. No one is saying the PS5 will be running CPU/GPU at peak frequencies simultaneously. I feel like you're having an argument in your own head. Additionally, find me a game/use case where both the CPU && GPU are being utilized simultaneously at 100% at any given moment.
 
With the way people badmouth variable clocks in relation to game development it makes you wonder how games on PC ever get released. Talk about variety!

The support the PC world gives to a huge variety of CPU/GPU and other chips as well as hardware and brand configurations is brilliant. Now objectively look at how varied the player's game experiences and quality/fps is based on that; it's complex and also wildly variable. Many gamers misconfigure settings, have poor performance, crashes or outright select weird configurations based on what customisation from within a game's setting menu affords them. Now look into in game outcomes and realise there is an imbalance based on an almost pay to win aspect e.g. racing games reaction times possible or FPS games with lower input latency in multiplayer or resolution enabling seeing enemies in the distance or more detail etc.

No one is complaining about the possibility of such support or that multiple game targets can be developed for, more so that consoles are by nature designed for a curated and sustainable performance mark to enable optimisations, repeatable and predictable player experiences. When you introduce a variable rate like Sony has there is a balancing act that was not present for console games before and developers alike have to tune around such events. So devs now have to decide to support targets of console hardware versions, generations versions and also now the introduction of variable rates as well. It's just another layer of why the fuck complicate things?

The answer is IMO they shouldn't have and didn't set out the PS5 to be this way but found themselves behind the performance mark of XSX so had to come up with a workable solution to top themselves over the 10TF mark for marketing reasons more than anything. It's sure as shit not made to extract more from the console or easy platform/developer pipelines.
 
Last edited:

Ascend

Member
...yes the ps4 draws different about of power between different activities because those different activities are running at different frequencies.

I have never stated that the entire frequencies needed to be used constantly I only made the statement that the PS5 is not likely to be running the CPU and GPU at peak frequencies simultaneously.

I'm really not sure what point you're trying to make here. You've made many different statements against what I've said only to turn around and incorporate my statements into your own posts against me.
Okay. No one is saying the PS5 will be running CPU/GPU at peak frequencies simultaneously. I feel like you're having an argument in your own head. Additionally, find me a game/use case where both the CPU && GPU are being utilized simultaneously at 100% at any given moment.
It's like we constantly keep going in circles in this forum...

Short version... The PS5 can have both the CPU and GPU at max clock speeds, but it cannot have both the CPU and GPU at 100% workload and maintain max clock speeds, since that would exceed the power budget. The same thing applies in Windows, where you can have your CPU locked at max clocks but being idle. If you run something like Prime95, suddenly everything becomes hot. Frequency and workload are not the same thing.
The reason it is expected that the max clock speeds can be maintained most of the time, is because workloads are very rarely maxing out both the CPU and the GPU at the same time. The amount of power available will constantly be shifted from the component with the lower workload to the component with the higher workload so that both can maintain their clocks. This can change multiple times a second, since it happens automatically and dynamically. In the cases that both have a high enough workload to exceed the power budget, lowering the clocks of at least one of the components will be required to stay within the power budget. Since power consumption is exponentially correlated with clocks, a slight downclock will have a relatively large decrease in power consumption...

It's a way to increase performance on a limited power budget. That is all there is to it. There is not much more to talk about on this subject.
 

IntentionalPun

Ask me about my wife's perfect butthole
...yes the ps4 draws different about of power between different activities because those different activities are running at different frequencies.

No.. it doesn't while gaming, but can use a variable amount of power depending on what game/intensity of graphics/etc

BECAUSE OF THE VARIANCE IN WORKLOAD

Say it slowly... wooooork...loooooaaad.
 
Last edited:

sendit

Member
Short version... The PS5 can have both the CPU and GPU at max clock speeds, but it cannot have both the CPU and GPU at 100% workload and maintain max clock speeds, since that would exceed the power budget. The same thing applies in Windows, where you can have your CPU locked at max clocks but being idle. If you run something like Prime95, suddenly everything becomes hot. Frequency and workload are not the same thing.

We can finally agree to something.
 

Hawke502

Member
The CPU and GPU of ps5 can run at their max power at the same time MOST OF THE TIME, but there will be occasions when the PS5 WILL NEED to downclock the CPU or the GPU so that one or another can run at max frequency. HOW is this better than fixed clocks? And HOW is this easier to develop for? Im sure its easy, but why would it be easier than fixed clocks?
 

TheGrat1

Member
No one is complaining about the possibility of such support or that multiple game targets can be developed for
No, some people are just playing up the idea that a console designed by a game developer based on consulting other game developers on what they want is going to make those developers live's hell because of a few % variability in CPU and GPU clock speed that uses smart shift technology from AMD.
When you introduce a variable rate like Sony has there is a balancing act that was not present for console games before
So an element of PC game development that has existed for decades suddenly becomes a significant hurdle because it comes in a console (with vastly less variability to boot)? It is just a box, dude. Not an alternate dimension where different rules come into play. The PS5 is not doing anything new in this area, even set PC rigs have constantly fluctuating clock speeds.
jGrNlJ.gif

Developing for the PS5 will be like developing and optimizing for a specific Alienware PC.
and developers alike have to tune around such events.
Good thing they have dev kits that give them feedback on exactly how their game code effects the console, then.

The answer is IMO they shouldn't have and didn't set out the PS5 to be this way but found themselves behind the performance mark of XSX so had to come up with a workable solution to top themselves over the 10TF mark for marketing reasons more than anything. It's sure as shit not made to extract more from the console or easy platform/developer pipelines.
This is just speculation bordering on FUD, while failing Occam's Razor to boot.
The simplest explanation for the PS5's final specs is: Price. They have a maximum acceptable MSRP, thus a limited budget to spread around the system. We know that they were going all in on the SSD and I/O complex going all the way back to the beginning of 2019 with the wired article, so they skimped on CU count on the gpu. (I can not remember who said it but I heard that keeping the CU count a multiple of 18 helps with backwards compatibility as well.) Clocking it higher while working on cooling (smart shift is a part of that) was likely the cheapest solution, and one that did not compromise their I/O solution.
These consoles are planned years in advance, to think that they would panic from a competitor's reveal and radically change their internals in 6 months to the point that it would severely effect the rest of the console's physical design is, frankly, ridiculous.
 
No, some people are just playing up the idea that a console designed by a game developer based on consulting other game developers on what they want is going to make those developers live's hell because of a few % variability in CPU and GPU clock speed that uses smart shift technology from AMD.

So an element of PC game development that has existed for decades suddenly becomes a significant hurdle because it comes in a console (with vastly less variability to boot)? It is just a box, dude. Not an alternate dimension where different rules come into play. The PS5 is not doing anything new in this area, even set PC rigs have constantly fluctuating clock speeds.
jGrNlJ.gif

Developing for the PS5 will be like developing and optimizing for a specific Alienware PC.

Good thing they have dev kits that give them feedback on exactly how their game code effects the console, then.


This is just speculation bordering on FUD, while failing Occam's Razor to boot.
The simplest explanation for the PS5's final specs is: Price. They have a maximum acceptable MSRP, thus a limited budget to spread around the system. We know that they were going all in on the SSD and I/O complex going all the way back to the beginning of 2019 with the wired article, so they skimped on CU count on the gpu. (I can not remember who said it but I heard that keeping the CU count a multiple of 18 helps with backwards compatibility as well.) Clocking it higher while working on cooling (smart shift is a part of that) was likely the cheapest solution, and one that did not compromise their I/O solution.
These consoles are planned years in advance, to think that they would panic from a competitor's reveal and radically change their internals in 6 months to the point that it would severely effect the rest of the console's physical design is, frankly, ridiculous.

I think you missed my point entirely. It's simpler to have a fixed sustained rate like all consoles prior than it is to go variable. You also completely did not reply to any points about the mess and variance of gamer experiences on the sheer variety of PC hardware and games across developers. There is a far larger percentage of gamers with poor settings, poor performance and outright crashes in the PC domain over consoles. This is magnified when you look at build your own systems. It also creates more development targets and testing scenarios, of which many PC developers don't cover 100%, unlike the small handful of console hardware revisions, as I stated.

I never said Sony reacted at reveal timing, they would've known about the power gap well before the public reveals. So your "frankly ridiculous" statement falls flat due to your own timing assumptions, poor science or evidence at best.
 
Last edited:

TheGrat1

Member
I think you missed my point entirely. It's simpler to have a fixed sustained rate like all consoles prior than it is to go variable.
Is it more complicated? Yes. Will it significantly hinder game development? No. I did not miss the point, it is simply such an elementary one to make that I did not care to address it.
You also completely did not reply to any points about the mess and variance of gamer experiences on the sheer variety of PC hardware and games across developers.
I ignored those points because they are irrelevant. There will be no significant variance in gamer experience for PS5 users. The game the dev ships will be experienced the same way by everyone who plays it. They can not (read: should not) change the consoles components. When PlayStation games start shipping with resolution, texture quality, depth of field, field of view and frame per second sliders plus allow you access to console commands as standard, let me know.
There is a far larger percentage of gamers with poor settings, poor performance and outright crashes in the PC domain over consoles. This is magnified when you look at build your own systems. It also creates more development targets and testing scenarios, of which many PC developers don't cover 100%, unlike the small handful of console hardware revisions, as I stated.
Thank you for making the case that developing for the PS5 will be vastly simpler than developing for PC, something that small inde devs (sometimes as small as 1 person) do all the time.

I never said Sony reacted at reveal timing, they would've known about the power gap well before the public reveals. So your "frankly ridiculous" statement falls flat due to your own timing assumptions, poor science or evidence at best.
It is still baseless speculation on your part.
And no, Sony would not "know" until the reveal unless someone from MS spilled the beans to them or they engage in corporate espionage. Assuming any of the big three design their consoles (a process that takes years) based on rumors and "you did not hear this from me" whisperings is still ridiculous.
 
Last edited:
They do have the ability to set a power budget for their workload. That is possible. On the dev kits it's limited because Sony is setting the power budget, but that'll change on the retail units.



Cerny talked about this and answered this question in the Road to PS5 show. It's to deal with the cooling issues mainly. And to get more TFs as an added bonus.
Mmmm.. sound more like dealing with cooling issues due to the added TFs. Makes no sense from the perspective of cooling unless the cooling system wasn't designed around the current clocking or the expectation is that the hardware will be maxed out all the time.
 

Yoshi

Headmaster of Console Warrior Jugendstrafanstalt
Sony's variable clock rate method is more predictable than usual though.

"The same scene" will always run at the same frequencies at all times on every machine. So your description really isn't valid for the PS5.

However the same piece of code won't always run at the same frequencies, because the overall workload might be different from scene to scene. So some code that does X while Y is happening will always run the same speed... but if you try to do X while Y AND Z are happening, then the frequencies might drop.

However it would always have that same drop while X/Y/Z are happening so at least the dev can detect that. Cerny claims no dev should ever have to optimize around those scenarios, but I personally take that with a grain of salt.. as Cerny basically described everything about the PS5 to be perfect and the right way to do things lol
No, even in the same scene you have no guarantee to have the same clock speeds, because the history of what you did may influence e.g. memory management, which in turn may influence CPU load. See the Ori example. If the garbage collector kicks in at a certain point of time, if there is a memory leak or if the memory gets too fragmented, additional load on the CPU may happen as well and this could lead to a downclock on PS5 (of course, it can also lead to issues on non-variable clock rate, as I said, the variable clock rate is just one additional place where more variability comes into play).
 

ZywyPL

Banned
Now many of those games are jaw dropping on a............1.8TF console.

So imagine what will be achieved on a 9TF console. I say 9FT because we don't want to push that 9FT to 10TF :messenger_fire:do we?!?!? Mark Cerny and his team have potentially made a bomb here some folk would have you believe.


We already saw recently, 4K is a bitch when it comes to eating up GPU resources, we already saw that with Pro and 1X (and PC for the matter), and PS5/XBX won't be any different, about 60-80% of those 10-12TF will go into 4K, with not much computing power left for the actual graphics enhancement. If we look at the Pro that has 4TF and does 1440p, double that to 4K, and that's your 8TF Cerny was talking about. Some of the work will be offloaded on RT cores, but that's about it.

I think the variable clocks in the PS5 would have a completely different feedback if it actually gave it an edge over the XBX, like a boost to 4-4.5GHz on the CPU, or 13-14TF GPU, but if both chips at their peak are already less capable than XBX, and on top of that they have to fight for power for that peak, then it is what it is.
 

Clear

CliffyB's Cock Holster
Native resolution is so 2013 guys. Noone should care that much when there are so many effective upscaling methods from CBD to DLSS.
 

pyrocro

Member
I was going to reply to a few posters but I'll just put this simple point below.

Why would a CPU or GPU draw more power?
Well to complete a workload task, the more areas of the GPU/CPU being engaged for this task the greater the power draw.

which is Plain and simple performance. greater power draw = greater performance.

fluctuating the power draw on the CPU reduces or increases its performance.
fluctuating the power draw on the GPU reduces or increases its performance.

A lot of post here seem to want to disassociate performance from the power draw but the only reason for a power spike would be more performance is being asked of that component.
With this see-saw act SmartShift will be performing one component will be getting the shit end of the stick.


Also running the CPU or GPU passed their sweet spot has diminishing returns
Shunting 35watts of power from the CPU and giving it to the GPU is likely to yield a meek performance boost on the GPU.(a couple 100MHz I would say)
Also, the CPU is most likely ~65watt CPU in both machines how much power can it give to the GPU. 35W~50watts leaving the CPU with what kind of performance.
sounds like there is going to be a lot of sleep command in the code, special logic loops and such.
 

Tulipanzo

Member
The CPU and GPU of ps5 can run at their max power at the same time MOST OF THE TIME, but there will be occasions when the PS5 WILL NEED to downclock the CPU or the GPU so that one or another can run at max frequency. HOW is this better than fixed clocks? And HOW is this easier to develop for? Im sure its easy, but why would it be easier than fixed clocks?
The idea is that at fixed clock power varies based on the intensity of the calculations.
Now, if that translated directly to graphical prowess, that would seemingly put PS5 at a disadvantage.
However, that's not the case.

Peak power is required for relatively simpler calculations, which flip every transistor with every clock cycle, not for more graphically intensive scenes that will leave some idle. For example, in HZD, the map heats up your console way more than playing the actual game.
The idea with PS5 boost clocks is that you vary the clocks so that those simpler but power-hungry equations run lower, while allowing everything else to run at overall higher clocks.

The Series X as it is, could not do this, because you need an internal controller for power consumption for this approach to function.
The guy interviewed here is a Program Manager, and is not qualified to talk tech.

EDIT: Here's that in detail
 
Last edited:

pyrocro

Member
The idea is that at fixed clock power varies based on the intensity of the calculations.
Now, if that translated directly to graphical prowess, that would seemingly put PS5 at a disadvantage.
However, that's not the case.

Peak power is required for relatively simpler calculations, which flip every transistor with every clock cycle, not for more graphically intensive scenes that will leave some idle. For example, in HZD, the map heats up your console way more than playing the actual game.
The idea with PS5 boost clocks is that you vary the clocks so that those simpler but power-hungry equations run lower, while allowing everything else to run at overall higher clocks.

The Series X as it is, could not do this, because you need an internal controller for power consumption for this approach to function.
The guy interviewed here is a Program Manager, and is not qualified to talk tech.

EDIT: Here's that in detail
So you know more than him on how they developed XSX?
He only works on at MS. But you figured it out, don't you.
Do you think MS didn't have access to AMD's smart shift?
Let's hear what else you know MS didn't have or didn't know how todo?
 

cormack12

Gold Member
The Series X as it is, could not do this, because you need an internal controller for power consumption for this approach to function.

Wait - what?

Where does that post say this?? If you're gonna say this stuff, then you need to show us, not try and misdirect us with a thread about something else. Where's the examples, the articles the proof?

The Series X as it is, could not do this, because you need an internal controller for power consumption for this approach to function.

S04E02-yWCFmEAb-subtitled.jpg
 
Last edited:

Tulipanzo

Member
Wait - what?

Where does that post say this?? If you're gonna say this stuff, then you need to show us, not try and misdirect us with a thread about something else. Where's the examples, the articles the proof?
In the interview it suggested the Series X could have done this, but it can't without a power controller (the "model SOC" Cerny talked about). Running clocks variable without a way to know power draw would fry the Series X, and this was already a concern at Sony (you can check out the talk DF had with him).



Again, XSX as it is right now, needs fixed clocks. If MS were to install a power controller, then theoretically it could run higher, but that might lead to some heat dissipation problems given the larger APU. That is a way more realistic scenario than what this guy is talking about ("forced clocks" is very clueless).
 
Last edited:

Dnice1

Member
It's like we constantly keep going in circles in this forum...

Short version... The PS5 can have both the CPU and GPU at max clock speeds, but it cannot have both the CPU and GPU at 100% workload and maintain max clock speeds, since that would exceed the power budget. The same thing applies in Windows, where you can have your CPU locked at max clocks but being idle. If you run something like Prime95, suddenly everything becomes hot. Frequency and workload are not the same thing.
The reason it is expected that the max clock speeds can be maintained most of the time, is because workloads are very rarely maxing out both the CPU and the GPU at the same time. The amount of power available will constantly be shifted from the component with the lower workload to the component with the higher workload so that both can maintain their clocks. This can change multiple times a second, since it happens automatically and dynamically. In the cases that both have a high enough workload to exceed the power budget, lowering the clocks of at least one of the components will be required to stay within the power budget. Since power consumption is exponentially correlated with clocks, a slight downclock will have a relatively large decrease in power consumption...

It's a way to increase performance on a limited power budget. That is all there is to it. There is not much more to talk about on this subject.

Hmm, seems like something developers have to optimize for. Which was the point the MS engineer was making. This wasn't about whether the cpu/gpu can potentially run at peak frequency. Those were Mark Cerny's words. The article was about does it require more developer optimization. Which even Cerny said to get the most benefit from the setup it will.
 

Tulipanzo

Member
So you know more than him on how they developed XSX?
He only works on at MS. But you figured it out, don't you.
Do you think MS didn't have access to AMD's smart shift?
Let's hear what else you know MS didn't have or didn't know how todo?
Well, I'm saying he works in Program Management and it shows. Even if he didn't I don't think he has much business or knowledge to comment on the PS5, since he most definitely did not work on that.

I'm not suggesting at all MS wasn't capable of doing something similar*, but that the Sony solution might not have been ideal for them.
For one thing, it requires a custom controller for power draw, which the XSX doesn't have (so it couldn't just "do it" as this guy seems to suggest).
For another, a "boost clock" approach might have made the larger APU too expensive to cool.

I genuinely don't get the aggressiveness, I don't think MS's approach is inferior or anything.
I however think that if MS genuinely considered this, the reason they didn't do it wasn't "because they don't like boost clocks", but rather because it conflicted with their existing design goals.

*Sony's approach includes AMD SmartShift, but it is not the same thing, just pointing it out.
 

IntentionalPun

Ask me about my wife's perfect butthole
No, even in the same scene you have no guarantee to have the same clock speeds, because the history of what you did may influence e.g. memory management, which in turn may influence CPU load. See the Ori example. If the garbage collector kicks in at a certain point of time, if there is a memory leak or if the memory gets too fragmented, additional load on the CPU may happen as well and this could lead to a downclock on PS5 (of course, it can also lead to issues on non-variable clock rate, as I said, the variable clock rate is just one additional place where more variability comes into play).
I mean... Playstation games generally do NOT have managed memory. They are built in C++ and the developer explicitly manages the memory.

So you don't have things like automatic garbage collection; same with the vast majority of XBox Games.

It's mostly only Indies that are using something like Unity with C# on Xbox and automatic memory management.

So no, that's not going to be an issue on PS5 for the vast majority of games.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Mmmm.. sound more like dealing with cooling issues due to the added TFs. Makes no sense from the perspective of cooling unless the cooling system wasn't designed around the current clocking or the expectation is that the hardware will be maxed out all the time.

You have to understand how silly the bolded sounds. You say this as if you are an engineer and building a console yourself. You can't possibility no if they design their console this way due to cooling "issues" from the added TFs. But Cerny literally told us himself that it was due to them not wanting a loud console again (like the PS4 and PS4 Pro).
 

cormack12

Gold Member
In the interview it suggested the Series X could have done this, but it can't without a power controller (the "model SOC" Cerny talked about). Running clocks variable without a way to know power draw would fry the Series X, and this was already a concern at Sony (you can check out the talk DF had with him).

Where? I've read that article before and I've just read it again. Specifically where does it state/suggest the below

it suggested the Series X could have done this, but it can't without a power controller

Where does it even mention other consoles. Don't drop a link and run - point out exactly in that article where it says what you are trying to pass off.


I genuinely don't get the aggressiveness, I don't think MS's approach is inferior or anything.
I however think that if MS genuinely considered this, the reason they didn't do it wasn't "because they don't like boost clocks", but rather because it conflicted with their existing design goals.

There is no aggressiveness, it's just something you're trying to hide behind because you're being asked to show exactly what you're saying. You remind me of someone who was banned.
 
Last edited:

quest

Not Banned from OT
You have to understand how silly the bolded sounds. You say this as if you are an engineer and building a console yourself. You can't possibility no if they design their console this way due to cooling "issues" from the added TFs. But Cerny literally told us himself that it was due to them not wanting a loud console again (like the PS4 and PS4 Pro).
That is BS and you know it the reason for the noisy PS4 was quality control and Sony cheaping out on cooling to make a few dollars. The loud fans on a map screen of a game that sold 10 million copies they could patch it. The whole cooling argument is being sony stockholder pocket protectors. Cerny is in marketing mode right now if believe they could not make a quiet ps4 I have this sweet bridge for sale. The variable clocks are to make Sony more money smaller APU lower build cost bigger profits very simple. The cooling is marketing speak to distract from the subject which a great marketing person does. It is beautiful sleight of hand distraction and marketing by Sony. It helps to have the media in your pocket who won't ask the hard questions.
 

Tulipanzo

Member
Where? I've read that article before and I've just read it again. Specifically where does it state/suggest the below
There's literally a part called PlayStation 5's boost clocks and how they work.
In it:
"We don't use the actual temperature of the die, as that would cause two types of variance between PS5s," explains Mark Cerny. "One is variance caused by differences in ambient temperature; the console could be in a hotter or cooler location in the room. The other is variance caused by the individual custom chip in the console, some chips run hotter and some chips run cooler. So instead of using the temperature of the die, we use an algorithm in which the frequency depends on CPU and GPU activity information. That keeps behaviour between PS5s consistent."

Inside the processor is a power control unit, constantly measuring the activity of the CPU, the GPU and the memory interface, assessing the nature of the tasks they are undertaking. Rather than judging power draw based on the nature of your specific PS5 processor, a more general 'model SoC' is used instead. Think of it as a simulation of how the processor is likely to behave, and that same simulation is used at the heart of the power monitor within every PlayStation 5, ensuring consistency in every unit.
As I said, thermal requirements were a major concern when developing this approach for PS5

Where does it even mention other consoles. Don't drop a link and run - point out exactly in that article where it says what you are trying to pass off.
It's literally in the OP: "we could have used forced clocks, we could have used variable clock rates ".
That's an obvious reference to Sony's approach, and it's just plain wrong. For one thing, there's no "forcing" (and he seemingly corrects himself there), but follows it up by suggesting it's been done to boast a higher TF count which is laughable.
The claim it's harder for devs is also wholly unsubstantiated (and in fact directly contradicting most reports).

I'm not out there criticizing MS for their approach to clocks, but a guy not involved with tech is openly misunderstanding how the competition's approach works.

There is no aggressiveness, it's just something you're trying to hide behind because you're being asked to show exactly what you're saying. You remind me of someone who was banned.
If you open the Quote in my og post, there's an extended explanation of the logic behind Sony's approach, which was asked about. The claims in the OP are misinformed.




PS: This the guy with the huge beard that said "The Series X is just called XBox". Off topic, it's just really funny
 

Yoshi

Headmaster of Console Warrior Jugendstrafanstalt
I mean... Playstation games generally do NOT have managed memory. They are built in C++ and the developer explicitly manages the memory.

So you don't have things like automatic garbage collection; same with the vast majority of XBox Games.

It's mostly only Indies that are using something like Unity with C# on Xbox and automatic memory management.

So no, that's not going to be an issue on PS5 for the vast majority of games.
Unreal Engine has a garbage collector, too, even though you use C++as a scripting language in it. Maybe some Sony-owned engine or Frostbite do not have automated garbage collection? Most engines will have a semi-autonomous framework that can lead to variance in a scene's performance.
 

cormack12

Gold Member
There's literally a part called PlayStation 5's boost clocks and how they work.
In it:

As I said, thermal requirements were a major concern when developing this approach for PS5


This is all wrong, you're mixing up about 3/4 different things. I know how the clocks work. Your quote:

In the interview it suggested the Series X could have done this, but it can't without a power controller (the "model SOC" Cerny talked about). Running clocks variable without a way to know power draw would fry the Series X, and this was already a concern at Sony (you can check out the talk DF had with him).

I'm asking where that is? Where's the material citing that specifically? I don't have an xbox. I have a playstation. So I'm not bothered if you turd all over Xbox but I want to know where that is being said and who by? Your latest ost doesnt prove that - you've taken a generalised paragraph and made it specific. Again, to be totally clear I'm asking you to provide the quotes or relevant sections that support that statement you have made (above).

There are lots ot things you're stitching together to try and create a story. Where are the elements individually? Both these 'interviews and articles' take less than ten minutes to read. As you're versed in them, provide the exact quotes that support your statements above. Your last post just avoids the question again, drops a link that doesn't have the evidence. It's just basic console warring that's in every thread, you've made the claims now back them up, or retract the statement. They are your options, otherwise you're just trolling.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Unreal Engine has a garbage collector, too, even though you use C++as a scripting language in it. Maybe some Sony-owned engine or Frostbite do not have automated garbage collection? Most engines will have a semi-autonomous framework that can lead to variance in a scene's performance.
Interesting; guess I assumed wrong, thanks for the info. Either way whether there is an automated GC or not, for something like a game, wouldn't you want to make manual .GC() calls to avoid exactly the scenario describes?
 
Last edited:

Tulipanzo

Member
This is all wrong, you're mixing up about 3/4 different things. I know how the clocks work. Your quote:

I'm asking where that is? Where's the material citing that specifically? I don't have an xbox. I have a playstation. So I'm not bothered if you turd all over Xbox but I want to know where that is being said and who by? Your latest ost doesnt prove that - you've taken a generalised paragraph and made it specific. Again, to be totally clear I'm asking you to provide the quotes or relevant sections that support that statement you have made (above).
1) That's a literal Cerny quote about how they manage thermals; he openly talked about it in his Road to PS5 video as well
2) I'm not "turding"[?] all over anything; I specifically said XBox's approach in not inferior; it's the approach used by every single other console, and I got an OG X1, so why would I be bothered
3) If anything, Jason here is clearly "turding" on the Sony approach with unsubstantiated and largely incorrect assumptions
There are lots ot things you're stitching together to try and create a story. Where are the elements individually? Both these 'interviews and articles' take less than ten minutes to read. As you're versed in them, provide the exact quotes that support your statements above. Your last post just avoids the question again, drops a link that doesn't have the evidence. It's just basic console warring that's in every thread, you've made the claims now back them up, or retract the statement. They are your options, otherwise you're just trolling.
In this interview, Jason said
We focus on optimizing the developer experience to deliver the best possible experience for players, rather than trying to 'hunt' down certain record numbers. We've always talked about consistent and sustained performance.
We could have used forced clocks, we could have used variable clock rates: the reality is that it makes it harder for developers to optimize their games even though it would have allowed us to boast higher TFLOPS than we already had, for example. But you know, that's not the important thing. The important thing is the gaming experiences that developers can build.
I don't get what I'm making up; he's openly suggesting that "variable clocks" (what the PS5 does) make it harder to optimize games, do not deliver consistent performance, and are used to "boast higher TFs". That's in the OP...

He literally said "we could have done this", which is very misleading. Had MS put in extra components and managed the cooling properly then yes, but the XSX as it exist right now can't do it.
Unless XSX manages thermals in a completely different way, then no "model SOC" means no clear info on power draw, which means certain consoles could get too hot and prevent variable clocks from working (which is what happens on phones and PCs). Again, the exact same issue Cerny talked about in the quote I posted...
 
Last edited:

Yoshi

Headmaster of Console Warrior Jugendstrafanstalt
Interesting; guess I assumed wrong, thanks for the info. Either way whether there is an automated GC or not, for something like a game, wouldn't you want to make manual .GC() calls to avoid exactly the scenario describes?
Not in general, no, because if you perform too many GC calls, you fragment your memory, which is bad for performance, it is better if garbage collection happens in large chunks. Of course, timing it manually is a good thing if you do it properly and we have done so with our 3D platformer under Unity (garbage collection upon level change), but this is not necessarily viable for all games and it still does not necessarily stop the garbage collector from kicking in whenever.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
You have that absolutely backwards.

The devkits let them set power profiles for testing... that is not how games will ever work when actually running on a users PS5.


The system is workload based, and automatic.

Devs can lock frequencies on dev kits likely to test isolated pieces of code at a given frequency.

Thanks for this. This is what I was trying to say. Developers have the ability to program to set power profiles so get an understanding of what the console is doing and how the game will react. So it's not like they'll have no idea how things will play out when the variable frequencies occur when the game releases.

They each have a max frequency (so no won't go above that) and a max power draw, with the budget fairing the GPU in general, and also can be shifted around.

But they can't both be at max frequency all the time, or likely even most of the time (reading between the lines from Mark Cerny.)

They are likely workloads where they both can be maxed though, Cerny said as much. I dunno.. you just aren't really getting that power usage isn't ONLY tied to frequency. It's frequency x's what code is actually running.

It's really pretty much the exact same concept as throttling when your power draw is too high. The difference is the PS5 does it predictably, and ahead of time (before the power draw happens, it guesses it would and instead lowers frequencies.) Whereas most of those kinds of throttling systems are reactionary, or are based on heat (which is effected by ambient temperatures.)

On PCs games don't run maxed out all the time, so this isn't unheard of. Like you said (and Cerny of course) it's just more predictable now. Which is a good thing.

That is BS and you know it the reason for the noisy PS4 was quality control and Sony cheaping out on cooling to make a few dollars. The loud fans on a map screen of a game that sold 10 million copies they could patch it. The whole cooling argument is being sony stockholder pocket protectors. Cerny is in marketing mode right now if believe they could not make a quiet ps4 I have this sweet bridge for sale. The variable clocks are to make Sony more money smaller APU lower build cost bigger profits very simple. The cooling is marketing speak to distract from the subject which a great marketing person does. It is beautiful sleight of hand distraction and marketing by Sony. It helps to have the media in your pocket who won't ask the hard questions.

I wasn't trying to say that the PS4 didn't cheapen out on cooling. But you are wrong to think that the cooling stuff in the PS5 is just for stockholders pockets. You can clearly see from the patent and the PS5 design that it's HUGELY based on how to cool it. Plus I think you are going coo coo for CoCo Puffs with the bolded.
 

Bryank75

Banned
Great comment from under a youtube video on the subject;

Paulo Lameiras2 hours ago
'MS have put themselves in a tough situation. Series X main argument against PS5 is the teraflops advantage, but with a Series S also available, they have to convince people that teraflops may not be that important. They basically will have to kill their best argument to make both systems equally worth buying.'
 
Last edited:
Top Bottom