• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Balancing clocks for max power draw is a different animal than variable clock throttling

Genx3

Member
If the PS5 can run at max performance during intense gaming what is the point of saying that it has variable clocks? If it only downclocks on the dashboard or during a movie for all intents and purposes its fixed clocks right? Comparisons always pit XSX 12TF to PS5 10 TF. There will never be a time where the PS5 won't have max clocks during games correct?



Does this mean MS made a mistake choosing fixed clocks with XSX?

No that is not what this means.
When the load get too high the PS5 will need to down clock either its GPU, its CPU or both. Totally dependent on the load which affects the wattage.
PS5 will not run at full clocks all the time Mark Cerny himself stated that.
 

Genx3

Member
I think its more of, it was co developed with Sony and couldnt avail of it, in 2020 only one laptop is currently rocking AMD Smartshift.

Btw its more of an educated guess, since why wouldnt XSX avail of it? since nextgen consoles are lightly based on AMD mobile APU's and smartshift is considered their next big thing for mobile devices

This is existing AMD Tech.
 
Last edited:
For the bold part, this tweet is addressing that. The choice is with the developers - and they make that choice based on their desired performance. They optimise for power consumption - if they allow the power consumption to rise above the SoC budget they can expect the system to downclock. If they're happy with that they can let the system manage it for them - if they're not happy they optimise power consumption by reducing the workload. That's what this tweet is referring to.

MS chose fixed clocks because that's traditional for consoles so it's the simplest solution - engineer the console and leave it to developers to predict and code for power consumption/cooling/stability.

The compromise to xsex is a 20% lower clock speed than PS5, which gives them slower speeds on cache access, geometry engine, ROPs and so on. MS will hope having more CUs will allow devs to make up the gap for the slower clock.

from what I understand with Smartshift is that it is totally automated and developers do not have to code for it at all.
More nonsense. This is existing AMD Tech.

That is why I thought it was an educated guess, because when it was announced in CES last January before the specs of either console was reveiled, many made the assumption ofcourse both consoles will be getting it since they both run AMD tech and it was obvious how much extra performance it would bring to closed systems. Because why wouldnt MS add it to their console? the tech is usally reserved only for closed systems that run both AMD CPU and GPU.
 

Bo_Hazem

Banned
To that discord channel? Most haven't, no.

You should probably spend more time in the real world instead:
HjD1ysX.png


IDJxIMK.png


according to 'balls' aka gaf's 'proelite' lol:
A5w6E0S.png

Doncabesa Doncabesa You're one of them.:lollipop_crying:

giphy.gif
 

enver

Banned
Ass creed with be 4k30 on both. Thats all we have to go on so far confirmed.

no, they said XSX version will be at least 30fps. We don't know yet, what the final FPS will be. So, we have to wait. Where did they confirm 4K@30fps for PS5 btw?

The choice is with the developers - and they make that choice based on their desired performance. They optimise for power consumption - if they allow the power consumption to rise above the SoC budget they can expect the system to downclock. If they're happy with that they can let the system manage it for them - if they're not happy they optimise power consumption by reducing the workload. That's what this tweet is referring to.

it's not a choice though. if you want a stable FPS you really just can't leave it to the system just lowering the FPS or so. this would result in a horrible gaming experience.

MS chose fixed clocks because that's traditional for consoles so it's the simplest solution - engineer the console and leave it to developers to predict and code for power consumption/cooling/stability.

What kind of spinning is this? seriously?!?!? Devs don't have to do shit. everything is sustained, everything is the same under all circumstances! They don't have to care for power budget, power consumption, temperature etc. everything is sustained all the time.... what are you even talking about?

The compromise to xsex is a 20% lower clock speed than PS5, which gives them slower speeds on cache access, geometry engine, ROPs and so on. MS will hope having more CUs will allow devs to make up the gap for the slower clock.

What? compromise? how so? you could also say that Sony compromised by using such a low number of CUs. XSX has 56% more CUs than Sony.
 
Folks,


We've been through the thread and issued a couple of warnings. No reply bans or bans have been issued as of this time. However, they will be forthcoming after this time. Obviously it's not everyone in the thread - there are a couple of new people who have had courtesy warnings, and the usual crowd with multiple existing warnings for console warring, or antagonistic derails. This is the line in the sand going forward. Do not be surprised or feel affronted, if you are removed from the topic for persistently dragging it down into a quagmire. If you're unable to discuss the technical points (which is what this thread should be about), without making this into another console war thread then you probably shouldn't be here anyway.


The tweet has a traditional definition of what variable clock rate is, and then says is it completely different to balancing clocks against power draw. I will update the title to reflect this.
Nice but dude, why i get a warning for just saying something about teraflops ? It's maybe the second time i said this number and got a warning for that...

With a threat of ban for just that ... I feel assaulted.

Sorry if the word are not the good to use, im using google trad sometime because i do not speak a fluent english. Sorry for that.
 

T-Cake

Member
no, they said XSX version will be at least 30fps. We don't know yet, what the final FPS will be. So, we have to wait. Where did they confirm 4K@30fps for PS5 btw?

Yeah, people keep missing this. And with VRR being a standard thing, why would every single game not have an unlocked frame rate option to let the FPS go anywhere between 40-120 for those users that have an appropriate display?
 

kraspkibble

Permabanned.
LOL

Basically, because everyone has been pissing and moaning "my PS4 sounds like a jet engine!!" It has scared Sony into making a quiet console. The PS4 isn't even loud... You think it is? Then what on earth was the PS3?

So now we'll get a PS5 that is not only the size of a house but will throttle itself to keep power/temps down.

Y'all gone done played yourselves for this "muh PS4 is a jet engine!!1!!" bullshit.
 

HawarMiran

Banned
Read an article that put everything in the PS5 into perspective. Primitive shaders,cache scrubbers and so on. The fact that PS5 is trying to remove bottlenecks is the key here. If you put every resource only into the things you can see on display will increase the power of the console manifold. But i will probably wait for the pro version. This gen I only played on the OG PS4
 

psorcerer

Banned
Variable clocks is a way to squeeze more juice out of either limited cooling or limited Power available like in the PS5's case.

It is not something to brag about. Its basically a Band-Aid for poor design decisions like having too small of a power supply.

Any closed box (including servers in datacenters) has a limited power.
It's physics.
The only place where power envelope doesn't matter is custom PCMR boxes, nobody cares if they overheat.
 

Genx3

Member
Any closed box (including servers in datacenters) has a limited power.
It's physics.
The only place where power envelope doesn't matter is custom PCMR boxes, nobody cares if they overheat.

This is true.
However since it indeed is a closed box Sony could have easily took the maximum power draw into account when considering the power supply.
I believe that going over the Max power draw would have likely compromised the cooling as well so maybe Sony engineers purposely want to keep the load down in order to allow for proper cooling. The size of the power supply smells like the console got a last minute OC imo.
 
Last edited:

On Demand

Banned
Variable clocks is a way to squeeze more juice out of either limited cooling or limited Power available like in the PS5's case.

It is not something to brag about. Its basically a Band-Aid for poor design decisions like having too small of a power supply.


You have no idea what you're talking about man.

There's no poor decision or bandaid. They designed the PS5 like it is from the beginning. You can't come up with the way Sony designed the PS5 out of a whim. Variable clocks don't work like it does in the traditional sense. What PS5 is doing is completely new.

The only poor decision is posts in this thread.

The is true.
However since it indeed is a closed box Sony could have easily took the maximum power draw into account when considering the power supply.
I believe that going over the Max power draw would have likely compromised the cooling as well so maybe Sony engineers purposely want to keep the load down in order to allow for proper cooling. The size of the power supply smells like the console got a last minute OC imo.

You don't create PS5's CPU and GPU clocks power draw variation at a last minute, you have to design it that way years in advance. There's no way silicon can be changed in mere months. There's years of planning and testing.
 

Genx3

Member
You have no idea what you're talking about man.

There's no poor decision or bandaid. They designed the PS5 like it is from the beginning. You can't come up with the way Sony designed the PS5 out of a whim. Variable clocks don't work like it does in the traditional sense. What PS5 is doing is completely new.

The only poor decision is posts in this thread.



You don't create PS5's CPU and GPU clocks power draw variation at a last minute, you have to design it that way years in advance. There's no way silicon can be changed in mere months. There's years of planning and testing.

Nonsense
This is a band-aid.
Why would they purposely gimp their power supply?
Smart Shift is how they got around the over clocking.
Even Mark Cerny calls it "Boost".
Why does he call it boost? They over clocked after the original design was almost final.
 

Panajev2001a

GAF's Pleasant Genius
Nonsense
This is a band-aid.
Why would they purposely gimp their power supply?
Smart Shift is how they got around the over clocking.
Even Mark Cerny calls it "Boost".
Why does he call it boost? They over clocked after the original design was almost final.

Then he says the system is expected to run at those clocks most of the time and that it does not work like the boost clock people are used to/think of do (and that is not something they have done last minute... even in the scenario he said clocks could be lowered by a couple of percent points you are still way above the XSX GPU clocks).

They built a design targeting high clock speed from the very beginning (and he mentions the trouble they had while working on that). If they allowed for a bigger console volume and tweaked voltage and thus frequency a bit towards the latter phase of the cycle... possibly, but you are driving at the usual “scared of competition last minute overclock” as if they went from 1.7 GHz to 2.2 GHz on the GPU at the last minute :LOL:.

Sure.. sure... the much more likely explanation is that Cerny wanted to draw attention to how reactionary they were to XSX, because the world revolves only around anticipating for it and reacting to it :)rolleyes:), and thus used the word boosted clocks (later clarified it better as variable clockspeed for a chip that is capable to run higher but capped)... sometimes it must be frustrating for him to speak and have to repeat the same thing 4 times as some people work overtime to add their own F.U.D. spin to it (see the raytracing bull crap).
 
Last edited:

jimbojim

Banned
Yeah, people keep missing this. And with VRR being a standard thing, why would every single game not have an unlocked frame rate option to let the FPS go anywhere between 40-120 for those users that have an appropriate display?

You don't want that kind of fluctuation in frame-rate even with VRR. You'll have a problem with input latency then. Locked 30 is ALWAYS better than unlocked frame-rate with VRR On. VRR doesn't magically make variances in frame-rate invisible by any means
 

Genx3

Member
Then he says the system is expected to run at those clocks most of the time and that it does not work like the boost clock people are used to/think of do (and that is not something they have done last minute... even in the scenario he said clocks could be lowered by a couple of percent points you are still way above the XSX GPU clocks).

They built a design targeting high clock speed from the very beginning (and he mentions the trouble they had while working on that). If they allowed for a bigger console volume and tweaked voltage and thus frequency a bit towards the latter phase of the cycle... possibly, but you are driving at the usual “scared of competition last minute overclock” as if they went from 1.7 GHz to 2.2 GHz on the GPU at the last minute :LOL:. Sure.. sure...

Come on you're way too smart to believe that PR.
They boosted after they decided they wanted to close the gap.
Nothing wrong with that.
Its a smart way to get more performance from a design that was never meant to hit 10.2 TF's.
 

T-Cake

Member
You don't want that kind of fluctuation in frame-rate even with VRR. You'll have a problem with input latency then. Locked 30 is ALWAYS better than unlocked frame-rate with VRR On. VRR doesn't magically make variances in frame-rate invisible by any means

Hmm, I did not know that. How does it work with Gsync then? That's supposed to be smooth as butter regardless of frame rate isn't it?
 

Panajev2001a

GAF's Pleasant Genius
Come on you're way too smart to believe that PR.
They boosted after they decided they wanted to close the gap.
Nothing wrong with that.
Its a smart way to get more performance from a design that was never meant to hit 10.2 TF's.

You are not reading my posts all the way through hence why I referenced voltage and frequency (and relation to power). Please re-read :).

The design was a design always built for very high clocks, no framing way they went from XSX’s 1.8 GHz to 2.2 GHz GPU clocks ok the last 6 months so close to mass production, this is borderline delusional.
Did they tweak the clock at all in the last year based on the console volume and cooling max capability as they left “some” headroom during design? That is possible, but that means extending the current approach that was already in use/implemented.

Nope... the more sensible/reasonable approach is that the world revolves around XSX and Sony just made a massive overclock in the last few months with a November launch on the horizon... :rolleyes:...
 

jimbojim

Banned
That's supposed to be smooth as butter regardless of frame rate isn't it?

Comparing G-Sync and Freesync? I dunno are there some differences ( except that Freesync is an open source ). But i think they are practically the same from what i've read. With very little differentiation in the technologies between them, the choice really comes down to brand preference. If you love Nvidia, of course you'll chose G-Sync. :D
You'll still have frame-rate drops with VRR on, but it only gives you the illusion of "smoothness" because there is no visible stutter or jittering
 

cormack12

Gold Member
Come on you're way too smart to believe that PR.
They boosted after they decided they wanted to close the gap.
Nothing wrong with that.
Its a smart way to get more performance from a design that was never meant to hit 10.2 TF's.

But you're not smart enough to listen to actual explanations on how stuff works. This is all explained and you are making yourself look stupid or that you cant follow simple technical talks. You need to go away and read and show you understand whats being said. Right now the only reason to even take note of you is to read posts in disbelief then ignore you.
 

Genx3

Member
You are not reading my posts all the way through hence why I referenced voltage and frequency (and relation to power). Please re-read :).

The design was a design always built for very high clocks, no framing way they went from XSX’s 1.8 GHz to 2.2 GHz GPU clocks ok the last 6 months so close to mass production, this is borderline delusional.
Did they tweak the clock at all in the last year based on the console volume and cooling max capability as they left “some” headroom during design? That is possible, but that means extending the current approach that was already in use/implemented.

Nope... the more sensible/reasonable approach is that the world revolves around XSX and Sony just made a massive overclock in the last few months with a November launch on the horizon... :rolleyes:...

I read it and I also listened to Mark Cerny himself when they unveiled the PS5 HW.
Of course Mark is going to want to make it seem like the final PS5 HW was the original plan.
It would sound bad PR wise if he didn't but most of us here that don't lap up PR know the truth.

No the world don't revolve around the XSX but it would be absolutely foolish of any company to ignore what their direct competitors are putting out.
Just the same way Samsung's Galaxy S series Phones keeps Apple on their toes.
 

Genx3

Member
But you're not smart enough to listen to actual explanations on how stuff works. This is all explained and you are making yourself look stupid or that you cant follow simple technical talks. You need to go away and read and show you understand whats being said. Right now the only reason to even take note of you is to read posts in disbelief then ignore you.

Feel free to ignore me. No need for the insults.
I'm stating my opinion. Just because I don't lap up corporate PR shouldn't be a reason for you to get your panties up in a bunch.
PS5 was Over clocked. Mark Cerny himself calls it boost. No PR or technical breakdown will change those facts.
 
We're all too stupid to know how these systems actually work. Let's not pretend like we understand the engineering side of this equation, unless you're going to tell me you're a hardware engineer yourself and that you have experience designing hardware yourself... lets keep from making definitive statements like we actually know.

Edit: I'm not sure what the argument is with the variable clocks. No one in here even knows how much it can, or would "vary" We had Mark Cerny give his take, but apparently we can't trust that marketing guy. I don't think anyone in here has EVER designed a game, am I wrong? If you haven't then how do you know how utilization a CPU/GPU goes through during the course of a game? Also, I keep hearing this 9TF number being tossed around, but on one has EVER given us the math on how they came to 9TF. What a convenient whole number. I get the feeling that the number 9 was thrown around because it's a single digit number, when side by side to double digit number looks much more impressive.
 
Last edited:

psorcerer

Banned
This is true.
However since it indeed is a closed box Sony could have easily took the maximum power draw into account when considering the power supply.
I believe that going over the Max power draw would have likely compromised the cooling as well so maybe Sony engineers purposely want to keep the load down in order to allow for proper cooling. The size of the power supply smells like the console got a last minute OC imo.

How can you measure the maximum draw?
If you go for theoretical max you will over-cool your system. If you go for a practical one there is a high chance that some specific code paths may over-heat or make the noise insufferable.
And with more powerful (complex silicon) console the chance for some specific code path making things blow out is higher.
MSFT just built the console for average usage. They know that it will never be used at max power anyway. Sony cannot do that.
 
No that is not what this means.
When the load get too high the PS5 will need to down clock either its GPU, its CPU or both. Totally dependent on the load which affects the wattage.
PS5 will not run at full clocks all the time Mark Cerny himself stated that.
If this is true why are people attacked when there is any mention that PS5 runs at less than 10TF? I get all TF discussions are theoretical but either the clocks are fixed or they aren't right? It doesn't mean that PS5 doesn't have other positive attributes. Even if it ran less than 10TF from time to time Spiderman will still be great right?
 

Genx3

Member
If this is true why are people attacked when there is any mention that PS5 runs at less than 10TF? I get all TF discussions are theoretical but either the clocks are fixed or they aren't right? It doesn't mean that PS5 doesn't have other positive attributes. Even if it ran less than 10TF from time to time Spiderman will still be great right?

The way Mark Cerny explained it they will need to drop the frequency just a little to drop the load disproportionately down.
 

psorcerer

Banned
If this is true why are people attacked when there is any mention that PS5 runs at less than 10TF?

Because people who say that immediately follow that PS5 would run on reduced clocks not in the simple scenarios where these clocks are not needed. Bot solely in the complex ones.
 
Last edited:

MastaKiiLA

Member
Nonsense
This is a band-aid.
Why would they purposely gimp their power supply?
Smart Shift is how they got around the over clocking.
Even Mark Cerny calls it "Boost".
Why does he call it boost? They over clocked after the original design was almost final.
You have no clue what you're talking about. Supplying power to a system is like the lowest bar to clear in its design. We're not talking about a power station here. The amount of power needed for a game console isn't complicated. The design decisions made with the PS5 architecture are NOT dictated by power supply.

I always wonder why non-engineers, or people with limited engineering knowledge feel the need to spout nonsense online, when experts actually can pick this stuff apart in seconds. I've got a computer engineering degree, but even I know it's foolish to comment on a system that (a) you have the barest of details on, and (b) you have no understanding of the real design philosophy, outside of the normal PR nonsense that's distilled into a consumable form for the general public. Have you no shame?
 

Genx3

Member
You have no clue what you're talking about. Supplying power to a system is like the lowest bar to clear in its design. We're not talking about a power station here. The amount of power needed for a game console isn't complicated. The design decisions made with the PS5 architecture are NOT dictated by power supply.

I always wonder why non-engineers, or people with limited engineering knowledge feel the need to spout nonsense online, when experts actually can pick this stuff apart in seconds. I've got a computer engineering degree, but even I know it's foolish to comment on a system that (a) you have the barest of details on, and (b) you have no understanding of the real design philosophy, outside of the normal PR nonsense that's distilled into a consumable form for the general public. Have you no shame?

OK Mr engineer.
So tell us why they need smart shift if there is no power limit and its the simplest , lowest bar to clear design.
Tell us why they are shifting load instead of making things simple and locking everything in.
 

S0ULZB0URNE

Member
Yep. Way better. on the order of 15-20 percent better.... half the gap of the PS4 Pro and Xbox one X better. And all that at only a 50% drop in I/O speed!

And we all know how the One X WIPED THE FLOOR with the PS4 Pro at retail.. and in exclusives.... Man there are a SHITLOAD of Xbox one X games better looking than God of war or the last of us 2 for sure.
Not sure if serious?
 
Because people who say that immediately follow that PS5 would run on reduced clocks not in the simple scenarios where these clocks are not needed. Bot solely in the complex ones.
But again it goes back to my point. If PS5 can run max clocks during heavy load why bother saying the clocks are variable? Do we need power savings as gamers? It seems like a weird thing to be focused on when designing a SOC.
 

Panajev2001a

GAF's Pleasant Genius
But again it goes back to my point. If PS5 can run max clocks during heavy load why bother saying the clocks are variable? Do we need power savings as gamers? It seems like a weird thing to be focused on when designing a SOC.

People scream abuse about Sony’s jet engine like noise, obviously exaggerating, but exaggeration makes good news and I think it got to them and the way they were designing PS5 at the very very beginning would have been even louder so they really made a concerned effort to ensure power consumption and noise were under control and yet they could hit the incredibly high clocks for RDNA2 designs on 7nm (not EUV).
 

Sony

Nintendo
I think Sony implemented Variable Clocks in a different way than usage/heat based power throttling. In the traditional throttling case, CPU speeds differ based on the load and fluctuate in performance real-time. I think Sony's approach allows the developer to be flexible within certain game-scenes. For example,
Scene 1 = GPU at 100%, CPU at 100%
Scene 2 = GPU at 80%, CPU at 120%

I think it's highly unlikely that Sony allows the performance of a game to be variable depending on the instances the OP is talking about...

In short, I think Sony allows developers to use a variable power/performance budget to develop for specific instances as opposed to traditional throttling.
 
Last edited:

psorcerer

Banned
But again it goes back to my point. If PS5 can run max clocks during heavy load why bother saying the clocks are variable? Do we need power savings as gamers? It seems like a weird thing to be focused on when designing a SOC.

Because simple unlocked load is a heavier load than complex locked one.
I.e. Cerny was saying: don't worry, you won't hear fan noise on the title screen.
 
Last edited:

MastaKiiLA

Member
OK Mr engineer.
So tell us why they need smart shift if there is no power limit and its the simplest , lowest bar to clear design.
Tell us why they are shifting load instead of making things simple and locking everything in.
I don't know for certain. That's the only valid answer in this whole thread, not speculation bandied around as fact.

Now, if I had to guess, I'd say it was to afford devs flexibility with game design. CPU-heavy game or scene? You can shift extra power to that unit. Graphics-heavy scene? Shift focus over there. You have a fixed transistor budget, and fixed die size based on yield expectations. You can only fit so many logic units in the design of either CPU or GPU, but you have the ability to modulate clocks to provide a boost.

Now that you have your processor design, how far can you push the operating range on your design, while remaining within an acceptable cooling profile for the packaging you want? That's where you set a limit on the frequency range, and thus current draw, which then determines the size of the PU you need to power it. This is how the first gigahertz processor was being designed when I interned on the IBM Spinnaker project. It's 22 years now, so I'm sure it's okay for me to talk about it. Besides, I think the project got folded into S/390 anyway.

Again, this is all speculation, and based on nothing more than conjecture. But I'm not pretending to be an expert on what the design decisions were with the console. I can only make assumptions based on my own experience working with a microprocessor design team. Power supply was never discussed in team meetings, and while I only worked on the top via routing design, I don't believe basing a chip design around something as simple as a PU makes any sense. And do understand that the "revolutionary" thing about Spinnaker was that it was supposed to be designed from the outside-in, meaning that you would start with the external interfaces, and make sure that everything down to the smallest logic units could comply with the target 1GHz clock. You're not building an electric car, or something that has the PU as the source of its functionality. The source in a console is the microprocessor, and you can spec your PU to meet the system afterwards.

tl;dr I'd say yields and performance would dictate the microprocessor designs, followed by cooling and packaging. Those would then dictate the PU. Although the PU would play some role in packaging, you can always make it external to hit your targets on the latter 2 points. Either way, PU shouldn't be the cost leader in the design. That should be the chips, memory, and new-tech SSD, so why let a PU hamstring you? These are professional engineers, so they wouldn't.
 
Last edited:

kuncol02

Banned
Scene 1 = GPU at 100%, CPU at 100%
Scene 2 = GPU at 80%, CPU at 120%
Kind of, but actually it will be something like:
Scene 1 = GPU at 100%, CPU at 80% (or maybe even much lower)
Scene 2 = GPU at 98%, CPU at 100%

Clocks given by Sony are absolutely highest that PS5 will be able to reach.
In addition if what Cerny said is true, that 2% lower GPU clock will save 10% of power. Then that 2% lower GPU clock gives around 20W of power. Current gen 8 core ryzen CPUs are ~60W of TDP with clock slighty higher than PS5 ones. How low it will need to go to save that 20W of power? And that's only 2% of GPU.
Why even bother with smartshift if you are underclocking CPU that much? I feel like that number given by Cerny is not correct, because it makes no sense.
 

ToadMan

Member
from what I understand with Smartshift is that it is totally automated and developers do not have to code for it at all.

Quite right. Developers optimise for power use and that optimisation is driven by the smartshift technology and indeed the variable clocking.

Implement a heavy gpu load and smartshift will put power to the gpu, code a heavy cpu load and smartshift will allocate power there. In this sense, developers are in control of what smartshift is doing.

The dev kits provide profiling tools to understand what smartshift is doing in reaction to the code.

Smartshift profiles enables devs to suspend the power allocation during development while they optimise.

It’s a different “animal” in terms of optimisation but not significantly so - it’s removed the unknown element of power usage.
 

martino

Member
power draw depends on silicon lottery though.
does this means that top X% could perform even beyond 2.23ghz ?
or will they put a hard limit on this frequency ?
 
Last edited:

PaintTinJr

Member
Cerny talked to that point explicitly too.

Their solution to activity monitoring removes the silicon lottery aspect.
Yeah, after reading the additional info in the DF follow up interview to the Road to PS5, it was interesting that the power control unit in the APU uses a simulation model of the silicon, so that the specifics of each APU aren't important to the deterministic clock & power/versus workload.

I was surprised to see Cerny mention that CPU, GPU and memory controller activity all played a part in the clock rate chosen by the power control unit. So in theory, an algorithm re-written to be more compute intensive, than memory intensive might see a clock boost(and performance boost) if it reduced power draw, or vice versa,
 

Jon Neu

Banned
You are not reading my posts all the way through hence why I referenced voltage and frequency (and relation to power). Please re-read :).

The design was a design always built for very high clocks, no framing way they went from XSX’s 1.8 GHz to 2.2 GHz GPU clocks ok the last 6 months so close to mass production, this is borderline delusional.
Did they tweak the clock at all in the last year based on the console volume and cooling max capability as they left “some” headroom during design? That is possible, but that means extending the current approach that was already in use/implemented.

Nope... the more sensible/reasonable approach is that the world revolves around XSX and Sony just made a massive overclock in the last few months with a November launch on the horizon... :rolleyes:...

Every design has room to increase clocks and power in the last minute. They do it precisely to power up the console if the competition is ahead of you, because yes, Sony cares about the Xbox specs and Xbox cares about the Sony specs. There’s millions at the stake.

MS could also give another power up to the console right now if they want it, but there’s no need because they have already won the power battle.
 

PaintTinJr

Member
Every design has room to increase clocks and power in the last minute. They do it precisely to power up the console if the competition is ahead of you, because yes, Sony cares about the Xbox specs and Xbox cares about the Sony specs. There’s millions at the stake.

MS could also give another power up to the console right now if they want it, but there’s no need because they have already won the power battle.
What about with the Xbox 360? The PS3 clocks were higher and Xbox bumped the clocks to match, but in turn went through 3-4(?) hardware revisions before they were able to overcome the RRoD problem that it almost certainly created, off the back of being too aggressive to beat the PS3 to market with such a small gen time between Xbox and Xbox 360 launching. Even 360 devkits RRoDed at expos for developers because they pushed too hard and might be the reason Xbox shows Xbox games on PC from time to time.

Would pushing the XsX harder really be worth a repeat of that RRoD scenario? IMHO If they got the same outcome as the 360 gen, then they probably would from a business strategy perspective, but reputationally I doubt you get to RRoD twice.

XsX has a further problem in comparing to PS5 performance, and that is that any high utilisation use solutions for the XsX's CU count, will automatically saturate the 36CUs(at higher clock) of the PS5, while in reverse, 36CU optimised code will possibly need refactored to make best use of the XsX CU count or just fail to use that greater width.

Edit:
And look at the shorter time between X1X and XsX compared to Pro and PS5 release dates. Do they have the headroom they had when going from the 360 to the X1 when they did a last minute clock bump?
 
Last edited:

Ascend

Member
You have no clue what you're talking about. Supplying power to a system is like the lowest bar to clear in its design. We're not talking about a power station here. The amount of power needed for a game console isn't complicated. The design decisions made with the PS5 architecture are NOT dictated by power supply.

I always wonder why non-engineers, or people with limited engineering knowledge feel the need to spout nonsense online, when experts actually can pick this stuff apart in seconds. I've got a computer engineering degree, but even I know it's foolish to comment on a system that (a) you have the barest of details on, and (b) you have no understanding of the real design philosophy, outside of the normal PR nonsense that's distilled into a consumable form for the general public. Have you no shame?
Even though it's true that supplying power might not be the highest priority in designing a system, it still is a huge influence indirectly. Why? Because of heat.
Consoles rarely are designed to have a power supply over 300W. Not only do you want to avoid power hungry systems, the amount of power usage directly influences the size and the acoustics of the console. That's why I find it a bit odd to say that decisions made for the PS5 are not dictated by power supply. That's probably one of the first things that are set in stone, and everything else grows from that, based on the capability of the hardware manufacturer, in this case, AMD.

Specifically in the case of the PS5, power is really central to its design. They went with a (relatively) constant power delivery and variable clocks on the different components, as opposed to the fixed clocks and variable power design of the XSX and pretty much every other console that ever existed.

As for the amount of power needed for a game console not being complicated, Cerny disagrees with that in the Road to PS5 presentation. Short version is that engineers have to 'guess' how much power a system will use.

XsX has a further problem in comparing to PS5 performance, and that is that any high utilisation use solutions for the XsX's CU count, will automatically saturate the 36CUs(at higher clock) of the PS5, while in reverse, 36CU optimised code will possibly need refactored to make best use of the XsX CU count or just fail to use that greater width.
I don't think it works like that. A game optimized for an RTX 2070 doesn't somehow fail to make use of the additional SM on a 2080Ti.
 
Top Bottom