• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Microsoft: We Could Have Used Variable Clocks for Xbox Series X, But We’re Not Interested in TFLOPS Numbers

pyrocro

Member
Ridiculous. It's not a bad thing when it lets them achieve clocks they couldn't by opting for a fixed clock

Yeah in a perfect world Sony could just go balls to the walls with clocks without having to worry about thermals or noise. Same with Microsoft.

If we were in a perfect world.
It's as though you want to be right and wrong at the same time.
you're making his point for him and want to say he's ridiculous.

Say it with me "variable clocks is a compromise". it literally is a give(power to cpu) and take(power form GPU) on the PS5.

If it's that much better, why don't any PC GPUs use it?
More unknown variables to contend with on the PC. everything is known about the console by the manufacture of that console.
Why not work within the limits of the design and power envelope in a consistent predictable way?
The real question is why has this technology not taken over the laptops world? should sony have put this into their laptops to Kill the competition?
if its free performance with no compromise.

The point has been driven in to the ground. Xbox One still runs those games at 720p/900p. Xbox One X 6TF still had to downgrade its resolution for RE2 Remake to match the same FPS output as a 4 TF PS4 Pro.

We can just chalk that up as "Optimized for the Xbox One X".
You seem to be suggesting XBOX series X weaker than PS4 Pro?
Developers can optimize their games using a fixed profile. They don't even need to pay attention to variable clocks
Do you run the games on a dev kit or the retail console you have at home(with variable clocks)?
Another day, another Microsoft talk. Their talk is really all over the place.

The way the talk, they are making it only more confusing for everyone.
Nah I think most people get it.
it might just be you.
Also, you should have read the article before posting, not reading it is a sure way to be confused about its contents.
 
MS is so full of crap. I have NO IDEA why they seem to be off their game since the PS5 showing. They need more confidence. There's no way they could know if the PS5 is harder to optimize for or not.
You should know that both company's must have each other dev kits by now, MS even knew that they had a more powerful console before sony announcement, thats why they stated "The most powerful console" before any official announcement from sony.
These are multi billion dallar company's not casual people digging for infos.
 
More unknown variables to contend with on the PC.
What unknown variables would stop, say, Asus or MSI from releasing a card that always runs at it's maximum advertised clock speed? The reason they're not doing that is because there simply isn't a need for it. Oh, and the tech press and customers would rip them a new one.

everything is known about the console by the manufacture of that console.
Why not work within the limits of the design and power envelope in a consistent predictable way?
I don't really get what's unpredictable about only having the GPU clock up when the workload demands it. That's still predictable behavior. What's the point of having the thing run at the same speeds whether you're playing Halo Infinite or Spelunky?
 
It's as though you want to be right and wrong at the same time.
you're making his point for him and want to say he's ridiculous.
Say it with me "variable clocks is a compromise". it literally is a give(power to cpu) and take(power form GPU) on the PS5.

What? No, that's smartshift and only works one way which is unused power from the CPU goes to the GPU.

Yeah, both consoles have compromises. XSX had to comprise with a lower GPU clock speed by opting for a fixed clock but have sustained performance. PS5 had to compromise with fluctuating clocks but can push peak clocks higher than normal thanks to it.

As I said before, claiming variable > fixed clocks when you actually factor in reality is just wrong.

Do you run the games on a dev kit or the retail console you have at home(with variable clocks)?

Do developers optimise on the retail console or the dev kit?
 
MS is so full of crap. I have NO IDEA why they seem to be off their game since the PS5 showing. They need more confidence. There's no way they could know if the PS5 is harder to optimize for or not.
I would say the contrary. They seem to be pretty consistent in their messaging since the PS5 reveal. They keep saying they aren't worried and still have the most powerful console. Also, I would imagine their decades as a software developer might give them some insight on challenges developing.

I have no doubts that there will be some fantastic games on the PS5, especially from 1st party devs. However, I think its pretty reasonable to think that a moving target (variable rates) would be harder to hit. Personally, I'm a bit concerned about the upclock on the PS5 GPU. I know if I overclocked my 2080 in my PC to 2.3Ghz it would be a crashfest, even with a 750 watt power supply. I think MS's decision to keep everything moderately clocked will pay off in dividends with heat and noise. I'm glad this is the approach they are staying with.
 

GODbody

Member

Please educate yourself.

PlayStation 5's boost clocks and how they work

One of the areas I was particularly interested to talk about was the boost clock of the PlayStation 5 - an innovation that essentially gives the system on chip a set power budget based on the thermal dissipation of the cooling assembly. Interestingly, in his presentation, Mark Cerny acknowledged the difficulties of cooling PlayStation 4 and suggested that having a maximum power budget actually made the job easier. "Because there are no more unknowns, there's no need to guess what power consumption the worst case game might have," Cerny said in his talk. "As for the details of the cooling solution, we're saving them for our teardown, I think you'll be quite happy with what the engineering team came up with."

ARTICLE CONTINUES BELOW

Regardless, the fact is that there is a set power level for the SoC. Whether we're talking about mobile phones, tablets, or even PC CPUs and GPUs, boost clocks have historically led to variable performance from one example to the next - something that just can't happen on a console. Your PS5 can't run slower or faster than your neighbour's. The developmental challenges alone would be onerous to say the least.

"We don't use the actual temperature of the die, as that would cause two types of variance between PS5s," explains Mark Cerny. "One is variance caused by differences in ambient temperature; the console could be in a hotter or cooler location in the room. The other is variance caused by the individual custom chip in the console, some chips run hotter and some chips run cooler. So instead of using the temperature of the die, we use an algorithm in which the frequency depends on CPU and GPU activity information. That keeps behaviour between PS5s consistent."

Inside the processor is a power control unit, constantly measuring the activity of the CPU, the GPU and the memory interface, assessing the nature of the tasks they are undertaking. Rather than judging power draw based on the nature of your specific PS5 processor, a more general 'model SoC' is used instead. Think of it as a simulation of how the processor is likely to behave, and that same simulation is used at the heart of the power monitor within every PlayStation 5, ensuring consistency in every unit.

ARTICLE CONTINUES BELOW

"The behaviour of all PS5s is the same," says Cerny. "If you play the same game and go to the same location in the game, it doesn't matter which custom chip you have and what its transistors are like. It doesn't matter if you put it in your stereo cabinet or your refrigerator, your PS5 will get the same frequencies for CPU and GPU as any other PS5."



360p geselecteerd als afspeelkwaliteit480p low geselecteerd als afspeelkwaliteit

PlayStation 5 New Details From Mark Cerny: Boost Mode, Tempest Engine, Back Compat + More

23:24

A new video report from Digital Foundry on what we've learned about the system since the reveal.

Watch on YouTube

Feedback from developers saw two areas where developers had issues - the concept that not all PS5s will run in the same way, something that the Model SoC concept addresses. The second area was the nature of the boost. Would frequencies hit a peak for a set amount of time before throttling back? This is how smartphone boost tends to operate.

"The time constant, which is to say the amount of time that the CPU and GPU take to achieve a frequency that matches their activity, is critical to developers," adds Cerny. "It's quite short, if the game is doing power-intensive processing for a few frames, then it gets throttled. There isn't a lag where extra performance is available for several seconds or several minutes and then the system gets throttled; that isn't the world that developers want to live in - we make sure that the PS5 is very responsive to power consumed. In addition to that the developers have feedback on exactly how much power is being used by the CPU and GPU."

Mark Cerny sees a time where developers will begin to optimise their game engines in a different way - to achieve optimal performance for the given power level. "Power plays a role when optimising. If you optimise and keep the power the same you see all of the benefit of the optimisation. If you optimise and increase the power then you're giving a bit of the performance back. What's most interesting here is optimisation for power consumption, if you can modify your code so that it has the same absolute performance but reduced power then that is a win. "

In short, the idea is that developers may learn to optimise in a different way, by achieving identical results from the GPU but doing it faster via increased clocks delivered by optimising for power consumption. "The CPU and GPU each have a power budget, of course the GPU power budget is the larger of the two," adds Cerny. "If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

ARTICLE CONTINUES BELOW

"There's another phenomenon here, which is called 'race to idle'. Let's imagine we are running at 30Hz, and we're using 28 milliseconds out of our 33 millisecond budget, so the GPU is idle for five milliseconds. The power control logic will detect that low power is being consumed - after all, the GPU is not doing much for that five milliseconds - and conclude that the frequency should be increased. But that's a pointless bump in frequency," explains Mark Cerny.

PS5 caps its CPU and GPU clocks at 3.5GHz and 2.23GHz respectively, but how stable are the frequencies?

At this point, the clocks may be faster, but the GPU has no work to do. Any frequency bump is totally pointless. "The net result is that the GPU doesn't do any more work, instead it processes its assigned work more quickly and then is idle for longer, just waiting for v-sync or the like. We use 'race to idle' to describe this pointless increase in a GPU's frequency," explains Cerny. "If you construct a variable frequency system, what you're going to see based on this phenomenon (and there's an equivalent on the CPU side) is that the frequencies are usually just pegged at the maximum! That's not meaningful, though; in order to make a meaningful statement about the GPU frequency, we need to find a location in the game where the GPU is fully utilised for 33.3 milliseconds out of a 33.3 millisecond frame.

"So, when I made the statement that the GPU will spend most of its time at or near its top frequency, that is with 'race to idle' taken out of the equation - we were looking at PlayStation 5 games in situations where the whole frame was being used productively. The same is true for the CPU, based on examination of situations where it has high utilisation throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency."

Put simply, with race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time. Cerny also stresses that power consumption and clock speeds don't have a linear relationship. Dropping frequency by 10 per cent reduces power consumption by around 27 per cent. "In general, a 10 per cent power reduction is just a few per cent reduction in frequency," Cerny emphasises.

It's an innovative approach, and while the engineering effort that went into it is likely significant, Mark Cerny sums it up succinctly: "One of our breakthroughs was finding a set of frequencies where the hotspot - meaning the thermal density of the CPU and the GPU - is the same. And that's what we've done. They're equivalently easy to cool or difficult to cool - whatever you want to call it."

ARTICLE CONTINUES BELOW

There's likely more to discover about how boost will influence game design. Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. It makes perfect sense as most game engines right now are architected with the low performance Jaguar in mind - even a doubling of throughput (ie 60fps vs 30fps) would hardly tax PS5's Zen 2 cores. However, this doesn't sound like a boost solution, but rather performance profiles similar to what we've seen on Nintendo Switch. "Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny
 

pyrocro

Member
What unknown variables would stop, say, Asus or MSI from releasing a card that always runs at its maximum advertised clock speed? The reason they're not doing that is because there simply isn't a need for it. Oh, and the tech press and customers would rip them a new one.
what computer case do you have?
Asking yourself what PC case I( I as in me, not you)have should clear up the misunderstanding you're having.

I don't really get what's unpredictable about only having the GPU clock up when the workload demands it. That's still predictable behavior. What's the point of having the thing run at the same speeds whether you're playing Halo Infinite or Spelunky?
All modern systems do downclock, even the consoles, the PS5 implementation of AMD's SMartshift is different from a "simple" downclocking of a CPU or GPU.
If the CPU for a particular situation is required to be MAXED out the GPU cannot be maxed out at the same time and vice-versa.
which means for a given situation the GPU or the CPU is getting a certain amount of power from the budget but when the power budget is reached it may be that the CPU or the GPU was downclocked to stay within the budget. Which means more planning and optimizing for games around this new paradigm.
 
Last edited:
And maybe Cerny is just making it up ? Timestamped.



Also consider the sony patent on cooling - we have not seen it yet. Is 22 % GPU extra speed worth it ? We shall see.

When triangles are small discussion from cernys speech, what do you think he is referring to ?

What else had lots of small traingles............


Hmmm yes, do consider the fact you haven't seen the cooling yet. I've seen the Xbox SX's cooling about 2-3 months ago in highly detailed breakdowns. Why are Sony keeping a cooling system so hush hush? I'd be worried.

I mean, you better just hope this isn't another Cell processor type situation where it's technically better than Xbox but the end results prove otherwise. (Everything ran better on Xbox360 until the arse end of the generation).
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
You should know that both company's must have each other dev kits by now, MS even knew that they had a more powerful console before sony announcement, thats why they stated "The most powerful console" before any official announcement from sony.
These are multi billion dallar company's not casual people digging for infos.

You sure about this?

I would say the contrary. They seem to be pretty consistent in their messaging since the PS5 reveal. They keep saying they aren't worried and still have the most powerful console. Also, I would imagine their decades as a software developer might give them some insight on challenges developing.

I have no doubts that there will be some fantastic games on the PS5, especially from 1st party devs. However, I think its pretty reasonable to think that a moving target (variable rates) would be harder to hit. Personally, I'm a bit concerned about the upclock on the PS5 GPU. I know if I overclocked my 2080 in my PC to 2.3Ghz it would be a crashfest, even with a 750 watt power supply. I think MS's decision to keep everything moderately clocked will pay off in dividends with heat and noise. I'm glad this is the approach they are staying with.

I'm sorry, but you don't seem to understand "how" the variable clocks work in the PS5. You can't compare it to your 2080 in your PC. This is totally different.
 
Last edited:

pyrocro

Member
What? No, that's smartshift and only works one way which is unused power from the CPU goes to the GPU.
the example was more to show what a compromise is.
Also in my example, I say take power from GPU and give to CPU, so I'm not sure what the point of your above statement is as its the opposite of your interpretation.


Yeah, both consoles have compromises. XSX had to comprise with a lower GPU clock speed by opting for a fixed clock but have sustained performance. PS5 had to compromise with fluctuating clocks but can push peak clocks higher than normal thanks to it.
why even say this, you're just restating unrelated facts and correlating in an attempt to make a false equivalence.

As I said before, claiming variable > fixed clocks when you actually factor in reality is just wrong.

do you have that ">" sign correct? your argument before was the opposite of that.
That still doesn't change the fact that there would have been benefits to them implementing a variable clock over the fixed clock they opted for.

Which is my point. In the real world, fixed clocks > variable clocks is false narrative
which is it?


Do developers optimise on the retail console or the dev kit?
where do the clients/gamers/players play games?
do you want a developer to optimize a game for themselves on their dev kits which no one else has access to or for the clients/gamers/players machine?

it's easy to figure out if you ask yourself the above questions.
 
Last edited:
D

Deleted member 775630

Unconfirmed Member
Yeah, and you have that, you have a octa core Ryzen with SMT over 3ghz, vs a shitty Jaguar at 2.3 maximum (XBOX X). Even at 3ghz would be > 4x more powerful than Jaguar

Anyway, seriously you think better AI next gen? lol. We have the same AI levels than PS2 gen, with much better CPU's
Well that's what I heard when people said games would be held back by current gen...
 

Joho79

Member
MS is so full of crap. I have NO IDEA why they seem to be off their game since the PS5 showing. They need more confidence. There's no way they could know if the PS5 is harder to optimize for or not.

To be fair, he doesn’t mention the PS5.
 
MS is so full of crap. I have NO IDEA why they seem to be off their game since the PS5 showing. They need more confidence. There's no way they could know if the PS5 is harder to optimize for or not.

But they know if their console is easier to develop for with fixed clocks or variable clocks. They don't need the Ps5 for that.
And they say the former is true.
 

longdi

Banned
I doubt it. lol

MS definitely knows the low down, they have been confidently dropping mini-bombs if you are paying attention.

Unlike internet fans, MS do not need to rely on tech/PR 'deep dive', no grey areas to imagine wonderful scenarios.
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
MS definitely knows the low down, they have been confidently dropping mini-bombs if you are paying attention.

Unlike internet fans, MS do not need to rely on tech/PR 'deep dive', no grey areas to imagine wonderful scenarios.

But they simply can't know this without working on a PS5 dev kit and having the available software toolkit that comes with it. It's simply not possible.
 
It goes to show in the console gaming world, the PC rules don't apply. In console-land, you over-engineer a certain component of the system to make it more marketable. In PC land, it's all about balancing the different components of the system. Microsoft clearly didn't get the memo.
 
Something can't be easier to develop for if something is varied rather than stable.
Something can't be easier to develop for if hardware delivers less power with varied clocks.

Devs are also whores who will say anything to hype up things. Just read developer interviews about any other previous generation jump.

Sony fans won't except this logic
 
But they know if their console is easier to develop for with fixed clocks or variable clocks. They don't need the Ps5 for that.
And they say the former is true.

I think that's what people need to take into account. They are talking about their system so it's probably true for them. Sony might be different though because of the way they are handling variable clocks on their system.

I heard developers mentioning that the PS5 is really easy to make games for. However I haven't heard anyone say that the variable clocks make development a pain yet.
 

J_Gamer.exe

Member
The clever implementation sony has gone for is superior to the fixed clocks setup and I think Ronald knows this. It allows you to get more out of a GPU than you otherwise would do under a fixed clock.

The xbox could probably hit 2ghz most the time with the same implementation and only drop when that worse case code comes around rarely. It's limited by that worst case where most the time ot could have run higher.

Microsoft care a lot about power so I think if they could have the xbox hitting even higher performance they would absolutely go for that.

Maybe they played it safe or maybe never thought of it, a bit like with the io hardware they seem to have been left behind a bit there.

Devs have said both consoles are close, closer than numbers would suggest, this clever clocks is one reason why, giving ps5 advantages in other areas of the gpu. Cache scrubbers and coherencey engines is another which should lead to higher cu occupancy.
 

sendit

Member

Do you actually understand what you posted? The freq at which the the GPU/CPU operate at are based on a fixed power draw (Which is how they're able to predictably cool the console efficiently). What you suggested in your original post is that this is based on thermal throttling.

"Shortcomings, as in downclocking the CPU and/or GPU in order to keep your device cool (which in turn keeps the fans quiet)" -GODBody :messenger_loudly_crying:
 
Last edited:

B_Boss

Member
Quite possibly. Anything Cerny says means no more to me than what Phil says means to the Sony fans that frequently like to imply or outright state Phil is lying.

Why not just take some of what they both say with a grain of salt, using discretion and respect their individual skill sets/talents? It is difficult not to appreciate (not worship) Cerny’s technical knowledge when he discusses the specifics.

No it's just a normal gaming forum thread, You have to understand when you step out of your next gen safe space and don't have all your Sony loyalists to back you up opinions other than the hive mind narrative that permeates that thread are permitted.

Opinions are certainly permitted here and elsewhere no? and those “Sony loyalists” could just as well (and as much) post here. Also, could you clarify the “hive mind narrative?” What is it? I’ve read an awful lot but am totally ignorant of a narrative that you’re speaking of. I could be just ignorant and I accept that lol.

I think the strongest and best “next gen safe space” should be the undeniable and scientific facts with supported evidence, but that is me.

but its ok im sure you wont notice as your eyes cant look away from the car crash ps5 internet router stylings.

I know what you mean 🤣:

b4h4onsdz9651.jpg


I’m not ultimately sure what you’re implying. The PS5 and Series X resembles something else but that’s it? I actually really like both their designs.
 
Last edited:

sendit

Member
It's as though you want to be right and wrong at the same time.
you're making his point for him and want to say he's ridiculous.

Say it with me "variable clocks is a compromise". it literally is a give(power to cpu) and take(power form GPU) on the PS5.


More unknown variables to contend with on the PC. everything is known about the console by the manufacture of that console.
Why not work within the limits of the design and power envelope in a consistent predictable way?
The real question is why has this technology not taken over the laptops world? should sony have put this into their laptops to Kill the competition?
if its free performance with no compromise.


You seem to be suggesting XBOX series X weaker than PS4 Pro?

Do you run the games on a dev kit or the retail console you have at home(with variable clocks)?

Nah I think most people get it.
it might just be you.
Also, you should have read the article before posting, not reading it is a sure way to be confused about its contents.

Yes, that is exactly what I'm saying. The PS4 Pro is weaker than the Xbox Series X. /s :messenger_dizzy:

Reference: https://www.udemy.com/course/how-to-support-your-adolescents-reading-development/
 

IntentionalPun

Ask me about my wife's perfect butthole
He really didn't imply they aren't interested in TFLOPs; he implied they aren't interested in increasing their max TFLOP by way of variable clock rates.

"Variable clocks aren't worth the extra TFLOPs on paper" would be a better paraphrase.

Now obviously one could argue he's wrong, but that's more aligned to what he said.
 

Journey

Banned
Microsoft is so transparently jealous. If your that confident in your engineering decisions then let your exclusive games do the talking.

Wait never mind... I understand Microsoft carry on then.

The constant bashing of MS exclusives has got to stop, there should be some kind of warning at least, it's getting old, we don't know what the new generation will bring after new acquisitions.
 

pyrocro

Member
He really didn't imply they aren't interested in TFLOPs; he implied they aren't interested in increasing their max TFLOP by way of variable clock rates.

"Variable clocks aren't worth the extra TFLOPs on paper" would be a better paraphrase.

Now obviously one could argue he's wrong, but that's more aligned to what he said.
Stop trying to be rational, it only going to make you informed. :messenger_beaming:
 

Yoshi

Headmaster of Console Warrior Jugendstrafanstalt
He says it would have made it harder to develop for if they used variable clocks? How so? I really wish he would expand on that.
Because it is harder to predict the behaviour of your game. E.g. if a scene runs just fine at the higher clockrate and the system happens to deliver that most of the time you test a scene, but struggles to keep performance at a lower clock, which also sometimes happens in the same scene, it is very difficult to replicate the issue to fix it. Of course variable clock rate is only one of many factors that can lead to variable performance. In fact, a Microsoft game recently ran into issues with another variable aspect: Ori and the Will of the Wisps had issues with the memory management of Unity and had inconsistent loading times and framerates as a consequence.
 

GODbody

Member
Do you actually understand what you posted? The freq at which the the GPU/CPU operate at are based on a fixed power draw (Which is how they're able to predictably cool the console efficiently). What you suggested in your original post is that this is based on thermal throttling.

"Shortcomings, as in downclocking the CPU and/or GPU in order to keep your device cool (which in turn keeps the fans quiet)" -GODBody :messenger_loudly_crying:
What's the difference? Instead of having the chip run at a locked frequencies and letting the power vary they chose to set limits (peaks) for the frequencies and give the chip a fixed power budget. These peaks are not sustained and are the top end of the spectrum. It's likely that neither the CPU or GPU will be both be running at peak frequencies simultaneously any given time given the fixed power budget with one of the units needing to reduce it's frequency(downclocking and reducing power draw) in order to increase the frequency(upclocking and increasing power draw) of the other unit. Cerny states this decision made the device easier to cool. I stated downlocking because the unit downclocks from those peak frequencies.
 

IntentionalPun

Ask me about my wife's perfect butthole
It's likely that neither the CPU or GPU will be both be running at peak frequencies simultaneously any given time given the fixed power budget
Not true; it will run both at max frequency as long as the workload wouldn't cause a higher power draw.

Frequencies aren't what cause power draw on their own.

According to Cerny they can both be max most of the time (likely because most of the time games aren't fully stressing one or the other from a power draw standpoint). We'll really need to wait to hear form devs about the downclocking and how much of an issue it is.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Because it is harder to predict the behaviour of your game. E.g. if a scene runs just fine at the higher clockrate and the system happens to deliver that most of the time you test a scene, but struggles to keep performance at a lower clock, which also sometimes happens in the same scene, it is very difficult to replicate the issue to fix it. Of course variable clock rate is only one of many factors that can lead to variable performance. In fact, a Microsoft game recently ran into issues with another variable aspect: Ori and the Will of the Wisps had issues with the memory management of Unity and had inconsistent loading times and framerates as a consequence.
Sony's variable clock rate method is more predictable than usual though.

"The same scene" will always run at the same frequencies at all times on every machine. So your description really isn't valid for the PS5.

However the same piece of code won't always run at the same frequencies, because the overall workload might be different from scene to scene. So some code that does X while Y is happening will always run the same speed... but if you try to do X while Y AND Z are happening, then the frequencies might drop.

However it would always have that same drop while X/Y/Z are happening so at least the dev can detect that. Cerny claims no dev should ever have to optimize around those scenarios, but I personally take that with a grain of salt.. as Cerny basically described everything about the PS5 to be perfect and the right way to do things lol
 

GODbody

Member
Not true; it will run both at max frequency as long as the workload wouldn't cause a higher power draw.

Frequencies aren't what cause power draw on their own.

According to Cerny they can both be max most of the time (likely because most of the time games aren't fully stressing one or the other from a power draw standpoint). We'll really need to wait to hear form devs about the downclocking and how much of an issue it is.

If both units are running at their peak frequencies constantly then there is no need for a variable frequency.

Frequency and power draw are explicitly tied together. When you overclock a CPU on a PC you increase the amount of power given to it thereby increasing the frequency at which it runs.

When the GPU is at it's peak it's drawing power away from the CPU to hit that peak frequency and if the power budget is fixed that CPU is not getting extra power to hit it's peak frequency at the same time.
 

sendit

Member
Not true; it will run both at max frequency as long as the workload wouldn't cause a higher power draw.

Frequencies aren't what cause power draw on their own.

According to Cerny they can both be max most of the time (likely because most of the time games aren't fully stressing one or the other from a power draw standpoint). We'll really need to wait to hear form devs about the downclocking and how much of an issue it is.

Precisely. Here are a few examples:


I would wager that PS5s GPU will be at 2.23 99% of the time.
 

IntentionalPun

Ask me about my wife's perfect butthole
If both units are running at their peak frequencies constantly then there is no need for a variable frequency.

Frequency and power draw are explicitly tied together. When you overclock a CPU on a PC you increase the amount of power given to it thereby increasing the frequency at which it runs.

When the GPU is at it's peak it's drawing power away from the CPU to hit that peak frequency and if the power budget is fixed that CPU is not getting extra power to hit it's peak frequency at the same time.

Again.. you are missing one huge factor.. what is running on the CPU/GPU (workload.)

They can both be running at max as long as the workload isn't too high.
 
You sure about this?



I'm sorry, but you don't seem to understand "how" the variable clocks work in the PS5. You can't compare it to your 2080 in your PC. This is totally different.
I'm not comparing variable clocking to my 2080. I was saying if I overclocked my 2080 to the GPU clock speed of the PS5 it would cause stability and heat problems even with a large power supply.

I get the power shift from CPU to GPU Sony is promoting, but less power equals less performance so there is going to have to be compromise if the shift needs to occur. For example if a developer needs more GPU power and it throttles the CPU to provide it, will CPU functions like AI suffer? These are the questions that haven't really been answered.
 

MastaKiiLA

Member
If both units are running at their peak frequencies constantly then there is no need for a variable frequency.

Frequency and power draw are explicitly tied together. When you overclock a CPU on a PC you increase the amount of power given to it thereby increasing the frequency at which it runs.

When the GPU is at it's peak it's drawing power away from the CPU to hit that peak frequency and if the power budget is fixed that CPU is not getting extra power to hit it's peak frequency at the same time.
He might be referring to multithreaded process versus single threaded processes. You might want a higher clock to process a single thread faster than usual, because it's not possible to run the task in parallel. So while you bump the clock, you might not be using the full number of logic units available, so your current draw might not be that much higher over a given interval of time. Thus you wouldn't be drawing more power.

Just speculation. I'm not sure what level of granularity these chips operate on. Just trying to guess the context of his post.
 
Last edited:

GODbody

Member
Again.. you are missing one huge factor.. what is running on the CPU/GPU (workload.)

They can both be running at max as long as the workload isn't too high.
Why would they both be running at max frequencies if the workload is not intensive doesn't require it?
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Here's what I'm talking about:

"So, when I made the statement that the GPU will spend most of its time at or near its top frequency, that is with 'race to idle' taken out of the equation - we were looking at PlayStation 5 games in situations where the whole frame was being used productively. The same is true for the CPU, based on examination of situations where it has high utilisation throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency."

Put simply, with race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time.

Although Cerny did say "at or near", he does explicitly say there's enough power for them to both be running at full frequency.

There’s enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz

It's all workload dependent; although taking all of these statements into account.. it actually sounds like most of the time one or the other will be budging a bit. Him throwing around "at or NEAR" is telling.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
I'm not comparing variable clocking to my 2080. I was saying if I overclocked my 2080 to the GPU clock speed of the PS5 it would cause stability and heat problems even with a large power supply.

I get the power shift from CPU to GPU Sony is promoting, but less power equals less performance so there is going to have to be compromise if the shift needs to occur. For example if a developer needs more GPU power and it throttles the CPU to provide it, will CPU functions like AI suffer? These are the questions that haven't really been answered.

That's not exactly how this works. The CPU and GPU can run at full power at the same time. But if the dev wanted to put some CPU power to the side they can. They can develop their game with this lesser CPU power in mind (say 3.2 GHz, instead of 3.5 Ghz) so that the GPU is always at 10.3 TFs of power.
 

Dnice1

Member
Not true; it will run both at max frequency as long as the workload wouldn't cause a higher power draw.

Frequencies aren't what cause power draw on their own.

According to Cerny they can both be max most of the time (likely because most of the time games aren't fully stressing one or the other from a power draw standpoint). We'll really need to wait to hear form devs about the downclocking and how much of an issue it is.

Cerny made so many conflicting or vague statements in that eurogamer article. The phrase "most of the time" is mentioned throughout the boost clock talk. Then there is "CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz". There is more statements I could point out including Digital Foundry's about developers throttling back on the cpu in order to sustain the gpu 2.23Ghz clock speed. Now why would they need to do that if it can run both cpu and gpu at peak frequency?

Now here is the heart of the matter Jason Ronald is talking about...
"Mark Cerny sees a time where developers will begin to optimise their game engines in a different way - to achieve optimal performance for the given power level. "Power plays a role when optimising. If you optimise and keep the power the same you see all of the benefit of the optimisation. If you optimise and increase the power then you're giving a bit of the performance back. What's most interesting here is optimisation for power consumption, if you can modify your code so that it has the same absolute performance but reduced power then that is a win. " In short, the idea is that developers may learn to optimise in a different way, by achieving identical results from the GPU but doing it faster via increased clocks delivered by optimising for power consumption. "

Several times the word optimized is used when talking about programming for variable clocks. Its going to be more developer work to get the most out of. That's all he was saying.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Why would they both be running at max frequencies if the workload is not intensive doesn't require it?

Why wouldn't it run at max frequency if it can?

What does "doesn't require it" mean? There is no such thing as requiring a frequency. The XSX will always have it's GPU/CPU set to the same frequency whether you are playing Halo Infinite or Doom 2D because it uses locked clocks. It would just have higher power draw when playing Halo than Doom 2d.
 

IntentionalPun

Ask me about my wife's perfect butthole
That's not exactly how this works. The CPU and GPU can run at full power at the same time. But if the dev wanted to put some CPU power to the side they can. They can develop their game with this lesser CPU power in mind (say 3.2 GHz, instead of 3.5 Ghz) so that the GPU is always at 10.3 TFs of power.
That is 100% NOT how it works. The dev does not decide where the power goes. All they have control over is the code they write, so they can write less intensive on the CPU code if they want to ensure max GPU power, but they can't explicitly do that by just setting power levels or anything.

edit: I kinda realize you might be trying to say the same thing as me.. I do apologize if that is the case.. but you saying "put some CPU power to the side" made it sound like they could literally lower the clocks or something.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Cerny made so many conflicting or vague statements in that eurogamer article. The phrase "most of the time" is mentioned throughout the boost clock talk. Then there is "CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz". There is more statements I could point out including Digital Foundry's about developers throttling back on the cpu in order to sustain the gpu 2.23Ghz clock speed. Now why would they need to do that if it can run both cpu and gpu at peak frequency?

Now here is the heart of the matter Jason Ronald is talking about...
"Mark Cerny sees a time where developers will begin to optimise their game engines in a different way - to achieve optimal performance for the given power level. "Power plays a role when optimising. If you optimise and keep the power the same you see all of the benefit of the optimisation. If you optimise and increase the power then you're giving a bit of the performance back. What's most interesting here is optimisation for power consumption, if you can modify your code so that it has the same absolute performance but reduced power then that is a win. "

In short, the idea is that developers may learn to optimise in a different way, by achieving identical results from the GPU but doing it faster via increased clocks delivered by optimising for power consumption. "

Six time the word optimized is used when talking about programming for variable clocks. Its going to be more developer work to get the most out of. That's all he was saying.
Yeah see my next post on the topic where I revisted the actual quotes; he actually says "at or near" when talking about max frequencies so if you read between the lines... it actually means most of the time one or the other probably gives.

There is more statements I could point out including Digital Foundry's about developers throttling back on the cpu in order to sustain the gpu 2.23Ghz clock speed.

To be clear they can't throttle the CPU. What they can do is write different / less intensive / optimized for power daw code targeting the CPU to make sure the GPU stays at max.

But yeah you are right, Cerny sure likes to talk about optimization in that one article and then explicitly tried to deny devs would have to optimize in another interview lol
 
That's not exactly how this works. The CPU and GPU can run at full power at the same time. But if the dev wanted to put some CPU power to the side they can. They can develop their game with this lesser CPU power in mind (say 3.2 GHz, instead of 3.5 Ghz) so that the GPU is always at 10.3 TFs of power.
You literally just said the same thing I said. They would move power to the GPU and take power from the CPU which would throttle the CPU and reduce performance of the CPU. The question is what are the impacts.

If the CPU and GPU can run full power all the time, what would the point of a variable clock speed be? If they can run full throttle together all the time, why would you shift power? Makes no sense.
 
Last edited:
Top Bottom