• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Explanation on the misconception for PS5 Variable Frequency(GPU/CPU)

phil_t98

#SonyToo
The gpu operates most of time at 2.28 ghz clock. In a rare case where the tempreture is rising temporarily gpu operates at 2.21 ghz to not allow the cooling system to get loud . So there is no boost. 2.28 ghz is the main clock but in rare cases its 2.21 ghz for short period of time to drop the power draw by 10%

so what is most of the time?
 

Hobbygaming

has been asked to post in 'Grounded' mode.
There is no chart or official numbers from Sony this bs PR. Cerny was very careful to avoid that he used generic terms minor and a couple ect. Sony is hiding the truth like Microsoft in 2013. This horrible feature this is the esram of this generation. Push to hard down clock and loss of frames. Lock it like every other home console.
PS5's reveal can't come any sooner. Comparing PS5's SSD to ESRAM is asinine 🙄
 
So looking into a bit more...

The AMD Smartshift tech which this is based on varies frequency And voltage per the component based on the workload determined by other hardware.

Meaning the CPU and GPU each have a power budget that changes with the workload, so they downclock and lower voltage if the workload is low. All the while CPU and GPu are each budgeted a portion of that allowed power and their clocks are varied between them to hold that budget level based on a higher cpu or gpu tasking. This allows it to be deterministic.

I think what is confusing everyone is Cerny only described a maximum workload budget scenario of the PS5 with the GPU and CPU running near their max frequencies.
He didn't intend for that to mean for any workload it would always run at the maximum power budget of the console.
 

darkinstinct

...lacks reading comprehension.
The gpu operates most of time at 2.28 ghz clock. In a rare case where the tempreture is rising temporarily gpu operates at 2.21 ghz to not allow the cooling system to get loud . So there is no boost. 2.28 ghz is the main clock but in rare cases its 2.21 ghz for short period of time to drop the power draw by 10%

Got a source for that claim? Typically console games are CPU limited. So devs will want to use the most CPU power they can get to have good frame rates. They can easily drop resolution to make up for lack of GPU power. And with one console at a fixed CPU clock of 3.8 GHz and another with a variable 3.5 GHz max that would run at 3.2 GHz (see Github) whenever the GPU runs at 2.28 GHz there is no way that devs opt for GPU over CPU. Unless PS5 targets 30 fps and XSX 60 fps. That might be possible.
 
Got a source for that claim? Typically console games are CPU limited. So devs will want to use the most CPU power they can get to have good frame rates. They can easily drop resolution to make up for lack of GPU power. And with one console at a fixed CPU clock of 3.8 GHz and another with a variable 3.5 GHz max that would run at 3.2 GHz (see Github) whenever the GPU runs at 2.28 GHz there is no way that devs opt for GPU over CPU. Unless PS5 targets 30 fps and XSX 60 fps. That might be possible.
3.8 ghz is for 8 thread (8c 8t) which would be at huge disadvantage compared to 8c 16t
Xsx is 3.6 at 8c16t. 3.8 is great for BC though.

Source for what I said is cerny.

Lol 30 fps vs 60 fps what ? U are living in lala land?
 
Last edited:

Hobbygaming

has been asked to post in 'Grounded' mode.
What are you talking about variable clocks are SSD now lol? I said it clear variable clocks not SSD. I said the SSD is bad ass jesus.
Oh but that can't be compared either as it would still have the higher clocks
 
Remember, the system is not based on temperature (internal and external), it's based on the "activity level" of the components....

Otherwise playing during a hot summer day would change performances compared to winter....
 

NXGamer

Member
Digital Foundry has spoken with multiple developers making games for the PS5. There is no need for a wall of text from armchair analysts.

Confirmed by DF that the variable clocks use a fixed target (profiles) that the developer chooses from. If you run Max GPU then the CPU is down-clocked and vice versa. This isn't rocket science.
it was me that Alex was speaking to, but he and DF have since gone to clarify with Devs, which I have also been doing as well as it is likley the option is a Priority mode, i.e. you choose to allow the CPU to dip IF demand get to high to keep GPU constant OR you all them both to shift dynamically within the predictable states.
 

JägerSeNNA

Banned
What happens if this v.f. thing not up to devs but it’s a priority of the system’s it’s self? Let’s say that you want to develop a game,should run at 4K/60fps. How do you optimize your game to hit 60fps all the time?

Will the GPU get slower to max out the CPU in order to maintain constant 60fps scene by scene and drop the resolution dynamically? Or reverse. Maintain the visual effects at a base resolution and sacrifice some fps.

Some questions need to be asked. What happens if the system have to get slower because of the high temperature? It’s a bit complicated.
 
it was me that Alex was speaking to, but he and DF have since gone to clarify with Devs, which I have also been doing as well as it is likley the option is a Priority mode, i.e. you choose to allow the CPU to dip IF demand get to high to keep GPU constant OR you all them both to shift dynamically within the predictable states.
Thanks for the input
 

shoegaze

Member
The way I see this - Sony wanted PS5 to be a console-size machine. That could've been one of the most important considerations for them, thus leading to decisions regarding variable clocks and unique cooling solution. XSX may look ridiculously big next to PSS5.
 

longdi

Banned
Please think logically and not fall for Cerny BS.

Rdna1 to Rdna2 still uses 7nm.
If 5700xt can only do 2.1ghz at insane voltage with custom water cooling, it is a tall order to think 6700 can do 2.23ghz 98% sustained with air cooling.

The 5700xt is a full CU chip while 6700 will be the cut down 36CU version.
Cut down versions are failed full version that cannot enable all CU.
Also PS5 6700 is welded into a APU form, which means it also have a zen2 chiplets, the i/o controllers, the pcie controllers, the sound chip. All these to be cooled by some 'patented' air cooling.

It is not even hating Sony here, just simple unbiased logic Mark Sony has used sneaky words that dont represent truths, but half truths. :lollipop_poop:
 
Confirmed by DF that the variable clocks use a fixed target (profiles) that the developer chooses from. If you run Max GPU then the CPU is down-clocked and vice versa. This isn't rocket science.

I’m curious to see how many devs favor GPU vs CPU (and vice versa) for their game’s power profile
 

JägerSeNNA

Banned
Please think logically and not fall for Cerny BS.

Rdna1 to Rdna2 still uses 7nm.
If 5700xt can only do 2.1ghz at insane voltage with custom water cooling, it is a tall order to think 6700 can do 2.23ghz 98% sustained with air cooling.

The 5700xt is a full CU chip while 6700 will be the cut down 36CU version.
Cut down versions are failed full version that cannot enable all CU.
Also PS5 6700 is welded into a APU form, which means it also have a zen2 chiplets, the i/o controllers, the pcie controllers, the sound chip. All these to be cooled by some 'patented' air cooling.

It is not even hating Sony here, just simple unbiased logic Mark Sony has used sneaky words that dont represent truths, but half truths. :lollipop_poop:
This is totally what I think, word by word but let’s see whether a dream come true!
 

Iorv3th

Member
I don't even get why people say boost clock can't be held all the time. It easily can in a PC and any other system if you have adequate cooling in place.
 

DunDunDunpachi

Patient MembeR
How does this save power?
How does reducing frequency save electricity? By not pulling as much electricity through the tubes in your wall. 🤷‍♀️

(that's essentially what "overclocking" is, pushing more electricity through the same chipset. Linear growth of frequency but exponential growth of power consumption)
 

longdi

Banned
I don't even get why people say boost clock can't be held all the time. It easily can in a PC and any other system if you have adequate cooling in place.

Sure you can.
If Sony paid for the best 6700 dies
And their 'patented' cooling is a liquid aio.
Water cooling, even just Aio, is proven perfect for gpu dies.
This will allow them to max the votlage of their premium rdna2 36CU dies and hold stable max boost clocks.

hybrid-2080ti-card.jpg

2_hybrid-2080ti-3dmark-temperature_1.png

1_hybrid-2080ti-3dmark-clocks_1.png
 
Last edited:

DaMonsta

Member
How does reducing frequency save electricity? By not pulling as much electricity through the tubes in your wall. 🤷‍♀️

(that's essentially what "overclocking" is, pushing more electricity through the same chipset. Linear growth of frequency but exponential growth of power consumption)
Obviously I understand that reducing frequency saves power.

I’m asking how Sony’s variable frequency approach saves power in comparison to a clock fixed at ~2% lower.
 

Gamernyc78

Banned
Very informative, thank you

TE="Agnostic2020, post: 257521728, member: 717954"]
I have seen many people make this mistake so its worth explaining this common mistake and the term that is being used mistakenly known as "boost clock" which the correct term is Variable Frequencies.

The PS5's boost clock is not the same as what we know of PC boost clock.
Lots of people have been talking about how the teraflop number is a sham because the PS5 won't be able to run at this 'boost clock' most of the time. That is simply false and here's why:

The PS5 has a specific power limit i.e it's power consumption is not variable and is a consistent figure at all times. Thus the temperature that the chip outputs is the same at all times. This allows the designers to design a cooling system around the exact temperature that it outputs. What this means is that the PS5's fan is not gonna spin much faster or much slower depending on the how big or small power consumption is. Its going to spin based on ambient temps, as the thermal output of the processor is already know and they only need to account for the variance in ambient temps (which makes the temperature range that the cooling system has to be designed for much more predictable and exact i.e a one degree rise in ambient temperature can be more accurately accounted for than the rise in power consumption). So all the people who suffer through the obnoxiously loud and hot PS4s and PS4 Pros, rejoice for you have to suffer no longer (that is, if you're getting a PS5)!

How is Sony achieving this though? Well here is where the variable frequency part comes in. The PS5's processor will be able to see what the games are actually doing i.e what activity is going on in-game, and when those game scenarios occur where power consumption spikes up, it'll downclock. Game developers will be able to tell exactly when it is that this power consumption goes up and as such, be able to account for the reduced frequency then (so in a way, you have the predictability and reliability that comes with setting a specific frequency target, but it also mean that devs will have to work a bit harder to fine tune and optimise things). The crucial thing here is that it doesn't have to downclock by a lot. Remember when I said this a couple of paragraphs above:


Well the opposite is happening here. Decrease in Frequency leads to an exponential decrease in Power consumption.

So a 2-3% decrease in frequency (that's about 40-70 MHz in the PS5's case) can deliver at least a 10% decrease in power consumption. So what this basically means is that the PS5 is going to be hitting the targeted clockspeed of 2.23 GHz most of the time (unlike a PC processor with boost clocks), when it downclocks it's not going to be by a significant amount (again, unlike a PC processor) and while doing all this it is going to remain cool and quiet. Quite an innovative and novel concept huh? This is why the PS5's variable frequency is unlike that of the boost clocks found on PC and shouldn't be compared.

So the summary for this section:

- Xbox Series X: Variable power/temps but constant frequency

- PC: Variable power/temps and variable frequency (unless you overclock it, in which case it'll perform like the Series X processor)

- PS5: Constant power/temps but variable frequency.












For full write up :


[/QUOTE]
Very
 

DunDunDunpachi

Patient MembeR
Obviously I understand that reducing frequency saves power.

I’m asking how Sony’s variable frequency approach saves power in comparison to a clock fixed at ~2% lower.
The difference is that one is fixed and the other is variable. The underlying physics dictating the heat-generation of the electrical input remains the same, I would assume.
 

LordOfChaos

Member
It's almost like people criticizing it for advertising boost clocks over base clocks didn't listen to the talk...

This isn't the thermally bound PC turbo boost, this is workload based power sharing which is expected to near max out at all times unless you're doing the most power hungry operations (games aren't using AVX all day).

Cerny also said temperatures would not be a constraint and we'd be pleased with the cooling setup, even acknowledged that they didn't do the best job with noise before, excited to see it and hope that's true.
 
10.2 TF + 2.3 TF = 12.5 Tf PS5 > XSX 12Tf

Checkmate Microsoft. Sony won next generation.
This is a bit off topic, but you know Sony is goin to win next gen anyway right? I seriously don't understand why they went with this sharade (variable frequency). They are going to sell more consoles than MS anayway. Unless MS is willing to sell the Series X for $399 and Lockheart for $199. Even with all that it's still going to be a tough battle for MS.
 
Last edited:

Deto

Banned
Digital Foundry has spoken with multiple developers making games for the PS5. There is no need for a wall of text from armchair analysts.

Confirmed by DF that the variable clocks use a fixed target (profiles) that the developer chooses from. If you run Max GPU then the CPU is down-clocked and vice versa. This isn't rocket science.

They actually said after saying that they are not completely certain and will ask for clarification from cerny

Cerny said:
not all games will be able to perform at the highest level, but for titles at the lowest level, the differences shouldn't be too apparent.

"When that worst-case game arrives, it will run at a lower clock speed. But not too much lower. To reduce power by 10 percent it only takes a couple of percent reduction in frequency, so I'd expect any downclocking to be pretty minor,"



The DF also said that a new generation would be 8TF. I don't see anyone saying that Sony and MS lied.
 

JägerSeNNA

Banned
This is a bit off topic, but you know Sony is goin to win next gen anyway right? I seriously don't understand why they went with this sharade (variable frequency). They are going to sell more consoles than MS anayway. Unless MS is willing to sell the Series X for $399 and Lockheart for $199.
I didn’t know that Sony will win next gen anyway? What makes you so sure?
 

Saber

Gold Member
As someone who doesn't understand and cares much about specs, this is a very informative post. At least the fan explanation feels pretty nice to read.
 
Last edited:

LordOfChaos

Member


This also means the thermal design engineers have a constant power output to design around and will build the cooling as needed, with Cerny saying we'd be pleased with the cooling setup and that Bloomberg article describing it as "lavish", I don't share this concern until I see that they screwed up the cooling.


Plus...This is the status quo for nearly all consoles? We know what it would do if it overheated, it would shut down, because a console running at throttled clocks isn't running properly.
 

Goliathy

Banned
? He never said those lol did you even watch the road to ps5 ?

he mentioned horizon:


00:33:33,079
processing dense geometry typically consumes less power than processing simple geometry which is I suspect why horizons map screen with its low triangle count makes my ps4 pro heat up so
much our process on previous consoles has been to try to guess what the maximum power consumption during the entire console lifetime might be which is to say the worst case scene in the worst case game and prepare a cooling solution that we think will be quiet at that power level if we get it right fan noise is minimal if we get it wrong the console will be quite loud for the higher power games and there's even a chance that it might overheat and shut down if we miss estimate power too badly

PlayStation 5 is especially challenging because the CPU supports 256 bit native instructions that consume a lot of power these are great here and there but presumably only minimally used or are they if we plan for major 256 bit instruction usage we need to set the CPU clock substantially lower or noticeably increase the size of the power supply and fan so after much discussion we
decided to go with a very different direction on PlayStation 5.

But he never mentioned Seller on Nioh

Full script here: https://pastebin.com/SR9kTApx
 
Last edited:

JägerSeNNA

Banned
This also means the thermal design engineers have a constant power output to design around and will build the cooling as needed, with Cerny saying we'd be pleased with the cooling setup and that Bloomberg article describing it as "lavish", I don't share this concern until I see that they screwed up the cooling.


Plus...This is the status quo for nearly all consoles? We know what it would do if it overheated, it would shut down, because a console running at throttled clocks isn't running properly.
The PS5 won’t be cheap then like most of the people think. Cooling 2.23ghz GPU needs a monster powerful cooling system in a console which will significantly increase the cost of the console.

But if I need to be honest,this boost performance thing came out in the last second just to be able to show two digit numbers to public. Makes me happy if I get wrong in the future.
 

DunDunDunpachi

Patient MembeR
It's almost like people criticizing it for advertising boost clocks over base clocks didn't listen to the talk...

This isn't the thermally bound PC turbo boost, this is workload based power sharing which is expected to near max out at all times unless you're doing the most power hungry operations (games aren't using AVX all day).

Cerny also said temperatures would not be a constraint and we'd be pleased with the cooling setup, even acknowledged that they didn't do the best job with noise before, excited to see it and hope that's true.
Upgrading from my base PS4 to the Pro was a revelation. The system is quieter than my HDD-modded PS2.

Didn't help that I had one of those "jet engine" PS4s. Even attempted to replace the thermal paste which worked for about 2 weeks then went back to its old ways. The system would actually stop, throw up a "PS4 is Too Hot. Please Shut me down, senpai" warning in the middle of certain intense games.

So even from a noise/cooling perspective I am eager to see what they pull off.
 

LordOfChaos

Member
The PS5 won’t be cheap then like most of the people think. Cooling 2.23ghz GPU needs a monster powerful cooling system in a console which will significantly increase the cost of the console.

But if I need to be honest,this boost performance thing came out in the last second just to be able to show two digit numbers to public. Makes me happy if I get wrong in the future.

I don't think either will be "cheap", but I think you could build around 3 PS5 APUs for every 2 SeX ones, and even this "lavish" cooling is coming up from cooling systems that cost a grand total of...


$1.

Say it's 5x that price, that's still going to be cheaper than the APU cost, if it wasn't then this whole thing would seem like an insane avenue to chase.
 
I didn’t know that Sony will win next gen anyway? What makes you so sure?

Seriously, I don't care who wins. I love tech and want the best, the newest. So, I'm definitely gettinig the Series X on launch day, even if it's 600€. But let's face it. The Playstation is so strong and unless MS undercuts Sony on price, there is no way they can outsell Sony.
Just look at previous generations, at the number of fans Sony has all over the world: on forums, media outlets, in real life...
 
Last edited:

Iorv3th

Member
Sure you can.
If Sony paid for the best 6700 dies
And their 'patented' cooling is a liquid aio.
Water cooling, even just Aio, is proven perfect for gpu dies.
This will allow them to max the votlage of their premium rdna2 36CU dies and hold stable max boost clocks.

None of that is required. Most of the time boost clocks are stable on all GPU/CPUs that are advertised with that clock rate. Now if you are talking about custom overclocking then you might need a better cooling system, but even on air you can still get pretty good gains over the stock boost rates.

As far as we know these chips aren't even overclocked, they are designed to run at that clock/core. So it's dumb to assume they won't be able to achieve those speeds most of the time.

Most of the time CPUs/GPU's don't run at boost speeds unless needed is just because your not going to want to be using that power consumption if your not needing it. But they have no trouble holding those clocks at all times if you set them to always run at those speeds.
 
Last edited:
Top Bottom