• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Explanation on the misconception for PS5 Variable Frequency(GPU/CPU)

I have seen many people make this mistake so its worth explaining this common mistake and the term that is being used mistakenly known as "boost clock" which the correct term is Variable Frequencies.

The PS5's boost clock is not the same as what we know of PC boost clock.
Lots of people have been talking about how the teraflop number is a sham because the PS5 won't be able to run at this 'boost clock' most of the time. That is simply false and here's why:

The PS5 has a specific power limit i.e it's power consumption is not variable and is a consistent figure at all times. Thus the temperature that the chip outputs is the same at all times. This allows the designers to design a cooling system around the exact temperature that it outputs. What this means is that the PS5's fan is not gonna spin much faster or much slower depending on the how big or small power consumption is. Its going to spin based on ambient temps, as the thermal output of the processor is already know and they only need to account for the variance in ambient temps (which makes the temperature range that the cooling system has to be designed for much more predictable and exact i.e a one degree rise in ambient temperature can be more accurately accounted for than the rise in power consumption). So all the people who suffer through the obnoxiously loud and hot PS4s and PS4 Pros, rejoice for you have to suffer no longer (that is, if you're getting a PS5)!

How is Sony achieving this though? Well here is where the variable frequency part comes in. The PS5's processor will be able to see what the games are actually doing i.e what activity is going on in-game, and when those game scenarios occur where power consumption spikes up, it'll downclock. Game developers will be able to tell exactly when it is that this power consumption goes up and as such, be able to account for the reduced frequency then (so in a way, you have the predictability and reliability that comes with setting a specific frequency target, but it also mean that devs will have to work a bit harder to fine tune and optimise things). The crucial thing here is that it doesn't have to downclock by a lot. Remember when I said this a couple of paragraphs above:

Increase in power doesn't correspond to an equal increase in frequency i.e it does not scale linearly. To hit higher frequencies you need to input more and more amounts of power to the point where it becomes a case of diminishing results
Well the opposite is happening here. Decrease in Frequency leads to an exponential decrease in Power consumption.

So a 2-3% decrease in frequency (that's about 40-70 MHz in the PS5's case) can deliver at least a 10% decrease in power consumption. So what this basically means is that the PS5 is going to be hitting the targeted clockspeed of 2.23 GHz most of the time (unlike a PC processor with boost clocks), when it downclocks it's not going to be by a significant amount (again, unlike a PC processor) and while doing all this it is going to remain cool and quiet. Quite an innovative and novel concept huh? This is why the PS5's variable frequency is unlike that of the boost clocks found on PC and shouldn't be compared.

So the summary for this section:

- Xbox Series X: Variable power/temps but constant frequency

- PC: Variable power/temps and variable frequency (unless you overclock it, in which case it'll perform like the Series X processor)

- PS5: Constant power/temps but variable frequency.












For full write up :

 
Last edited:

BeardGawd

Banned
Digital Foundry has spoken with multiple developers making games for the PS5. There is no need for a wall of text from armchair analysts.

Confirmed by DF that the variable clocks use a fixed target (profiles) that the developer chooses from. If you run Max GPU then the CPU is down-clocked and vice versa. This isn't rocket science.
 

B0xel-new

Neo Member
Very nice read, thanks for posting it!
I’ve been saying variable frequency is actually quite the clever idea, interesting to see how it will work in real world. I hope Sony are providing enough power though for games later in the generation that might push it hard.
 
Digital Foundry has spoken with multiple developers making games for the PS5. There is no need for a wall of text from armchair analysts.

Confirmed by DF that the variable clocks use a fixed target (profiles) that the developer chooses from. If you run Max GPU then the CPU is down-clocked and vice versa. This isn't rocket science.
They actually said after saying that they are not completely certain and will ask for clarification from cerny
 

quest

Not Banned from OT
There is no chart or official numbers from Sony this bs PR. Cerny was very careful to avoid that he used generic terms minor and a couple ect. Sony is hiding the truth like Microsoft in 2013. This horrible feature this is the esram of this generation. Push to hard down clock and loss of frames. Lock it like every other home console.
 

Kenpachii

Member
I have seen many people make this mistake so its worth explaining this common mistake and the term that is being used mistakenly known as "boost clock" which the correct term is Variable Frequencies.

The PS5's boost clock is not the same as what we know of PC boost clock.
Lots of people have been talking about how the teraflop number is a sham because the PS5 won't be able to run at this 'boost clock' most of the time. That is simply false and here's why:

The PS5 has a specific power limit i.e it's power consumption is not variable and is a consistent figure at all times. Thus the temperature that the chip outputs is the same at all times. This allows the designers to design a cooling system around the exact temperature that it outputs. What this means is that the PS5's fan is not gonna spin much faster or much slower depending on the how big or small power consumption is. Its going to spin based on ambient temps, as the thermal output of the processor is already know and they only need to account for the variance in ambient temps (which makes the temperature range that the cooling system has to be designed for much more predictable and exact i.e a one degree rise in ambient temperature can be more accurately accounted for than the rise in power consumption). So all the people who suffer through the obnoxiously loud and hot PS4s and PS4 Pros, rejoice for you have to suffer no longer (that is, if you're getting a PS5)!

How is Sony achieving this though? Well here is where the variable frequency part comes in. The PS5's processor will be able to see what the games are actually doing i.e what activity is going on in-game, and when those game scenarios occur where power consumption spikes up, it'll downclock. Game developers will be able to tell exactly when it is that this power consumption goes up and as such, be able to account for the reduced frequency then (so in a way, you have the predictability and reliability that comes with setting a specific frequency target, but it also mean that devs will have to work a bit harder to fine tune and optimise things). The crucial thing here is that it doesn't have to downclock by a lot. Remember when I said this a couple of paragraphs above:


Well the opposite is happening here. Decrease in Frequency leads to an exponential decrease in Power consumption.

So a 2-3% decrease in frequency (that's about 40-70 MHz in the PS5's case) can deliver at least a 10% decrease in power consumption. So what this basically means is that the PS5 is going to be hitting the targeted clockspeed of 2.23 GHz most of the time (unlike a PC processor with boost clocks), when it downclocks it's not going to be by a significant amount (again, unlike a PC processor) and while doing all this it is going to remain cool and quiet. Quite an innovative and novel concept huh? This is why the PS5's variable frequency is unlike that of the boost clocks found on PC and shouldn't be compared.

So the summary for this section:

- Xbox Series X: Variable power/temps but constant frequency

- PC: Variable power/temps and variable frequency (unless you overclock it, in which case it'll perform like the Series X processor)

- PS5: Constant power/temps but variable frequency.










Source :


PROCESSING POWER AND COOLING
I'm not gonna talk about the processors because they're fairly similar and standard components. What I want to talk about is how they're configured.

The Series X uses a tried and tested design. They set a specific Frequency target for both the CPU and the GPU and supply the chip with increasing amounts of power till the chip hits those frequencies. However, this power draw is not uniform i.e it is a variable range. In certain game scenarios, more power is required to hit that frequency while in others less power is required (to be noted here: increase in power doesn't correspond to an equal increase in frequency i.e it does not scale linearly. To hit higher frequencies you need to input more and more amounts of power to the point where it becomes a case of diminishing results). Power consumed=Temperature outputted. As such, the temperature outputted by the chip is also a variable range like the amount of power consumed (you can actually see the variance in power consumed in real time by the amount of noise the fan is generating. If its spinning faster and louder it means that more power is being consumed). So the design team and engineering team have to design a cooling system that is sufficient and works well for a range of temperatures/power consumed. The catch here is that they're making a prediction about the amount of power that might be drawn by a game and this prediction might end up not being good enough for certain games i.e the cooling system might not be sufficient enough for certain games which means when this game is being played the console is gonna run hot and loud. The Series X designers have clearly taken note of this because the entire design of the console is based around this. I think its fair to say that the Series X has one of the most unique designs out there, and that is to accommodate the cooling system. How well it'll run remains to be seen but the design team seem very confident and I think we can trust them.

Now onto the PS5. The PS5 eschews years of traditional console design and goes for variable frequencies or boost clocks. However there is a huge misconception among people because of the word 'boost clocks'. To understand further, lets take a look at something that has boost clocks in the traditional sense - a PC processor (GPU or CPU, both will do). A PC processor usually has two clockspeed targets - a base clock (lower frequency) and a boost clock (higher frequency). Variable power will be supplied to the processor so that it consistently hits that base clock (so basically what the Series X is doing). However, if there is thermal headroom (i.e more power can be supplied to the chip without it overheating) the processor will be supplied with more power so that it hits the boost clock. However at this boost clock, the temperatures will start to rise and eventually the processor will have to come back to the base clock to prevent overheating. In this type of configuration, both the power and the frequency are variable.

The PS5's boost clock is not the same as this. Lots of people have been talking about how the teraflop number is a sham because the PS5 won't be able to run at this 'boost clock' most of the time. That is simply false and here's why:

The PS5 has a specific power limit i.e it's power consumption is not variable and is a consistent figure at all times. Thus the temperature that the chip outputs is the same at all times. This allows the designers to design a cooling system around the exact temperature that it outputs. What this means is that the PS5's fan is not gonna spin much faster or much slower depending on the how big or small power consumption is. Its going to spin based on ambient temps, as the thermal output of the processor is already know and they only need to account for the variance in ambient temps (which makes the temperature range that the cooling system has to be designed for much more predictable and exact i.e a one degree rise in ambient temperature can be more accurately accounted for than the rise in power consumption). So all the people who suffer through the obnoxiously loud and hot PS4s and PS4 Pros, rejoice for you have to suffer no longer (that is, if you're getting a PS5)!

How is Sony achieving this though? Well here is where the variable frequency part comes in. The PS5's processor will be able to see what the games are actually doing i.e what activity is going on in-game, and when those game scenarios occur where power consumption spikes up, it'll downclock. Game developers will be able to tell exactly when it is that this power consumption goes up and as such, be able to account for the reduced frequency then (so in a way, you have the predictability and reliability that comes with setting a specific frequency target, but it also mean that devs will have to work a bit harder to fine tune and optimise things). The crucial thing here is that it doesn't have to downclock by a lot. Remember when I said this a couple of paragraphs above:


Well the opposite is happening here. Decrease in Frequency leads to an exponential decrease in Power consumption. So a 2-3% decrease in frequency (that's about 40-70 MHz in the PS5's case) can deliver at least a 10% decrease in power consumption. So what this basically means is that the PS5 is going to be hitting the targeted clockspeed of 2.23 GHz most of the time (unlike a PC processor with boost clocks), when it downclocks it's not going to be by a significant amount (again, unlike a PC processor) and while doing all this it is going to remain cool and quiet. Quite an innovative and novel concept huh? This is why the PS5's variable frequency is unlike that of the boost clocks found on PC and shouldn't be compared.

So the summary for this section:

- Xbox Series X: Variable power/temps but constant frequency

- PC: Variable power/temps and variable frequency (unless you overclock it, in which case it'll perform like the Series X processor)

- PS5: Constant power/temps but variable frequency.

The exciting thing for me and maybe others who are interested in hardware and engineering is that these are three different ways to achieve the same general target. Just goes to show that these companies are putting lots of effort into the design and engineering and not just copying things and definitely not skimping on anything.

For full write up :



Your PC part is straight up wrong.

U can overclock and slam more power on your GPU and once the heat piles up on a GPU it will downclock accordingly to keep on the right temps. So no it's exactly what PC already does on there GPU's.

However the thing with PC is they aren't bottlenecked like consoles with a fixed performance GPU's.

Sony simple overclocking a AMD GPU to the point it overheats so they reduce the clocks the moment it does when the CPU gets used. Which is a terrible system for a fixed console solution. And if it already does this straight out of the box, have fun when that thing collects dusts and gets more slower over time far quicker.

It's a trash system for the sake to hit 10tflop number for absolute no reason.

If they only lost 3% performance that's 1,8 fps on 60 fps to not make it overheat? why not just locking it at that frequency then? Because its bullshit, that thing will be downclocking hard and that's why they didn't shared the real numbers with you because it would probably look extremely bad for them.

If sony wants to proof me wrong, they will have to show numbers. The fact they don't but show 30 minutes with super detail SSD information should already tell you enough. It's the worst case scenario.
 
Last edited:
We won’t know what it’s actually capable of until it’s compared against the Series X. Cerny’s own words are very contradictory. Saying that they were struggling to hit a 2GHz locked GPU and even a 3GHz CPU, but yet with variable clock they can far exceed both the majority of the time?

Something doesn’t add up, but platform holders will always try and shine the best light on things. Time will tell but it seems like a late decision effort to artificially pad out the spec sheet, rather than a genius move which was planned all along.
 

martino

Member
Digital Foundry has spoken with multiple developers making games for the PS5. There is no need for a wall of text from armchair analysts.

Confirmed by DF that the variable clocks use a fixed target (profiles) that the developer chooses from. If you run Max GPU then the CPU is down-clocked and vice versa. This isn't rocket science.

i don't see why it couldn't work like manual oc work on pc.
a power target or temp one and frequency adjust around it.
Strange DF is not aware of this one.

My 1080ti works between 1940 and 2020 mhz like that since day one.

edit : i'm just realizing my 2% math at 3am in another thread was really wrong
 
Last edited:
i don't see why it couldn't work like manual oc work on pc.
a power target or temp one and frequency adjust around it.
Strange DF is not aware of this one.

My 1080ti work 1940-2020 mhz like that since day one.
Exactly and this is not considered boost clock .its a varying frequency as they are extremely close anyways
 

BeardGawd

Banned
They actually said after saying that they are not completely certain and will ask for clarification from cerny

It's very simple. Most of the time when dealing with "boost clocks" the CPU only boosts temporarily while the rest of the system stays the same until it reaches a certain temperature and then it reverts back to base clocks. In the PS5's case they will down clock the GPU so the CPU can stay at max (or very close to it) and vice versa.


Basically you target and say I want full GPU and the CPU underclocks so the Power Budget keeps the GPU clock high. The Power that would have been CPU Reserved goes over to the GPU to keep it's clock more stable, and since the CPU is now lower clocked, the more intense utilisation or instructions will not tip the Power Balance - well, that is for a game that is also not Absolutely thrashing both.
indeed this Info comes from people who work on the Thing.

Basically, if the Gpu is at 10.2 TF, the cpu is not at 3.5 GHz.


Cerny said all this on stage basically, just not in the most direct way. The only reason to mention smart shift is if this happens, just like it does on smart shift.
 

Goliathy

Banned
If sony wants to proof me wrong, they will have to show numbers. The fact they don't but show 30 minutes with super detail SSD information should already tell you enough. It's the worst case scenario.

I agree, but according to Mr. Cerny:

00:38:03,429
when that worst case game arrives it will run at a lower clock speed but not too much lower to reduce power by 10% it only takes a couple of percent reduction in frequency so I'd expect any down clocking to be pretty minor

So, "a couple" = "2-4%" max. according to Cerny, we will see how it turns out.
 

Vroadstar

Member
Your PC part is straight up wrong.

U can overclock and slam more power on your GPU and once the heat piles up on a GPU it will downclock accordingly to keep on the right temps. So no it's exactly what PC already does on there GPU's.

However the thing with PC is they aren't bottlenecked like consoles with a fixed performance GPU's.

Sony simple overclocking a AMD GPU to the point it overheats so they reduce the clocks the moment it does when the CPU gets used. Which is a terrible system for a fixed console solution. And if it already does this straight out of the box, have fun when that thing collects dusts and gets more slower over time far quicker.

It's a trash system for the sake to hit 10tflop number for absolute no reason.

If they only lost 3% performance that's 1,8 fps on 60 fps to not make it overheat? why not just locking it at that frequency then? Because its bullshit, that thing will be downclocking hard and that's why they didn't shared the real numbers with you because it would probably look extremely bad for them.

If sony wants to proof me wrong, they will have to show numbers. The fact they don't but show 30 minutes with super detail SSD information should already tell you enough. It's the worst case scenario.

Look we got an armchair system architect here, Someone will hire this guy soon he's great on forums!
 
I agree, but according to Mr. Cerny:

00:38:03,429
when that worst case game arrives it will run at a lower clock speed but not too much lower to reduce power by 10% it only takes a couple of percent reduction in frequency so I'd expect any down clocking to be pretty minor

So, "a couple" = "2-4%" max. according to Cerny, we will see how it turns out.
Exactly. If we think he lied then thats a different scenario.

But I also heard from some non stop that he lied in 2019 when he said ps5 ssd speed has not been seen in pc market and guess what he was and is still correct till summer this year when the pcie4 saturate releases which is faster
 

martino

Member
TL;DR

PS5 is faster than you thought.
Sony is using AMD's tech to split TDP between parts in a notebook setting in a very innovating way: to split TDP between parts in a console setting.

Amazing tech.
no, that just make it as near as possible as possible to the number advertised not above or a lot less
 

Kenpachii

Member
I agree, but according to Mr. Cerny:

00:38:03,429
when that worst case game arrives it will run at a lower clock speed but not too much lower to reduce power by 10% it only takes a couple of percent reduction in frequency so I'd expect any down clocking to be pretty minor

So, "a couple" = "2-4%" max. according to Cerny, we will see how it turns out.

Then why not do the following:

1) The cooling solution a bit better
2) make the box a tiny bit bigger
3) lower those clocks by 3% and not have that issue?

Why are they so focused on that 3% and why did they feel like making a thing around it.

How does this make any sense to anybody?

It's clear they decided to push this solution for the sake of inflating there numbers.
 
Last edited:

martino

Member
Then why not increase:

1) The cooling solution a bit better
2) make the box a tiny bit bigger
3) lower those clocks by 3% and not have that issue?

Why are they so focused on that 3% and why did they feel like making a thing around it?

How does this make any sense to anybody?

edit : misread

again with my 1080ti as exemple.
the card is 1670mhz boost (it's a really safe clocking)
lot of them can run 1800-1900mhz
mine run 1950-2020 mhz
and some people can make them stay over 2000 nearing 2100 (on 14nm process)

imo that means they lost more die in the process to have all them at 2.23 at top frequency (than with 2ghz) ...but if they are doing it, it must not be that much.
 
Last edited:

Goliathy

Banned
Then why not increase:

1) The cooling solution a bit better
2) make the box a tiny bit bigger
3) lower those clocks by 3% and not have that issue?

Why are they so focused on that 3% and why did they feel like making a thing around it?

How does this make any sense to anybody?

I don't know, there is not a valid reason. Maybe to close the gap to the xbox series x and to hit the sweet double digit TFlops. Maybe cerny gives us more insights on why they did it.
 
Then why not increase:

1) The cooling solution a bit better
2) make the box a tiny bit bigger
3) lower those clocks by 3% and not have that issue?

Why are they so focused on that 3% and why did they feel like making a thing around it?

How does this make any sense to anybody?
He talked about it in road to ps5 and how ps4 pro would get loud when playing god of war and they don't want ps5 to get loud. They want constant noise . In an unlikely event that tempreture rises for gpu or CPU they will drop the freq by 2% 3%to reduce the power draw by 10% and ensure the cooling system doesn't get loud and has consistent noise level
 
Last edited:

Bernkastel

Ask me about my fanboy energy!
Thank you for posting an indie dev with xbox logo in her display picture and constantly talking about xbox and xcloud lol .and she is wrong BTW
Thank you for reminding that NeoGaf posters know more about tech(especially of a console that we have not seen but only told specs in an ASMR sermon) than actual game devs.
 

llien

Member
3) lower those clocks by 3% and not have that issue?

10.7 => 10.379
You infidel!!!!

PS
Seriously, why not? Perhaps because they wanted to be "under 20% slower" than MS SeX. Oh, wait, it would still be less than 20% diff.
 
Last edited:

Kenpachii

Member
I don't know, there is not a valid reason. Maybe to close the gap to the xbox series x and to hit the sweet double digit TFlops. Maybe cerny gives us more insights on why they did it.

And that's the funny thing, they couldn't go into more detail about it yet had time enough to come up with 300 SSD sheets and insane information. They had time, they just didn't want to mention it. its pretty obvious.

Maybe sony will make changes after the backslash they got out of it.

He talked about it in road to ps5 and how ps4 pro would get loud when playing god of war and they don't want ps5 to get loud. They want constant noise . In an unlikely event that tempreture rises for gpu or CPU they will drop the freq by 2% 3%to reduce the power draw by 10% and ensure the cooling system doesn't get loud and has consistent noise level

Then get a better fan or a bigger box or better cooling or don't push your chip to 2,2+ghz?. The reason PS4 pro makes so much noise is because the fan in there is dog shit on a less then 1 buck cooler most likely. I have 12 fans in my PC, and still i can't hear shit. it's a poor man's excuse at best.
 

quest

Not Banned from OT
He talked about it in road to ps5 and how ps4 pro would get loud when playing god of war and they don't want ps5 to get loud. They want constant noise . In an unlikely event that tempreture rises for gpu or CPU they will drop the freq by 2% 3%to reduce the power draw by 10% and ensure the cooling system doesn't get loud and has consistent noise level
You have a link to the chart that shows this 2-3% or you guessing off of generic terms? If you want constant noise put good cooling in the box not cheap cooling like Sony did last generation not hard to do. Multiple 3rd parties visited Microsoft the series x was quiet after a full day use with out down clocking and risk losing frames.
 

UnNamed

Banned
In simple terms (note: numbers are invented just for example):

Ryzen APU consume, for example, 200W.
According to their test, 200W means up to 80degrees with their thermal cooling.
In a ideal case we would have 50W CPU and 150W GPU. This is what past consoles do.
Since PS5 prefers to boost GPU over CPU, we can divide 25W for CPU and 175W for the GPU. 25W for CPU is low, and you can reach only 2.8ghz with that power spec, but is enough for some uses, 175W/2,23ghz is reasonably high for an overclocked GPU.
You play a Indie game with relatively poor logic, so poor use of CPU, you have GPU boosted all the time.
You play Mortal Kombat 13 with a better use of CPU, you'll have GPU boosted most of the time.
You play an open world game with ton of graphics, npcs, physics and stuff, or GT with 40 cars on track, physics, simulation and impressive graphics, forget your GPU boost.

Am I right?
 
You have a link to the chart that shows this 2-3% or you guessing off of generic terms? If you want constant noise put good cooling in the box not cheap cooling like Sony did last generation not hard to do. Multiple 3rd parties visited Microsoft the series x was quiet after a full day use with out down clocking and risk losing frames.
Go watch the video man I gave a link and time frame. they said this time they have engineered a specific cooling system which they will teardown and show later on .he says all this in the video of road to ps5.from couple of % rediction in clock frequency to 10% reduction in power draw its all there.
 
In simple terms (note: numbers are invented just for example):

Ryzen APU consume, for example, 200W.
According to their test, 200W means up to 80degrees with their thermal cooling.
In a ideal case we would have 50W CPU and 150W GPU. This is what past consoles do.
Since PS5 prefers to boost GPU over CPU, we can divide 25W for CPU and 175W for the GPU. 25W for CPU is low, and you can reach only 2.8ghz with that power spec, but is enough for some uses, 175W/2,23ghz is reasonably high for an overclocked GPU.
You play a Indie game with relatively poor logic, so poor use of CPU, you have GPU boosted all the time.
You play Mortal Kombat 13 with a better use of CPU, you'll have GPU boosted most of the time.
You play an open world game with ton of graphics, npcs, physics and stuff, or GT with 40 cars on track, physics, simulation and impressive graphics, forget your GPU boost.

Am I right?
There is no gpu boost. The gpu operates most of time at 2.28 ghz clock. In a rare case where the tempreture is rising temporarily gpu operates at 2.21 ghz to not allow the cooling system to get loud . So there is no boost. 2.28 ghz is the main clock but in rare cases its 2.21 ghz for short period of time
 

martino

Member
In simple terms (note: numbers are invented just for example):

Ryzen APU consume, for example, 200W.
According to their test, 200W means up to 80degrees with their thermal cooling.
In a ideal case we would have 50W CPU and 150W GPU. This is what past consoles do.
Since PS5 prefers to boost GPU over CPU, we can divide 25W for CPU and 175W for the GPU. 25W for CPU is low, and you can reach only 2.8ghz with that power spec, but is enough for some uses, 175W/2,23ghz is reasonably high for an overclocked GPU.
You play a Indie game with relatively poor logic, so poor use of CPU, you have GPU boosted all the time.
You play Mortal Kombat 13 with a better use of CPU, you'll have GPU boosted most of the time.
You play an open world game with ton of graphics, npcs, physics and stuff, or GT with 40 cars on track, physics, simulation and impressive graphics, forget your GPU boost.

Am I right?
we'll see cpu this time are in another category. they will do a lot of things
 
Last edited:
So according to this explanation and several others I've seen... When the PS5 is sitting on the main menu or watching a Blu Ray its going to pull the same power from the wall as when its crunching God Of War 2???

The thing would be cranking out 250 to 300 watts at all times.. I highly doubt they would do this.
 
So according to this explanation and several others I've seen... When the PS5 is sitting on the main menu or watching a Blu Ray its going to pull the same power from the wall as when its crunching God Of War 2???

The thing would be cranking out 250 to 300 watts at all times.. I highly doubt they would do this.
The power draw of the system is constant while playing games .not in OS
 

phil_t98

#SonyToo
so lets get this straight, using assassins creed as an example. more NPC's which is more compute power then the GPU takes a hit? then much better Graphics then the CPU takes a hit?
 

hyperbertha

Member
Thank you for reminding that NeoGaf posters know more about tech(especially of a console that we have not seen but only told specs in an ASMR sermon) than actual game devs.
Neither do Indie devs know more about the console than Cerny. Two can play at this game. Besides the quote you had was a very simplistic point of view. I'm no expert, but RDNA 2 is said to be able to run at much higher clocks, and Cerny has stated clearly they've addressed addressed the cooling. So in this case , the dev in your link is obviously very uneducated.
 

DunDunDunpachi

Patient MembeR
Nice writeup. I saw 10x more armchair analysis on what variable frequency really meant instead of evaluating what Cerny (and then outlets like DF) explained. Kinda sad that a bunch of folks who are quite obviously interested in hardware specs wrote it off so snidely, because the payoff of low heat + low fan noise at the occasional cost of (let's say) 50mhz seems like a no-brainer.

Oh no the clock drops from its usual 2.23 down to 2.18, my precious terafloooooops.
 

quest

Not Banned from OT
Go watch the video man I gave a link and time frame. they said this time they have engineered a specific cooling system which they will teardown and show later on .he says all this in the video of road to ps5.from couple of % rediction in clock frequency to 10% reduction in power draw its all there.
Exactly he gave a general term not a chart or hard numbers like he did the SSD. We have 100s of hard numbers on the SSD but 0 on this for a reason. Minor, a couple, most of the time vs real numbers on the SSD that is bad ass. Post them charts with what loads cause down clocks.

There is no gpu boost. The gpu operates most of time at 2.28 ghz clock. In a rare case where the tempreture is rising temporarily gpu operates at 2.21 ghz to not allow the cooling system to get loud . So there is no boost. 2.28 ghz is the main clock but in rare cases its 2.21 ghz for short period of time
Another generic term most of the time so 51%,55%61%,90%;we have no clue since Sony hid the numbers. Its funny we all questioned why Microsoft was hiding things in 2013 including me but not now? Go search my 2013 posts I was on Microsoft for the same thing.
 

DarkestHour

Banned
OP is not correct. It hasn't been standard for a PC to have fixed frequencies in the longest time and won't work that way unless forced which is an old school way of doing overclocing. Both CPU/GPU are variable frequencies and frequency.
 
well explain it clearer then exactly how variable clock speeds are hypothetically gonna work?
The gpu operates most of time at 2.28 ghz clock. In a rare case where the tempreture is rising temporarily gpu operates at 2.21 ghz to not allow the cooling system to get loud . So there is no boost. 2.28 ghz is the main clock but in rare cases its 2.21 ghz for short period of time to drop the power draw by 10%
 
Top Bottom