• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Explanation on the misconception for PS5 Variable Frequency(GPU/CPU)

Sota4077

Member
Jesus christ there has got to be some astroturfing going on from Sony at this point. These posts downplaying power and inflating specifically Sony's SSD are coming out every 20 damn minutes for the last week. It is like after Microsoft shit the bed for their Xbox One launch and then we spent the next month getting constant clarifications to everything that came out. Suddenly Sony are having every developer under their umbrella come out and defend them while literally every other developer seems to say it is awesome that SSD's are now in consoles but it doesn't do anything beyond loading games and assets faster.
 
Jesus christ there has got to be some astroturfing going on from Sony at this point. These posts downplaying power and inflating specifically Sony's SSD are coming out every 20 damn minutes for the last week. It is like after Microsoft shit the bed for their Xbox One launch and then we spent the next month getting constant clarifications to everything that came out. Suddenly Sony are having every developer under their umbrella come out and defend them while literally every other developer seems to say it is awesome that SSD's are now in consoles but it doesn't do anything beyond loading games and assets faster.
Dude what the hell does any of what u said has anything to do with this thread ? This is explanation of cooling system in ps5 . Wtf 🤣
 

DunDunDunpachi

Patient MembeR
Jesus christ there has got to be some astroturfing going on from Sony at this point. These posts downplaying power and inflating specifically Sony's SSD are coming out every 20 damn minutes for the last week. It is like after Microsoft shit the bed for their Xbox One launch and then we spent the next month getting constant clarifications to everything that came out. Suddenly Sony are having every developer under their umbrella come out and defend them while literally every other developer seems to say it is awesome that SSD's are now in consoles but it doesn't do anything beyond loading games and assets faster.
Turnabout is fair play.

Or do you disapprove of the two-dozen bait thread about how salty are the Sony fanboys immediately following the conference?

POLL: How salty are the saltybois?

O 10.2 TF salty
O SSD speed salty
O A balance of sodium and chloride
☉ Himalayan salt lamp salty
 
Last edited:
We know this OP. It’s nothing, but overzealous Xbox fanboys who keep pushing the narrative that the PS5 won’t be running constantly at 2.23Ghz and only down clock if the workload requires it which is rarely.
 

SlimySnake

Flashless at the Golden Globes
I dont know man. I feel like Cerny just made that shit up just to get over 10 tflops.

this is probably a 9.2 tflops gpu that will run at that frequency most of the time. if what cerny is saying is true then it will force devs to literally play through the entire game and code on a second by second basis since they now have to look at activity being sent to the cpu and gpu to make sure it never goes above a certain temp.

no one is gonna do that. not even first party devs. they will just run this thing downclocked at all times to ensure they never hit that worse case scenario. imagine you are running this physics and destruction heavy ps5 game at 1800p. maybe its horizon. now imagine you are running away from enemies and then all of a sudden several enemies start chasing you while other herds join on. all of a sudden, you have overloaded the cpu. it needs those a.i routines to run at 3.5 ghz, but at the same time the gpu needs to also render all these new enemies. boom, their power control system throttles both the cpu and gpu because the activity is so high. what now? does it impact the a.i? does it impact the framerate? destruction? resolution?

best case scenario, its just resolution that takes a hit but devs will have to code around it. its going to be coded on a case by case basis. its a shitty system and no one is going to use it.
 

DunDunDunpachi

Patient MembeR
I dont know man. I feel like Cerny just made that shit up just to get over 10 tflops.

this is probably a 9.2 tflops gpu that will run at that frequency most of the time. if what cerny is saying is true then it will force devs to literally play through the entire game and code on a second by second basis since they now have to look at activity being sent to the cpu and gpu to make sure it never goes above a certain temp.

no one is gonna do that. not even first party devs. they will just run this thing downclocked at all times to ensure they never hit that worse case scenario. imagine you are running this physics and destruction heavy ps5 game at 1800p. maybe its horizon. now imagine you are running away from enemies and then all of a sudden several enemies start chasing you while other herds join on. all of a sudden, you have overloaded the cpu. it needs those a.i routines to run at 3.5 ghz, but at the same time the gpu needs to also render all these new enemies. boom, their power control system throttles both the cpu and gpu because the activity is so high. what now? does it impact the a.i? does it impact the framerate? destruction? resolution?

best case scenario, its just resolution that takes a hit but devs will have to code around it. its going to be coded on a case by case basis. its a shitty system and no one is going to use it.
Why would a company design a variable clock speed controller, a power system, and a tailored cooling system just to get over an arbitrary number that 95% of consumers won't even see?
 

M1chl

Currently Gif and Meme Champion
PS5's reveal can't come any sooner. Comparing PS5's SSD to ESRAM is asinine 🙄
Any comparison with ESRAM this gen and next gen is pure stupidity. There isn't anything even remotely close to that, just because it's some memory pool, does not make it similar.
 

DunDunDunpachi

Patient MembeR
surely you arent serious.
Then explain it to me. Sony appears to have been figuring out this spinning-plates of a design for awhile. We have some glimmers of it all the way back in early 2019 long before Microsoft revealed their system specs.

Show me your evidence that they rushed the variable frequency, the cooling system, and the underlying philosophy of "constant power" in response to an arbitrary number. Because that is the implication here.

So far this comment about Sony pushing things just to get over 10 teraflops (which you aren't the first to parrot) reads like FUD from a company that very much wants people to care about teraflops, a metric that has only recently gained notoriety among console players. Hmmm...
 

SlimySnake

Flashless at the Golden Globes
Then explain it to me. Sony appears to have been figuring out this spinning-plates of a design for awhile. We have some glimmers of it all the way back in early 2019 long before Microsoft revealed their system specs.

Show me your evidence that they rushed the variable frequency, the cooling system, and the underlying philosophy of "constant power" in response to an arbitrary number. Because that is the implication here.

So far this comment about Sony pushing things just to get over 10 teraflops (which you aren't the first to parrot) reads like FUD from a company that very much wants people to care about teraflops, a metric that has only recently gained notoriety among console players. Hmmm...
evidence? i already listed the workflow in my original post. go back and read it. they do it based on activity on the gpu and cpu. the example i listed shows that its going to require an insane amount of playtesting to ensure those A.I routines, destruction and physics dont fail when the system throttles itself. imagine a.i just stopping because the cpu was being throttled. how do you code for that on a scene by scene basis? you dont. you set a limit on both the cpu and gpu and run the game at that. its why consoles have always had fixed frequency. they knew they had access to 1.6 ghz of cpu speed and coded around that without worrying about going over the power budget.

you are moving goalposts by ignoring all of that. you are pretending as if MS and Sony havent spent the last 7 years training everyone that tflops matter. its marketing 101. you want to avoid bad PR. that video was trending #1 for days and got 13 million views in like two days. gamers set trends, its why xbox one was so mercilessly crucified by the 5% of the gaming population that did see conference. the rest of the drones just follow what we do. sony knows this more than anyone. they used the #NoPS4DRM movement on twitter to set aside literally five minutes while they took shots at the xbox one's policy and capitalized on the anger of a small but vocal fraction of the gaming landscape. they know what wouldve happened if they had released a 9 tflops gpu.
 

Ascend

Member
So a 2-3% decrease in frequency (that's about 40-70 MHz in the PS5's case) can deliver at least a 10% decrease in power consumption.
That line alone tells me everything. They clocked the GPU way beyond the optimal point on the efficiency curve. Generally, that is not something you want to do for a power efficient system like laptops, or, a console. It makes things suspect, and is evidence for Sony bumping up speeds at the last minute. It sounds like they were trying hard to advertise a value above 10TFLOPS.
 

DunDunDunpachi

Patient MembeR
evidence? i already listed the workflow in my original post. go back and read it. they do it based on activity on the gpu and cpu. the example i listed shows that its going to require an insane amount of playtesting to ensure those A.I routines, destruction and physics dont fail when the system throttles itself. imagine a.i just stopping because the cpu was being throttled. how do you code for that on a scene by scene basis? you dont. you set a limit on both the cpu and gpu and run the game at that. its why consoles have always had fixed frequency. they knew they had access to 1.6 ghz of cpu speed and coded around that without worrying about going over the power budget.

you are moving goalposts by ignoring all of that. you are pretending as if MS and Sony havent spent the last 7 years training everyone that tflops matter. its marketing 101. you want to avoid bad PR. that video was trending #1 for days and got 13 million views in like two days. gamers set trends, its why xbox one was so mercilessly crucified by the 5% of the gaming population that did see conference. the rest of the drones just follow what we do. sony knows this more than anyone. they used the #NoPS4DRM movement on twitter to set aside literally five minutes while they took shots at the xbox one's policy and capitalized on the anger of a small but vocal fraction of the gaming landscape. they know what wouldve happened if they had released a 9 tflops gpu.
You have the wrong fanboy, friend, and I'm not going to run in circles to talk you out of your standpoint.

Substantiate this harebrained notion that Sony rushed their clock speeds -- and the underlying architecture and hardware to facilitate it -- in order to compete with Microsoft.

Hard mode: try not to mention gitHub leaks.

I'll wait.
 

mejin

Member
Thanks for this post.

Cerny and some media outlets were clear on how this works, and the latter were even clearer on explaining the concept. However, I think the problem people have lies where people don’t want to understand.

oh, people do understand, but xbox fans stayed on their holes for more than a decade. they deserve to go crazy now.
 

SleepDoctor

Banned
Haha don't be upset. You should tell that to others posting 100 threads a day how bad ps5 is by creating fake news from wfcctech like our resident xbox "fan" sonomamashine sonomamashine


Ive said it in most threads. I don't care for damage control or slandering either. You guys should make an ot and circle jerk each other there.

You guys can make 100 threads and you still ain't changing anybody's mind on what to buy.
 

SlimySnake

Flashless at the Golden Globes
You have the wrong fanboy, friend, and I'm not going to run in circles to talk you out of your standpoint.

Substantiate this harebrained notion that Sony rushed their clock speeds -- and the underlying architecture and hardware to facilitate it -- in order to compete with Microsoft.

Hard mode: try not to mention gitHub leaks.

I'll wait.
haha. two posts in a row and you have continued to ignore my post on how the activity based throttling would require far too much work on the devs side to get it to work at max clocks.

didnt expect any less.

you do you and continue to believe everything cerny says. fyi, he also said that his 36 cu gpu will be more powerful than a 48 cu gpu because of higher clocks, and that 3 ghz cpu was such a headache they had to come up with this variable power feature to hit over 3 ghz. meanwhile at MS, they ran into no such issues.
 
If the CPU can run at 3.2ghz with full GPU clock I’ll be happy with that. Someone predicted it would drop to 2.7ghz under full GPU load, that seems way too low
 
Last edited:

DunDunDunpachi

Patient MembeR
haha. two posts in a row and you have continued to ignore my post on how the activity based throttling would require far too much work on the devs side to get it to work at max clocks.

didnt expect any less.

you do you and continue to believe everything cerny says. fyi, he also said that his 36 cu gpu will be more powerful than a 48 cu gpu because of higher clocks, and that 3 ghz cpu was such a headache they had to come up with this variable power feature to hit over 3 ghz. meanwhile at MS, they ran into no such issues.
Okay, if you suggest that I listen to someone other than Cerny, go right ahead.

I'm asking you to substantiate what you claimed and all I get is handwaving with snide comment about how I shouldn't believe Cerny.

This is GameFAQs circa 2007 Aaron Greenburg garbo. You'll have to try harder.
 
Digital Foundry has spoken with multiple developers making games for the PS5. There is no need for a wall of text from armchair analysts.

Confirmed by DF that the variable clocks use a fixed target (profiles) that the developer chooses from. If you run Max GPU then the CPU is down-clocked and vice versa. This isn't rocket science.
Damn, I got more out of this statement than the wall of text.
 

DaMonsta

Member
The difference is that one is fixed and the other is variable. The underlying physics dictating the heat-generation of the electrical input remains the same, I would assume.
Again all that is understood.

What I’m saying is if we are only talking about a ~2% variance in clock speed, why not just lock it lower and, save the power and the heat.

Will ~2% result in any meaningful difference in performance?
 

yurinka

Member
Please think logically and not fall for Cerny BS.

Rdna1 to Rdna2 still uses 7nm.
If 5700xt can only do 2.1ghz at insane voltage with custom water cooling, it is a tall order to think 6700 can do 2.23ghz 98% sustained with air cooling.

The 5700xt is a full CU chip while 6700 will be the cut down 36CU version.
Cut down versions are failed full version that cannot enable all CU.
Also PS5 6700 is welded into a APU form, which means it also have a zen2 chiplets, the i/o controllers, the pcie controllers, the sound chip. All these to be cooled by some 'patented' air cooling.

It is not even hating Sony here, just simple unbiased logic Mark Sony has used sneaky words that dont represent truths, but half truths. :lollipop_poop:
If I have to choose between official specs and the words of the console architect explaining its architecture to the devs versus a user forum with an avatar of its direct competition, I'm not sure who is biased here. xD

I think that in addition of being integrated in the ALU, the GPU (and several other related non-GPU stuff) features many customizations, extra custom hardware to do specific work, different I/O focus and several different tweaks, changes and optimizations everywhere to the point that they are even using a new and unique frequency/power/cooling management system.

I think all these things change a lot how the GPU works and how the overall system will perform compared to a discrete PC GPU of more or less similar few main specs, so I think it isn't fair to consider that it's just a specific GPU weilded into the ALU. Specially less when the PC GPU you mention is RDNA 1 instead of RDNA 2, what is the next gen consoles will have.
 

DunDunDunpachi

Patient MembeR
Again all that is understood.

What I’m saying is if we are only talking about a ~2% variance in clock speed, why not just lock it lower and, save the power and the heat.

Will ~2% result in any meaningful difference in performance?
Great question. Seems like Sony wanted to control heat from a power-usage perspective instead of from a temperature-reading perspective, so that is why they went that route. Or at least that is what they are claiming, based on that snooze-fest of a presentation.

Let me reframe the question and return it: if we are only talking about a 2% variance in clock speed, why not push the clock as high as your cooling system can handle under that power budget with a failsafe that keeps everything in budget for the sake of heat? Same fundamental question as yours.
 

DaMonsta

Member
Great question. Seems like Sony wanted to control heat from a power-usage perspective instead of from a temperature-reading perspective, so that is why they went that route. Or at least that is what they are claiming, based on that snooze-fest of a presentation.

Let me reframe the question and return it: if we are only talking about a 2% variance in clock speed, why not push the clock as high as your cooling system can handle under that power budget with a failsafe that keeps everything in budget for the sake of heat? Same fundamental question as yours.
I assume that’s why they capped it where they did.
 

quest

Not Banned from OT
Again all that is understood.

What I’m saying is if we are only talking about a ~2% variance in clock speed, why not just lock it lower and, save the power and the heat.

Will ~2% result in any meaningful difference in performance?
well if my aunt had testicles she be my uncle. There is 0 hard numbers but 100 for that SSD and compression system ask yourself why that is. There is no if there is only facts that Sony clearly is hiding. If it was minor we see the facts. Go read about 2013 this is similar to some of the Microsoft cover up. Astroturfing , PR, deflection until the media got the truth. Simple chart to match the SSD is not much to ask. Exact low clocks loads that cause issues ect transparency.
 
Last edited:
well if my aunt had testicles she be my uncle.

giphy.gif
 

DunDunDunpachi

Patient MembeR
well if my aunt had testicles she be my uncle. There is 0 hard numbers but 100 for that SSD and compression system ask yourself why that is. There is no if there is only facts that Sony clearly is hiding. If it was minor we see the facts. Go read about 2013 this is similar to some of the Microsoft cover up. Astroturfing , PR, deflection until the media got the truth. Simple chart to match the SSD is not much to ask. Exact low clocks loads that cause issues ect transparency.
You know what else comes from 2013?

Sony Too™

Don't use Microsoft's bad behavior as some kind of evidence or precedent for what Sony is supposedly doing right now.
 

JägerSeNNA

Banned
Something is weird here. If the frequency of the GPU does not fall dramatically,only a few percents which I guess he means %3-5 at worst situation,then why do you need to openly try to explain it in front of public?

Honesty? very bad PR?
 
so lets get this straight, using assassins creed as an example. more NPC's which is more compute power then the GPU takes a hit? then much better Graphics then the CPU takes a hit?
Is it common on console games for both the GPU and CPU to be using 100%?

I don’t know about consoles but I can’t think of a single game on my PC that does that. I tune settings to use as much of my GPU as it can, but not CPU
 

vpance

Member
Again all that is understood.

What I’m saying is if we are only talking about a ~2% variance in clock speed, why not just lock it lower and, save the power and the heat.

Will ~2% result in any meaningful difference in performance?

It's probably going to be more like around 5%, but it's not going to happen often. So in other words they gained 5% more performance and quieter plus cooler operation in exchange for up top end power in these rare situations..

The alternative is designing a system that is always 5% weaker all the time, or one that runs louder and hotter whenever a demanding game fully saturates the APU (PS4 Pro style, random ramping of heat and fan). In other words, Sony chose to tame the spikes in power.

Edit: one more alternative. Build an even bigger box and add more cooling to maintain 10.28TF all the time, while keeping the same peak heat and noise levels. Maybe that wasn't worth the cost to them.

If PS5 is notably smaller and cheaper than XSX, then I think we can more confidently say there was some calculated logic to their planning.
 
Last edited:

TGO

Hype Train conductor. Works harder than it steams.
It's amazing people actually think Mark Cerny would sit in front of audience who can call bullshit the moment it leaves his mouth.
It was a developers conference.
There is Zero PR, He is not trying to sell anything and he can't hide anything.
These fucker have a Dev Kit
It is a lecturer on the architecture of the PS5
 
Haha don't be upset. You should tell that to others posting 100 threads a day how bad ps5 is by creating fake news from wfcctech like our resident xbox "fan" sonomamashine sonomamashine
you reaching mate, this is a conspiracy theory, can't wait to read your take on moon landing & 9/11, take a break, you are way too invested in this, go kiss your wife or walk your dog if you have any.
 

Goliathy

Banned
That line alone tells me everything. They clocked the GPU way beyond the optimal point on the efficiency curve. Generally, that is not something you want to do for a power efficient system like laptops, or, a console. It makes things suspect, and is evidence for Sony bumping up speeds at the last minute. It sounds like they were trying hard to advertise a value above 10TFLOPS.

from a psychological standpoint 9 (single digit) sounds WAY less than 10 (double digit). It would make sense from that perspective.
that’s also why we see prices like $499 and not $500.
 

quest

Not Banned from OT
Something is weird here. If the frequency of the GPU does not fall dramatically,only a few percents which I guess he means %3-5 at worst situation,then why do you need to openly try to explain it in front of public?

Honesty? very bad PR?
If I'm paying 500 dollars damn straight. Just like Microsoft had to explain the crappy esram and the always online DRM. Any performance loss like that needs to be explained period not just the good stuff. Microsoft is not hiding their weaker SSD and slower clocks they are putting hard numbers out there transparency.
 

phil_t98

#SonyToo
Is it common on console games for both the GPU and CPU to be using 100%?

I don’t know about consoles but I can’t think of a single game on my PC that does that. I tune settings to use as much of my GPU as it can, but not CPU
it sounds like the way its designed if one is 100% the other isn't due to thermal temp
 

longdi

Banned
Agnostic2020 Agnostic2020


This is my take on variable frequency.

PS5 and Series X will use the same AMD design. Sony just choose to push their 36 CU to its limits, to make up the marketing numbers.

Hence their term variable is appended.

No more BS, no more half truths. All is clear for team green vision yo!
 

longdi

Banned
I expect PS5 to be at least $100 cheaper than X-box series x :(

You better hope Sony is not so far out gone with their reputation that they think gamers will pay the same msrp as Series X.

Will you buy the best iPad Pro or the creeky Galaxy Tab? This is how things stand atm. MS engineering has been better next gen
 

psorcerer

Banned
We seen this same spin before. About high frequency is good.

One difference though. MSFT lied.
Pixel fillrate was lower, texel fillrate was lower, vertex rate was lower.
RAM speed was horribly lower.
Etc. etc.
Nevertheless a lot of games were pretty close by image quality.
 
Top Bottom