• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PlayStation 5’s Boost Clock Design Opens Up a Lot of Opportunities, Says Developer

And just for that second sentence you will probably be banned.

I love how we came from 2013 "TF are very important" to 2017 "TF don't matter at all" to 2020 where it was first "Who needs raw power? It's all about other stuff" and now people are arguing about 0.28 just to get the so important 10 before the point. You can say whatever you want about NeoGAF, but it's always entertaining here.

More ghastly straw men than you can shake a Tflop at.
 
Last edited:

Gavin Stevens

Formerly 'o'dium'
I'm well-versed in persecution complexes yes, that said PS4 has sold more than double the units of the Xbox One, so wouldn't there naturally be more Sony gamers on any forum that isn't catering specifically to the Xbox crowd?

Naturally, of course. But we aren’t talking 2-1 here, we are talking one box guy walks into a thread and can get pummelled by a LOT of people. But yeah of course, more fans would equal, well, more fans.
Read the comments above, developers do not have to "carefully evaluate" anything.

Not 100%. It’s not as simple as it all being an automatic process, and you can still over throttle, though don’t have taken measures to make sure you don’t. But it’s in no way an automatic “you are using this, so it will downclock this”. Nothing is automatic in this execution.

Edit: as for minimum speed the minimum the console will drop to is, yes, that lovely figure people are so scared to say. However the reality is more than likely the console will achieve its max GPU clock the vast majority of the time and the CPU will be the one to clock down. You won’t see the console hitting those low end figures, there isn’t much reason for it to. I can see a few cases where they want the GPU to drop for higher cpu utilisation but those will likely be nice looking but complex indie type games or side scrolling type games, physics based Ori type deals, you know? I can’t see a reason for the console work ever drop to voldemort flops, it has no reason to.
 
Last edited:
Naturally, of course. But we aren’t talking 2-1 here, we are talking one box guy walks into a thread and can get pummelled by a LOT of people. But yeah of course, more fans would equal, well, more fans.


Not 100%. It’s not as simple as it all being an automatic process, and you can still over throttle, though don’t have taken measures to make sure you don’t. But it’s in no way an automatic “you are using this, so it will downclock this”. Nothing is automatic in this execution.

I really haven't seen it much. You did recc a specific thread to me but if it's contained to one thread... oh well? Like I temporarily had trouble with being ganged up on opinion-wise in one specific thread pertaining to one specific game... I stopped trying to fight everyone in that thread and perused the myriad other threads for the same game and had less problems. Each thread has its own flow to it and if you disrupt that flow, even if what you're saying is reasonable you can derail a thread. For the moderator the job is to make sure discussion remains healthy and doesn't get sidetracked, so the options are to give every single person who usually frequents a specific thread shit or to warn the single person they're all reacting to. For all I know they do both! Heck we just had a massive train wreck of a thread from a big Sony guy who claims Xbox fans skate for the same offenses he was temp banned for, I feel like I peruse this forum enough to know things don't feel as lopsided in either direction as either side makes out.
 

FranXico

Member
Not 100%. It’s not as simple as it all being an automatic process, and you can still over throttle, though don’t have taken measures to make sure you don’t. But it’s in no way an automatic “you are using this, so it will downclock this”. Nothing is automatic in this execution.
That's what the power profiles are for. Devs can use those to optimize for certain scenarios, but on retail boxes it is fully automatic. It's a feedback controller maintaining where the power budget is allocated (yes, oversimplifying, but that's the mental picture I use).
 
Last edited:

GymWolf

Member
Oppurtunities of wasting time to balance cpu and gpu all the time?I

I think you are onto something here OP.
 

thelastword

Banned

No matter how many times it is said and debunked, you will hear the same thing in an upcoming thread, it's cyclical. Next thread you will hear about heat issues again....
No matter how many times people explain this it still sounds like the most retarded fanboy nonsense
So Mark Cerny and several multiplat devs are spewing fanboy nonsense, but you aren't? These are the guys saying people are trolling XBOX threads.....

Cerny also said that before they implemented SmartShift it was hard for the CPU and GPU to maintain 3 and 2GHz clocks respectively, so it does being some valid concerns.
Don't remember that, can I get a quote. All I heard from Cerny was that PS5 could be clocked much higher, by they had to cap it at the current rates to offset instability issues....It will be great if I saw that quote where PS5 was limited in clocks prior to smartshift? Must have missed it....
No. Why do people have so many problems with this.
Great writeup btw, but the people pretending they cant understand very clear and precise explanations on PS5's boost clocks are being willingly obtuse. They are not that hard-headed, just hard coated....The Series X and PS5 have different designs, it's set in stone at this point, no one is saying the Series X is not going to boot. What we've noticed though is that a bevy of devs are extremely excited by PS5's design choices. Doesn't mean Series X wont run the games, but if this helps a dev be more creative the better.....It's just reassuring to see Cerny getting many pats on the back for his hard work. He didn't go the traditional route, he put lots of thought into the design to make "DEVS" primarily happy......In essence, if the devs are happy with the hardware, then gamers will be extremely happy with the devs' output......Check...Such genius thinking, from this man called Cerny...
Oppurtunities of wasting time to balance cpu and gpu all the time?I

I think you are onto something here OP.
PS5 has a lower time to triangle than the PS1......I fail to see where devs are wasting time balancing GPU to CPU when this is done by the smartshift alogrithm. Perhaps devs can manually tweak it for more performance, based on their specific needs, but they surely wont be spending it in wasteful fashion. As it stands, the blazing SSD already saves them lots of time in the design process, making lots of meandering environments to mask loadtimes, elevator scenes and adding more logic and ai and physics algos to traverse those. If there was a time when devs would not be wasting time is next gen on PS5. No need to pad a disk with the same assets either. So the boost clocks and smartshift are really there to help devs minimize time wastage.....Remember, they can get more out of the 36 cu;s faster than the alternative or the traditional method......
 
The problem is however we dont know how far PS5 GPU will downclock and most likely we will never know (Digital Foundry has no tools to determine PS5 GPU clock). I hope PS5 GPU will stay at 10TF even at worst possible scenario, but anything 2170MHz would already give sub 10TF numbers.

I really don't think it's going to matter if we know what the TFs are while playing are games. As long as the games look and play great I doubt anyone is going to worry about TFs while playing them.

When people look at those analysis by Digital Foundry they don't care about Teraflops, they care about the actual visuals and performance. As long as those are good the console should be fine.
 

kraspkibble

Permabanned.
nah.

why on earth would you want your CPU/RAM/GPU to downclock when playing a game? i don't overclock my CPU to 5.1GHz for it to dip down to 5.0/4.9/4.8/4.7GHz! i don't overclock my GPU because i want it to drop down the clock/memory speeds when playing a game. shit why did i even bother overclocking my RAM?

you want it running as fast as possible when playing a game. there is no excuse for PS5 to downclock at all next gen unless the game can easily go over whatever the display is capable of. i mean, there is no benefit to running a game at 4k 120fps if your tv is only 60hz. there is no benefit at running at 5K 240fps if the TV is only capable of 4K 120hz!

Sony most likely downclocks because it reduces heat. it's always why the made the PS5 beefy as heck because they let people who cried "BUT MUH PS4 SOUNDS LIKE AJET ENGINEN!!11!!!!" get into their head. mark my words they will regret it and 3-4 years from now when someone posts a Digital Foundry video and cries that PS5 performs worse than XSX ... well you can blame Sony for half assing it.
 
Last edited:

Lone Wolf

Member
No matter how many times it is said and debunked, you will hear the same thing in an upcoming thread, it's cyclical. Next thread you will hear about heat issues again....
So Mark Cerny and several multiplat devs are spewing fanboy nonsense, but you aren't? These are the guys saying people are trolling XBOX threads.....

Don't remember that, can I get a quote. All I heard from Cerny was that PS5 could be clocked much higher, by they had to cap it at the current rates to offset instability issues....It will be great if I saw that quote where PS5 was limited in clocks prior to smartshift? Must have missed it....

Great writeup btw, but the people pretending they cant understand very clear and precise explanations on PS5's boost clocks are being willingly obtuse. They are not that hard-headed, just hard coated....The Series X and PS5 have different designs, it's set in stone at this point, no one is saying the Series X is not going to boot. What we've noticed though is that a bevy of devs are extremely excited by PS5's design choices. Doesn't mean Series X wont run the games, but if this helps a dev be more creative the better.....It's just reassuring to see Cerny getting many pats on the back for his hard work. He didn't go the traditional route, he put lots of thought into the design to make "DEVS" primarily happy......In essence, if the devs are happy with the hardware, then gamers will be extremely happy with the devs' output......Check...Such genius thinking, from this man called Cerny...

PS5 has a lower time to triangle than the PS1......I fail to see where devs are wasting time balancing GPU to CPU when this is done by the smartshift alogrithm. Perhaps devs can manually tweak it for more performance, based on their specific needs, but they surely wont be spending it in wasteful fashion. As it stands, the blazing SSD already saves them lots of time in the design process, making lots of meandering environments to mask loadtimes, elevator scenes and adding more logic and ai and physics algos to traverse those. If there was a time when devs would not be wasting time is next gen on PS5. No need to pad a disk with the same assets either. So the boost clocks and smartshift are really there to help devs minimize time wastage.....Remember, they can get more out of the 36 cu;s faster than the alternative or the traditional method......
Road to PS5 video, 37 minutes in.
 
But I also think PS5 is 9.2+ TFLOPS. How big of a flop difference will it be to XSX? Noticeable in a few places probably. That's about it.

the Series X have extra performance available that a developer can then use that overhead to put in a little bit more GPU specific rendering demand like more particles, details, etc... Kinda like a game running on PC on a 2070/S vs 2080/TI.
You will see more Native 4K 60 FPS games running on XSX than PS5, not without having to resort to resolution scaling, meaning that it would then do 1800p to have the exact same graphical look of a game to hit 60Fps. Whereas the Series X can just do it at native 4K and hit it's limit right at 60fps.
 

StormCell

Member
the Series X have extra performance available that a developer can then use that overhead to put in a little bit more GPU specific rendering demand like more particles, details, etc... Kinda like a game running on PC on a 2070/S vs 2080/TI.
You will see more Native 4K 60 FPS games running on XSX than PS5, not without having to resort to resolution scaling, meaning that it would then do 1800p to have the exact same graphical look of a game to hit 60Fps. Whereas the Series X can just do it at native 4K and hit it's limit right at 60fps.

And that's why I like the Series X a little more.
 

Hendrick's

If only my penis was as big as my GamerScore!
It's simple. Boost is better than no boost if you need more power, but not better than just always having more power and not needing to boost.
 

IntentionalPun

Ask me about my wife's perfect butthole
Why is no-one pointing out this is clearly a quote from a dev who misunderstood the Cerny talk and likely has never used a PS5 devkit?
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Precisely. Boost mode makes sense if you're going to be releasing games that are pushing the hardware to its limits. Personally, I like to imagine that a game like Cities Skylines would take full advantage of this by beefing up the CPU every once in a while for micro simulations. Still, the bottom line is the system as designed, and that design is prepared for games that are asking a little more.

I think it makes sense. But I also think PS5 is 9.2+ TFLOPS. How big of a flop difference will it be to XSX? Noticeable in a few places probably. That's about it.
A game can't choose to beef up the CPU when it wants.
 

Psykodad

Banned
Do you get banned if you say XSX is 11 TF. All seems a bit silly just for saying 9.2 TF.
Think it has to do with VFXveteran who got banned for claiming PS5 will run at 9Tf mostly, without backing it up with proof when asked by the mods.
His word carries more weight somehow since he claims to have inside sources.

After that, it's concerned trolling, since claiming it is 9Tf mostly contradicts Cerny, who says 10.28Tf mostly.

Edit:

Think thats the reason.
No idea, I don't say it, so I don't get banned over it.
 
Last edited:

thelastword

Banned
Timestamped:


That has nothing to do with smartshift, but instead issues from the traditional fixed frequency method, employed in current consoles or the Series X for that matter. Smartshift is just an added layer available to devs to shift power from cpu to gpu. Devs can choose to use it or not.
 

Deto

Banned
we'll see if we repeat:

MS before: "hurrr durrr true 4k, PS4 PRO 4k fake"
MS now: checkboading> all, uses Gears 5, intelligent and superior engineering solution.

Now: "hurrr durrr true clock, PS5 9tf clock fake"
MS tomorrow: "consumption-based clock is a superior engineering solution, let's use it on the xbox SXXXXXX platinum gold extreme edition"
 
Last edited:

Lone Wolf

Member
That has nothing to do with smartshift, but instead issues from the traditional fixed frequency method, employed in current consoles or the Series X for that matter. Smartshift is just an added layer available to devs to shift power from cpu to gpu. Devs can choose to use it or not.
It has everything to do with Smartshift, Cerny states it right at the time stamp. Why are you ignoring that?
 

longdi

Banned
Timestamped:



Thanks for linking, replaying the clip.

Mark Cerny said "they need to cap at 2.23ghz to ensure on-chip logic operate properly." Does it sound like the max they can push(Overclock) the gpu before instability kicks in?

Mark Cerny also said "they expect the gpu to spend most of its time at or close to 2.23ghz and perfomance." Does it sound like some times it cant even sustain close to their marketed perfomance? As someone mentioned, even a 3-5% drop in frequency, brings its performance below 10TF, where a 5% loss = 9.75TF machine 🤷‍♀️
 
Last edited:

Jigsaah

Gold Member
Anybody who says a tech option or design isn't debateable is FOS. By design everything can have it pros and cons, in some instances if you do a comparison, a design or product can excel in all areas tested. A vs B. The debate is in the pudding, aka the results. It's one thing to try and convince folk that a specific design is superior, but your results do not reflect that at all.....

I think out the gate so far, PS5 has shown us, yet again, not one, not just unreal, but a bevy of games that are realtime and looks the next gen part. It's design uptick which the devs are talking about is being proven true, not only through their words, but through what we have seen. If early UE code can look so good, if early Guerilla games can look so good, then such a new paradigm for devs is bringing an age or evolution to how games are made...


Boost mode worked well on PRO, games used the extra power. I think it was a great feature. Boost clocks on the other hand are great too, it allows more flexibility for devs. You realize, if a dev needs more CPU power than GPU, the scales can balance to what he needs most. I think you guys need to bring better arguments here. Please expound on one scenario, where a dev doing an indie idealizes a console staying at peak clocks when he is already hitting the target.?


You have not seen spell binding visuals, but you believe the visuals shown on PS5 were as you expected. So this means you were "really not impressed with PS5 visuals", "your expectations were low on PS5 visuals", then you expound that PS5 games don't look dramatically better than PS4 games outside of resolution. I better bookmark this post to see what games impresses you down the line.....Seems like you are just a "curb your enthusiasm for PS5 visuals police".


As for reality. PS5 games did impress millions, watch the numbers per video, some people thought the Unreal demo was actually "Unreal". I and many others thought Kena was bonkers. Some people swore on Pragmata, just a few days back a poster told me Demon Souls remake was the best thing he saw in that reveal week. So many different types of persons are impressed by different visuals and presentations. Ratchet could not be done on PS4. Then there are other ways to be impressed, like in physics, ai and animation which we have not seen much of yet, or world design. Even next gen draw distance and the quality of visuals maintained way into the distance, which Ghosts of Tsushima already gives us a first look on a paltry mechanical drive....

What you have seen is already impressive to millions upon millions and the truth is we know even better visuals will be shown come launch day and beyond. The heavy hitters like Santa Monica, Naughty has not shown anything yet, Kojima has not shown anything, the rumored silent hill. We know GT7 will knock socks off when it launches, and tbh, these are expectations even if the hardware was a bit lopsided like PS3, but this gen, devs are singing the praises of hardware design in the PS5 like we have not seen in ages.......Truly you ask yourself, if so many devs are singing PS5's praise, then it must be gravy. We have not even seen the new gen Killzone yet, which lit fire in 2013 and still stands a a stalwart of visuals to this day.... So it's not only about the hardware, it's about the talent, but the fact that the hardware is so balanced and good.....with these devs in tow?...."Oh Boyyyy"

tenor.gif
 
So ps5 is basically 9.2tf?
I love the 9.2 TERAFLOPS cry.

9.2 TERAFLOPS! Ahaha Sony you are shit, go home! You suck!

Oh and btw, all next gen Xbox games will also have to run on 4TF hardware! But THAT DOESN'T MATTER! Sony 9.2 TERAFLOPS ahahah!
I wonder what the minimum speed of the GPU is. I suppose it's around 9.2TF as per the github leaks a while back.

Guys please stop doing this to yourselves. You know that kind of stuff isn't allowed unless you can prove it.

freakazoid.gif
 

ZywyPL

Banned
Mark Cerny said "they need to cap at 2.23ghz to ensure on-chip logic operate properly." Does it sound like the max they can push(Overclock) the gpu before instability kicks in?

Yup, that's how it works.


Mark Cerny also said "they expect the gpu to spend most of its time at or close to 2.23ghz and perfomance." Does it sound like some times it cant even sustain close to their marketed perfomance? As someone mentioned, even a 3-5% drop in frequency, brings its performance below 10TF, where a 5% loss = 9.75TF machine 🤷‍♀️

10.3, 9.7, it's still +-10, who cares.


You know that kind of stuff isn't allowed unless you can prove it.

Actually, it's very easy to prove, because we have Cerny's own words to back it up, the PS5 lead architect - we know from Cerny himself that the clocks aren't fixed but variable, and capped at 2230MHz, but he or Sony never said anything if/what the minimum clock is, so based on that we can assume that PS5 GPU operates at anything between 1-2230MHz. And actually as per the so-called "race to idle", again explained by Cerny himself, the GPU tries to minimize the clock/thermals depending on the workload, so technically, there is a scenario where the GPU will be operating at 2000MHz, a.k.a. infamous 9.2TF. Because TFlops are only theoretical, right? More than that, the clock can actually be even lower, and generate like just 3-4TF in scenarios like when you're looking at an empty skybox or a wall, because there's nothing to render other than the main character and a flat texture. Also bare in mind games on consoles despite having framerate locked at 30/60 still operate with some headroom (that's how they achieve steady framerate), which can be easily seen in all the games with uncapped framerate mode, so if the engine is forced to render 30 frames while all the 10.3TF allow it to render like 35-40FPS the GPU will most likely tune down its clocks as all the power is not really needed for 30, and as per SmartShift might give all that saved power to the CPU, if needed of course. There you go.

giphy.gif
 

Resenge

Member
Guys please stop doing this to yourselves. You know that kind of stuff isn't allowed unless you can prove it.

Actually, it's very easy to prove, because we have Cerny's own words to back it up, the PS5 lead architect - we know from Cerny himself that the clocks aren't fixed but variable, and capped at 2230MHz, but he or Sony never said anything if/what the minimum clock is, so based on that we can assume
 
Last edited:

ZywyPL

Banned


Well it's can't operate at 0MHz that's for sure, that would mean it's turned off. so 1 if the absolute lower limit, unless confirmed otherwise. But please read the next sentence because taking stuff out of context is so easy, dare I say low.
 
Last edited:


Yeah it still isn't proof of anything to be honest. You can assume that due to the clocks being variable the console can drop all the way down to 1TFs.

What the moderators want is actual proof that it isn't a 10.28TF system. It should be easy to prove if developers release a benchmark and the system doesn't hit 10.28TFs most of the time.

However nobody has come up with that kind of proof yet.


If you can provide the confirmation that the console is in fact 9TF (by which we mean runs at a majority of the time at 9TF as per the insinuation) and not 10.28TF as detailed in the release materials, then we can roll back now. Until then the onus is on [IMG alt="captainraincoat"]https://www.neogaf.com/data/avatars/s/76/76888.jpg?1578130972[/IMG] captainraincoat to provide that evidence via the appeal address. Or alternatively justify what the purpose of dropping that statement was. It should be relatively easily to clear up.

The bait comment was in response to baiting other posters (probably Playstation owners) to come in and go through the same cyclical argument of 9TF, it's not 9TF etc. We see no reason this is not the case here. It did not need to be mentioned to make the underlying point. But we are open to being convinced otherwise


All we asking for is proof that it drops to 9.2TFs and doesn't spend the majority of the time at around 10.28TFs. Nobody has been able to provide us with proof of that yet.
 
Last edited:

Resenge

Member
Well it's can't operate at 0MHz that's for sure, that would mean it's turned off. so 1 if the absolute lower limit, unless confirmed otherwise. But please read the next sentence because taking stuff out of context is so easy, dare I say low.
It's more of a comment on everybody console warring on here right now. Until you get the console or until you get real proof you are assuming. You could be right for all I know but you do not know either.
 
Last edited:

ZywyPL

Banned
It's more of a comment on everybody console warring on here right now. Until you get the console or until you get real proof you are assuming. You could be right for all I know but you do not know either.

Just saying. If someone wants to claim PS5 is a 9TF he ca use Cerny's words to back it up, and it's up to the mods what they'd do with such justification. I personally don't care as much about the TFlops as I do for the fan speed/noise, PS4 Pro was a mistake and it should never happen again, ever.
 

Resenge

Member
Just saying. If someone wants to claim PS5 is a 9TF he ca use Cerny's words to back it up, and it's up to the mods what they'd do with such justification. I personally don't care as much about the TFlops as I do for the fan speed/noise, PS4 Pro was a mistake and it should never happen again, ever.
Cerny's words were 10.28 TF, anything else is assuming until you have proof.
 
Last edited:

ZywyPL

Banned
Cernys words 10.28 TF, anything else is assuming until you have proof.

10.28TF (Variable), a subtle difference, but difference nevertheless.


Sure, if someone is spamming this shit, it's obviously dumb. We probably never found out anyway, what is the base clock so it does not matters.

I don't even think even the devs know how much power their games/scenes are using, the most important metric for them is the frame time (<16.67 or <33.34ms), and that's what is being monitored by the devkits, as seen for example on X1X devkit and its built-in LCD, they don't care how much power they use, but if the scene fits within that set timeframe budget, if it doesn't, they optimize. Because they cannot add extra power to the consoles obviously, so monitoring that would be totally worthless. So while XBX power is a given, I truly believe that PS5 clocks will forever remain a mystery.
 
Last edited:

M1chl

Currently Gif and Meme Champion
10.28TF (Variable), a subtle difference, but difference nevertheless.




I don't even think even the devs know how much power their games/scenes are using, the most important metric for them is the frame time (<16.67 or <33.34ms), and that's what is being monitored by the devkits, as seen for example on X1X devkit and its built-in LCD, they don't care how much power they use, but if the scene fits within that set timeframe budget, if it doesn't, they optimize. Because they cannot add extra power to the consoles obviously, so monitoring that would be totally worthless.
I am not really sure why console manufactuer cares about peak power output. Unless some regulation from FCC.... And need for carbon fan blades. Any aviators in here? No.
 
Last edited:

ZywyPL

Banned
I am not really sure why console manufactuer cares about peak power output. Unless some regulation from FCC.... And need for carbon fan blades. Any aviators in here? No.

PR/marketing. Believe it or not but having "the world's most powerful console" is a title to behold. It's easy, simple and effective, while the 2nd place holder will have figure out some fancy slogans that will lure the crowd towards his platform.
 
Cerny's words were 10.28 TF, anything else is assuming until you have proof.

Same goes for any claims about Xbox. It's ok to speculate but the truth always comes first.

Example
Like we can speculate that not all games will be BC on Xbox but if Microsoft says all of them then that's the truth until it's proven wrong.
 

M1chl

Currently Gif and Meme Champion
PR/marketing. Believe it or not but having "the world's most powerful console" is a title to behold. It's easy, simple and effective, while the 2nd place holder will have figure out some fancy slogans that will lure the crowd towards his platform.
No, why the clocks are not at 10.+ TF at all times, you know. I guess mainly heat and power drain.
 
Top Bottom