• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Cerny: Devs Don’t Have to Optimize in Any Way for PS5’s Variable Clocks, It’s All Automatic

Great post, but don't forget, 360 had a 54% defect rate, people keep forgetting that. Had it not been for those high defect rates, 360 would have sold the same 40 million consoles XBONEX has. There is really not an 80 million base for XBOX outside of rebuying their consoles at such an alarming rate. Lots of Xbox fans came into this gen thinking XBONES would have sold 80 million easily, but never factored in rrod.
This is probably why MS figured they could change so much for a console, if people are willing to re-purchase our product 5 times during the generation we can sell a console 500+ dollars... This is an insane deal, if it doesn't break.
 

Dory16

Banned
I have never based my console purchases on Power. Do i wish PS5 was 20 TF? Of course but it comes down to the games..

Do your prefer Xbox 1st party games/ Studios? Get a XSEX or a Gaming PC

Do you prefer Sony 1st party games/ studios? Get a PS5

For me it will be Gaming PC and PS5.

We all know what games we love and what console will match our wants. No amount of TFLOPS going to change that....
MS understood that and I think their studio buying spree has already achieved at least one goal. Microsoft's first party output this gen is a question mark, whereas any other gen it was easy to call it for Sony long in advance. They have a lot of capable studios now and with the most powerful hardware and large budgets, there is at least room for doubt that Sony will dominate both in quantity and quality of first party titles.
Not only that but GamePass changes a lot of the picture. I never want to leave a game that i enjoy to go start another one but GP is literally like netflix. Like I'm halfway through A plague's tale but oh look, some new free DLC for Gears (which was also free). Or The Surge 2 just arrived. And Bleeding Edge. And 3 Yakuza games I never played.
There will be a lot more good games available for a lot less money and playable from more places with Xbox (let's not forget that the consoles will stream games to mobiles).
 
yeah this look like a simpler and more elegant solution tbh.

sony fanboy: can you imagine ND and santa monica with 10+tf after the games they created with a miserable 1.8 machine? 5 times the power yo!

me: yeah but can you imagine ND and santa monica with 12+tf under their asses? 6 times the power bromigo:messenger_sunglasses:

sony fanboy: BZZZZ CAN'T COMPUTE THIS INFORMATION-BZZZZZ- WE CARE ABOUT GAMES- BZZZZZZ

me: :messenger_expressionless:
That make the difference look pretty small.... But you can't argue with the numbers, just how relevant you think they are to the final product compared to its competition.

Let's say I think that they end up close enough in performance, after having seen a couple of third party games compared. All we are left with as purchase decision points are relative prices and games library (the exclusives). Sony tends to deliver on the games front, MS is still un-proven (this is a nice way to put it), unless you joined the gamespass cult, then all you see is the number of games, which is lower than the number of games on PSNow.
 

GymWolf

Gold Member
That make the difference look pretty small.... But you can't argue with the numbers, just how relevant you think they are to the final product compared to its competition.

Let's say I think that they end up close enough in performance, after having seen a couple of third party games compared. All we are left with as purchase decision points are relative prices and games library (the exclusives). Sony tends to deliver on the games front, MS is still un-proven (this is a nice way to put it), unless you joined the gamespass cult, then all you see is the number of games, which is lower than the number of games on PSNow.
my point is, if it's ok talking and getting excited for ps5 being 4x ps4 why can't you be even more excited when ps5 is 5x a ps4? it doens't make sense.
people like ND or guerrila that squeeze the console at 101% can make even more spectacular game with 2 more tf, there is no absolute question about that, and tf are not only used for resolution...

and believe me, in 4-5 years when all games are gonna use 100% from the console to maintain 4k, a solid framerate, great graphics and maybe some rtx here and there, 2 tf more are gonna be useful as fuck.

more power is ALWAYS a good thing, not only for third party game but especially for exclusive who always try to push further the graphics bar in games.

that's why it sound stupid to me when people get excited for 10tf but they try to lie at themself saying that 12tf are useless because reasons...

if ps5 was for absurd a 15tf machine, the ssd people would be on the street dancing naked (with me as a front runner)
 
Last edited:
Not only that but GamePass changes a lot of the picture. I never want to leave a game that i enjoy to go start another one but GP is literally like netflix. Like I'm halfway through A plague's tale but oh look, some new free DLC for Gears (which was also free). Or The Surge 2 just arrived. And Bleeding Edge. And 3 Yakuza games I never played.
I find that way of thinking pretty bad, didn't you lime plague's tale? And aren't those Yacusa games cheap by now?

If you buy games (digitally even) and only play AAA games gamespass costs you 3 full games on release day a year.... After 5 years this is 15 games that you get to keep, amd assuming you get a certain amount of games on rebate it could easily be 20 - 25 games, and as time goes on the equation only go in favor of ownership.

Now you can say: but I have 300+ games on gamespass... The problem is that unless you have the attention span of a clown fish you will end up playing some games for longer than a few hours, go back and replay the ones you really like, etc.
 

Dory16

Banned
I find that way of thinking pretty bad, didn't you lime plague's tale? And aren't those Yacusa games cheap by now?

If you buy games (digitally even) and only play AAA games gamespass costs you 3 full games on release day a year.... After 5 years this is 15 games that you get to keep, amd assuming you get a certain amount of games on rebate it could easily be 20 - 25 games, and as time goes on the equation only go in favor of ownership.

Now you can say: but I have 300+ games on gamespass... The problem is that unless you have the attention span of a clown fish you will end up playing some games for longer than a few hours, go back and replay the ones you really like, etc.
Not sure how this became a trial of my behaviour on the subscription service that I pay for. Way to miss my point buddy. All I'm saying is that GP has so much value that it will be a factor when people decide which next gen console to get (especially if it comes bundled with the XSX for a few months)
And if you think that the better investment to playing and trying a library of thousands of games (plus every future first party on day one) is buying the ones you like outright because you get to "keep" them, more power to you.
 
Last edited:

Tripolygon

Banned
The 1st answer is because 2.2 Ghz is way passed the sweet spot of 1.9Ghz. for this architecture. Meaning you're creating a lot more heat but only making minimal gains in performance.
The 2nd answer is because as DF already proved more CU's is more performant than less CU's even if the TF's are equal.
Uhh what? Where did you get this number from?
 
Last edited:
that's why it sound stupid to me when people get excited for 10tf but they try to lie at themself saying that 12tf are useless because reasons...
The problem is that a 20% difference is not that big, so everything will be scalable within reason... Sure TFs are not only used for resolution, but if you cut resolution during heavy action (not much) you free the compute power to calculation these physics interactions, extensive shaders, etc. assuming you needed more than 10TF in the first place.

It really is better to have more compute power than less, no questions about that... The question is given how much difference there is, will it matter much in the end?
 
The problem is that a 20% difference is not that big, so everything will be scalable within reason... Sure TFs are not only used for resolution, but if you cut resolution during heavy action (not much) you free the compute power to calculation these physics interactions, extensive shaders, etc. assuming you needed more than 10TF in the first place.

It really is better to have more compute power than less, no questions about that... The question is given how much difference there is, will it matter much in the end?
Of course it will, especially as it relates to Ray Tracing. RT absolutely tanks the raster capability of the GPU even with RT cores, having a surplus of that compute available means Microsoft's system could push more RT at a scene than Sony's and to such a level that it would cripple the PlayStation 5 while still affording the Series X to competently render the scene without consequence.
 

GymWolf

Gold Member
The problem is that a 20% difference is not that big, so everything will be scalable within reason... Sure TFs are not only used for resolution, but if you cut resolution during heavy action (not much) you free the compute power to calculation these physics interactions, extensive shaders, etc. assuming you needed more than 10TF in the first place.

It really is better to have more compute power than less, no questions about that... The question is given how much difference there is, will it matter much in the end?
dude, these console are already "old" compared to max spec pc, even 1 teraflop matter in the long run.

why do you think cerny is trying to squeeze power from a 9tf gpu to get at 10tf with exotic boost clook? just because he likes to pay more for a better cooling solution? cmon....

9 tf are already capable of melting brains at 4k30 frame, why cerny needs 10+ tf? because more power is more future proof.
 
Last edited:
There's a list of games on PS4 that look better than those. Gears 5 is a good game and it looks good

TLOU Part 2
Ghost of Tsushima
Uncharted 4
The Order 1886
Detroit
Death Stranding

All look better

GoT isn’t out yet, none of us have played it
Take Detroit off the list

Otherwise I agree with you
 

Tripolygon

Banned
dude, these console are already "old" compared to max spec pc, even 1 teraflop matter in the long run.
These consoles are using the latest in AMD CPU and GPU architecture with future looking technologies that will help developers create fantastic looking games.

2 Years ago, everyone was expecting 8 - 12TF GCN GPU, 2.6GHz Zen 1 CPU, Regular Hard Drives and 12 - 16GB Ram.

We are getting 10 - 12TF RDNA 2 GPU with Ray Tracing, 3.5 - 3.6 8 cores 16 threads Zen 2 CPU, 800GB -1TB super fast NVME SSD and 16GB GDDR6 Ram.

This is what 1.8TF produced this gen
8e8TheOrder188620150222.jpg

MtskgEX.jpg

k70Qr7F.jpg

0_0.jpg

why do you think cerny is trying to squeeze power from a 9tf gpu to get at 10tf with exotic boost clook? just because he likes to pay more for a better cooling solution? cmon....
What does this even mean? What is a 9TF GPU? How do you define what a 1TF GPU is?
9 tf are already capable of melting brains at 4k30 frame, why cerny needs 10+ tf? because more power is more future proof.
Or hear me out for a second. Maybe perhaps 10TF was their goal? Just like 12TF was Microsoft goal?
Remember when Mark Cerny said you'll need about 8TF minimum to ensure native 4K current gen games with no compromise?
 
Last edited:

GymWolf

Gold Member
These consoles are using the latest in AMD CPU and GPU architecture with future looking technologies that will help developers create fantastic looking games.

2 Years ago, everyone was expecting 8 - 12TF GCN GPU, 2.6GHz Zen 1 CPU, Regular Hard Drives and 12 - 16GB Ram.

We are getting 10 - 12TF RDNA 2 GPU with Ray Tracing, 3.5 - 3.6 8 cores 16 threads Zen 2 CPU, 800GB -1TB super fast NVME SSD and 16GB GDDR6 Ram.

This is what 1.8TF produced this gen
8e8TheOrder188620150222.jpg

MtskgEX.jpg

k70Qr7F.jpg

0_0.jpg


What does this even mean? What is a 9TF GPU? How do you define what a 1TF GPU is?

Or hear me out for a second. Maybe perhaps 10TF was their goal? Just like 12TF was Microsoft goal?
Remember when Mark Cerny said you'll need about 8TF minimum to ensure native 4K current gen games with no compromise?
These console are already old compared to new gpu that are gonna come out in a couple of months (before the console) in terms of raw power and probably rtx technology and other minor things (nvidia is usually the leader for this shit on pc)
even a 2080ti is better than both console.

I know how good are gonna be the game with 9-10 teraflop, i defended this concept in another topic, no need to post pictures of games i have already finished.

My point was, why ps5 use this contrived boost clock to achieve higher frequencies that raise power consumation and hottnes of the console if tf are useless and 1 or 2 more are not needed to make a difference? Just put out a console with 9tf at 400 dollars and easily win the gen before it starts.
People in here act like the jump from 9 to 10 is more significant than 10 to 12.

notice this, people get mad when you tell em that ps5 is a 9tf machine when it's actually a 10tf machine (so just one more tf) but it's not that important when the difference between ps5 and sex is double that number...if that sound right to you i don't know what to say...
 
Last edited:

Dory16

Banned
GoT isn’t out yet, none of us have played it
Take Detroit off the list

Otherwise I agree with you
There's a list of games on PS4 that look better than those. Gears 5 is a good game and it looks good

TLOU Part 2
Ghost of Tsushima
Uncharted 4
The Order 1886
Detroit
Death Stranding

All look better


uPRZ8G6.png


The problem with the world today is that facts no longer matter and I trace it back to the moment a famous president started freely using the term alternative facts. You don't know anything about the performance or quality of at least 2 games on that list. Other than Death Stranding, the other ones are not up to par.
 

LordOfChaos

Member
The 1st answer is because 2.2 Ghz is way passed the sweet spot of 1.9Ghz. for this architecture. Meaning you're creating a lot more heat but only making minimal gains in performance.
The 2nd answer is because as DF already proved more CU's is more performant than less CU's even if the TF's are equal.

We have no examples of the RDNA 2.0 architecture out yet, let alone any bespoke alterations.

It's past the sweet spot for RDNA 1.0. Things change, maybe this one did or maybe it didn't.
 

Hobbygaming

has been asked to post in 'Grounded' mode.
uPRZ8G6.png


The problem with the world today is that facts no longer matter and I trace it back to the moment a famous president started freely using the term alternative facts. You don't know anything about the performance or quality of at least 2 games on that list. Other than Death Stranding, the other ones are not up to par.
The Last of Us 2 and Ghost already look better and Gears 5 being 60 FPS doesn't make it look better, it has better performance

Those others I listed also look better and don't perform as well but are still mostly stable at 30 FPS
 
Last edited:

Dory16

Banned
The Last of Us 2 and Ghost already look better and Gears 5 being 60 FPS doesn't make it look better, it has better performance

Those others I listed also look better and don't perform as well but are still mostly stable at 30 FPS
This is a technical thread but I of course respect all opinions, no matter how subjective.
 

darkinstinct

...lacks reading comprehension.
The only way it can work if it's automatic is if there is an upper ceiling like 9.2 TF for devs and the boost clocks are only used to stabilize frame rate at whatever their target is. You can't develop a game for a 10.28 TF console that automatically can scale down, it's just not possible because of a myriad of things that can break if the dev isn't accounting for it. If you design it around the limit, it will run abysmal whenever the console throttles. The messaging of Sony feels strangely familiar, it's similar to how Microsoft said they would have some sort of game sharing but nobody had figured it out exactly and the public messaging around it was awful. That's what Cerny is trying now, to manage a shitstorm. Devs weren't happy with the idea of two SKUs for Xbox, now imagine how they feel about a PS5 that can automatically change its performance targets during runtime. Sony should do the smart thing and just use locked specs. The boost isn't going to win them anything, but it might cost them reliability.
 

SleepDoctor

Banned
These console are already old compared to new gpu that are gonna come out in a couple of months (before the console) in terms of raw power and probably rtx technology and other minor things (nvidia is usually the leader for this shit on pc)

I know how good are gonna be the game with 9-10 teraflop, i defended this concept in another topic, no need to post pictures of games i have already finished.

My point was, why ps5 use this contrived boost clock to achieve higher frequencies that raise power consumation and hottnes of the console if tf are useless and 1 or 2 more are not needed to make a difference? Just put out a console with 9tf at 400 dollars and easily win the gen before it starts.
People in here act like the jump from 9 to 10 is more significant than 10 to 12.

notice this, people get mad when you tell em that ps5 is a 9tf machine when it's actually a 10tf machine (so just one more tf) but it's not that important when the difference between ps5 and sex is double that number...if that sound right to you i don't know what to say...


As has been said before, if tflops don't matter why did they overclock the gpu to squeeze out another tflop?

The games will be good regardless. Its just a matter these warriors trying to save face after shitposting everywhere ever since xsx was revealed to be 12 tf.

My most anticipated game this year is GoT. But all these Cerny interviews feel more like damage control than anything else. Its reminiscent of the Xbox one damage control lol.
 

FranXico

Member
The only way it can work if it's automatic is if there is an upper ceiling like 9.2 TF for devs and the boost clocks are only used to stabilize frame rate at whatever their target is. You can't develop a game for a 10.28 TF console that automatically can scale down, it's just not possible because of a myriad of things that can break if the dev isn't accounting for it. If you design it around the limit, it will run abysmal whenever the console throttles. The messaging of Sony feels strangely familiar, it's similar to how Microsoft said they would have some sort of game sharing but nobody had figured it out exactly and the public messaging around it was awful. That's what Cerny is trying now, to manage a shitstorm. Devs weren't happy with the idea of two SKUs for Xbox, now imagine how they feel about a PS5 that can automatically change its performance targets during runtime. Sony should do the smart thing and just use locked specs. The boost isn't going to win them anything, but it might cost them reliability.
Devs can use profiles (prioritize CPU or GPU always) to optimize their engine. The frequency is not intended to be their concern.
 

Genx3

Member
Uhh what? Where did you get this number from?
I used RDNA1 Figures since that's the only RDNA Architecture available right now.

We have no examples of the RDNA 2.0 architecture out yet, let alone any bespoke alterations.

It's past the sweet spot for RDNA 1.0. Things change, maybe this one did or maybe it didn't.

RDNA2 is RDNA based so there could be a slight improvement but I doubt its a huge improvement.
 

Clear

CliffyB's Cock Holster
The 1st answer is because 2.2 Ghz is way passed the sweet spot of 1.9Ghz. for this architecture. Meaning you're creating a lot more heat but only making minimal gains in performance.
The 2nd answer is because as DF already proved more CU's is more performant than less CU's even if the TF's are equal.

DF's "tests" proved nothing. Its painfully simple-minded and unscientific to switch discrete GPU's in an otherwise fixed system architecture and conclude that the function and performance of everything outside of that changed part has zero impact on any overall gains/losses observed.

Its funny because the One X is a great example of how much you can gain from overclocking both GPU and memory.
 

SleepDoctor

Banned
DF's "tests" proved nothing. Its painfully simple-minded and unscientific to switch discrete GPU's in an otherwise fixed system architecture and conclude that the function and performance of everything outside of that changed part has zero impact on any overall gains/losses observed.

Its funny because the One X is a great example of how much you can gain from overclocking both GPU and memory.


I didn't know 1.172ghz on the xbox one x gpu is considered overclocked 🤔🤔🤔
 

Genx3

Member
DF's "tests" proved nothing. Its painfully simple-minded and unscientific to switch discrete GPU's in an otherwise fixed system architecture and conclude that the function and performance of everything outside of that changed part has zero impact on any overall gains/losses observed.

Its funny because the One X is a great example of how much you can gain from overclocking both GPU and memory.

Yup Scientific data proves nothing...
 

Hobbygaming

has been asked to post in 'Grounded' mode.
As has been said before, if tflops don't matter why did they overclock the gpu to squeeze out another tflop?

The games will be good regardless. Its just a matter these warriors trying to save face after shitposting everywhere ever since xsx was revealed to be 12 tf.

My most anticipated game this year is GoT. But all these Cerny interviews feel more like damage control than anything else. Its reminiscent of the Xbox one damage control lol.
Who said they overclocked it just to get another Tflop?? Mark Cerny wanted to get to a certain frequency for what they wanted to achieve with the console

They could've went with more CU's from the start but they chose not to
 

Clear

CliffyB's Cock Holster
The only way it can work if it's automatic is if there is an upper ceiling like 9.2 TF for devs and the boost clocks are only used to stabilize frame rate at whatever their target is. You can't develop a game for a 10.28 TF console that automatically can scale down, it's just not possible because of a myriad of things that can break if the dev isn't accounting for it. If you design it around the limit, it will run abysmal whenever the console throttles. The messaging of Sony feels strangely familiar, it's similar to how Microsoft said they would have some sort of game sharing but nobody had figured it out exactly and the public messaging around it was awful. That's what Cerny is trying now, to manage a shitstorm. Devs weren't happy with the idea of two SKUs for Xbox, now imagine how they feel about a PS5 that can automatically change its performance targets during runtime. Sony should do the smart thing and just use locked specs. The boost isn't going to win them anything, but it might cost them reliability.

Noone develops a game specifically rated for a 10.28 TF part! It doesn't work like that.

You do what you can within your frame-time and memory budget, in fact memory (ram utilization specifically) tends to be the only absolutely immutable aspect because everything else can be massaged into shape.

The whole point of it being autonomous is that every unit performs consistently irrespective of ambient temperature. So if your dev build is running stably even though its requiring >9.2tf of GPU resource its going to perform identically on retail hardware, regardless of the console running in Alaska or Arizona.
 

Clear

CliffyB's Cock Holster
Yup Scientific data proves nothing...

Data generated from a flawed experimental model is worthless. And that's exactly what their sloppy methodology is .

Lets not forget back when they'd run face-offs between vsync'd and non-vsync'd builds despite the outcome being obviously a fait accompli!
 

GymWolf

Gold Member
Data generated from a flawed experimental model is worthless. And that's exactly what their sloppy methodology is .

Lets not forget back when they'd run face-offs between vsync'd and non-vsync'd builds despite the outcome being obviously a fait accompli!
not to take the defense of df, but everyone can commit some mistake, loudy and hottie as fuck ps4-ps4pro or ps3 YLOD problem proved that cerny (or sony in general) can make mistake too (and on a muuuuch bigger scale than some youtube nerd blog)

funny how people get shit for doubting cerny the expert but it' okay to devaluate df expertise when both of them made mistake in the past...
and believe me, i have ZERO interest in defending df, i just don't care about them at all.
 
Last edited:

Tripolygon

Banned
These console are already old compared to new gpu that are gonna come out in a couple of months (before the console) in terms of raw power and probably rtx technology and other minor things (nvidia is usually the leader for this shit on pc)
even a 2080ti is better than both console.
That is a no brainer? Consoles have not been ahead of the curve since the original Xbox was released, you can argue that perhaps Xbox 360 GPU was a gen ahead when it released but it was surpassed within just months. Console remain the same for the whole generation while PC parts keep changing every year. This time the consoles are releasing with techs that are currently not prevalent in games. RTX, VRS, VRR, Machine Learning all these are things that are just starting to be used and the consoles support those out of the gate.
My point was, why ps5 use this contrived boost clock to achieve higher frequencies that raise power consumation and hottnes of the console if tf are useless and 1 or 2 more are not needed to make a difference? Just put out a console with 9tf at 400 dollars and easily win the gen before it starts.
People in here act like the jump from 9 to 10 is more significant than 10 to 12.
Why is is contrived? Literally everyone uses similar variation of this type of design. Mobile processors from Mediatek, Qualcom, Samsung, Desktop parts from AMD, Nvidia and Intel. Consoles have usually followed a different design but now Sony has gone the other route with their take on it, Contrived? I think not.
notice this, people get mad when you tell em that ps5 is a 9tf machine when it's actually a 10tf machine (so just one more tf) but it's not that important when the difference between ps5 and sex is double that number...if that sound right to you i don't know what to say...
The better question is notice how people are hell bent on making the PS5 into a 9TF console when the designers have stated their specifications, why is that? Why are people trying to make the difference between both console bigger than it is?
 

Tripolygon

Banned
I used RDNA1 Figures since that's the only RDNA Architecture available right now.



RDNA2 is RDNA based so there could be a slight improvement but I doubt its a huge improvement.
There have been great performance improvement between same architecture. GCN 1 to GCN 5, Vega 10 and Vega 20. Below is AMD projection for RDNA 1 compared to RDNA 2. 50%
AMD-Radeon-Roadmap-2020_RDNA2-Radeon-RX-Navi-2x-GPUs_1-1536x864.png
 

GymWolf

Gold Member
That is a no brainer? Consoles have not been ahead of the curve since the original Xbox was released, you can argue that perhaps Xbox 360 GPU was a gen ahead when it released but it was surpassed within just months. Console remain the same for the whole generation while PC parts keep changing every year. This time the consoles are releasing with techs that are currently not prevalent in games. RTX, VRS, VRR, Machine Learning all these are things that are just starting to be used and the consoles support those out of the gate.

Why is is contrived? Literally everyone uses similar variation of this type of design. Mobile processors from Mediatek, Qualcom, Samsung, Desktop parts from AMD, Nvidia and Intel. Consoles have usually followed a different design but now Sony has gone the other route with their take on it, Contrived? I think not.

The better question is notice how people are hell bent on making the PS5 into a 9TF console when the designers have stated their specifications, why is that? Why are people trying to make the difference between both console bigger than it is?
when i said that this console are already old it was not some gatcha moment on my part trying to winning a conversation, i was stating factual stuff, 3000 series from nvidia is gonna shit on these console from a relatively high place even before the console came out, this was the all meaning of the word "old".

listen, i'm pretty noob with this technical stuff, but i saw both noobs and experts from df to nx, to insider on twitter to people who know about this stuff on this forum or other forums being really confused by all this boost clock thing, we had a technical presentation from cerny and various overview from df etc. and we still don't know how this thing is gonna perform in real game application espacially with an eye on the future when scene that use both cpu\gpu at 100% can probably occur.
my "contrived" was more of a general sentiment stating, it's surely is something more contrived than fixed clock in sex, we can at least agree on this, didnt'we?

i don't know what the better question is, my point was always "more power is better" and any excuse to say different is stupid, and also the fact that if 9 to 10 is a noticeable upgrade worth of doing something "unusual" (if contrived sound soo bad to you) then the jump from 10 to 12 is even more noticeable and it's funny when people try to devaluate that.

(sorry for my english)
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Cerny also said devs will need to optimize specifically for the PS5 To get the most out of the “boost clock” or whatever you want to call it.

Optimise for a console to get best results (with a baseline easy to reach)?!
9K1k8C4.gif


I tend to trust Cerny when he says they worked hard on making the console powerful yet very easy, not a puzzle, to program for. He said the same about PS4 (PS5 being even easier to get intended performance out of it, aka his time to triangle metric) and look at how it turned out:
wqlg7Ld.png
 
Last edited:

DaMonsta

Member
Optimise for a console to get best results (with a baseline easy to reach)?!
9K1k8C4.gif


I tend to trust Cerny when he says they worked hard on making the console powerful yet very easy, not a puzzle, to program for. He said the same about PS4 (PS5 being even easier to get intended performance out of it, aka his time to triangle metric) and look at how it turned out:
wqlg7Ld.png
I’m just saying he has comments on both sides of this topic.

On one hand he says devs don’t “have” to optimize for the variable clocks

Then he says that they will need to optimize specifically for the variable clocks in order to get the best out of them.

You gotta realize Mark Cerneys public comments are marketing first, tech second.
 
Faster clock speed for the GPU->slower clock speed for the CPU and the opposite. That is AMD’s smartshift technology that Cerny mentions in a nutshell. People can sugarcoat it all they want and spend insane amounts of energy in damage control but that is the way things are.
 

LordOfChaos

Member
RDNA2 is RDNA based so there could be a slight improvement but I doubt its a huge improvement.

Which is a guess for now. RDNA 1 still has a lot of GCN-ey bits. We'll have to see when we can, it could be they substantially rework the front end.
 

quest

Not Banned from OT
With others, many at Sony included yes, not with Cerny... quite the opposite.
When his console crushed Microsofts of course did not need to marketing. Now that he is in a dog fight spec wise he is pulling out marketing speak 101 mostly, a couple. Slightly ect. Its easy to when you kicked the other guys ass to not resort to marketing speak. When it is not an ass kicking its 100% marketing speak. If it was not we would have legitimate hard numbers for the variable clocks instead of each side guessing at best and worst case and using it as a fact. The SSD 100% numbers because it way out does Microsoft no coincidence.
 

Kusarigama

Member
I think theres a missunderstanding here.

I think Cerny means that devs wont have to write code thinking about if it will overheat the system, since ps5 avoids it automatically.

But of course, devs will balance the power between the cpu and gpu as they see fit for each scenario.
This is precisely what I think it is. Mark Cerny gave examples of how the map screen of certain games causes the PS4 fans to speed up more even though there's nothing very graphically intensive scene on screen. And now such situations can be avaoided because of this. Also in the digital foundry deep dive article Mark Cerny talked about another code(hypothetical) which could make the PS4 go in to a thermal shutdown and on PS5 it won't be problematic.
 
Last edited:
The 1st answer is because 2.2 Ghz is way passed the sweet spot of 1.9Ghz. for this architecture. Meaning you're creating a lot more heat but only making minimal gains in performance.
The 2nd answer is because as DF already proved more CU's is more performant than less CU's even if the TF's are equal.
RDNA 1 was not meant to run at 2.2ghz, only 1.8ghz. PS5 APU (which is custom RDNA2) has being designed from the get go for high clocks. In some tests we can even see the performance increase on each steppings (using the same 2ghz clock). This APU has minimum 5 steppings, which is alot for an AMD APU.

Also on console the bandwidth will be managed very differently (more efficiently) than on PC so it shouldn't be a problem in most cases.

DF tests (Hitman 2 to test the GPU ? unusual) on RDNA1 were only there to bring confusion, FUD and clicks.
 
Last edited:
Top Bottom