• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

(*) Sony PS5 Vs. Xbox Series X Technical Analysis: Why The PS5’s 10.3 TFLOPs Figure Is Misleading

Status
Not open for further replies.
In his field he is one of the best in the world. I recommend you look if you are interested in this man's work throughout his life. Not everyone reaches that level.
I know, I never said that he isn’t a great engineer but he is speaking as a Sony person , why is this so difficult to understand ? He knows about a trillion times more stuff about computers than I do (and I have a college diploma) but still listening to him was painful. The guy said the opposite things of what he was saying during the ps4 presentation, That I can accept, I get it, he has to present and hype a new product that is inferior to the competition. Him saying that he hates a big number of CU, I am sorry but is inexcusable, he should have sugarcoated things with a different phrase.
 

FStubbs

Member
If you liked your PS4, you'll love the PS5.

This is the closest generation ever in terms of overall performance envelope. I've said that since last year.

Maybe if you're considering the jump being from PS4Pro to PS5 and not PS4 to PS5. PS4 to PS5 - even with the 9/10 TFLOPS, is still a huge leap.

Gamecube to Wii was probably the closest in performance jump ever.
 

Proelite

Member
Maybe if you're considering the jump being from PS4Pro to PS5 and not PS4 to PS5. PS4 to PS5 - even with the 9/10 TFLOPS, is still a huge leap.

Gamecube to Wii was probably the closest in performance jump ever.

I was talking about the closest in terms of Xbox vs PS.

The jump is massive from previous gen is massive. Biggest for Xbox and probably PS too.
 
Last edited:

FranXico

Member
Him saying that he hates a big number of CU, I am sorry but is inexcusable, he should have sugarcoated things with a different phrase.
That was particularly embarrassing. He could have mentioned ensuring BC while maintaining costs (kind of like he said for the SSD size), and it would have not been an insult to people's intelligence.
 
Anyone expecting a locked 2.23Ghz on the PS5 is gonna be disappointed.

The XSX GPU clock @1.8Ghz can actually be SUSTAINED.

The PS5 GPU clock @2.23Ghz definitely can NOT. At best it will pulse up to that frequency for a brief instant before throttling back down to something more sustainable. And it doesn't matter if it's temperature throttling or power throttling. They are two sides of the same coin.

That 2.23Ghz number is PURE PR speak to make it seem like the two are closer in power than they really are.
 

frogger

Member
Maybe if you're considering the jump being from PS4Pro to PS5 and not PS4 to PS5. PS4 to PS5 - even with the 9/10 TFLOPS, is still a huge leap.

Gamecube to Wii was probably the closest in performance jump ever.
Gamecube to Wii is almost like a lateral pass.
 

FStubbs

Member
I was talking about the closest in terms of Xbox vs PS.

The jump is massive from previous gen is massive. Biggest for Xbox and probably PS too.

Ah. Then I'm sorry, I still disagree. Xbox 360 vs PS3 were nearly a dead heat. SNES vs Genesis basically were as well. PS3 was theoretically a bit stronger and SNES or Genesis were theoretically a bit stronger depending on which developer you asked.

PS5 and Series X are closer than PS4 and Xbone were, or PS2 and Xbox OG, I'll give you that.
 
Last edited:

frogger

Member
Anyone expecting a locked 2.23Ghz on the PS5 is gonna be disappointed.

The XSX GPU clock @1.8Ghz can actually be SUSTAINED.

The PS5 GPU clock @2.23Ghz definitely can NOT. At best it will pulse up to that frequency for a brief instant before throttling back down to something more sustainable. And it doesn't matter if it's temperature throttling or power throttling. They are two sides of the same coin.

That 2.23Ghz number is PURE PR speak to make it seem like the two are closer in power than they really are.
Yep. But some people will continue to deny it. Anyway, buy the console and be happy.
 
Anyone expecting a locked 2.23Ghz on the PS5 is gonna be disappointed.

The XSX GPU clock @1.8Ghz can actually be SUSTAINED.

The PS5 GPU clock @2.23Ghz definitely can NOT. At best it will pulse up to that frequency for a brief instant before throttling back down to something more sustainable. And it doesn't matter if it's temperature throttling or power throttling. They are two sides of the same coin.

That 2.23Ghz number is PURE PR speak to make it seem like the two are closer in power than they really are.

Wrong in so many ways....

The machine is supposed to run with no problems at the highest clocks most of the time....

That's why they chose a power draw target, instead of the classic approach where you boost the clocks and cause overheating
 

Proelite

Member
Ah. Then I'm sorry, I still disagree. Xbox 360 vs PS3 were a dead heat. SNES vs Genesis basically were as well. PS3 was theoretically a bit stronger and SNES or Genesis were theoretically a bit stronger depending on which developer you asked.

PS5 and Series X are closer than PS4 and Xbone were, or PS2 and Xbox OG, I'll give you that.

Not really.

PS3 had a very bad first few years in terms of 3d party titles. That hasn't happened since. Even Xb1 didn't have the 3rd party issues that PS3 had.

Wrong in so many ways....

The machine is supposed to run with no problems at the highest clocks most of the time....

True, but the CPU is running at sub 2.0ghz if the GPU is at max.

That's SmartShift.
 
Last edited:

FStubbs

Member
Not really.

PS3 had a very bad first few years in terms of 3d party titles. That hasn't happened since. Even Xb1 didn't have the 3rd party issues that PS3 had.

I thought we were talking sheer raw power, not third party support.

Also, PS3 never had problems with third party support. Wii U or Vita OTOH ...
 

martino

Member
Yep. I do.

when you'll invest yourself into your pc you will be discover things like that :
i have a 1080ti evga icx2
the card is advertised running between 1556 and 1670 ghz (11.153 tflops to 11.97 tlfops)

in game it runs between 1950 and 2020 ghz (13.97 to 14.4 tflops)

(those are pascal tflops missing lot of features though :'( )

oc is a lot more higher than advertised isn't it ?
do you the difference with ps5 PR ? i hope you can
also advertising inform you of the worst case scenario...
 
Last edited:
Not really.

PS3 had a very bad first few years in terms of 3d party titles. That hasn't happened since. Even Xb1 didn't have the 3rd party issues that PS3 had.



True, but the CPU is running at sub 2.0ghz if the GPU is at max.

That's SmartShift.

SmartShift is used when you reach that power draw, not ALWAYS....
 

Proelite

Member
SmartShift is used when you reach that power draw, not ALWAYS....

PS5's power draw will be constant for all machines. It's the division of the power between CPU and GPU that changes.

If you seen any chart for power and clock, you can see that it's an exponential graph, not linear.

Here is one for Smartshift. The faster you want to push the GPU, the more power you need to take way from the CPU.

csm_AMD_Ryzen_Mobile_Tech_Day_Breakout_Session_Performance_Optimization_03_feb15ab184.png
 

psorcerer

Banned
when you'll invest yourself into your pc you will be discover things like that :
i have a 1080ti evga icx2
the card is advertised running between 1556 and 1670 ghz (11.153 tflops to 11.97 tlfops)

in game it runs between 1950 and 2020 ghz (13.97 to 14.4 tflops)

(those are pascal tflops missing lot of features though :'( )

oc is a lot more higher than advertised isn't it ?
do you the difference with ps5 PR ? i hope you can
also advertising inform you of the worst case scenario...

I don't know why you want to support my side of the argument. But thanks.
Just one thing: how do you know that it runs 1950 sustained?
And if it not runs sustained. How long these peaks of a high freq last?
 
Cerny is the man responsible for the ps5 , he is not an impartial tech person. Some of the things he said were embarrassing, had a MS tech person said some of this stuff he would have been laughed out of the Internet and rightfully so. Again compare his thesis on console design for the ps4 and for the ps5.

Well the same people’s previous semi-Gods were Kleegamefan and osirisblack and now that their God spoke they advise us to listen to him and believe everything he said in blind faith. No thanks....

WTF?? He isn't a tech person? He is a lead system architect of PS5!

People, do you see what Xbox fan is saying here? The lead system architect of PS5 isn't a tech person.
 
Last edited:
WTF?? He isn't a tech person? He is a lead system architect of PS5!

People, do you sew what Xbox fan is saying here? The lead system architect of PS5 isn't a tech person.
Do you have reading comprehension problems ? He isn’t an IMPARTIAL tech person, it is right there in the post you quote.
 
He doesn't work for Sony.
O.k , Sony doesn’t pay him, he is doing all the work for free all these years. Oh Christ, really ? I get it, he is an independent contractor that has been on Sony’s payroll for almost a decade and designs sony’s products and he just presented a product he designed, he is the definition of impartial.
 
O.k , Sony doesn’t pay him, he is doing all the work for free all these years. Oh Christ, really ? I get it, he is an independent contractor that has been on Sony’s payroll for almost a decade and designs sony’s products and he just presented a product he designed, he is the definition of impartial.

Just saying he is not a Sony employee.
 

SleepDoctor

Banned
WTF?? He isn't a tech person? He is a lead system architect of PS5!

People, do you see what Xbox fan is saying here? The lead system architect of PS5 isn't a tech person.

Literally had me lol

Fanboy logic "he's not an impartial tech person". Holy shit some of you need some reading comprehension 🤦🏻‍♂️🤦🏻‍♂️🤦🏻‍♂️
 

V4skunk

Banned
I thought I was the only one! This isn't a 10.3 TF machine. It's a 9.2 TF machine that can occasionally run at 10.3. Probably for less complex indie games for example.
But why would a system up clock to run something less intense?
As a pc gamer this makes no sense.
Also as a pcgamer the way I understood the presentation is that the gpu up clocking would down clock the cpu and vice versa!
 

Genx3

Member
You're most definitely correct.
Conversely that means adding 10% more power only adds 2% perf.

So that means that it will make very little sense to run those high clocks on the GPU because it will disproportionately drop the Frequency of the CPU in order to sustain it.
Basically the 10.23 TF's on the Spec sheet will probably rarely be used.
 
PS5's power draw will be constant for all machines. It's the division of the power between CPU and GPU that changes.

If you seen any chart for power and clock, you can see that it's an exponential graph, not linear.

Here is one for Smartshift. The faster you want to push the GPU, the more power you need to take way from the CPU.

That's not what Cerny said:

"It's a completely different paradigm: rather than running at constant frequency and letting power vary based on the workload. We run at essentially constant power and let the frequency vary based on the workload we then tackled the engineering challenge of a cost-effective and high-performance cooling solution designed for that specific power level. In some ways it becomes a simpler problem because there are no more unknowns there's no need to guess what power consumption the worst case game might have.

So how fast can we run the GPU and CPU with this strategy: the simplest approach would be to look at the actual temperature of the silicon die and throttle the frequency on that basis but that won't work it fails to create a consistent PlayStation 5 experience it wouldn't do to run a console slower simply because it was in a hot room so rather than look at the actual temperature of the silicon die we look at the activities that the GPU and CPU are performing and set the frequencies on that basis which makes everything deterministic and repeatable. While we're at it we also use AMD's smart shift technology and send any unused power from the CPU to the GPU so it can squeeze out a few more pixels.

The benefits of this strategy are quite large: running a GPU at 2 ghz was looking like an unreachable target with the old fixed frequency strategy. With this new paradigm we're able to run way over that in fact we have to cap the GPU frequency at 2.3ghz so that we can guarantee that the on chip logic operates properly
36CU use at 2.3 ghz is 10.3 teraflops and we expect the GPU to spend most of its time at or close to that frequency and performance.

Similarly running the CPU at 3 ghz was causing headaches with the old strategy but now we can run it as high as 3.5 gigahertz in fact it spends most of its time at that frequency
That doesn't mean all games will be running in 2.23 gigahertz and 3.5 gigahertz when that worst case game arrives it will run at a lower clock speed but not too much lower: to reduce power by 10% it only takes a couple of percent reduction in frequency so I'd expect any down clocking to be pretty minor.

All things considered the change to a variable frequency approach will show significant gains for PlayStation gamers"
 
Last edited:

Proelite

Member
So that means that it will make very little sense to run those high clocks on the GPU because it will disproportionately drop the Frequency of the CPU in order to sustain it.
Basically the 10.23 TF's on the Spec sheet will probably rarely be used.

That's not true. CPU load can vary greatly in games.

The problem that I am hearing is that it takes developer input to balance CPU / GPU power usage.
 

ethomaz

Banned
Can you show me a proof? For example, evidence from a GPU profiling tool.
I already posted the DF guy explaining how it works.
I found it is weird at first people talking about the clock dropping when Cerny clearly said the clock can be choose by the devs and not automatically change due thermals.
 
Last edited:

TLZ

Banned
Again with this 9.2 bullshit. Just for the love of God stop it. Absolutely no official mentioned it. Cerny confirmed to DF 10.3 is what it'll be at most. He never mentioned how far it'd drop. NX Gamer explained what it actually meant, that it really is 10.3 and would only drop around 50mhz at most to keep that power output in check. So that's around 10TF.

There's absolutely no evidence of this 9.2 bullshit. Again, please, just drop it. Let's talk about what we actually officially have in our hands, 10.3 and 12.1, and the various other specs.

Stop trying too hard.
 
Last edited:
Status
Not open for further replies.
Top Bottom