• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • Hey Guest. Check out the NeoGAF 2.2 Update Thread for details on our new Giphy integration and other new features.

The PlayStation 5 and Xbox Series X/S versions of Squadrons have a critical performance difference.

Status
Not open for further replies.

Kagey K

Member
Dec 18, 2013
9,886
13,679
985
This does not have to do with hardware - you should know better.
Many things won’t at the beginning of a new gen. It’ll be API and software based, we won’t see the hardware kick in for a bit yet.

The fact that the game came out about a month ago and was written with these new consoles in mind, should be worrying though.


If this was a 5 or 6 year old game it would make sense, but it doesn’t on one that had the July PS5 mandate in mind before it released.
 

DForce

Member
Nov 23, 2017
4,422
8,133
525
Here is the topical on form information, for those interested - where Cerny himself states they increase the frequency/speed of the CPU. Frequency means Speed. So they do in fact Increase (not decrease) the standard CPU speed and both CPU and GPU run in BOOST mode.

They then decrease the CPU speed as needed to alleviate heating/strain on the CPU and to increase performance decrease performance of the gpu.

But make no mistake, the CPU as standard on Ps5 is burdened by a higher speed than it was intended.

Translation - we continuously overburden the CPU and GPU by overclocking both and attempt to alleviate that burden by using variable frequencies in order to relieve strain on either the CPU or GPU.

Overclocking as a solution to catch up performance wise is not a preferred solution, it stresses the CPU and GPU and may eventually limit it's lifespan.

Again, Cerny Literally Verbatim states - "WE increase the frequency of CPU and GPU, until they reach the capabilities of the system's cooling solution"



You took his words out of context.

Mark Cerny
In general I like running the GPU at a higher frequency. Let me show you why.

That's just one part of the GPU there are a lot of other units and those other units all run faster when the GPU frequency is higher at 33% higher frequency rasterization goes 33% faster processing the command buffer goes that much faster the 2 and other caches have that much higher bandwidth and so on.

Their plan from the beginning was to have high frequencies. They didn't use it to "catch up". I don't know why people go with this narrative when it's not true.

They then decrease the CPU speed as needed to alleviate heating/strain on the CPU and to increase performance decrease performance of the gpu.

It decreases if developers want more power to the GPU, not heat related unless it's a worst case scenario.

You can't call Cerny and liar and then use his comments as proof. Doesn't surprise me that a lot of new NeoGAFers are coming in trying to be dishonest when discussing information about Mark Cerny's Deep Dive presentation.
 
  • Like
Reactions: TurboSpeed

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Mar 31, 2011
5,339
2,460
990
 

phil_t98

Member
Oct 10, 2014
3,831
3,646
620
so how come when resident evil performed worse on xbox x than the pro it was because the console wasn't powerful enough to do those resolutions and now that this performs better on series x its because of lazy Devs?
 
Last edited:

SegaSnatcher

Member
Jan 31, 2020
321
686
320
Scottsdale, Arizona
The quote by Cerney plainly states they in fact do, you are free to believe what you wish but the fact is - you're just speaking gibberish and ignoring Cerny's own words.

A binned CPU, is a CPU cherry picked by engineers, voltage checked - then higher frequencies are applied to speed up the CPU past it's intended CPU speed - the CPU in question is a CPU that was not originally intended to reach the speed's specified - but since these are binned samples put into production they are considered stable CPU's at variable frequencies which is why the CPU in fact now Slows down too - because it was taken of the shelf, sped up internally - and applied to a console.

INDISPUTABLE. Just as the quote is indisputable.

Not speaking gibberish. It’s all about the form factor and cooling solution. It’s not the CPU itself that is an issue, again it’s the same core as the XSX CPU that is clocked even higher.
 

Shmunter

Member
Aug 25, 2018
7,679
17,710
725
Oh snap, just in time to add some spice to the mix.

Personally I make nothing off it as it sounds unrealistic
 

Panajev2001a

GAF's Pleasant Genius
Jun 7, 2004
17,982
10,282
2,095
The quote by Cerney plainly states they in fact do, you are free to believe what you wish but the fact is - you're just speaking gibberish and ignoring Cerny's own words.

A binned CPU, is a CPU cherry picked by engineers, voltage checked - then higher frequencies are applied to speed up the CPU past it's intended CPU speed - the CPU in question is a CPU that was not originally intended to reach the speed's specified - but since these are binned samples put into production they are considered stable CPU's at variable frequencies which is why the CPU in fact now Slows down too - because it was taken of the shelf, sped up internally - and applied to a console.

INDISPUTABLE. Just as the quote is indisputable.

No, sorry but no. You are interpreting the statements one way and pretending it is gospel for others. The boost clock is fully intended speed, these chips are not overclocked.
 

SegaSnatcher

Member
Jan 31, 2020
321
686
320
Scottsdale, Arizona
Yeah it’s only a full PS4 of power difference (1.7 tf), and we saw how shitty the PS4 was.

So the power levels are essentially the same.

Really?

That’s not how it works. Again that amount of difference doesn’t = 2x the performance. They both use RDNA 2.

The 20% delta between PS5 and XSX is smaller than both PS4 vs XB1 and Pro vs X1X.
 
Last edited:

DForce

Member
Nov 23, 2017
4,422
8,133
525
You can tell a lot of members from XboxGAF never owned or played on a gaming PC.


The GPU, CPU, RAM etc. are much better on the PS5 than the XsS, but they think the XsS is closer to the PS5.


Embargos will be up later today.. We'll start getting comparison videos soon.
 

Kagey K

Member
Dec 18, 2013
9,886
13,679
985
That’s not how it works. Again that amount of difference doesn’t = 2x the performance. They both use RDNA 2. 20% delta is smaller than both PS4 vs XB1 and Pro vs X1X.
It works fine, unless you don’t want it to.

In RDNA flops the difference is actually more like a PS4 pro. Math can go up and down. You only choose to see it one way.
 

GymWolf

Member
Jun 11, 2019
13,638
19,747
565
This seems like a combo of sex being more powerfull and lazy devs not using ps5 power to get better perfomance than a ps4 pro.
 
Status
Not open for further replies.