• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

(*) Sony PS5 Vs. Xbox Series X Technical Analysis: Why The PS5’s 10.3 TFLOPs Figure Is Misleading

Status
Not open for further replies.

StreetsofBeige

Gold Member
I just wanted to ask if anybody in here has any hands on knowledge of both Dev kits cause I keep hearing well this is how its always been so this is how I think its supposed to happen and I get that we are just speculating at this point but I see a lot of people spitting out there speculation as fact like they have both consoles already and they also developed a game on both its really getting tiring
Good luck with getting dev kit specs.

We couldn't even get reliable retail unit specs, as there were BS rumours for a year for PS5 among the probably 20 insiders on game forums.

Even if someone claims "Ah, here's some secret dev kit specs I got off a buddy", given the general track record, I wouldn't believe it for a second.

I also wouldn't trust too many devs saying MS or Sony dev kits are good. 99% of them will say it's "super duper to work with MS and Sony and the dev kits are hum didilly fantastic!"

At best, you'll get some Jason Schreier dirt digging, or an indie dev who has balls and doesn't mind speaking their mind (they have no allegience) telling gamers the goods and bads.
 
Last edited:

Coolwhhip

Neophyte
Considering how bad that show was and how weak PS5 is (aside from SSD), I wouldn't be surprised if they scrapped it entirely, rode out PS4/Pro for another two years and release PS5 in 2022. They surely have some other R&D designs they could green light that has more power. Every tech company has multiple blueprints floating around the product development teams.

At one electronics company I worked at, R&D already had a road map of products to come out every year for the next 6 years. But most people in the office were only told about products 1 year out (at most).

Coronavirus is a prefect excuse for any company to change products.

still in the bargaining phase?
 

Tumle

Member
My problem is not, that I’m not accepting the PS5’s is less powerfull TF vise than the xbxs.
My problem is the FUD fanboys are spreading..
What Cerny said:
“PS5’s will run mostly at 10.2 TF but will go down if temperature gets too hot a couple of %”
What Xbox fanboys are saying:
“PS5 is only 9.2 TF but occasionally it will hit 10.2 TF once in a blue moon lol!”
But thankfully it looks like the board has moved on, so general gaming threads are back :)
 
My problem is not, that I’m not accepting the PS5’s is less powerfull TF vise than the xbxs.
My problem is the FUD fanboys are spreading..
What Cerny said:
“PS5’s will run mostly at 10.2 TF but will go down if temperature gets too hot a couple of %”
What Xbox fanboys are saying:
“PS5 is only 9.2 TF but occasionally it will hit 10.2 TF once in a blue moon lol!”
But thankfully it looks like the board has moved on, so general gaming threads are back :)
You have to understand people concerns, Sony said or a company said doesn't turn into reality.
Few examples.
Sony said :





Ubisoft said :



MS said :



EA said :



Now the same people you have problems with might say the same thing about you.
 
xbox-series-x-vs-sony-ps5-graphics-performance-2-2048x1182.jpg


Remember when I said that these next-gen consoles are revolutionary? Well, I wanted to compare these to PC GPUs (sustained TFLOPs only) and while I want to point out that comparisons across architectures can be slightly off, these are accurate enough for some light comparison. NVIDIA has retained its performance crown (as always) at 16.1 TFLOPs on the RTX 2080 Ti but just barely.

The Xbox Series X with its 12.1 TFLOPs actually beats out some lower clocked variants of the RTX 2080 SUPER! This is the first time that a console has been able to take on the PC high-end market and I think that deserves applause. Part of the reason for this is, of course, the fact that next-gen consoles are built like miniature PCs and architected as x86 devices. We even threw AMD's RX 5700 XT in there and that is roughly equivalent to the Sony PS5 once you factor in variable clock rates.

xbox-series-x-vs-sony-ps5-graphics-performance-per-dollar-1-2048x1184.jpg

I was looking at these graphs and I had a thought.

The PS5 will be based on RDNA 2 architecture, correct?

AMD has told us that there would be a roughly 50% performance per watt increase between the two. While this certainly doesn't mean a 50% IPC increase between the two architectures, but we can expect some degree of increase in IPC between the two correct? Could be anything between 10% to 40% maybe?

The PS5 GPU is a custom build, it doesn't come from an established GPU on the market, such as a 5700XT, but it could be a cut down of a theoretical RDNA2 5800XT or 5900XT.

My point is that, even if we go by the worst Tflop count of 9.2Tflop, the custom build (and maybe cut down of a 5800XT or 5900XT) GPU of the PS5 should deliver superior performance to a 5700XT by either a small margin or by a considerable amount because of the IPC increase between RDNA1 and RDNA2.

If we go by the 10.26Tflop or even just 10Tflop count then we gap between the two would be even wider, in favor of the PS5.
 
Last edited:
This is insiders 13TF all over again. There's a bigger gap between these two consoles than was there between Xbox One X and PS4 Pro. Remember what Microsoft was able to get out of Xbox One X, how it shot way beyond what people expected of it. What do people expect to happen with this insane thing they've built? I'm sure the same extensive game engine analysis and approaches they used for Xbox One X will have been amplified for Series X.

The games will pretty much put this one to rest. The PS5 isn't weak at all, and I won't dare suggest that, but the Series X quite obviously murders the PS5. Any sane person can see that in raw performance potential.
sqLpu7n7sGUml7THlzDEwgnOPhfXOxItRS8DCZv6OuZ-Wv32yhpshTYUpdi7mA7B8SL8ILFLFt--bQk2Hf1bTfoXGdnC-6SqrCn8RPk=w301-h200
 

Tumle

Member
You have to understand people concerns, Sony said or a company said doesn't turn into reality.
Few examples.
Sony said :





Ubisoft said :



MS said :



EA said :



Now the same people you have problems with might say the same thing about you.

sorry my work PC is blocking your links..

the difference is they are just making stuff up with out anything to back it up. its like they are so eager for the PS5 only to be 9.2 TF max.
 

longdi

Banned
This confusion is all down to Mark Sony.
If he had been truly transparent and clear, we won't be going in circles. Spent too much time downplaying Tflops and high frequency. Stinky move.

Here is how i would have done my GDC script:
-PS5 uses the best AMD rdna2 36cu chip yet
-we clock it to 2.23ghz
-our amd cpu is clock to 3.5ghz with SMT
-2.23ghz and 3.5ghz are the max clocks 95% of the time
-will drop clocks with the most intensive games
-as such PS5 has 10.3tflops max
-gpu clock may look high, but we have created a good cooling solution to handle it all the time. trust us.
-we are also applying fixed deterministic voltage/frequency values
-consoles need fixed performance in all conditions
-all the 3 factors above, helps us to achieve 2.23ghz
-we also use amd smart shift, this allows developers the flexibility in distributing load between the gpu and cpu
-in summary, we give you the best AMD 36CU dies in a playstation.
 

yurinka

Member
I was talking about 5700 and 5800, I thought both were RDNA 1 but I saw 5800 is RDNA 2. And the "PS5 has 9.2" comes from the github stuff that as I remember also said PS5 was RDNA 1. So when comparing to 5800, so I thought people were comparing PS5 GPU to a RDNA1 GPU, but later I saw 5800 is RDNA2. Not sure about 5700, but I think it's RDNA1.
 
Last edited:

sendit

Member
My problem is not, that I’m not accepting the PS5’s is less powerfull TF vise than the xbxs.
My problem is the FUD fanboys are spreading..
What Cerny said:
“PS5’s will run mostly at 10.2 TF but will go down if temperature gets too hot a couple of %”
What Xbox fanboys are saying:
“PS5 is only 9.2 TF but occasionally it will hit 10.2 TF once in a blue moon lol!”
But thankfully it looks like the board has moved on, so general gaming threads are back :)

It’s not the temps, it’s the power draw. PS5 has a constant power draw.

This confusion is all down to Mark Sony.
If he had been truly transparent and clear, we won't be going in circles. Spent too much time downplaying Tflops and high frequency. Stinky move.

Here is how i would have done my GDC script:
-PS5 uses the best AMD rdna2 36cu chip yet
-we clock it to 2.23ghz
-our amd cpu is clock to 3.5ghz with SMT
-2.23ghz and 3.5ghz are the max clocks 95% of the time
-will drop clocks with the most intensive games
-as such PS5 has 10.3tflops max
-gpu clock may look high, but we have created a good cooling solution to handle it all the time. trust us.
-we are also applying fixed deterministic voltage/frequency values
-consoles need fixed performance in all conditions
-all the 3 factors above, helps us to achieve 2.23ghz
-we also use amd smart shift, this allows developers the flexibility in distributing load between the gpu and cpu
-in summary, we give you the best AMD 36CU dies in a playstation.

What are you doing on this forum? Someone get this individuals resume to Sony and get him hired.:messenger_astonished:
 
Last edited:

Tumle

Member
It’s not the temps, it’s the power draw. PS5 has a constant power draw.



What are you doing on this forum? Someone get this individuals resume to Sony and get him hired.:messenger_astonished:
yea sorry.. i just made the confusion worse with my bad English :p
 

geordiemp

Member
Your explanation doesn't even make sense. If most games never need to be that high then it would almost never be that high going by your logic. Thus it would not hit those clocks the majority of the time. So your explanation sounds wrong. If I only need 9.5 TFlops most of the time and according to you the GPU delivers me 9.5TFlops it is not running 10.2 the majority of the time. It is running at 9.5 TFlops.

It feels like he was intentionally vague. He could have said if you are at 10.28 TFlops GPU you are at ____ for the CPU. If you need max CPU you are at ___ GPU power. I suspect he doesn't want people to know those details though or at least Sony doesn't.

I think most games will be pretty demanding on the GPU/CPU especially when there is a console that has higher specs. Devs will be trying to max out the PS5 and then scaling to the XSX.

Most dont think in the time domain, a frame might be 16.6 ms, but the GPU is running doing things every 1/2 nanosecond. so, out of all the things that happen in one frame, nothing runs at maxium all the time. CPU is not running all 16 threads for the hundreds of things it is doing in that 1 frame.

But posters like to say, but if its 3.5 Ghz its not 10.2 TF because ....I dont understand.

When Ps5 needs 10.2 TF it will get it is the simple message, when XSX needs 12.1 it will get it, but nothing is running 100 % all the time. Look at the below, Spiderman activity for 1 frame.

You realise HZD only got hot running the map screen, simple jobs run too fast, this solves the problem in therory.

XSX would spin up cooling power on that HZD map screen if it got to run it, Ps5 will put a brake on the power for a frame or 2 or whatever it needs, but delivery more of ps5 grunt for running the gameplay . Its smart.



IaCWmQa.png
 
Last edited:
I was looking at these graphs and I had a thought.

The PS5 will be based on RDNA 2 architecture, correct?

AMD has told us that there would be a roughly 50% performance per watt increase between the two. While this certainly doesn't mean a 50% IPC increase between the two architectures, but we can expect some degree of increase in IPC between the two correct? Could be anything between 10% to 40% maybe?

The PS5 GPU is a custom build, it doesn't come from an established GPU on the market, such as a 5700XT, but it could be a cut down of a theoretical RDNA2 5800XT or 5900XT.

My point is that, even if we go by the worst Tflop count of 9.2Tflop, the custom build (and maybe cut down of a 5800XT or 5900XT) GPU of the PS5 should deliver superior performance to a 5700XT by either a small margin or by a considerable amount because of the IPC increase between RDNA1 and RDNA2.

If we go by the 10.26Tflop or even just 10Tflop count then we gap between the two would be even wider, in favor of the PS5.

PPW is not 1:1 with IPC. 50% more PPW doesn't equal 50% IPC over RDNA1.

IPC gains on RDNA2 are probably 10% - 20% at absolute most, which might also depend on the specific node process (i.e whether 7nm "enhanced" is EUV or just improved DUV; AMD haven't clarified that but it's been said both systems are on 7nm enhanced).
 
So RDNA was 1.25 x the performance of GCN, and I expect a similar gain from RDNA 1 to RDNA 2. If so, that gives a similar performance for the XSX as 18tflops of GCN. Simplified I know, but that gives a good indication of how well the PS5 and XSX will perform next gen. Add onto that Ray Tracing, Mesh Shading, VRS etc and then the CPU jump, and next gen gonna shine.
Imagine what some of the best devs will do with that kit.
 
PPW is not 1:1 with IPC. 50% more PPW doesn't equal 50% IPC over RDNA1.

IPC gains on RDNA2 are probably 10% - 20% at absolute most, which might also depend on the specific node process (i.e whether 7nm "enhanced" is EUV or just improved DUV; AMD haven't clarified that but it's been said both systems are on 7nm enhanced).

I agree with your assessment.

Then lets speculate a bit.

Let us create three estimates, a conservative, middle ground and an optimistic one.


1.
Conservative estimate: We assume an IPC increase between RDNA and RDNA2 of around 10%. Lets also assume the conservative estimate of "only" 9.2Tflop for the PS5.

9.2*1.1= aprox. 10.12 RDNA1 Tflops

10.12/9.98 (5700XT Tflop count) = 1.014 = aprox performance 1.4% above a 5700XT.


2.
Middle ground estimate: We assume an IPC increase between RDNA and RDNA2 of around 10%. Lets also assume a safe estimate of 10Tflop for the PS5 because of both thermal throttling and profile choices by the developers..

10*1.1= aprox. 11 RDNA1 Tflops
11/9.98 (5700XT Tflop count)= 1.1022 = aprox performance 10,22% above a 5700XT.


3.
Optimistic estimate: We assume an IPC increase between RDNA and RDNA2 of around 20%. Lets also assume the optimistic estimate of 10.26Tflop for the PS5 (which it should hold for most of the time).

10.26*1.2 = 12.312 RDNA1 Tflops.

12.312/9,98 (5700XT Tflop count) = 1,233 = aprox performance 23,3% above a 5700XT.



Assuming this console is sold at 399/400€, which it is a must in my opinion, this is a very good Tflop per €/$ count.

This console would be a beast for the price.
 
Last edited:

Kumomeme

Member
My problem is not, that I’m not accepting the PS5’s is less powerfull TF vise than the xbxs.
My problem is the FUD fanboys are spreading..
What Cerny said:
“PS5’s will run mostly at 10.2 TF but will go down if temperature gets too hot a couple of %”
What Xbox fanboys are saying:
“PS5 is only 9.2 TF but occasionally it will hit 10.2 TF once in a blue moon lol!”
But thankfully it looks like the board has moved on, so general gaming threads are back :)
yep..even wccftech confidently make a benchmark with 9.2tf as lower baseline lol..
 

Kumomeme

Member
So RDNA was 1.25 x the performance of GCN, and I expect a similar gain from RDNA 1 to RDNA 2. If so, that gives a similar performance for the XSX as 18tflops of GCN. Simplified I know, but that gives a good indication of how well the PS5 and XSX will perform next gen. Add onto that Ray Tracing, Mesh Shading, VRS etc and then the CPU jump, and next gen gonna shine.
Imagine what some of the best devs will do with that kit.
we already saw what Naughty Dog and Santa Monica Studio did with base ps4 1.8tf jaguar cpu with hdd...consider what sony first party had released, tf number seems like doesnt matter much
 

Neofire

Member
My problem is not, that I’m not accepting the PS5’s is less powerfull TF vise than the xbxs.
My problem is the FUD fanboys are spreading..
What Cerny said:
“PS5’s will run mostly at 10.2 TF but will go down if temperature gets too hot a couple of %”
What Xbox fanboys are saying:
“PS5 is only 9.2 TF but occasionally it will hit 10.2 TF once in a blue moon lol!”
But thankfully it looks like the board has moved on, so general gaming threads are back :)
Exactly, especially people saying that the "ps5 is so weak" that Sony should just scrap it and wait til 2020 😂🤣 what? So much FUD and other nonsense because the console is 12TF..
 

LordOfChaos

Member
Clocks are deterministic, not thermally bound, both chips run near the peak most of the time? When a power constrained situation arises switching is on the order of milliseconds rather than sustained downclocks? Almost like people either didn't watch the GDC talk or were willfully spreading FUD?


 

kbear

Member
Clocks are deterministic, not thermally bound, both chips run near the peak most of the time? When a power constrained situation arises switching is on the order of milliseconds rather than sustained downclocks? Almost like people either didn't watch the GDC talk or were willfully spreading FUD?


"Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core."
 

LordOfChaos

Member
"Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core."

This is more recent, direct from the horses mouth, and they also say retail units have different boosting than dev kits, as dev kits should assume the worst.

If he's saying it outright, what are we doing here, assuming he's lying?

"There's enough power that both CPU and GPU can run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

"Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.


^Right there
 
Last edited:

Genx3

Member
I agree. I think the developer will be more important than the gap between PS5 and XSX.

Absolutely.
The fact Sony's 1st party games mostly aim for 30FPS will mean that their games will still look a lot better than XSX 1st party games that mostly aim for 60FPS.
 

Genx3

Member
One thing DF found out was that Cerny was wrong about faster clocks running games better than more CU's.
They ran 2 different 5700's one with 36 CU's and another with 40 CU's then set the clocks so that they were both 9.6 TF cards.
The 40 CU card out performed the 36 CU card even though it had slower clocks.
 

sinnergy

Member
One thing DF found out was that Cerny was wrong about faster clocks running games better than more CU's.
They ran 2 different 5700's one with 36 CU's and another with 40 CU's then set the clocks so that they were both 9.6 TF cards.
The 40 CU card out performed the 36 CU card even though it had slower clocks.
Off course, that’s how it works and will play out ! With ray tracing even more so.
 

StreetsofBeige

Gold Member
One thing DF found out was that Cerny was wrong about faster clocks running games better than more CU's.
They ran 2 different 5700's one with 36 CU's and another with 40 CU's then set the clocks so that they were both 9.6 TF cards.
The 40 CU card out performed the 36 CU card even though it had slower clocks.
Interesting to hear. I'm not a techie or engineer.

So at face value, doing the math I think most people who can at least interpret the math would assume it leads to the same performance because the final number is the same (on paper).

With 16 more CU, SeX will blow away PS5 regardless of slower gpu speed.
 
Last edited:

HarryKS

Member
Absolutely.
The fact Sony's 1st party games mostly aim for 30FPS will mean that their games will still look a lot better than XSX 1st party games that mostly aim for 60FPS.

What are you talking about? What games? The ones built on the Xbox One? The cross-gen ones?
 

sendit

Member
One thing DF found out was that Cerny was wrong about faster clocks running games better than more CU's.
They ran 2 different 5700's one with 36 CU's and another with 40 CU's then set the clocks so that they were both 9.6 TF cards.
The 40 CU card out performed the 36 CU card even though it had slower clocks.

Agreed. PS5 is in for a rude awakening. Check out the massive difference here:

5700 vs 5700 XT

1 RDNA CU = 64 Stream Processors
RNDA Tflop = ( (CU_COUNT * 64) * 2) * GPU_CLOCK

5700
36 CU
~2 GHz (Stock 1.53 GHz)
~9.2 Tflops@2GHz

5700 XT
40 CU
~1.9 GHz (Stock 1.76 GHz)
~9.7 Tflops@1.9GHz

Example 1 (Red Dead):
Se5dN3Y.jpg


Example 2 (Tomb Raider):
W7v2CL7.jpg


Example 2 (AC):
UPXxoaR.jpg


Ref Video

Additional stuff for those that love TFlops:

Stock TFlop 5700: 7.2 Tflop
Stock TFlop 5700 XT: 9 TFlop

%Diff: 25%

PS5: ~10.28 TFlop
XSX: ~12.1 TFlop

%Diff: 17%

Difference is absolutely massive!
 
Last edited:

Genx3

Member
What are you talking about? What games? The ones built on the Xbox One? The cross-gen ones?
I'm talking about PS4/Pro vs XB1/XB1X games. In most games XB 1st party aims for 60 fps whole Sony 1st party aims for 30 fps. There are a few exceptions but the majority of the games.
If that trend continues than Sony 1st party games on PS5 will look significantly better than XB 1st party games on XSX.
 

Ascend

Member
Agreed. PS5 is in for a rude awakening. Check out the massive difference here:

5700 vs 5700 XT

1 RDNA CU = 64 Stream Processors
RNDA Tflop = ( (CU_COUNT * 64) * 2) * GPU_CLOCK

5700
36 CU
~2 GHz (Stock 1.53 GHz)
~9.2 Tflops@2GHz

5700 XT
40 CU
~1.9 GHz (Stock 1.76 GHz)
~9.7 Tflops@1.9GHz

Example 1 (Red Dead):
Se5dN3Y.jpg


Example 2 (Tomb Raider):
W7v2CL7.jpg


Example 2 (AC):
UPXxoaR.jpg


Ref Video

Additional stuff for those that love TFlops:

Stock TFlop 5700: 7.2 Tflop
Stock TFlop 5700 XT: 9 TFlop

%Diff: 25%

PS5: ~10.28 TFlop
XSX: ~12.1 TFlop

%Diff: 17%

Difference is absolutely massive!
The clearer things become, the weaker the PS5 will seem.
Compared to the PS4 Pro, it's a beast. Compared to the XSX, well, it's a disappointment, but still a viable machine.
 

S0ULZB0URNE

Member
This is more recent, direct from the horses mouth, and they also say retail units have different boosting than dev kits, as dev kits should assume the worst.

If he's saying it outright, what are we doing here, assuming he's lying?

"There's enough power that both CPU and GPU can run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

"Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.


^Right there
He said it initially as well.
But yet people are still going to troll and say otherwise.
 

HarryKS

Member
I'm talking about PS4/Pro vs XB1/XB1X games. In most games XB 1st party aims for 60 fps whole Sony 1st party aims for 30 fps. There are a few exceptions but the majority of the games.
If that trend continues than Sony 1st party games on PS5 will look significantly better than XB 1st party games on XSX.

What games on XBOX 1 run at 60 fps. 1st party?
 

Gamerguy84

Member
DF putting in work trying to make the PS5 look bad, lol no bias.

I switched to NX Gamer after RDR2. They pointed out how rough it looked on PRO, while stating it was flawless 4k and perfect framerates.

That turned out to be a lie.
 
Last edited:

Clear

CliffyB's Cock Holster
One thing DF found out was that Cerny was wrong about faster clocks running games better than more CU's.
They ran 2 different 5700's one with 36 CU's and another with 40 CU's then set the clocks so that they were both 9.6 TF cards.
The 40 CU card out performed the 36 CU card even though it had slower clocks.

And once again THIS IS A FALLACY because when these tests are done on a PC the only thing that's changing is the internal configuration of the GPU, its an entirely different scenario when the whole memory system and I/O stack is built in service of the principle of maximal resource occupancy.

These are processor elements, they take input data, transform and output it. Hence when DF do these amateur-hour "tests" all that's indicated is the relative impact of changing these discrete processes in isolation. Whereas the abstract of Cerny's approach is about end-to-end optimization.
 
DF putting in work trying to make the PS5 look bad, lol no bias.

I switched to NX Gamer after RDR2. They pointed out how rough it looked on PRO, while stating it was flawless 4k and perfect framerates.

That turned out to be a lie.

So you're saying DF are MS shills and this 'NX Gamer' made up lies about PS4 Pro?
 

Ridley1

Neo Member
Most dont think in the time domain, a frame might be 16.6 ms, but the GPU is running doing things every 1/2 nanosecond. so, out of all the things that happen in one frame, nothing runs at maxium all the time. CPU is not running all 16 threads for the hundreds of things it is doing in that 1 frame.

But posters like to say, but if its 3.5 Ghz its not 10.2 TF because ....I dont understand.

When Ps5 needs 10.2 TF it will get it is the simple message, when XSX needs 12.1 it will get it, but nothing is running 100 % all the time. Look at the below, Spiderman activity for 1 frame.

You realise HZD only got hot running the map screen, simple jobs run too fast, this solves the problem in therory.

XSX would spin up cooling power on that HZD map screen if it got to run it, Ps5 will put a brake on the power for a frame or 2 or whatever it needs, but delivery more of ps5 grunt for running the gameplay . Its smart.

And pray do tell, at the start of a frame, how does the GPU know what clock rate its going to need to finish it on time? Can it see into the future? Are they using monster cable?

For that reason, i'm fairly sure your rationale is wrong. From what you describe you'd need some sort of PID controller for clockrate? But even then the frame rate would be all over the place for the end user.
 
This fanboy nonsense from both sides are getting ridiculous. Infographs and mile long post why his favorite box is better than the other, when no-one has seen a single side-by-side comparisons.

Series X will most probably be my multiplatform machine and PS5 for exclusives, but I’m already starting to see the DF articles of head to heads with games looking fully identical and running at same framerate: ”THOSE LAZY DEVELOPERS NOT TAKING ADVANTAGE OF MY 1.7 TERAFLOPS REEEEE”

It’s going to be Assassins’s Creed Unity ad nauseum.

I hear you. But it does seem the Series X will dominate. Likely leading to a PS5 Pro to compensate down the line. That being said, regardless of Halo, SONY will have the better games. As a pure multiplat system, you have to go with the Series X. If you have to choose 1, you start with the PS5 and then get a Series X 2 years later. I'll be getting both! BLADOW! But what I really am commenting about, where are these DF (or any) "head to head" game comparisons being shown? I haven't seen much of any gameplay outside of minecraft and gears. Please post!
 
Any exact specs for both GPUs? As in TMUs, ROPs..etc?
Rule of thumb for AMD GPU uArchs is TMU's 4*CU, ROPS are 16*Shader arrays.

PS5 = 144TMUs, 64 ROPS
XSX = 208TMUs, 64 ROPS

Unless they have some custom upgrades. Sony has previously quadrupled Async Compute Engines, so it is possible.
 
Last edited:
One thing DF found out was that Cerny was wrong about faster clocks running games better than more CU's.
They ran 2 different 5700's one with 36 CU's and another with 40 CU's then set the clocks so that they were both 9.6 TF cards.
The 40 CU card out performed the 36 CU card even though it had slower clocks.
Off course, that’s how it works and will play out ! With ray tracing even more so.
I find it crazy how everyone just eats up what ever Mark Cerny says as fact. Dude is a sales man.
Interesting to hear. I'm not a techie or engineer.

So at face value, doing the math I think most people who can at least interpret the math would assume it leads to the same performance because the final number is the same (on paper).

With 16 more CU, SeX will blow away PS5 regardless of slower gpu speed.
Xbox > Playstation , why can't they accept that MS did a better job this time around, this is ridiculous.
2 > 1 but no 2 = 1 or 2 < 1.
 
Guys, we're all missing the point here. Maybe there are no "teraflops"? Maybe WE are the teraflops?! Maybe the "teraflops" are actually something INSIDE us! Did ANYONE take that into consideration? And if that is the case, the next gen isn't about the "consoles" or their "processing power", it's about the friendships we make on the road to launch!
 
Status
Not open for further replies.
Top Bottom