• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

12.15 - 10.28= 1.87 Teraflops difference between the XSX and PS5 (52 CU's vs. 36 CU's)

15 CU's extra at lower clock speeds = cooler, more stable, console

also when you think about it
PS5 CU = .286 TFLOPS / CU @ 2.23Ghz (when peaked)

XSX CU = .2307 TFLOPS / CU @ 1.825 Ghz (consistantly available at all times)

When you overclock something that high, you'll get more power out of your hardware, but that does not mean it's better. Lets also not forget that the CPU steals that power from the GPU when needed and the GPU steals that power from the CPU when it's needed. To maintain framerate the CPU will take precedence, therefore 4K becomes less achievable and variable resolution will start to kick in. Power stealing is a bottleneck in itself.

Does this make the PS5 a bad system, no. Does it make XSX a better system, yes. Both Systems sound great, but I'm not real happy with Sony's message as all they did was confuse the public with technical details. Xbox was very clear this time around and can carry the, "Worlds Fastest, Most Power Console" slogan around for some time which is an easy sell.
 
Last edited:
SSD magic doesn't beat raw power directly, but it's sure as hell going to be interesting to see how the first party titles utilize it. I have enough damn TFs on muh nvidia beast, I want more innovation and customization in my consoles. CERNY IS A CUSTOMISED LOVER, HE'S SLOW TO START BUT HE MAKES ME JUICIER AT FASTER CLOCKS.
 

sdrawkcab

Banned
It is pretty insane. They definitely squeezed as much as they could out of just 36CUs. I'd like to see the power consumption at 2.23GHz and I find it insane that Cerny says 2.23GHz can be sustained.
2.23 GHz cannot be sustained. Why? Common sense tells me this.

If it could be sustained, then why isn't the base clock speed 2.23GHz? Why is there even a consideration to mention "up to"?

Why didn't he mention the base clock speed? I would love to know what percentage of usage takes up the 2.23GHz. I say less than 40% of the time, but I'll be guessing.
 

StreetsofBeige

Gold Member
2.23 GHz cannot be sustained. Why? Common sense tells me this.

If it could be sustained, then why isn't the base clock speed 2.23GHz? Why is there even a consideration to mention "up to"?

Why didn't he mention the base clock speed? I would love to know what percentage of usage takes up the 2.23GHz. I say less than 40% of the time, but I'll be guessing.
Ya.

There's was some fishy shit going on with that boost mode. I'm too lazy to find it but I think DF explained it better than Cerny.

You'll never know what the usual combination of cpu + gpu speeds are being used because it depends on the game and what's going on at that moment where the cpu and gpu are adjusting clock speeds to balance each other out.

But by the sounds of it, PS5 running at max 3.5 ghz cpu and 2.23 ghz gpu at the same time isn't happening. And if it magically does, not for very long. As you said, if it can sustained the base clock might as well be 2.23 ghz all the time.

It's like your internet provider saying download speeds are up to 1gb. Chances of getting that all the time is zero.
 
Last edited:
I'll need someone like a tech vetted dev on GAF to elaborate more on this.

You don't need a dev, it's just how chips work. Power consumption increases exponentially with frequency. I had to check up on this because I'm not smart enough to remember it on a day to day basis, but the formula for power consumption is:

"P = a * C * (V*V) * f

Where P is power, C is capacitance, V is the voltage across the gate (typically, Vdd), f is the clock frequency and a is some constant. "

So the power increase per CU for upping clocks is going to be the voltage increase squared, multiplied by the frequency increase.

So a hypothetical 20% clock speed increase paired with a 20% voltage increase would give you something like a 70% power draw increase. It's why Cerny stated that shaving only a few percent off clocks results in a 10% overall power used.

It's possible the PS5 will end up drawing more power than the XSX despite having fewer TF. I have faith in whatever cooling solution Sony have come up with though.
 

Shmunter

Member
2.23 GHz cannot be sustained. Why? Common sense tells me this.

If it could be sustained, then why isn't the base clock speed 2.23GHz? Why is there even a consideration to mention "up to"?

Why didn't he mention the base clock speed? I would love to know what percentage of usage takes up the 2.23GHz. I say less than 40% of the time, but I'll be guessing.
I’ve been thinking about this, and I’ve watched the presentation twice to get a grip on the information overload and it’s making more sense.

Let’s look at how a GPU works today. You play 2 very different games, one spins the crap out of your fan, the other is a whisper. Why does that happen? Because the GPU is under load more in one than the other. We all know that, more load = more power needed. The GPU is still a fixed clock at x MHz.

But why is the gpu clock fixed when the silent game e.g. is only using 50% of the capacity for arguments sakes. I believe the theory is that you can downclock the GPU 50% for this game and retain exactly the same performance to the user.

If you’re able to downclock the GPU, you essentially now have a predictable amount of electricity in reserve that you can direct to boosting a different component e.g. the cpu.

I think I like it, it’s about letting devs harness the console in ways that works best for their individual project. Pushing its baseline out to areas that benefit them, not just a static canvas that may never be fully utilised in one corner, but over used in another.

The theory is sound, a higher starting baseline with this technique is always better, but the concept is valid in all scenarios.
 
Last edited:

Aidah

Member
The main reason for using 36 instead of something bigger is to reduce cost. However here are some of the reasons they think the reduced GPU hardware cost is worth it, and how it's doable:

1. They are implementing variable frequency and pushing GPU frequency really high, with the claim that, at the same teraflop capability, a smaller GPU with a higher clock will give better overall real world performance than a bigger GPU with a lower clock.

2. They are implementing variable frequency in a non-typical way. Instead of basing it on thermals, they're basing it on power usage. That way performance will be consistent across units since it won't be affected by the environment they're in, and from a dev prospective, situations are repeatable and consistent.

3. In situations where the CPU is being stressed to the max but the GPU isn't or vise versa, power split can be shifted to one part more than the other in order for it to sustain the highest clock possible, where the other part gets downclocked without much impact to performance since it wasn't being fully stressed, freeing up unneeded power.

4. In the worst case scenario where both GPU and CPU are being fully stressed and a downclock is required, a small downclock results in a relatively big reduction in power usage, so the downclock should be minimal.

5. The more CUs you have, the harder it gets to use them very efficiently.

6. In terms of thermals and noise management, they've commented on how current gen cooling wasn't adequate and that it would be better this time around, with the details on cooling being revealed at the teardown. So, at the very least it should be better than last time.

In theory, this sounds very smart. However, real world performance is the real judge and it will be very interesting to find out. If this does work out well then yes, a 52CU Xbox that uses this same idea would perform better than a 52CU Xbox that doesn't, i.e. it would give more bang for buck.

CPU performance comparison is a lot more straightforward between the two consoles. (with multithreading enabled) The difference is less than 3%, that is beyond insignificant. However in the worst case scenario, the difference could grow to say 10 or 12 percent. Xbox also has a 3.8 CPU clock without multithreading, which could be useful during the transition period to next gen.

Now all this is purely hardware, and theoretical from our prospective, software and software features will play just as significant of a role in real world performance and percieved quality, Sony hasn't talked about VRS or using AI for instance, where Microsoft has. Xbox could also possibly have a significant advantage in hardware ray tracing capability. Again, given the difference in approach, real world performance is going to be the real judge.

Now going back to the beginning, the reason is to reduce GPU cost. However, this doesn't necessarily mean that the PS5 itself is going to be cheaper than Xbox, as Sony have invested much more in storage capability and Audio, and possibly other things we don't yet know about, like controller features or VR related hardware being built in...etc.

So basically, in terms of hardware, seems like, with the above considered, Sony thinks a much faster storage system and innovation in audio will be a bigger game changer (and it very possibly could be), where Microsoft thinks a bigger GPU, possibly significant advantage in ray tracing performance, and slightly faster CPU, with locked clocks is the way to go.

Personally, from a hardware prospective and being mainly PC, I'm currently finding the PS5 more interesting than Xbox. Reason being that the Xbox is more straightforward and just seems like a worse version of the PC I'm building later this year, in every way. Where PS5 might actually provide hardware capability in some regard that I won't find on PC for at least a while, like the audio engine or the redicioulsly fast storage system, maybe the controller features and whatever else.

As a side note, I'm finding the difference in approach between the two this time very interesting, however I wonder if the significant difference will result in neither of the two being fully utilized.

Alright, I'm done taking a shit. Time to end this.
 
Last edited:

Azurro

Banned
2.23 GHz cannot be sustained. Why? Common sense tells me this.

If it could be sustained, then why isn't the base clock speed 2.23GHz? Why is there even a consideration to mention "up to"?

Why didn't he mention the base clock speed? I would love to know what percentage of usage takes up the 2.23GHz. I say less than 40% of the time, but I'll be guessing.

This is the worst part of the talk yesterday. Not only is the PS5 an underwhelming system, you have people like the one I'm quoting speaking with no idea of what they are talking about.

The system architect mentions the typical performance is 10.3 TF. The great majority of the time you don't need the CPU and GPU running at full blast, and if that situation does arise, then it will downclock by a few percentage points, it's not going to be downclocked to 9 TF.

The whole point of this is to have a consistent power envelope for more reliability that's easier to cool and power can go to where it needs to go.

The system is just underpowered in the number of CUs and bandwidth.
 

Dane

Member
Ya.

There's was some fishy shit going on with that boost mode. I'm too lazy to find it but I think DF explained it better than Cerny.

You'll never know what the usual combination of cpu + gpu speeds are being used because it depends on the game and what's going on at that moment where the cpu and gpu are adjusting clock speeds to balance each other out.

But by the sounds of it, PS5 running at max 3.5 ghz cpu and 2.23 ghz gpu at the same time isn't happening. And if it magically does, not for very long. As you said, if it can sustained the base clock might as well be 2.23 ghz all the time.

It's like your internet provider saying download speeds are up to 1gb. Chances of getting that all the time is zero.

Because reducing to around 2,1ghz would end with 9.65-67 TFLOPS, wouldn't be surprised if the peak permanent clock would be very best 2.2ghz to have 10.00 TF, maybe even slightly less on that.
 

Sacred

Member
A whole PS4 of a difference!


Cerny confirmed to Richard from DF that it's always at 10.28tf.

No this is max boost which was talked about in the presentation, when under full load to keep from over heating the boost MUST be managed which is why they put the word variable under the GPU specs..
 

Shmunter

Member
Ya.

There's was some fishy shit going on with that boost mode. I'm too lazy to find it but I think DF explained it better than Cerny.

You'll never know what the usual combination of cpu + gpu speeds are being used because it depends on the game and what's going on at that moment where the cpu and gpu are adjusting clock speeds to balance each other out.

But by the sounds of it, PS5 running at max 3.5 ghz cpu and 2.23 ghz gpu at the same time isn't happening. And if it magically does, not for very long. As you said, if it can sustained the base clock might as well be 2.23 ghz all the time.

It's like your internet provider saying download speeds are up to 1gb. Chances of getting that all the time is zero.


Both max boosts simultaneously ain’t happening, but one can be sustained at max while the other is at min. It’s about dynamic clocks but a consistent power draw, the power is always level and heat profile is capped.

Also convinced this is entirely under developer control, not some auto throttling like on a phone.
 

Arkam

Member
Cerny is of no consequence to the power requirements. thermal implications and noise output of pushing frequencies that high.

He got out-engineered day and date by Microsoft employees, something people said was impossible. The man isn't a god.
Out engineered? Get the fuck out of here with that absolute garbage statement. He made choices (in a vacuum) and was incorrect on what the competition was doing. Out engineered lol
 

sdrawkcab

Banned
This is the worst part of the talk yesterday. Not only is the PS5 an underwhelming system, you have people like the one I'm quoting speaking with no idea of what they are talking about.

The system architect mentions the typical performance is 10.3 TF. The great majority of the time you don't need the CPU and GPU running at full blast, and if that situation does arise, then it will downclock by a few percentage points, it's not going to be downclocked to 9 TF.

The whole point of this is to have a consistent power envelope for more reliability that's easier to cool and power can go to where it needs to go.

The system is just underpowered in the number of CUs and bandwidth.
Everything you said is correct! And it fully reinforces my point, and the point of almost everyone else; THE PS5 IS THE WEAKER SYSTEM! What's their to debate?
 

ZywyPL

Banned
Now MS ups the clock of the GPU and drops de mic.

Aww man, imagine XBX being OCed to 2GHz, reaching that sweet 13.3GHz PS5 fanboys were dreaming about before the actual reveal ;D


2.23 GHz cannot be sustained. Why? Common sense tells me this.

If it could be sustained, then why isn't the base clock speed 2.23GHz? Why is there even a consideration to mention "up to"?

Why didn't he mention the base clock speed? I would love to know what percentage of usage takes up the 2.23GHz. I say less than 40% of the time, but I'll be guessing.

If you think about it, games like CoD, BF, Fifa, fighting games, sport games, racing games, basically everything that already runs at 60FPS on that weak-ass Jaguar will do way more than fine if the CPU will be downclocked to its minimum value, allowing the GPU to run at constant 2.2Ghz. Because even at 3GHz we are talking about ~4x more capable CPU than what the devs got this gen, even more so with multi-threading. On the contrary, games that will indeed require that 3.5GHz CPU clock will most definitely have implemented dynamic resolution to keep up the framerate whenever GPU's 9TF or so will be insufficient for native res.
 

sdrawkcab

Banned
I’ve been thinking about this, and I’ve watched the presentation twice to get a grip on the information overload and it’s making more sense.

Let’s look at how a GPU works today. You play 2 very different games, one spins the crap out of your fan, the other is a whisper. Why does that happen? Because the GPU is under load more in one than the other. We all know that, more load = more power needed. The GPU is still a fixed clock at x MHz.

But why is the gpu clock fixed when the silent game e.g. is only using 50% of the capacity for arguments sakes. I believe the theory is that you can downclock the GPU 50% for this game and retain exactly the same performance to the user.

If you’re able to downclock the GPU, you essentially now have a predictable amount of electricity in reserve that you can direct to boosting a different component e.g. the cpu.

I think I like it, it’s about letting devs harness the console in ways that works best for their individual project. Pushing its baseline out to areas that benefit them, not just a static canvas that may never be fully utilised in one corner, but over used in another.

The theory is sound, a higher starting baseline with this technique is always better, but the concept is valid in all scenarios.
No one is arguing against the concept. The concept has been working for at least 15 yrs, with CPUs and GPU. What are you people talking about? We're not talking about whether this works or not; of course it works! It's been implemented for years! The argument is (which there shouldn't be!), is that the PS5 is significantly less powerful than the Xbox Series X, that's all. It's not the more powerful system, due to the compromises Sony made. No one is arguing that those compromises don't work. But to sit there and say or even suggest that those compromises/trade offs/sacrifices don't make the system weaker than the Series X is absurd!
 

Azurro

Banned
Everything you said is correct! And it fully reinforces my point, and the point of almost everyone else; THE PS5 IS THE WEAKER SYSTEM! What's their to debate?

The debate is that, yes, the PS4 is an underwhelming machine with underpowered specs. But people shouldn't be misrepresenting what the machine is doing or presenting disingenuous arguments that amount to children going "neener neeneeeeer" in the playground.

The real question is, will this box that is roughly 20 to 25% less powerful be good enough for a next generation leap? How good will 4K support be? Will the performance be similar to the new Xbox with slightly lower graphical settings? Or will it be a 60 fps vs 30 fps difference?

There's enough stuff to be annoyed and disappointed about without having thr misinformation going around.
 

Shmunter

Member
No one is arguing against the concept. The concept has been working for at least 15 yrs, with CPUs and GPU. What are you people talking about? We're not talking about whether this works or not; of course it works! It's been implemented for years! The argument is (which there shouldn't be!), is that the PS5 is significantly less powerful than the Xbox Series X, that's all. It's not the more powerful system, due to the compromises Sony made. No one is arguing that those compromises don't work. But to sit there and say or even suggest that those compromises/trade offs/sacrifices don't make the system weaker than the Series X is absurd!
I replied to your post on sustained clocks, and then you reply with some unrelated mumbo jumbo. Did you lose tract of the conversation? Go have a nap and come back later when your fit for posting.
 

sdrawkcab

Banned
The real question is, will this box that is roughly 20 to 25% less powerful be good enough for a next generation leap? How good will 4K support be? Will the performance be similar to the new Xbox with slightly lower graphical settings? Or will it be a 60 fps vs 30 fps difference?

The "real" question? Really? Of course it's good enough. It's vastly more powerful than the PS4 Pro or Xbox One X! I don't think any rational human being thinks otherwise. And if you have to convince someone that it's quite the console for a new gen, then you'll be wasting your time, as that person clearly cannot be reasoned with.

But, that's the point of this thread, is it. This is a comparison between two great consoles. And between them, one is less powerful than the other. I don't even know why it's a debate. Honestly I love trolling people, and I push their buttons because they let me, but this really isn't that serious to me. But I'll be lying if I said I don't find the strawman arguments being used by Sony fans incredibly amusing. Moving goalposts, or changing the narrative to evade the facts is...it's downright funny.

"I think the PS5 is the more balanced system, hear me out...",
"I've been doing some research, and Mark Cerny made the more powerful system" ,
"The difference is only 18%, tops",
Complete denial and delusion, at best!
 

sdrawkcab

Banned
I replied to your post on sustained clocks, and then you reply with some unrelated mumbo jumbo. Did you lose tract of the conversation? Go have a nap and come back later when your fit for posting.
Go tell me what the base clock is, please? You don't know, do you.

Oh, so because Mark says that they can "maintain that performance most of the time" means nothing to me. Ask him to define "most of the time". Give me percentages Mark. And, in essence ONLY one component can sustain that clock "most of the time", because both can't operate at boost simultaneously. So, which is it? Is it that the GPU sustains its boost clock most of the time (which means the CPU runs at its base clock most of the time)? Or is it that the CPU sustains its boost clock most of the time, which means the GPU runs at its base clock most of the time?

Either way, ONE of those things will never be performing optimally, constantly.

Give me a break. I don't know about you, but common sense seems to be seriously lacking in this place.
 

Neur4lN01s3

Neophyte
So, conclusion is:

Ps5 have faster loading time, but will run 4K (or sub-4K, beacuse 4K is not native but "supported") games at 30 fps (9Tflop overclocked to 10 When underclocking CPU, slower CPU, slower memory bandwidh)
XSX have a little slower loading time, but will run 4K native games at 60 fps (12 Tflop sustained, faster CPU, faster mem access, till 10GB, from gpu)

well I choose better graphics (resolution, assets, framerate) over faster loading time.
 
Last edited:

Shmunter

Member
Go tell me what the base clock is, please? You don't know, do you.

Oh, so because Mark says that they can "maintain that performance most of the time" means nothing to me. Ask him to define "most of the time". Give me percentages Mark. And, in essence ONLY one component can sustain that clock "most of the time", because both can't operate at boost simultaneously. So, which is it? Is it that the GPU sustains its boost clock most of the time (which means the CPU runs at its base clock most of the time)? Or is it that the CPU sustains its boost clock most of the time, which means the GPU runs at its base clock most of the time?

Either way, ONE of those things will never be performing optimally, constantly.

Give me a break. I don't know about you, but common sense seems to be seriously lacking in this place.
It’s almost like you sort of kinda read my post but not really??
 

SleepDoctor

Banned
Can mods make a "Ps5 spin OT" and just merge all this fud into one?

Its really just clogging up the first page with fud and some of these guys have terrible writing and reading comprehension. Couple that with not even understanding the hardware.
 

GymWolf

Member
Little do you all see the hidden gem is the 3d audio. With the 3d audio the system can use sound waves to detect silicon particle positioning in the SSD without transmitting any power through the bus to CPU and GPU The tflops argument become null as the console can theoretically be turned off while the 3d audio sound is running in standby mode. In turn it can create billions of micro vibrations to emulate virtual CUs on the cohesion engine plane therfore boosting the virtual clock speeds to over 3.6Ghz with the machine turned off.
ZmI1PrrM.jpg
 

Fun Fanboy

Banned
Don't the more CU's that Xbox has, also help with that 44% better ray tracing power they were talking about? Or was that something else.
 

wintersouls

Member
As I have read elsewhere...

Yesterday's talk was the one addressed to the experts from the GDC sector, so there were no games to watch, much less design. So it was slow and boring at times.

Understand for once that yesterday was a talk for engineers and programmers not for players


Those who did not understand it and only looked at the TFs, it is a landmark. For those who understood it and then went to Eurogamer to read everything exposed something more simple and concentrated, they loved it
 
Last edited:

Journey

Banned
I cannot believe that Cerny and his team were able to achieve 10.28 Teraflops with just 36 CUs!

XSX needed 52 CU's (16 additional CU's) to achieve an additional 1.87 Teraflops, which is not much of a difference when you really think about it. I guess one of the senior engineers at Microsoft really loves Blink-187 and wanted to pay homage to them in a way.

The power consumption of the XSX will be massive because of its heavy reliance on 52 CU's while the PS5 were cost and power consumption effective with just 36 CU's.

This thread is to start a conversation about whether the XSX's 52 CU's was efficient or not to achieve that additional 1.87 Teraflops gain or not.


The problem is that it's not consistent and there's no data right now that tells us what the average TF number will be in the real world for PS5. Xbox Series X is ALWAYS running at 12.155 TF without breaking a sweat, while PS5 can only achieve its max 10.28 at its peak frequency which has already been express won't be all the time, so the average TF number for PS5 might be 9.5TF while the average TF for XSX remains at 12.155 since it never changes.


When DF or others run the numbers, we could be looking at something like this:

PS5 average TF performance = 9.2 TF with an occasional spike going up to 10.2
XSX avarage TF performance = 12.155 consistently.
 

Journey

Banned
Lol, it's not enough that they have a 12.1 vs 10.3 scenerio, Xbox fans have to spread even more FUD. Are you guys still that worried about the PS5? Cerny said that the system will be running constantly in Boost Mode, due in large part to their cooling system being able to handle that kind of heat. 10.3 Tflops isn't just a rare occurence. That is what it will run at the vast majority of the time. And it will only downclock slightly if the CPU is demanding more.


So not only does Xbox Series X have a CPU running at a higher frequency than PS5, but it will NEVER downclock the GPU no matter how demanding the CPU load is :pie_thinking:

We have a clear winner here boys!
 

Neur4lN01s3

Neophyte
  • PS5 CPU up to @3.5 GHz when GPU is underclocked -LOOSER
  • XSX CPU @3.8 GHz sustained, 3.6 GHz with SMT - WINNER
  • PS5 SSD up to 8-9 GB/s when compressed - WINNER
  • XSX SSD up to 5.5 GB/s when compressed -LOOSER
  • PS5 SSD 825 GB -LOOSER
  • XSX SSD 1000 GB - WINNER
  • PS5 GPU up to 10.2 GFlops when cpu is underclocked -LOOSER
  • XSX GPU 12.1 sustained TFlops - WINNER
  • PS5 GPU bandwidh 448 GB/s -LOOSER
  • XSX GPU bandwidh 560 GB/s - WINNER
  • PS5 CU number (and RayTracing units) 36 -LOOSER
  • XSX CU number (and RayTracing units) 52 - WINNER
  • PS5 Low latency mode: NO -LOOSER
  • XSX Low latency mode: YES - WINNER
  • PS5 fan noise TBA (higher clocks can lead to a very noisy fan)
  • XSX fan noise TBA
 
Last edited:

Shmunter

Member
So...how much are you willing to pay for the weaker system?

I'll pay $399, tops. You?
Heres my take, the PS5 brings more to the next gen table based on what’s been presented.

The extra 2 TF on the XsX does not make up for the innovations and potential that PS5 has for future gaming. The xsx is not much more than a nice gfx card and will have trouble keeping up with games that are built round the ps5 architecture, but not the other way around. There is also the matter of VR as a point of difference which I won’t go into.

Any PlayStation fans initially disappointed by the specs will soon realise that not only should they not be disappointed, but they should rejoice by what is undoubtedly a landmark console that will smash all records next gen and provide the best gaming experiences they have ever imagined. Analysts will soon catch on.

So to answer your question I will not pay $399 for XsX, but I will easily pay $499 for a PS5.
 

sdrawkcab

Banned
As I have read elsewhere...

Yesterday's talk was the one addressed to the experts from the GDC sector, so there were no games to watch, much less design. So it was slow and boring at times.

Understand for once that yesterday was a talk for engineers and programmers not for players


Those who did not understand it and only looked at the TFs, it is a landmark. For those who understood it and then went to Eurogamer to read everything exposed something more simple and concentrated, they loved it
Oh, you read this elsewhere? (Couldn't figure it out on your own, huh?)

Please, tell me more.
 

sdrawkcab

Banned
Heres my take, the PS5 brings more to the next gen table based on what’s been presented.

The extra 2 TF on the XsX does not make up for the innovations and potential that PS5 has for future gaming. The xsx is not much more than a nice gfx card and will have trouble keeping up with games that are built round the ps5 architecture, but not the other way around. There is also the matter of VR as a point of difference which I won’t go into.

Any PlayStation fans initially disappointed by the specs will soon realise that not only should they not be disappointed, but they should rejoice by what is undoubtedly a landmark console that will smash all records next gen and provide the best gaming experiences they have ever imagined. Analysts will soon catch on.

So to answer your question I will not pay $399 for XsX, but I will easily pay $499 for a PS5.
You sound like the type of guy that, if you walk in on your wife in bed with your best friend, and they tell you they were planning a surprise party, you'd show up to that party, expecting cake...
 

Stuart360

Member
Heres my take, the PS5 brings more to the next gen table based on what’s been presented.

The extra 2 TF on the XsX does not make up for the innovations and potential that PS5 has for future gaming. The xsx is not much more than a nice gfx card and will have trouble keeping up with games that are built round the ps5 architecture, but not the other way around. There is also the matter of VR as a point of difference which I won’t go into.

Any PlayStation fans initially disappointed by the specs will soon realise that not only should they not be disappointed, but they should rejoice by what is undoubtedly a landmark console that will smash all records next gen and provide the best gaming experiences they have ever imagined. Analysts will soon catch on.

So to answer your question I will not pay $399 for XsX, but I will easily pay $499 for a PS5.
Oh boy, those DF game comparison vids cant come soon enough, its going to Biblical.
 

Jonsoncao

Banned
Heres my take, the PS5 brings more to the next gen table based on what’s been presented.

The extra 2 TF on the XsX does not make up for the innovations and potential that PS5 has for future gaming. The xsx is not much more than a nice gfx card and will have trouble keeping up with games that are built round the ps5 architecture, but not the other way around. There is also the matter of VR as a point of difference which I won’t go into.

Any PlayStation fans initially disappointed by the specs will soon realise that not only should they not be disappointed, but they should rejoice by what is undoubtedly a landmark console that will smash all records next gen and provide the best gaming experiences they have ever imagined. Analysts will soon catch on.

So to answer your question I will not pay $399 for XsX, but I will easily pay $499 for a PS5.

Why are all these posters emerging to make playstation fans look bad?
 

Shmunter

Member
You sound like the type of guy that, if you walk in on your wife in bed with your best friend, and they tell you they were planning a surprise party, you'd show up to that party, expecting cake...
You sound like a guy that needs to look up some jokes because that’s the lamest thing I’ve read all night, and I’ve seen some real lame shit today.
 

Journey

Banned
PS5 vs Xbox Series X - The TRUTH about which is Faster! | The Tech Chap




You know what's funny? he emphasized what Mark Cerny said about the CUs in PS5 being 62% larger because they're based on RDNA2 vs the PS4 CU's making them equivalent to 58 PS4 CUs.


BUT....

Xbox Series X CUs are also based on RDNA 2 making Xbox Series X equivalent to 84 PS4 CUs :messenger_tears_of_joy:

It's a loop, no matter how you slice it, Xbox Series X is superior, welcome to 2013 all over again!
 
Last edited:

Roxkis_ii

Member
You know what's funny? he emphasized what Mark Cerny said about the CUs in PS5 being 62% larger because they're based on RDNA2 vs the PS4 CU's making them equivalent to 58 PS4 CUs.


BUT....

Xbox Series X CUs are also based on RDNA 2 making Xbox Series X equivalent to 84 PS4 CUs :messenger_tears_of_joy:

It's a loop, no matter how you slice it, Xbox Series X is superior, welcome to 2013 all over again!
So how did being the most powerfull console work out for the Xbox one x? Oh.....


GhqsWOR.gif
 
Last edited:
Top Bottom