• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Dusk Golem reiterates that the Xbox will be more powerful than the PS5 (Admitted to starting console wars, demodded)

Reindeer

Member
Don't know why people are getting emotional over this, I mean it's common sense, unless a game is specifically taken advantage of extra throughput of PS5 SSD then games should always look better on Series X (if devs give both systems equal attention) because it's the more powerful system. It's as simple as that.
 
Last edited by a moderator:

KingT731

Member
"People see what they want to see" - nicy try kid, but I'm not falling for that one - 8.5TF+30% = 11TF, so I don't know, maybe you simply suck at math, or you are indeed frighted, uncertain, and doubtful about your beloved piece of plastic of choice and see everyone/everything as a personal attack, but I don't give a flying fuck about such shitty persons, welcome to my ignore list kid, and get a life it it hurts you so much. Bye.
What's 30% of 12.15? It's 3.645. Now what's 12.15 - 3.645? 8.505
 

T-Cake

Member
It’s all gonna look worse then a pc anyways if you care so much about power.
Here's a fun blind test, lets label these, I bet everyone will have a hard time telling which is which

8TP7gD7.png

The Xbox One X is on the left because the grass is way sharper.
 

SlimySnake

Flashless at the Golden Globes
This is troubling to say the least and something I had anticipated when I heard about Sony going narrow and fast. a 16 CU difference was never going to save them much money and up the costs on cooling solutions. the ssd costs seem to be biting them in the ass.

the BOM math doesnt add up though, PS5 should be cheaper at the very least, but who knows at this point. github was mostly true. anything can happen.

I still dont know how an 18% difference in tflops and 25% difference in vram bandwidth can get you a 4kcb and native 4k difference between games. thats 100% more pixels. even if the xbox fans are right, and the ps5 is 9.2 or even 8 tflops, thats still only a 50% difference. For a 100% difference, the PS5 must be RDNA 1.0 or worse GCN based, have variable clocks that cant even hit 8 tflops consistently let alone 10. It must have severe ram bottlenecks caused by 3d audio. it must have software based ray tracing that is consuming a lot of shader power. The power budget must be lower than that of the xbox which means only certain low power instructions can run the game at 10 tflops, the rest persumably the ones that devs use simply dont have enough power and get downgraded to 5-6 tflops.

All of that has to happen for us to see a 4kcb vs native 4k situation. I am not buying it. sounds like wishful thinking of xbox fans to me.
 

Journey

Banned
Exactly! Like come on bro! No game has been shown on the Xbox Series X dev kits yet, but I'm supposed to believe the gap between the two is PS5 can barely do 1080p but the XSX can easily do 4K?!?!



Percentages ALWAYS matter. It's how you honestly measure the difference between things.



Not necessarily, especially when it comes to memory bandwidth for example. Say you have a 30 gallon tanker to fill, one person has a bucket that holds 10 gallons of water and the other guy has a bucket that holds 9 gallons, see where I'm going? 9 vs 10 is tiny, but 3 trips later the guy with the 10 gallon bucket finished his job while the other guy still has an entire trip to run, in other words, you're looking at just the TF difference without accounting for all the other inner workings of the graphics pipeline. 560GB/s vs 448GB/s can make a difference at higher resolution, especially 4K
 
I'd rather take "fake" 4K with reconstruction and upscaling techniques than "real" 4K that's just needlessly wasting processing power on more raw pixel density, tbqh. It'd spare system power for other tasks related to asynchronous programming models, which would have more of an impact on the actual gameplay.

So with that being the case, I'm hoping MS isn't trying to pursue native 4K in their games or mandating it for 4rd-parties; I think RE3 Remake showed that native is not always the best choice when you can stabilize/improve performance by dropping the native resolution and upscaling instead. Native 4K won't really be required for games until 8K monitors become the household standard, but that is DEFINITELY like 3-4 years away from now.

This is troubling to say the least and something I had anticipated when I heard about Sony going narrow and fast. a 16 CU difference was never going to save them much money and up the costs on cooling solutions. the ssd costs seem to be biting them in the ass.

the BOM math doesnt add up though, PS5 should be cheaper at the very least, but who knows at this point. github was mostly true. anything can happen.

I still dont know how an 18% difference in tflops and 25% difference in vram bandwidth can get you a 4kcb and native 4k difference between games. thats 100% more pixels. even if the xbox fans are right, and the ps5 is 9.2 or even 8 tflops, thats still only a 50% difference. For a 100% difference, the PS5 must be RDNA 1.0 or worse GCN based, have variable clocks that cant even hit 8 tflops consistently let alone 10. It must have severe ram bottlenecks caused by 3d audio. it must have software based ray tracing that is consuming a lot of shader power. The power budget must be lower than that of the xbox which means only certain low power instructions can run the game at 10 tflops, the rest persumably the ones that devs use simply dont have enough power and get downgraded to 5-6 tflops.

All of that has to happen for us to see a 4kcb vs native 4k situation. I am not buying it. sounds like wishful thinking of xbox fans to me.

I think the misconception comes from the fact that, when people are discussing these differences and it's in relation to Series X's advantages, suddenly paper specs are the only thing that matters. But if it were the inverse, we're told paper specs aren't all that matters and the discussion quickly shifts to things well outside the GPU paper specs. I think it's a bit of a double standard tbh.

The truth is if there are custom architecture changes to Series X that give it a bigger performance gap than what the paper specs convey, such as larger L3 cache on the GPU (naturally it has this due to having more CUs, but they could've increased the cache even more than normal), mip-blending hardware to facilitate SFS, GPU customizations to facilitate greatly expanded executeIndirect, an increase in the normal ROPs amount, any hardware they've implemented for hardware implementation of DXR on the GPU, larger L2 cache on the CUs, or something else like any custom silicon that might have facilitated the expanded compute performance for INT 8 and INT 4 calculations (these may not be completely done generically on the CUs directly), etc...could be things he knows about and those kind of customizations, at enough of a scale, would produce a notable gap in performance between the two systems beyond what the paper specs (which only really cover the most general things like CPU clock, GPU clock and CU count) convey.

I guess we'll see in due time, we only got four more days to find out.
 
Last edited:

pawel86ck

Banned
I do not think you understand how this technology works. Just because a game runs with an unlocked frame rate does not imply that the APU needs to be throttled (however minor) to compensate. Just as an example, a GPU’s stream processors often lie idle during rendering; referred to as ‘bubbles’ or pipeline stalling. Asynchronous compute can help fill those bubbles, but based on Cerny’s talk it sounds like this is unlikely to have much if any effect as far as throttling is concerned. He offers an atypical example, and then says throttling by only a couple percent is plenty to curtail heat production.

Hopefully dynamic resolution scaling is commonly employed, because it is a more efficient method for rendering, allowing spare power to be used elsewhere.
Cerny has said 2GHz clock was already too much for PS5 GPU with fixed clock strategy, so why do you think PS5 GPU can sustain even higher clock (2.2GHz) now? Since you know everything then please explain me how technology works 😀.
 
Last edited:

Larvana

Member
Forgive me if I'm misremembering, but haven't we had months of devs talking about how easy the PS5 is to develop for, with better tools and ease of accessing the systems power than the XSX?

Given that pretty much everything we saw at the PS5 Showcase was running on actual PS5 hardware, while possibly everything at the XSS/X one was on PC, if not flat out just CGI for a game barely in production, it seems a bit of a leap to suddenly believe MS's machine is just leagues better than Sony's with most Devs finding the former a breeze and latter a nightmare to create on suddenly.

It may well be true of course, but I find it suspicious given that this all comes from one random guy on twitter, pretty much the day after Xbox had to delay Halo Infinite due to very obvious development problems, as well as speculation Fable was basically vaporware.

It's all a little bit too convenient.
Talking about third party games not first party. Common sense you'll be able to push for bit more effects on xbox series x compared to PS5.
 
Not surprised if true.

Imo is more the case of smarter engineering from MS side. They managed to do a PS4 role reversal, just spending the design efforts to make a good practical performant console.

Sony has to bin the best gpu die to obtain that 2.23ghz variable clocks, they need beefier psu to power the high clocks, they needed a solid heat sink fan with irregular shape to fit the slim and tall console form factor, they have to custom make the 12 channel ssd, the Dual Sense gimmicky vibrations....it all adds up.

You can debate games, but hardware and network wise, MS has left Sony in the dust this round.
You're making an assumption that the SSD costs more to build.
I'm sure they put a lot of money into engineering the SSD's custom controller etc. I.e R&D Expenditure. But that has absolutely jack shit to do with how much that would cost in parts.
As for the actual storage. All Sony need are Nand modules. If calculated correctly, I believe they're using 12x64GiB modules for 865GB. Now using the same density modules for a 1TB drive, you'd need 16.
So a 1TB drive actually costs more in parts than the 865GB module in the PS5.
Not to mention the cost advantages of getting NAND modules directly to map to your custom configuration, rather than buying an off-the-shelf drive.

The bulk of any extra cost will be in the custom memory/IO controller. R&D costs to design that aside, I don't think it would cost significantly more than most other controllers to manufacture at volume. But I can't claim to know that.
Either way, its not cut and dry. High R&D costs, don't necessarily directly translate to high BOM.

You can't do math.

10.28/12.1 x 100 = 0.8495867769 x 100 = 84.95867769% = 84.9% -> 100% - 84.9% = 15.1%

Hence, the difference is 15.1%, NOT 18%.

10.28 is 15% less than 12.1
12.1 is 18% more than 10.28.

What matters is the value you take as 100%. That's how mathematics works.

Cerny didn't engineer shit. He just selected the parts to go on a console and he did a poor job at it.
Wonder what all those patents are about. Wonder what all that R&D went towards, if he just picked parts from a bin and threw them together.
 
I skipped from page 1 to page 7 and this thread is going exactly as i figured it would.

Xbox will be cheaper and more powerful and offer more value in Gamepass at a lower overall cost to the consumer.

Playstation will be more expensive and less powerful but will offer small niche mechanics the Xbox doesn't have, like being able to blow into the controller.

Both systems are going to have half baked games at launch and in a year or two we will start seeing better games.

Playstation will have some amazing looking first party games but suffer when it comes to third party.

The end.. we can close the thread now.
I agree, but please add VR to the discription of PS5 because VR is really coming in hard and fast.
 

truth411

Member
The only issue for the PS5 is the lack of Bandwidth. 448GB isn't enough imo, if they use 16000mhz GDDR6 it would be 512GB, 18000mhz GDDR6 would be 576GB.
The Tempest engine uses up to 20GB
The rest is split between the CPU/GPU.
PS5 games may very well be Bandwidth starved a year or two once launched.
 
Last edited:

RCU005

Member
PS5 needs to be $499/$399 regardless of what the competition does. You all act like every move each company does is in direct response from the competition.

If Microsoft wants to sell their XSX at $299/$199 good for them. It doesn’t mean people will not buy the PS5 because it’s more expansive THAN the other.

However, realistically, the XSX won’t be cheap. People are also acting like Series S could be sold at $99 or something very cheap, and still it won’t matter to to Sony.

Sony must know that they can’t go beyond $499. It doesn’t matter if the XSX is $99 or $699.
 
performance mode means that there are some compromises made (this applies to all games and platforms that have options like this), it is obvious that 30fps will be the target for matching the artistic vision for this game, while 60fps will have things cut back

Lmao what? 60fps means you need to compromise regardless of whether its a performance mode or not.

Point is, Sony will have 60fps modes and 30fps modes. Best of both worlds
 
Last edited:

BluRayHiDef

Banned
You're right...


Here's a fun blind test, lets label these, I bet everyone will have a hard time telling which is which

8TP7gD7.png

I'm looking at this on a 43" 4K TV (LG 43UH6500) and I can tell easily that the one on the left is sharper than the one on the right; everything looks sharper - the trees, grass, rocks, mountains, and characters. However, I have both versions of the game (Xbox one X and PlayStation 4 Pro) and while the Pro's version isn't as sharp as the One X's version, it looks fantastic and not as blurry as it does in the still that's on the right.
 

Javthusiast

Banned
Xbox = not a single ip I wanna play
PS = a ton of exclusive ips I already wanna play, plus all the new ones we will get.

Done deal for me, like it was in every console generation.

I play on my base ps4 still and have zero problems, therefore ps5 being weaker does fuck all to my buying decision.
 
Last edited:

bargeparty

Member
performance mode means that there are some compromises made (this applies to all games and platforms that have options like this), it is obvious that 30fps will be the target for matching the artistic vision for this game, while 60fps will have things cut back

Just how Infinite was compromised for 60fps?
 

pasterpl

Member
Lmao what? 60fps means you need to compromise regardless of whether its a performance mode or not.

Point is, Sony will have 60fps modes and 30fps modes. Best of both worlds

yes, I am not denying that, just fact is that it seems that xbsex games will be designed around 60fps target while PS5 will be designed around 30fps target, we will see if that 2tf advantage can deliver same fidelity at 60fps as ps5 does at 30fps. RE rumours suggest that hitting high performance on xbsex is easier than on ps5 (I am not saying that it is impossible on ps5)
 
Reactions in this thread are particularly enlightening.

7 years ago, and even to this day, the OG Xbox One got ridiculed for the graphical downgrades it faced versus PS4 (900p vs. 1080p, 30 vs 60fps, etc.).

Now that the shoe is on the other foot, attitude towards the underpowered console have radically changed (e.g., "native resolution doesn't matter", "gameplay matters more than graphics", etc.).

Moving the goalposts, much?
 
Last edited:

Journey

Banned
I'm looking at this on a 43" 4K TV (LG 43UH6500) and I can tell easily that the one on the left is sharper than the one on the right; everything looks sharper - the trees, grass, rocks, mountains, and characters. However, I have both versions of the game (Xbox one X and PlayStation 4 Pro) and while the Pro's version isn't as sharp as the One X's version, it looks fantastic and not as blurry as it does in the still that's on the right.


Things usually look fine until you put something better next to it lmao.
 

Elog

Member
I think the misconception comes from the fact that, when people are discussing these differences and it's in relation to Series X's advantages, suddenly paper specs are the only thing that matters. But if it were the inverse, we're told paper specs aren't all that matters and the discussion quickly shifts to things well outside the GPU paper specs. I think it's a bit of a double standard tbh.

The truth is if there are custom architecture changes to Series X that give it a bigger performance gap than what the paper specs convey, such as larger L3 cache on the GPU (naturally it has this due to having more CUs, but they could've increased the cache even more than normal), mip-blending hardware to facilitate SFS, GPU customizations to facilitate greatly expanded executeIndirect, an increase in the normal ROPs amount, any hardware they've implemented for hardware implementation of DXR on the GPU, larger L2 cache on the CUs, or something else like any custom silicon that might have facilitated the expanded compute performance for INT 8 and INT 4 calculations (these may not be completely done generically on the CUs directly), etc...could be things he knows about and those kind of customizations, at enough of a scale, would produce a notable gap in performance between the two systems beyond what the paper specs (which only really cover the most general things like CPU clock, GPU clock and CU count) convey.

I guess we'll see in due time, we only got four more days to find out.

It sounds like you do not believe that the PS5 has the same amount of customizations as the XSX? It is true that we do not know the architecture for these two platforms yet and hopefully we get to know more soon starting with the XSX on Monday. However, I would argue that so far all information indicates that there are more customizations to the PS5 hardware than the XSX.

We will see but I do not think it is a reasonable assumption to assume that the XSX has more advantageous customizations to the GPU than the PS5 as you do above - both based on history and the information released so far.
 

bargeparty

Member
Things usually look fine until you put something better next to it lmao.

Well that's kind of why these comparisons are generally useless (probably from a DF video right?). People will play the games on their console of choice and generally not give a shit about minute differences.

Yeah I can probably recall RDR2 on my PS4 Pro having a slightly soft image, but imo that didn't really detract from the game at all. Some people seem way too focused on sharpness. I saw some shots from a screenshot thread of HZD on PC with a custom reshade and the sharpness was on 1000 and it looked fucking terrible.

I wonder where RDR2 sold more, PS4 or Xbox? I'd be willing to bet PS4.

Another thing I've been wanting to mention with all this RDR2 stuff. I tried the game for both OneX and PS4Pro and guess what, I preferred it on the PS4. The xbox version had this weird... motion to it, like the frametimes were off and it didn't feel as smooth as the PS4 version. That enough solidified my choice.
 
Last edited:
Well that's kind of why these comparisons are generally useless (probably from a DF video right?). People will play the games on their console of choice and generally not give a shit about minute differences.

Yeah I can probably recall RDR2 on my PS4 Pro having a slightly soft image, but imo that didn't really detract from the game at all. Some people seem way too focused on sharpness. I saw some shots from a screenshot thread of HZD on PC with a custom reshade and the sharpness was on 1000 and it looked fucking terrible.

I wonder where RDR2 sold more, PS4 or Xbox? I'd be willing to bet PS4.

What does that have to do with the comparison? PS4 has like 2.5x the install base of the Xbox, of course you'd expect the game to sell more on PS4.
 

Astral Dog

Member
Wait so PS5 struggles with 4K but it's apparently a cake walk for Series X despite the gap in raw power being smaller than it is this gen?

Hmm
2 TF more is a lot for that rendering dude, there will be SOME differences that doesn’t mean the PS5 isn't well designed but XSX is just better in numbers(specially if its a little cheaper after all but we will see)
 

BluRayHiDef

Banned
Can you really do a comparison with 2 completely different shots?

Yes, because the difference in resolution is very obvious, at least on a sizeable 4K screen. I'm using a 43" 4K TV at the moment, and I have a 55" 4K TV as well; the difference is obvious on both.
 

BluRayHiDef

Banned
Really not sure what you're trying to prove there, I can make Xbox One S look negligible compared to One X, a difference of 500% in TF power

NnjNjj3.png





But the reality is the below


3jJ1618.png

I'm not trying to prove anything; I simply showed two unedited screen-captures from the versions of the game that run on the mid-gen refreshes. Also, unlike you, I didn't zoom in or crop anything; I showed the entire frames. So, I don't get your point.
 

SlimySnake

Flashless at the Golden Globes
I'd rather take "fake" 4K with reconstruction and upscaling techniques than "real" 4K that's just needlessly wasting processing power on more raw pixel density, tbqh. It'd spare system power for other tasks related to asynchronous programming models, which would have more of an impact on the actual gameplay.

So with that being the case, I'm hoping MS isn't trying to pursue native 4K in their games or mandating it for 4rd-parties; I think RE3 Remake showed that native is not always the best choice when you can stabilize/improve performance by dropping the native resolution and upscaling instead. Native 4K won't really be required for games until 8K monitors become the household standard, but that is DEFINITELY like 3-4 years away from now.



I think the misconception comes from the fact that, when people are discussing these differences and it's in relation to Series X's advantages, suddenly paper specs are the only thing that matters. But if it were the inverse, we're told paper specs aren't all that matters and the discussion quickly shifts to things well outside the GPU paper specs. I think it's a bit of a double standard tbh.

The truth is if there are custom architecture changes to Series X that give it a bigger performance gap than what the paper specs convey, such as larger L3 cache on the GPU (naturally it has this due to having more CUs, but they could've increased the cache even more than normal), mip-blending hardware to facilitate SFS, GPU customizations to facilitate greatly expanded executeIndirect, an increase in the normal ROPs amount, any hardware they've implemented for hardware implementation of DXR on the GPU, larger L2 cache on the CUs, or something else like any custom silicon that might have facilitated the expanded compute performance for INT 8 and INT 4 calculations (these may not be completely done generically on the CUs directly), etc...could be things he knows about and those kind of customizations, at enough of a scale, would produce a notable gap in performance between the two systems beyond what the paper specs (which only really cover the most general things like CPU clock, GPU clock and CU count) convey.

I guess we'll see in due time, we only got four more days to find out.
yeah, dont forget VRS which doesnt seem to be in the PS5.

way too much obfuscation from devs and sony tbh. just fucking come out and say it. i dont want to be tricked into buying a console.
 

Journey

Banned
I'm not trying to prove anything; I simply showed two unedited screen-captures from the versions of the game that run on the mid-gen refreshes. Also, unlike you, I didn't zoom in or crop anything; I showed the entire frames. So, I don't get your point.


I cropped to hide the labels. My point is that there IS as clear difference between Pro and One X.

The comparison here is 1:1, perfectly fair when representing 4K and working with a much bigger image.

kvyJ2Ax.png








He ma, look at this standalone random screenshot of RDR2 for Xbox One S, looks fine to me, I'll just go ahead and miss the point entirely though and post it anyway, after all if I have nothing to compare to it, it should look fine on Xbox One S, who the heck needs next gen consoles when I can just choose ignore that there's something better out there :messenger_winking_tongue:

UaMoHrJ.png
 
Last edited:
It sounds like you do not believe that the PS5 has the same amount of customizations as the XSX? It is true that we do not know the architecture for these two platforms yet and hopefully we get to know more soon starting with the XSX on Monday. However, I would argue that so far all information indicates that there are more customizations to the PS5 hardware than the XSX.

We will see but I do not think it is a reasonable assumption to assume that the XSX has more advantageous customizations to the GPU than the PS5 as you do above - both based on history and the information released so far.

Elog...Elog...you know that's not what I'm saying. What I'm saying is simply Series systems DO have customizations...and they do. But people tend to ignore that reality and just say they are "PCs in a box". Which, I mean relative to older systems like PS2, Gamecube, Saturn, SNES, MegaDrive etc....BOTH of them are PCs in a box. They're both x86-based, that is a PC architecture primarily.

What you're referring to as "information" are either unsubstantiated rumors (the RDNA3 ones from people like MLID were destroyed by a PS5 software engineer on Twitter btw), or patents that could or could not be reflective of actual hardware in the PS5. There are just as many such patents relating to technologies that could or could not be in the Series systems, what we're trying to gauge is the probability of such patents in a finished retail product.

For instance, the "dual-GPU chiplet" Sony patents of late are 100% not relevant to the PS5 releasing later this year. Maybe a PS5 Pro, maybe just something for servers...the point is, patents are just ideas and you need enough factual evidence (directly from those applying them, general technologies that already have proven equivalents in other fields, etc.) to deduce those patents as being implemented in an actual product.

As for historical precedent, well there are things to explain almost all of that. OG Xbox is actually a pretty customized system despite it being a "Direct X box"; if it were nearly as straigtforward a design as people think, we'd have robust OG Xbox emulators on PC a long time ago, but it hasn't happened. And that has nothing to do with popularity when 3DO and even Jaguar have more mature emulators on the platform. 360 similarly had a lot of custom designs; IIRC it was the first console ever with a unified memory pool, and it had an insanely large framebuffer for the time. The XBO was not even primarily developed for pushing core gaming as its main focus, so several design decisions reflected that. However, it still had various customizations like the Move block (I think that's what they were called).

So there's historical precedent to show that MS has never really literally put a PC in a box and called it a day. If you see a proliferation of common features between the Series systems and PC, it's because Microsoft have leveraged R&D with the Series systems as the targets for developing technological designs and implementations that they then prioritize to spread out to the PC side. It's a means of maximizing the investment costs, among other things like synergizing the console and PC scenes together (which massively benefits 3rd parties). I don't know what range of alterations MS have made to their system (btw I'm not just talking GPU; a lot of word going around seems to indicate they have an unusually large L3 cache on their CPU as well), but is it possible they have either more of them or at least more of the sort that would help their system punch above its weight? Of course. Besides, you can only use historical precedent up to a certain degree, anyway.

But that doesn't mean I don't think both systems have a notable range of customized optimizations, I just wanted to stress that the Series systems do, in fact, have customizations to their design which nullifies any person trying to pigeonhole them as "PCs in a box". That's mostly it.

But yeah, in any case, we'll see to what extent MS have made optimized alterations to their system deviating from the generic RDNA2 spec within four or so days.

yeah, dont forget VRS which doesnt seem to be in the PS5.

way too much obfuscation from devs and sony tbh. just fucking come out and say it. i dont want to be tricked into buying a console.

TBF, VRS is a term MS have coined for their particular implementation of such a technique. The PS5 could have a variant of it, maybe something of a branch off Foveated Rendering, but it could be implemented in the actual design differently.

I would be more surprised if there was no type of equivalent for it on PS5 whatsoever.
 
Last edited:

BluRayHiDef

Banned
I didn't zoom, I did however crop to hide the labels.

Here's the full screen, un-cropped image from the YouTube video that's not ever representing 4K which would make it even a bigger difference on a large screen

kvyJ2Ax.png




Whoops, looks like the difference is still clear, look ma no zoom or crop!


My point is that there IS as clear difference between Pro and One X.



He ma, look at this standalone random screenshot of RDR2 for Xbox One S, looks fine to me, I'll just go ahead and miss the point entirely though and post it anyway, after all if I have nothing to compare to it, it should look fine on Xbox One S, who the heck needs next gen consoles when I can just choose to be ignorant about there being something better out there lmao

UaMoHrJ.png

But I never said that the difference was not clear; I've said from the beginning that it is clear (here's the post in which I said so Link). So, why are you arguing against me? As for my picture comparison, it wasn't accompanied by any commentary and therefore there was no indication that I posted it to prove a point; I simply wanted to see people's genuine opinions about the comparison to determine whether the difference is obvious to most people or not.
 

yurinka

Member
performance mode means that there are some compromises made (this applies to all games and platforms that have options like this), it is obvious that 30fps will be the target for matching the artistic vision for this game, while 60fps will have things cut back
Halo Infinite visuals also show that compromises were made there. But at least in Spiderman there is an option, so you can choose whatever you prefer: the better looking version or the one with higher FPS.

Because a small difference in graphics won't matter in the long run.
In the big majority of the cases people can't say which version looks better just looking the games themselves. Are only able to do it if looking a DF or similar comparision video next to each other, with FPS counter and so on.

Cerny has said 2GHz clock was already too much for PS5 GPU with fixed clock strategy, so why do you think PS5 GPU can sustain even higher clock (2.2GHz) now? Since you know everything then please explain me how technology works 😀.
Cerny said it was too much for the previous architecture, not for the one they had in PS5. In fact, you "forgot" to mention that just after that Cerny said that he explained that one of the reasons of why they did choose that architecture was to allow a higher clock without experiencing issues.
 
Last edited:

Journey

Banned
But I never said that the difference was not clear; I've said from the beginning that it is clear (here's the post in which I said so Link). So, why are you arguing against me? As for my picture comparison, it wasn't accompanied by any commentary and therefore there was no indication that I posted it to prove a point; I simply wanted to see people's genuine opinions about the comparison to determine whether the difference is obvious to most people or not.


Gotcha, I was replying to javthusiast initially and got my cables crossed.
 
Top Bottom