• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

(*) Ali Salehi, a rendering engineer at Crytek contrasts the next Gen consoles in interview (Up: Tweets/Article removed)

rnlval

Member
Yeap but it lags compared with AMD solution that is truly simultaneous (it executed both at same time/wave).
nVidia solution uses a queue running with render tasks... it runs in one wave render and another computer of course using a priority level.

What Turing added is a feature that in critical compute tasks it can jump the queue to be executed first to not affect the criticality of that compute task.

nVidia GPUs are faster enough to do that.
 

SonGoku

Member
You're in conflict with Lisa Su's Big NAVI's arrival.
The point of the paper is: the more CUs the more threads needed to reach higher CUs utilization
I don't think anyone can argue that for any given game PS5 ALUs will reach higher percentage of utilization compared to XSX, neither will reach 100% outside of edge cases (if they exist).

Just in case: XSX will still be overall stronger!
 
Last edited:

rnlval

Member
The point of the paper is: the more CUs the more threads needed to reach higher CUs utilization
I don't think anyone can argue that for any given game PS5 ALUs will reach higher percentage of utilization compared to XSX, neither will reach 100% outside of edge cases (if they exist).

Just in case: XSX will still be overall stronger!
Whitepaper doesn't beat demonstrated results
 

ethomaz

Banned
Your AMD whitepaper usage to support your argument is flawed.

Hint: 4K resolution leads to wider parallelism.
:messenger_tears_of_joy: :messenger_tears_of_joy: :messenger_tears_of_joy:

Next made up excuse please kkkkkkk


Whitepaper doesn't beat demonstrated results
A demonstration that supports the writepaper like the Gears should run way way better than RTX 2080 to support your point and not close to RTX 2080 but you fail to understand your own comments.

PS5 with 10TFs should reach RTX 2080 results with good utilization to the CUs.
 
Last edited:
Hold your fanboy horses, the original one is back......

I am just saying both consoles will be similarly bandwidth limited compared to a 2080TI....which they BOTH are,.....and I was respondiong to a poster comparing TF of nvidia cards to try to show Ps5 wil be poor or whatever crap he was spouting.

My point is that anyone expecting a massive difference and one console will blow another away is probably wrong, they will not be too different, yes 560 >> 448 but 448 >> 336, both are not IDEAL bandwidth set ups and lets hope they both have some tricks..

If you want to read a good explantion, Lady Gaia does a nice one :

And trying to convince me 560 here and 336 there is good solution is just funny, its not much better than 448 average, but believe what you want.

KzQH8Wc.png


The difference between me and you is I recognise both consoles are equally good / equally deficient, you just see one side only (fanboy blinkered view). Read what you typed.

Nope, all nonsense. I know the PS5 is a good console, and I'm not saying it isn't. I just don't buying certain things regarding the variable clock speed that is always that high and never drops in any meaningful way. I don't buy it, and nobody with a working brain should either. Also you're way wrong in stating Series X won't clearly outperform it by a very noticeable margin. The PS5 will produce exclusives that easily look better than plenty of xbox series x exclusives, and the same will be true in the reverse. But on multi-platforms, especially further into the gen, the Series X is going to greatly outperform the ps5. That's not up for debate in my eyes.

PS5 has yet to even confirm some of the features that will contribute to that performance gap, such as VRS, Texture Space Shading, Sampler Feedback, Mesh Shaders. Apparently Primitive Shaders aren't the same as Mesh Shaders, though it shares some similarities.

As to Lady Gaia's explanation and the CPU queuing up work, I assume she isn't 100% correct about that.


GPU Work CreationXbox Series X adds hardware, firmware and shader compiler support for GPU work creation that provides powerful capabilities for the GPU to efficiently handle new workloads without any CPU assistance. This provides more flexibility and performance for developers to deliver their graphics visions.
 
Last edited:

rnlval

Member
:messenger_tears_of_joy: :messenger_tears_of_joy: :messenger_tears_of_joy:

Next made up excuse please kkkkkkk



A demonstration that supports the writepaper like the Gears should run way way better than RTX 2080 to support your point and not close to RTX 2080 but you fail to understand your own comments.

PS5 with 10TFs should reach RTX 2080 results with good utilization to the CUs.
Your arguments are ridiculous.
 

rnlval

Member
Results don't contradict the whitepaper, you are not looking at the full picture and taking variables into account.

4K doesn't enable 100% utilization and smaller polygons impact parallelism
100% is rare if it exists at all
Your theoretical arguments was wreaked by XSX's demonstrated Gears 5 results.
 
I'd just like to add to your comment so there isn't any confusion, the takeaway here is that XSX needs the extra bandwidth to feed the more powerful GPU that's why they went for this weird comprise instead of 16GB @448GB/s.
Both consoles have roughly the same amount bandwidth available proportionate to their GPUs peak performance or put in more laymans terms: similar GB/s per TF.

As far as bandwidth goes both consoles are equally good or equally bottleneckecked depending of how you look it (glass half full/empty)

The thing I think people don't seem to get is I don't believe at all the series x has some bottleneck lol. Just because the Series X memory setup isn't done in the most traditional way doesn't make it a bottleneck for performance. I don't see what stops devs from quickly swapping data in and out as necessary from either the SSD or the slower portion of RAM if it needs the highest bandwidth performance. And, again, that 10GB of GPU optimal memory is more than either the RTX 2080 or 2080 Super, literally one below the 2080 Ti. In addition to the fact that not everything in RAM in the first place is for graphics processing in the first place, there's other parts to a game as well, and those parts don't consume the highest speeds to begin with. They have more than enough RAM on the Series X, or are people going to tell me that the RTX 2080 and Super don't have enough. The 2080 Ti only has 1 more GB of RAM than Series X's GPU optimal memory.

And to stress again, Sampler Feedback is specifically designed to increase physical memory efficiency, and it's apparently so effective that it can/will lead to an effective multiplier of 2 to 3 times physical memory. If what Microsoft said is true and there's a lot of wastage that comes from textures being loaded into RAM that are never actually visible on screen, and they've come up with a far more efficient solution that cuts down on memory usage significantly, then that automatically changes a lot of people's calculations about that 10GBs of GPU optimal memory and what it can accomplish.

And as far as the GPUs having roughly the same amount of bandwidth available proportionate to their GPU's peak performance, that doesn't really mean anything to be honest. And keep in mind I'm not saying the PS5 sucks or is weak. No, it's a beast, and because the power the PS5 has is more than enough (shit even less than that would be more than enough), it will easily produce titles that are more impressive than some games that will release on Series X. Developers and the games matter, and always will matter, but the Series X in terms of pure performance is in another league based on the information we have.

I mean, people can say whatever, but isn't it telling that after a full DF deep dive and a full deep dive from the system architect there's still no mention of VRS, Texture Space Shading, Sampler Feedback, Mesh Shaders. I'm assuming the PS5 may potentially have the newer ray tracing based features announced because of everything Cerny said about it in his deep dive. And, no, we can't assume that just because it may be some version of RDNA2 that it automatically has all the features confirmed for RDNA2 PC and DirectX 12 Ultimate.

Even Mark Cerny himself seemed to be implying as much in his presentation. It's a custom chip, and what he seemed keen to discuss about the GPU was the gpu cache scrubbers, primitive shaders (Vega/RDNA1 feature) and ray tracing. If I'm leaving anything out let me know.
 

ethomaz

Banned
Your quote basically describes what close to metal API in consoles does.
 
Your theoretical arguments was wreaked by XSX's demonstrated Gears 5 results.

You said it, the Gears 5 results speak for themselves. RTX 2080 class performance on just 1 month of work? Better than Gears 5 PC Ultra settings at 4K, running at over 100fps? Come the fuck on, that's incredible. Real results put to bed the myth of the mystical bottleneck.
 

ethomaz

Banned
You said it, the Gears 5 results speak for themselves. RTX 2080 class performance on just 1 month of work? Better than Gears 5 PC Ultra settings at 4K, running at over 100fps? Come the fuck on, that's incredible. Real results put to bed the myth of the mystical bottleneck.
That result is not in 4K lol
It was misinterpreted.

Gears 5 runs at 4K Ultra on Xbox a bit below RTX 2080.

Using fake results to try to make a pint does the opposite lol
 
Last edited:
Your quote basically describes what close to metal API in consoles does.

In other words, Lady Gaes (spelled wrong maybe) analysis would be wrong then, since the Xbox Series X GPU doesn't always require the CPU to queue up work for it. And even if it did, which it doesn't, that's to be expected in a console. The CPU doesn't just sit around doing nothing at all.
 

ethomaz

Banned
In other words, Lady Gaes (spelled wrong maybe) analysis would be wrong then, since the Xbox Series X GPU doesn't always require the CPU to queue up work for it. And even if it did, which it doesn't, that's to be expected in a console. The CPU doesn't just sit around doing nothing at all.
It needs.
You need CPU no matter how close to metal you are.
It just you need very little to do the GPU calls.

To you understand any GPU call needs to be done by CPU... GPU can’t work alone it needs the CPU to say what it will do.

After the CPU says what GPU will do... the GPU won’t need more CPU assistance until the next call... that is what any API close to metal does.
 
Last edited:

SonGoku

Member
Your theoretical arguments was wreaked by XSX's demonstrated Gears 5 results.
Not really
The thing I think people don't seem to get is I don't believe at all the series x has some bottleneck lol
I don't think its a botleneck either at least not inherently, the main takeaway is both consoles have similar bandwidth available proportional to their GPUs needs.
Of course more would be better but consoles are a compromise.
And as far as the GPUs having roughly the same amount of bandwidth available proportionate to their GPU's peak performance, that doesn't really mean anything to be honest.
It means it doesn't have a surplus bandwidth compared to PS5. The more powerful GPU needs more bandwidth to feed it and materialize its computational advantage.
PS5 games will run at 10-20% lower dynamic resolution, that should free enough GPU & bandwidth resources to match XSX settings and performance
Series X in terms of pure performance is in another league based on the information we have.
I wouldn't call 17-20% difference another league, its a way smaller difference than current gen 40%
As far as the features mentioned they are present on PS5 as well
 
Last edited:
That result is not in 4K lol
It was misinterpreted.

Gears 5 runs at 4K Ultra on Xbox a bit below RTX 2080.

Using fake results to try to make a pint does the opposite lol

Actually, no, I think you're wrong there, the game itself, as in real gameplay, was running at 4K, and at settings beyond PC Ultra.


The team showcased a technical demo of Gears 5, powered by Unreal Engine, for Xbox Series X using the full PC Ultra Spec settings, which included higher resolution textures and higher resolution volumetric fog, as well as a 50% higher particle count than the PC Ultra Specs allowed. They also showed off the opening cutscene, which now runs at 60 FPS in 4K.

The obvious implication here is that if the cutscene is running at a rock solid 4K 60fps, then surely the gameplay is running at 4K. Come on now. Can you actually disprove a single thing I just said? This isn't from Digital Foundry, this one is straight from Microsoft themselves. So did Microsoft also misinterpret their own console? Ultra settings with elements that were beyond what the PC Ultra setting has, and it was killing it performance wise.

The actual benchmark that ran super close to under RTX 2080 performance, that one was running without the beyond PC ultra settings, but equivalent PC ultra. The actual benchmark is designed to be a little bit more punishing than the actual game, that's what ran slightly below the RTX 2080 in performance at equivalent PC Ultra settings. And the reason it was behind on some elements had to do with the fact that the PC running the RTX 2080 had a beast of a CPU that further skewed results towards the PC side, which DF confirmed.
 

ethomaz

Banned
Actually, no, I think you're wrong there, the game itself, as in real gameplay, was running at 4K, and at settings beyond PC Ultra.




The obvious implication here is that if the cutscene is running at a rock solid 4K 60fps, then surely the gameplay is running at 4K. Come on now. Can you actually disprove a single thing I just said? This isn't from Digital Foundry, this one is straight from Microsoft themselves. So did Microsoft also misinterpret their own console? Ultra settings with elements that were beyond what the PC Ultra setting has, and it was killing it performance wise.

The actual benchmark that ran super close to under RTX 2080 performance, that one was running without the beyond PC ultra settings, but equivalent PC ultra. The actual benchmark is designed to be a little bit more punishing than the actual game, that's what ran slightly below the RTX 2080 in performance at equivalent PC Ultra settings. And the reason it was behind on some elements had to do with the fact that the PC running the RTX 2080 had a beast of a CPU that further skewed results towards the PC side, which DF confirmed.
That you misinterpretation again.

There is two parts of the article.

One saying 4k.

“The team showcased a technical demo of Gears 5, powered by Unreal Engine, for Xbox Series X using the full PC Ultra Spec settings, which included higher resolution textures and higher resolution volumetric fog, as well as a 50% higher particle count than the PC Ultra Specs allowed. They also showed off the opening cutscene, which now runs at 60 FPS in 4K.”

Another saying 4k.

“Rayner also shared that the game is already running over 100 FPS and that the team is investigating implementing 120 FPS gameplay for multiplayer modes, giving players an experience never before seen on consoles.”

The first one is 60fps.
The second one is 100fps with what they are trying to archive with MP... they want to run at 120fps even if they need to drop the resolution... that is why they didn’t say it is 4K.

The first showcase is what DF showed and confirmed to be close to RTX 2080 performance in 4K Ultra... that will probably be what you will play on campaign.

Or maybe they will give players to options for campaign... 4K Graphical and 1080p (or some other resolution) performance with added effects that makes sense.

Edit - Reading a bit more the 100fps is really the MP mode that they are aiming to 120fps but it is below 4K for that.
 
Last edited:
Not really

I don't think its a botleneck either at least not inherently, the main takeaway is both consoles have similar bandwidth available proportional to their GPUs need.
Of course more would be better but consoles are a compromise.

It means it doesn't have a surplus bandwidth compared to PS5. The more powerful GPU needs more bandwidth to feed it and materialize its computational advantage.
PS5 games will run at 10-20% lower dynamic resolution, that should free enough GPU & bandwidth resources to match XSX settings and performance

I wouldn't call 17-20% difference another league, its a way smaller difference than current gen 40%

The difference is bigger than 17-20%, I wouldn't go solely off the teraflop count alone, especially since the Series X GPU has more confirmed advanced gpu features that will stretch that performance quite a bit. The PS5 GPU clock speeds are also, again, variable for a reason. Even a 5% drop in frequency makes it a 9 teraflop GPU in the same range as the 5700XT. Some here might think it will never drop by at least 5%, but it will probably drop further, but let's assume it doesn't, and only drops 5%, that's a 9.x teraflop PS5 GPU.

Next, there are those important features like Texture Space Shading, Sampler Feedback, VRS, Mesh Shaders. None of those are confirmed for the PS5.. yet. I assume machine learning is also there due to an old wired piece, but cerny didn't seem to detail any of that in his deep dive either.

As far as the bandwidth argument, I again think that's a pretty pointless exercise because of course the more powerful GPU needs more memory bandwidth to feed it. That's one of the reasons more powerful GPUs outperform less powerful ones. This is why the 2080 Ti couldn't possibly have the same bandwidth as the RTX 2080, and why the slightly faster RTX 2080 Super also needed it's memory bandwidth bumped up. The Xbox Series X has the perfect amount of memory bandwidth for its GPU. I think the gap is bigger than with the ps4 pro and xbox one x due to those series x gpu features. And DirectX 12 engines should become far more common, which means Series X will really be stretching its legs. The original Xbox One, though it supported some, never got to benefit from the full range of DirectX 12 features because it simply didn't support them all.

I mean, I still don't think people comprehend how big a deal some of these confirmed xbox series x features are.


Sampler Feedback
The final marquee feature for Direct X 12 Ultimate/feature level 12_2 is sampler feedback. This is a very new feature that has only recently been exposed, and has received very little publicity so far; though like everything else here, the hardware capabilities first showed up in Turing.


Previously demoed by NVIDIA as texture-space shading, sampler feedback is a broader feature with a few different uses. At a very high level, the idea behind sampler feedback is to allow game engines to track how the texture samplers are being (or will be) used – thus, the samplers give feedback to the engine – allowing the engine to make more intelligent decisions about how the samplers are used and what resources are kept in VRAM.

The principle use case for this, Microsoft envisions, will be in improving texture streaming. By using sampler feedback, game engines can determine what texture tiles are actually going to be needed, and thus only loading up the necessary tiles. This keeps overall VRAM pressure down, ultimately allowing developers to use higher quality textures overall by losing less VRAM to unneeded tiles. Fittingly for the Xbox Series X, this is especially handy when your games are stored on a high speed SSD, as it means the necessary tiles can be pulled in from storage incredibly quickly (almost in a just-in-time fashion), instead of having to stage them in RAM or take measures to mitigate the long access time of a HDD.


Even mesh shaders, one of the high benefits of those is significantly cutting down on the memory bandwidth cost of using far more complex geometry. So everything we've seen confirmed for the Series X thus far seems tailor made for allowing the available resources to be used for a far better end result than what would have been possible without these features. Developers will have to build them into their engines, but with DirectX 12 Ultimate bringing PC and console much closer together, i can now see that happening.
 
That you misinterpretation again.

The first part of your quote is not 4K.
The second part of your quote is.

That is one show case not 4k.

“The team showcased a technical demo of Gears 5, powered by Unreal Engine, for Xbox Series X using the full PC Ultra Spec settings, which included higher resolution textures and higher resolution volumetric fog, as well as a 50% higher particle count than the PC Ultra Specs allowed”.”

That is another showed in 4K.

“They also showed off the opening cutscene, which now runs at 60 FPS in 4K.”

The first one is related to what they are trying to archive with MP... they want to run at 120fps even if they need to drop the resolution... that is why they didn’t say it is 4K.

The second showcase is what DF showed and confirmed to be close to RTX 2080 performance in 4K Ultra... that will probably be what you will play on campaign.

Or maybe they will give players to options for campaign... 4K Graphical and 1080p (or some other resolution) performance with added effects that makes sense.

Edit - Reading a bit more the 100fps is really the MP mode that they are aiming to 120fps but it is below 4K for that.


Uhh, there is a video showing it.. and there is actual gameplay, running at 4K with the same exact higher than pc ultra settings for us to see. This is the tech demo, running at 4K, not just the cutscene at settings beyond PC Ultra... Why would they be able to run the more impressive cutscenes at higher than PC Ultra settings at 4K and a flawless 60fps, but somehow NOT be able to run the less flashy actual gameplay at 4K, come on.



Have you not actually played Gears 5 at all? That 4K cutscene transitions RIGHT into actual gameplay at the start of the game with the same settings that are beyond PC's Ultra at 4K. We have the footage, so this is why I find it so bizarre you keep denying this.

Also, the Gears 5 internal benchmark that ran slightly below RTX 2080 perform was a 2 week old unoptimized port using NONE of the Series X's new GPU features. In other words, once it does, it probably beats the RTX 2080. Then again, the RTX 2080 ALSO supports those same features, so the RTX 2080 will be getting a performance boost once their implemented also, but I think it's safe to say Series X will likely beat RTX 2080 because everything Series X did it did on a 2 week old unoptimized port. That's some impressive shit.
 

ethomaz

Banned
Uhh, there is a video showing it.. and there is actual gameplay, running at 4K with the same exact higher than pc ultra settings for us to see. This is the tech demo, running at 4K, not just the cutscene at settings beyond PC Ultra... Why would they be able to run the more impressive cutscenes at higher than PC Ultra settings at 4K and a flawless 60fps, but somehow NOT be able to run the less flashy actual gameplay at 4K, come on.



Have you not actually played Gears 5 at all? That 4K cutscene transitions RIGHT into actual gameplay at the start of the game with the same settings that are beyond PC's Ultra at 4K. We have the footage, so this is why I find it so bizarre you keep denying this.

Also, the Gears 5 internal benchmark that ran slightly below RTX 2080 perform was a 2 week old unoptimized port using NONE of the Series X's new GPU features. In other words, once it does, it probably beats the RTX 2080. Then again, the RTX 2080 ALSO supports those same features, so the RTX 2080 will be getting a performance boost once their implemented also, but I think it's safe to say Series X will likely beat RTX 2080 because everything Series X did it did on a 2 week old unoptimized port. That's some impressive shit.

Two demonstration.

SP 4K Ultra aiming to 60fps.
MP sub-4k Ultra aiming to 120fps.

With 12TFs it should easily beat RTX 2080.

Now to run at 4k Ultra with added effects at 100fps it needs more than double of power of RTX 2080.

That is why MS didn’t say 4k for the MP 100fps demonstration.
 
Last edited:
And here's the section of the video where Richard from DF points out that, without even using the Series X's new GPU features, a 2 week old unoptimized port on the Gears 5 internal benchmark at equivalent PC settings practically matched the RTX 2080.



Also, if you read the xbox article carefully, the over 100fps was not specifically just about multiplayer. Multiplayer was just mentioned as an additional potential benefit of the kind of performance they're seeing from the system. It was specifically about everything that was mentioned up to that point regarding a 4K beyond PC ultra version of gears with higher textures, particles, contact shadows and some ray traced global illumination.
 

ethomaz

Banned
And here's the section of the video where Richard from DF points out that, without even using the Series X's new GPU features, a 2 week old unoptimized port on the Gears 5 internal benchmark at equivalent PC settings practically matched the RTX 2080.



Also, if you read the xbox article carefully, the over 100fps was not specifically just about multiplayer. Multiplayer was just mentioned as an additional potential benefit of the kind of performance they're seeing from the system. It was specifically about everything that was mentioned up to that point regarding a 4K beyond PC ultra version of gears with higher textures, particles, contact shadows and some ray traced global illumination.

They didn’t say 4k for 100fps lol
You need to selective read and add 4k to that happen.
Both MS and DF state only Ultra, add features running at 100fps.
 
Last edited:
Two demonstration.

SP 4K Ultra aiming to 60fps.
MP sub-4k Ultra aiming to 120fps.

With 12TFs it should easily beat RTX 2080.

Now to run at 4k Ultra with added effects at 100fps it needs more than double of power of RTX 2080.

That is why MS didn’t say 4k for the MP 100fps demonstration.

There is a video showing the actual gameplay footage running on the xbox series x at 4K 60fps beyond PC Ultra settings with DF even dropping the technical details of what the game was running at in the lower right hand portion of the screen. Are you intentionally denying the truth right now?



3OgybwC.png


3OgybwC.png


ME5iZ6Q.png


fkGl7wE.png



With these settings all enabled, the game transitions directly into actual gameplay. The Xbox Series X on a 2 week old port was already running Gears 5 better than PC Ultra settings at 4K and at over 100fps. New contact shadows, real-time screen space global illumination and 50% more particles. We have the footage my man.
 

ethomaz

Banned
There is a video showing the actual gameplay footage running on the xbox series x at 4K 60fps beyond PC Ultra settings with DF even dropping the technical details of what the game was running at in the lower right hand portion of the screen. Are you intentionally denying the truth right now?



3OgybwC.png


3OgybwC.png


ME5iZ6Q.png


fkGl7wE.png



With these settings all enabled, the game transitions directly into actual gameplay. The Xbox Series X on a 2 week old port was already running Gears 5 better than PC Ultra settings at 4K and at over 100fps. New contact shadows, real-time screen space global illumination and 50% more particles. We have the footage my man.
Yes.

4k Ultra 60fps
Unknown resolution Ultra 100fps

These was what they showed..
 
Last edited:
They didn’t say 4k for 100fps lol
You need to selective read and add 4k to that happen.
Both MS and DF state only Ultra, add features running at 100fps.

Now you're just trolling. They never at any point state that the game is NOT running at 4K in any example they gave. The only resolution you hear them mention is 4K. What exactly are you playing at? Where do they mention sub 4K resolutions exactly? :messenger_tears_of_joy:

You gotta stop spreading this fud. Your bias is really showing, jeez...

Seriously, show me where you are getting sub 4K resolution from when that is NEVER said? If you can't show me, then we know you aren't being serious, and should not be taken as such.
 
Last edited:

ethomaz

Banned
Now you're just trolling. They never at any point state that the game is NOT running at 4K in any example they gave. The only resolution you hear them mention is 4K. What exactly are you playing at? Where do they mention sub 4K resolutions exactly? :messenger_tears_of_joy:

You gotta stop spreading this fud. Your bias is really showing, jeez...
Not trolling just telling you what was showed and you keep misinterpreting.
 

ethomaz

Banned
Coalition only showcased a 4K version of actual gameplay of the game to Digital Foundry running at higher than PC Ultra settings...
To DF yes... they didn’t show 100fps.
All footage aiming 60fps.

But Coalition has the game with sub-4k resolution at Ultra running at 100fps aiming to 120fps.

They don’t state 4k for the 100fps.
 
Last edited:

ethomaz

Banned
Alright, simple question. Where does it say anywhere on DF's coverage or Microsoft's website that what they showed was running at sub 4K resolution? Where? I'm waiting.
Where it says it is running at 4k and 100fps in MS site or DF?
MS site says 100fps but not 4K. What means not being 4k? I’m waiting.

You are adding works where it doesn’t exists.

BTW the 100fps quote.

“Rayner also shared that the game is already running over 100 FPS and that the team is investigating implementing 120 FPS gameplay for multiplayer modes, giving players an experience never before seen on consoles.”

1: 4k Ultra added effects 60fps
2: Unknown resolution 100fps aiming to 120fps.
 
Last edited:

SonGoku

Member
especially since the Series X GPU has more confirmed advanced gpu
Your whole thesis rests on Sony not mentioning it yet therefore it doesn't exist 🤦‍♂️? How does that make any sense to you
You are suggesting PS5 lacks basic RDNA2 features without any supporting evidence. Especially odd considering both PS4 & Pro had the latest architectural enhancements plus some more

By that logic PS4 wouldn't have tessellation since Sony didn't mention it on the reveal

The PS5 GPU clock speeds are also, again, variable for a reason.
Going by information available and developers comments it stays at 10-10.27TF most of the time which is where my 17-21% estimates come from.
But just to satisfy you lets say it hypothetically drops to 9.2TF that's still only 30% difference (smaller than pro vs x)
Next, there are those important features like Texture Space Shading, Sampler Feedback, VRS, Mesh Shaders. None of those are confirmed for the PS5..
Those are all basic global RDNA2 features that will be present on PS5 and RDNA2 cards
Theres no info that suggests PS5 will use a stripped down RDNA2 and how would you possibly justify it? The assumption its just so detached from reality with no sound logic behind it.
I assume machine learning is also there due to an old wired piece, but cerny didn't seem to detail any of that in his deep dive either.
So glad you bring this up as it'll help you understand that just because a basic feature isn't mentioned it doesn't mean its not there. It just wasn't the focus of the presentation

That's one of the reasons more powerful GPUs outperform less powerful ones
Yes! they need the extra bandwidth to materialize their compute advantage, which in XSX case is 17% to 21%

Question: How many multiplatforms consistently showing a 10-20% resolution difference at same settings and performance will it take for you to accept the gap is way smaller than current gen?
Where it says it is running at 4k and 100fps in MS site or DF?
MS site says 100fps but not 4K. What means not being 4k? I’m waiting.
Hilarious hypocrisy
-"PS5 has a RDNA2 GPU without any RDNA2 features because they didn't confirm them yet"
-"Gears was running at 4k/100fps because they didnt deny it"

I don't know the gears situation i just find his double standard criteria hilarious
lol
 
Last edited:
Where it says it is running at 4k the 100fps mode in MS site or DF?
What means not being 4k? I’m waiting.

You are adding works where it doesn’t exists.

BTW the 100fps quote.

“Rayner also shared that the game is already running over 100 FPS and that the team is investigating implementing 120 FPS gameplay for multiplayer modes, giving players an experience never before seen on consoles.”

4k Ultra added effects 60fps
Unknown resolution 100fps aiming to 120fps.

Alright, i'll accept that the over 100fps is multiplayer for the sake of argument. And upon asking someone, I'm leaning towards you being right on that part. And even DF says in the below video that they didn't see the 120fps footage.

But back to 4K gameplay at beyond PC Ultra settings, I intentionally saved this video for last because I wanted to see just how far you would go in denying what was clearly shown, largely because you know it goes against every bit of the claims that the series x has some weird performance issue that will keep it from reaching what the specs claim. Here is the 14 minute video of exactly that beyond PC Ultra settings version of Gears 5 running on the Series X.



"The game is 60, the cutscenes are 60." exact quote Not "aiming" for 60 like you've been saying. These settings are beyond PC's ultra settings, and here is a 14 minute video showing the entire thing. Are you ready to admit it now? They even show features and pick apart stuff in the Series X demo that was clearly not in the PC Ultra version of the game.Hell, just looking at the performance you can tell how good it's running. 2 weeks old port, not even using the new GPU features that would make it run even better.

 

ethomaz

Banned
Alright, i'll accept that the over 100fps is multiplayer for the sake of argument. And upon asking someone, I'm leaning towards you being right on that part. And even DF says in the below video that they didn't see the 120fps footage.

But back to 4K gameplay at beyond PC Ultra settings, I intentionally saved this video for last because I wanted to see just how far you would go in denying what was clearly shown, largely because you know it goes against every bit of the claims that the series x has some weird performance issue that will keep it from reaching what the specs claim. Here is the 14 minute video of exactly that beyond PC Ultra settings version of Gears 5 running on the Series X.



"The game is 60, the cutscenes are 60." exact quote Not "aiming" for 60 like you've been saying. These settings are beyond PC's ultra settings, and here is a 14 minute video showing the entire thing. Are you ready to admit it now? They even show features and pick apart stuff in the Series X demo that was clearly not in the PC Ultra version of the game.Hell, just looking at the performance you can tell how good it's running. 2 weeks old port, not even using the new GPU features that would make it run even better.


I did not disagree any time that there is running at 4k Ultra with added affects in performance similar to RTX 2080 and that they will delivery that at 60fps.

I said that you claiming they were running at 4k 100fps was a misinterpretation because they didn’t say that.
 
Last edited:
Those are all basic global RDNA2 features that will be present on PS5 and RDNA2 cards
Theres no info that suggests PS5 will use a stripped down RDNA2 and how would you possibly justify it? The assumption its just so detached from reality with no sound logic behind it.

Rest of this isn't worth addressing the games will prove my point. Both consoles are custom RDNA 2, but there's no guarantee that any feature that's available on one, or on RDNA 2 is automatically available on the other. Only Microsoft has confirmed specific major headline features of RDNA2. Sony has only directly confirmed ray tracing and primitive shaders. I'll go ahead and say machine learning also because of that wired article.

Nowhere does Sony or Cerny confirm VRS. Nowhere does Sony or Cerny confirm Sampler Feedback or Texture Space Shading, nowhere do they confirm Mesh Shaders. We already have "insiders" again over at resetera claiming that the PS5 supports VRS, but we all already know how accurate many insiders have been on these consoles already, so we automatically won't take what they say serious. Sony has yet to confirm them, and until they do the safe assumption is that they don't have them. Even Mark Cerny in his presentation was cautioning people about expectations since they are custom and make choices that they felt were right for them. Among all the things he talked about, he focused on the GPU cache scrubbers. DF have specifically inquired about VRS and apparently not received any confirmation yet. Now why would that be exactly? Safe assumption is the PS5 does not support these things. They may both be custom RDNA 2, but the Xbox Series X is clearly more advanced.
 
I did not disagree any time that there is running at 4k Ultra with added affects in performance similar to RTX 2080 and that they will delivery that at 60fps.

I said that you claiming they were running at 4k 100fps was a misinterpretation because they didn’t say that.

There are posts in this thread where you literally say 4K Ultra AIMING for 60fps (why use the word aiming, when they're clearly achieved it?). 4K Ultra 60fps SP. Multiple times you say this. You aren't acknowledging that the Series X was running on a 2 week old, unoptimized port of Gears 5 using none of the new GPU features at 4K resolution and 60fps using settings that went beyond the PC's Ultra Settings, superior shadows, superior textures, new real-time screen space global illumination, 50% more particles, i mean come on. You were not acknowledging that. You weren't just denying the 100fps, you were not acknowledging the beyond 4K PC Ultra settings during actual gameplay with video proof. You kept implying regular PC Ultra settings for a good stretch. That's probably why so many of your posts are now edited.

I literally had to drop 3 different videos just for you to admit that Series X after 2 weeks is running Gears 5 PC better than the highest existing PC ultra settings with a ton of new features. The one area i concede on is that the 100fps may not have been for the SP, and was maybe actually MP after all, as you said. But one key dishonest claim you made is that it ran at 4K. Nowhere, anywhere is there a claim that the 100fps is sub 4K resolution. That was you pulling something totally out of your ass. DF doesn't say it across 3 plus videos, it isn't said on DF's website, it isn't said on Microsoft's news post about the demo. You literally made up sub 4k resolution, and you know that's what you did. The answer is why? Don't answer, rhetorical. Alright, officially done with the thread.
 

ethomaz

Banned
There are posts in this thread where you literally say 4K Ultra AIMING for 60fps (why use the word aiming, when they're clearly achieved it?). 4K Ultra 60fps SP. Multiple times you say this. You aren't acknowledging that the Series X was running on a 2 week old, unoptimized port of Gears 5 using none of the new GPU features at 4K resolution and 60fps using settings that went beyond the PC's Ultra Settings, superior shadows, superior textures, new real-time screen space global illumination, 50% more particles, i mean come on. You were not acknowledging that. You weren't just denying the 100fps, you were not acknowledging the beyond 4K PC Ultra settings during actual gameplay with video proof. You kept implying regular PC Ultra settings for a good stretch. That's probably why so many of your posts are now edited.

I literally had to drop 3 different videos just for you to admit that Series X after 2 weeks is running Gears 5 PC better than the highest existing PC ultra settings with a ton of new features. The one area i concede on is that the 100fps may not have been for the SP, and was maybe actually MP after all, as you said. But one key dishonest claim you made is that it ran at 4K. Nowhere, anywhere is there a claim that the 100fps is sub 4K resolution. That was you pulling something totally out of your ass. DF doesn't say it across 3 plus videos, it isn't said on DF's website, it isn't said on Microsoft's news post about the demo. You literally made up sub 4k resolution, and you know that's what you did. The answer is why? Don't answer, rhetorical. Alright, officially done with the thread.
The 100fps part is not 4k.
What is not 4k is what? Dynamic 4k? 1800p? 1080p? These are all sub-4k.

I should never quote you if you don’t bullshit about 4k 100fps because I never did fight against that they are not targeting 4k 60fps Ultra with added effects.

They are aiming at 120fps for MP but not 4k.
 
Last edited:

SonGoku

Member
both consoles are custom RDNA 2, but there's no guarantee that any feature that's available on one, or on RDNA 2 is automatically available on the other.
Im not talking about XBX, im talking about basic RDNA2 features that will be present in every RDNA2 card
Why would PS5 lack global RDNA2 features?
. Only Microsoft has confirmed specific major headline features of RDNA2.
Not discussing it yet its not a good argument, Sony didn't discuss tessellation for the PS4 reveal or ML for PS5s
Nowhere does Sony or Cerny confirm VRS. Nowhere does Sony or Cerny confirm Sampler Feedback or Texture Space Shading, nowhere do they confirm Mesh Shaders.
They confirmed RDNA2 which has all those features
Sony has yet to confirm them, and until they do the safe assumption is that they don't have them.
The reasonable and realistic assumption is PS5 has basic RDNA2 features. PS4 & Pro didn't miss any features and Pro had extra features
Even Mark Cerny in his presentation was cautioning people about expectations since they are custom and make choices that they felt were right for them.
He was cautioning people to not just assume they bought a off the shelf design just because a similar GPU is available as a PC card roughly at the same time they release (like Pro and RX 470/480), they didn't just buy a RDNA2 card they helped design RDNA2 as part of the collaboration AMD then releases it as a discrete card.

If anything this should be definitive evidence that PS5 will support every feature found on RDNA2 cards at time of release
he focused on the GPU cache scrubbers.
Because that's a feature unique to PS5 not part of RDNA2 global features and its also related to the SSD/IO optimization which was one of the three pillars of the presentation
 
Last edited:
To add some info to the thread, Dark from Digital Foundry confirmed that the Series X version of Gears 5 that was running beyond PC Ultra Settings takes advantage of dynamic resolution scaling. That's basically how it maintained its locked 60fps after being ported in 2 weeks without newer gpu features utilized. Don't know minimum res yet, but asked.


edit:

Lowest he saw it drop was to 1080p in a cutscene.
 
Last edited:
Top Bottom