• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Dirt 5 devs: Next gen GPU differences don't impact development, but have efficiencies when optimising and tuning

Md Ray

Member
Hitman 2 was 4k on Xbox vs 1440 on pro. So ps5 will most likely not run much better.
It will be locked 60fps on PS5 at 1440p. From One X to Series X - the GPU's computational power is doubled (2.0x) but memory bandwidth isn't as Richard said in his Series X back-compat vid. 326 GB/s to 560 GB/s is a 1.7x (70%) uplift.

However, from PS4 Pro to PS5 - the GPU's compute is more than doubled (nearly 2.5x) and bandwidth is doubled (2.05x, 105%) uplift. So expect at least twice the frame-rate of PS4 Pro on PS5 in HM2 using PS5's boost mode.
 
Exactly, combat is no where near stable 60, or was at every encounter shown.

Oh and I know you'll ignore me, but here's Sekiro running at 60fps from multiple different sources on max settings XsX




Imagine thinking people here were on the level of your usual Twitter warriors who don't know the difference between average 60fps and occasional dropped frames.


See, told you he'd just ignore this.

Doesn't fit with his narrative.
 
With this I agree. We need a more thorough analysis.

True and here's my defense of the XSX. When Digital Foundry says the GPU gets taxed that's probably because the game is only taking advantage of 40 CUs and not the full 52 CUs that the XSX has. Thats due to the fact that the game hasn't been patched for Series X hardware yet so all your seeing is an increase in frames due to the additional clockspeed. Once the game takes advantage of the XSX CUs the results will be better.

The real question is if From Software will patch it for the XSX.
 

Neo_game

Member
To be fair Digital Foundry makes it look worse than it is because their job is to point out the flaws. What we really need is a good solid 30 minutes of someone playing the game with a framerate counter.

May be the game drops even lower in some parts but they simply have not come across it ? I do not know why people give DF such benefit of doubt. They can make the game look, run good or bad and most people will never know unless they benchmark it themselves.
 

Panajev2001a

GAF's Pleasant Genius
True and here's my defense of the XSX. When Digital Foundry says the GPU gets taxed that's probably because the game is only taking advantage of 40 CUs and not the full 52 CUs that the XSX has. Thats due to the fact that the game hasn't been patched for Series X hardware yet so all your seeing is an increase in frames due to the additional clockspeed. Once the game takes advantage of the XSX CUs the results will be better.

The real question is if From Software will patch it for the XSX.

I thought XSX’s Xbox One BC support offered all CU’s but DF said not all of the new “RDNA2 architectural improvements” or something like that.
 
I thought XSX’s Xbox One BC support offered all CU’s but DF said not all of the new “RDNA2 architectural improvements” or something like that.

How do they get the game to run with more CUs without a patch?

I remember with the PS4 Pro they needed a patch to take advantage of those extra CUs and if not they would default to he PS4s CU count.
 

geordiemp

Member
Yeah, that was an instant, not the average.

Since you like this video, you can go back to 11:20 of the same video and he will explain why.

He explains that the back compat running of the RDNA2 must be struggling, but that logic is funny as RDNA 5700 has no problems with same DX code. 5700is also RDNA at full 4K :



What MS said is its BC mode does not take full advagtage of RDNA2 features (which means VRS or Ray tracing) - some are thinking DX12 RDNA1 to RDNA2 is another strange language or some emulation difference both uding DX12. Its hilarious.

Anyway, when ps5 tears through Sekiro at 60 FPS, what will be the narrative then ? Covid ?

It is amusing
 
Last edited:

JackMcGunns

Member
We’re just a few weeks away folks, we might as well wait to see the head to heads from DF and others. My interpretation is that there is a difference, but not anything that will impact development, but that’s not really saying there won’t be a difference that DF won’t expose, and that will be important for gamers who own both because if one is better in some way, however small, for some it will be the version that they buy.

One thing is certain, there will be some serious meltdowns based on some expectations
 

Zathalus

Member
Yes its capped at 1800p, the point is the 1800p is easier go run than 2140p, your point is not logical or thought out.

Also the line is below 60 for the whole of the grab, its not a dip. :messenger_beaming:

Here is a bigger version to help you see. That green line is the FPS, its green. The line above it is the 60 FPS line. If you look closely, the green line is below the 60 FPS line.


B3qXgaY.png


If you want I can use arrows ?
It has been explained to you multiple times that Microsoft themselves have said its running in a GCN backwards compatibility layer. You refuse to believe them because you are frankly just another fanboy with an agenda. Gears 5 and Tactics also go against your narrative, but you always neglect to even mention them. Your behavior is about as sad as those that refuse to believe Mark Cerny about variable frequency and are convinced the PS5 is a 9.2 TFLOPs console.

You take one console manufacturer at thier word, but refuse to believe in anything the other states. Hypocrisy at its finest.
 
Last edited:

geordiemp

Member
It has been explained to you multiple times that Microsoft themselves have said its running in a GCN backwards compatibility layer. You refuse to believe them because you are frankly just another fanboy with a agenda. Gears 5 and Tactics also go against your narrative, but you always neglect to even mention them. Your behavior is about as sad as those that refuse to believe Mark Cerny about variable frequency and are convinced the PS5 is a 9.2 TFLOPs console.

You take one console manufacturer at thier word, but refuse to believe in anything the other states. Hypocrisy at its finest.

Its been explained to you multiple times 5700 is RDNA, it is not a GCN card.

The code is not running GCN, its running DX api, so I dont know where this GCN narrative came from, MS did not say that. They said not taking advantages of RDNA2 FEATURES - if you understand what that means of course.

I am discussing a 3rd party game, which we use as its the same game, its called a BENCHMARK. Keep it to a common 3rd party benchmark.

Also I did not insult you, keep it civil, remember ad hominem is for the stupid.
 
Last edited:
It will be locked 60fps on PS5 at 1440p. From One X to Series X - the GPU's computational power is doubled (2.0x) but memory bandwidth isn't as Richard said in his Series X back-compat vid. 326 GB/s to 560 GB/s is a 1.7x (70%) uplift.

However, from PS4 Pro to PS5 - the GPU's compute is more than doubled (nearly 2.5x) and bandwidth is doubled (2.05x, 105%) uplift. So expect at least twice the frame-rate of PS4 Pro on PS5 in HM2 using PS5's boost mode.

But the point is, PS5 will run PS4Pro versions of the games, so extra compute is essentially wasted.

X1X had more games with either 4k, 60fps, or both as options than PS4Pro by a large margin - which means XsX is better placed to take advantage of its upgrade over current gen.
 
We’re just a few weeks away folks, we might as well wait to see the head to heads from DF and others. My interpretation is that there is a difference, but not anything that will impact development, but that’s not really saying there won’t be a difference that DF won’t expose, and that will be important for gamers who own both because if one is better in some way, however small, for some it will be the version that they buy.

One thing is certain, there will be some serious meltdowns based on some expectations

Will it be massive or will it be small? I can't wait to find out.
 

Md Ray

Member
Receipts please.

Likely you're referring to the DF video showing Whittleton Creek, perhaps the most unoptimised and demanding zone in any current gen game, running between 52-60fps on 4k mode.

I expect you'll show evidence of this stage running better at 4k on 5700 XT to back up your claims.
Sure.

5700 XT at native 4K, equivalent settings to X1X consistently runs at 60fps and even a touch above 60fps.
urzgZd0.png


Series X here in roughly the same scene consistently getting below 60fps compared to 5700 XT.
94eaZr3.png


I matched the scene where One X is getting 32fps in both the comparisons. 5700 XT, in comparison, here is getting a 9% uplift over SX.

5700 XT vs X1X vid (timestamped):



Series X vs X1X vid (timestamped):



Source that it's unoptimised?
 
Last edited:

Concern

Member
It will be locked 60fps on PS5 at 1440p. From One X to Series X - the GPU's computational power is doubled (2.0x) but memory bandwidth isn't as Richard said in his Series X back-compat vid. 326 GB/s to 560 GB/s is a 1.7x (70%) uplift.

However, from PS4 Pro to PS5 - the GPU's compute is more than doubled (nearly 2.5x) and bandwidth is doubled (2.05x, 105%) uplift. So expect at least twice the frame-rate of PS4 Pro on PS5 in HM2 using PS5's boost mode.


Expectation and reality are two different things. Unless of course you have Hitman 2 on Ps5 gameplay. We don't even know how Sony will handle bc exactly.

All Ps4 games after July are supposed to be able to run on Ps5. Hitman 2 released before that, so we don't even know if it'll get a boost at all.
 

geordiemp

Member
Sure.

5700 XT at native 4K, equivalent settings to X1X consistently runs at 60fps and even a touch above 60fps.
urzgZd0.png


Series X here in roughly the same scene consistently getting below 60fps in this same section compared to 5700 XT.
94eaZr3.png


I matched the scene where One X is getting 32fps in both the comparisons. 5700 XT, in comparison, here is getting a 9% uplift over SX.

5700 XT vs X1X vid (timestamped):



Series X vs X1X vid (timestamped):



Source that it's unoptimised?


They are both RDNA GPU based as well, both running DX12 api of code. 5700 is quote strong actually.
 

Panajev2001a

GAF's Pleasant Genius
How do they get the game to run with more CUs without a patch?

I remember with the PS4 Pro they needed a patch to take advantage of those extra CUs and if not they would default to he PS4s CU count.

The piece said:
while Series X runs old games with full clocks, every compute unit and the full 12 teraflop of compute, it does so in compatibility mode - you aren't getting the considerable architectural performance boosts offered by the RDNA 2 architecture

XSX is apparently running the games natively, but I would not be able to explain the exact difference with the XOX running Sekiro beyond the game having some difficulties scaling properly.

PS4 Pro patches helped the PS4 games run natively and fully support the new GPU (new GPU having new instructions and features the old one did not) while boost mode just enabled higher clocks I would think. In this case it sounds like XSX has been designed with its GPU being able to present itself as an Xbox One X++ of sorts and that is how it is running Xbox One and the ones that received Xbox One X patches.
 
Sure.

5700 XT at native 4K, equivalent settings to X1X consistently runs at 60fps and even a touch above 60fps.
urzgZd0.png


Series X here in roughly the same scene consistently getting below 60fps in this same section compared to 5700 XT.
94eaZr3.png


I matched the scene where One X is getting 32fps in both the comparisons. 5700 XT, in comparison, here is getting a 9% uplift over SX.

5700 XT vs X1X vid (timestamped):



Series X vs X1X vid (timestamped):



Source that it's unoptimised?


The Series X is running at the same clock speed as the 5700Xt plus it has more CUs than that card and it's based on RDNA2.

I would definitely say it's some sort of optimization issue. There's no way the Series X should perform worse than a 5700Xt.
 
The piece said:


XSX is apparently running the games natively, but I would not be able to explain the exact difference with the XOX running Sekiro beyond the game having some difficulties scaling properly.

PS4 Pro patches helped the PS4 games run natively and fully support the new GPU (new GPU having new instructions and features the old one did not) while boost mode just enabled higher clocks I would think. In this case it sounds like XSX has been designed with its GPU being able to present itself as an Xbox One X++ of sorts and that is how it is running Xbox One and the ones that received Xbox One X patches.

Well that is interesting. I'm still not sure how it can take advantage of the extra CU count without being programmed for it.
 

Zathalus

Member
Its been explained to you multiple times 5700 is RDNA, it is not a GCN card.

The code is not running GCN, its running DX api.

Also I did not insult you, keep it civil, remember ad hominem is for the stupid.
How did I insult you? I called you a fanboy, which anybody looking at your post history can clearly see.

I'm not even sure why you keep bringing the 5700 card into the picture. The XSX is emulating the behaviour of GCN via a compatibility layer, as explained by Microsoft themselves. The near 14 TFLOP Radeon 7 (GCN as well) that has 1 Terabytes of memory bandwidth also drops frames into the 50s at 4k.

You refuse to believe Microsoft at what they are saying is occurring, so basically you are spreading FUD. The exact same FUD about PS5 9.2 TFLOPs that was spread around by delusional Xbox fanboys.

You also once again neglected Gears 5 and Tactics, can you please show me benchmarks of any 5700 class card running those games with ultra settings at 4k60fps?
 

JackMcGunns

Member
The Series X is running at the same clock speed as the 5700Xt plus it has more CUs than that card and it's based on RDNA2.

I would definitely say it's some sort of optimization issue. There's no way the Series X should perform worse than a 5700Xt.


Won’t matter if it’s an RDNA2 GPU when running backward compatible games. XSX is running them in GCN mode which puts it at a massive disadvantage when comparing to what it can achieve with RDNA2
 

geordiemp

Member
Well that is interesting. I'm still not sure how it can take advantage of the extra CU count without being programmed for it.

The game programs do not instruct each CU to do work, and Microsoft already stated the full 12 TF is being applied in BC mode.

Unless someone is not being upfront.

How did I insult you? I called you a fanboy, which anybody looking at your post history can clearly see.

I'm not even sure why you keep bringing the 5700 card into the picture. The XSX is emulating the behaviour of GCN via a compatibility layer, as explained by Microsoft themselves. The near 14 TFLOP Radeon 7 (GCN as well) that has 1 Terabytes of memory bandwidth also drops frames into the 50s at 4k.

You refuse to believe Microsoft at what they are saying is occurring, so basically you are spreading FUD. The exact same FUD about PS5 9.2 TFLOPs that was spread around by delusional Xbox fanboys.

You also once again neglected Gears 5 and Tactics, can you please show me benchmarks of any 5700 class card running those games with ultra settings at 4k60fps?

I am comparing 5700 vs XSX on a 3rd party game to show TF is not everything. Again less the insults and be civil.

Both are RDNA cards, there is no GCN here in sight, why is this confusing you ?

The game code is not GCN lol its just game code, The interface is DX12 api, the hardware layer is RDNA.

Why do you want to compare games from first parties, is it because they have a recent patch to make them run better and do less work (VRS). What has that got to do with benchmark a 3rd party game ?

Anyway, book mark it, Ps5 will run Sekiro better, and then you can blame covid, devs, and GCN, DX12, DX11 Vulkan and mantle.
 
Last edited:

Md Ray

Member
True and here's my defense of the XSX. When Digital Foundry says the GPU gets taxed that's probably because the game is only taking advantage of 40 CUs and not the full 52 CUs that the XSX has. Thats due to the fact that the game hasn't been patched for Series X hardware yet so all your seeing is an increase in frames due to the additional clockspeed. Once the game takes advantage of the XSX CUs the results will be better.

The real question is if From Software will patch it for the XSX.
No, they didn't say that. All of 52 CUs are being used in BC mode. What DF's Richard said was that it's due to bandwidth the games aren't seeing a 2x frame-rate increase over One X. When you look at the bandwidth situation, Series X only offers a 70% boost over One X while TF is doubled.



As Mr. Greenberg here says, Series X is using its full power to deliver back-compat One X games.
 

Concern

Member
No, they didn't say that. All of 52 CUs are being used in BC mode. What DF's Richard said was that it's due to bandwidth the games aren't seeing a 2x frame-rate increase over One X. When you look at the bandwidth situation, Series X only offers a 70% boost over One X while TF is doubled.



As Mr. Greenberg here says, Series X is using its full power to deliver back-compat One X games.



Greenberg? The most clueless dude on the xbox team? Lol. The same guy who gets bashed for constantly posting cringeworthy and/or incorrect information on twitter? But now since it fits the fanboy narrative, he's correct? Do better guys.
 

geordiemp

Member


Yes I understand that statement and you dont, full artictecural improvements of RDNA2 over RDNA1 which is 5700 means things like VRS (optional), Mesh shading (Optional, not used much in any games)

It tries to say GCN level, so I used a RDNA 1 hardware to compare it against, so there is no GCN hardware in sight. We are comparing a game running DX12 using 2 RDNA hardware cards,.

5700 is running same code at 9.75 TF. It also is not GCN.

Anyway, not long now, your argument will look not so good when ps5 runs this game.
 
Last edited:

Zathalus

Member
The game programs do not instruct each CU to do work, and Microsoft already stated the full 12 TF is being applied in BC mode.

Unless someone is not being upfront.



I am comparing 5700 vs XSX on a 3rd party game to show TF is not everything. Again less the insults and be civil.

Both are RDNA cards, there is no GCN here in sight, why is this confusing you ?

The game code is not GCN lol its just game code, The interface is DX12 api, the hardware layer is RDNA.

Why do you want to compare games from first parties, is it because they have a recent patch to make them run better and do less work (VRS). What has that got to do with benchmark a 3rd party game ?

Anyway, book mark it, Ps5 will run Sekiro better, and then you can blame covid, devs, and GCN, DX12, DX11 Vulkan and mantle.
As per the tweet I linked, the XSX is emulating the behavior of GCN. Why is this confusing you?
 
No, they didn't say that. All of 52 CUs are being used in BC mode. What DF's Richard said was that it's due to bandwidth the games aren't seeing a 2x frame-rate increase over One X. When you look at the bandwidth situation, Series X only offers a 70% boost over One X while TF is doubled.

He clearly says that it's because it's not tapping into the features of RDNA2, this is straight power through a comparability mode to emulate the GCN.

Your argument, and geordiemp, seems centred on this "limitation" giving PS5 the edge on current gen BC - assuming it uses the full architecture.

However this is misplaced. Hitman 2 doesn't have a 4k mode on PS4Pro, or Hitman 1 for that matter, so PS5 won't have the opportunity to do better at 4k even if it theoretically could.

In fact, most games don't have a 4k mode on PS4Pro, even simplistic ones. Final Fantasy X/X-2 HD is a straight remaster of a PS2 game but only One X got a 4k mode.

Obviously comparisons can take place at 1440p, but then 60fps is a given there.
 
Last edited by a moderator:

Zathalus

Member
Yes I understand that statement and you dont, full artictecural improvements of RDNA2 over RDNA1 which is 5700 means things like VRS (optional), Mesh shading (Optional, not used much in any games)

Where does it mention GCN code lol
OK, good to know you can't read. Are the words GCN-based invisible to you?
 

MC68000

Member
The game programs do not instruct each CU to do work, and Microsoft already stated the full 12 TF is being applied in BC mode.

Unless someone is not being upfront.



I am comparing 5700 vs XSX on a 3rd party game to show TF is not everything. Again less the insults and be civil.

Both are RDNA2 cards, there is no GCN here in sight, why is this confusing you ?

The game code is not GCN lol its just game code, The interface is DX12 api, the hardware layer is RDNA.

Why do you want to compare games from first parties, is it because they have a recent patch to make them run better and do less work (VRS). What has that got to do with benchmark a 3rd party game ?

Anyway, book mark it, Ps5 will run Sekiro better, and then you can blame covid, devs, and GCN, DX12, DX11 Vulkan and mantle.
I was going to ask if the 5700 was RDNA2. Genuine question. I see you changed you post after the fact to reflect RDNA. Original post is in bold.
 

sircaw

Banned
Yes i will admit if so. But we will need to see the 2nd round of games on matured XDK before we give the final verdict.

Im interested in the next gen subplot of Mark Cerny vs Jason Ronald.

The venerable old Cerny, much revered by the fans, the man who speaks from his bathtub, who found success by selecting mid-range parts and blessing them with dev friendly tools.

Against an unknown upstart who burst into the scene with his odd beard. Tbf xbeard had a small victory with his custom One X design, which smashes the loud and over-sized/weaker PS4 Pro. Though skeptics harshly ruled it as being a year later.

Well, fine, we will have no hold bars in 3 weeks time. Head to head, priced to priced. :messenger_bicep:

Ah matured xdx, what will the next goal post be i wonder.

I get ya, the next round of games, i see you. lol.
 
No it's not, and I never claimed as such. I claimed the XSX is emulating GCN. Oh wait, Microsoft did as well.

But what do they know, some guy on a internet forum knows better then them.

This is the guy who doesn't know the difference between average FPS and frame rate drops. He's being willfully ignorant of any argument which destroys his narrative.
 

geordiemp

Member
As per the tweet I linked, the XSX is emulating the behavior of GCN. Why is this confusing you?

LOL bacuse its hilarious, thats what DF or Eurogamer poor interpretation. I know what they were trying to say, the sentence was poorly constructed and a bad analogy.

No its not emulating GCN. You have read some developers comments on that tweet, gave them a good laugh. Thats not what MS said either and a very poor understanding of hardware.

You dont switch on or off hardware features like cache, CU cores, Geometrey, Render back ends...Its invisible to the game code.

nu2sxhp.png


Not long now until we see more emulated games lol

Oh and is Yakuza new game also running emulated ? You need to think up some new excuses.
 
Last edited:

longdi

Banned
Ah matured xdx, what will the next goal post be i wonder.

I get ya, the next round of games, i see you. lol.

That's the news. Xdk are behind, that's the truth.

If Jason Ronald box beat Cerny box, will you also submit to Jason too? 🤷‍♀️
 
Ever heard of emulation?

So it's just emulating an Xbox One X via software?

Usually emulation takes quite a bit of resources so I don't think Microsoft is doing that as there are more efficient ways of doing BC. I'm thinking it has to be something similar to what PCs do.
 

Zathalus

Member
LOL bacuse its hilarious, no its not emulating GCN. You have read some developers comments on that tweet, gave them a good laugh. Thats not what MS said either and a very poor understanding of hardware.

You dont switch on or off hardware features like cache, CU cores, Geomtrey, Render back ends...

nu2sxhp.png


Not long now until we see more emulated games lol
Emulation does not require you to switch off any hardware, but I am sure Microsoft is just making stuff up. Whatever floats your boat, keep on living in that bubble.
 
LOL bacuse its hilarious, thats what DF or Eurogamer poor interpretation. I know what they were trying to say, the sentence was poorly constructed and a bad analogy.

No its not emulating GCN. You have read some developers comments on that tweet, gave them a good laugh. Thats not what MS said either and a very poor understanding of hardware.

You dont switch on or off hardware features like cache, CU cores, Geometrey, Render back ends...Its invisible to the game code.

nu2sxhp.png


Not long now until we see more emulated games lol

Oh and is Yakuza new game also running emulated ? You need to think up some new excuses.

Why do you think a random poster on Era ignoring the point means anything more than when you do it?

And what does Yakuza have to do with anything?
 
Top Bottom