• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

HAL-01

Member
Not sure how many times it has to be explained to you, but that is a emulated BC game not taking full advantage of the hardware.
"all of the thousands of backwards compatible games run natively on the Xbox Series X|S, running with the full power of the CPU, GPU & SSD. No boost mode, no downclocking, the full power of the consoles for each & every game." -Mr Aaron Greenberg
 

VCL

Member
I mean... They aren't lying lol


It's running at the full power of the Series X, and Gears 5 is absolutely not made from the ground up for it.
it's a remastered version of gears 5, so it runs natively on series x
 
Last edited:

xacto

Member
Ok this is completely off topic and highly pedantic.
But the past year I have seen this phrase come up more and more often on the internet. Is that slang or an actually grammatically correct phrase? Shouldn't it be "would have known"?

I'm really sorry to be that pedantic, but I need to know :D

Sincerely

a non native English speaker :)

It should be "would have." Unless grammar is an issue, "would of" is probably faster to type, but not correct.
 

Zathalus

Member
Full advanage again eh, you will be saying that for all of next year. Its a benchmark point, ran by DF, and we have the data.

I do not want to discuss a new optimsiation of gears or whatever it is, as it needs the same benchmarking of the same code.

Sekiro has been benchmarked on PC and XSX using the same game code.

Dog ate my homework sir.
Lots to unpack here.

1) You accuse Microsoft of lying about BC games not taking advantage of the full console capabilities. God knows why, I'd wager you wouldn't question the same claim made by Sony.

2) You still fail to grasp that emulation comes with a performance overhead. Despite this being clear on every emulation ever.

3) You fail to engage on the Gears 5 point as it goes against your narrative. DF already took a look at a early build 6 months ago, there are videos out there that clearly show the new settings, and even Ars Technica said it was like for like compared to the PC version. Per the article:

This showed the game's campaign mode nearly reaching parity with the PC version while offering updates to its global illumination model. In the months since, the PC version has apparently adopted some of those illumination tweaks, so like-for-like comparisons show the PC version narrowly beating that of Xbox Series X on a visual basis, primarily due to superior ambient occlusion on the PC version... but, again, that's a PC version jacked up to "ultra" and "maximum" settings while running on an i7-8700K and RTX 3080. For pretty much every other graphical element you can think of (draw distance, shadow resolution, geometric detail), Series X is neck-and-neck.

Lastly, we can use Gears Tactics as well:

The only other game we've tested in the "optimized for Series X" pool at this point is Gears Tactics, which is currently a Windows 10 exclusive but will arrive on XB1 and Xbox Series on November 9. Its ability to stick to a 4K/60fps performance target on Series X, while running at the PC version's "ultra" settings preset, is commendable—really, it's about as pretty as Gears 5.

PC to compare:

01105702757l.jpg


So my point stands, 2080ti class performance demonstrated in two optimized games.
 

Zathalus

Member
"all of the thousands of backwards compatible games run natively on the Xbox Series X|S, running with the full power of the CPU, GPU & SSD. No boost mode, no downclocking, the full power of the consoles for each & every game." -Mr Aaron Greenberg
https://www.eurogamer.net/articles/digitalfoundry-2020-xbox-series-x-back-compat-is-transformative

while Series X runs old games with full clocks, every compute unit and the full 12 teraflop of compute, it does so in compatibility mode - you aren't getting the considerable architectural performance boosts offered by the RDNA 2 architecture.
but again, this is the GPU running in compatibility mode, where it emulates the behaviour of the last generation Xbox - you aren't seeing the architectural improvements to performance from RDNA 2,
 

Kerlurk

Banned


Yeah, the Empire really strikes back! Good one! :)

If so, how is it that MS claims 5 ipc (instructions per cycle) for RDNA1 CU's (Compute Units) and 7 ipc for RDNA2 CU's in their Xbox Hot Chips talk? This is not necessary a negative, but just wondering about the discrepancy.

I guess we will have to wait for the RNDA2 reveal.

Should be interesting to see what changes AMD has made to the CU's.
 
Last edited:

VCL

Member
Means absolute jack shit. Guess what, they are running Dx API.

Oh and remember, as stated by MS, BC throws XSX full 12 TF of power at it, no boost mode no nothing. How do you think MS is doing so great with BC?
12 teraflops in a game not optimized for x series moving it only better because of brute force... I'm not in, just see gears tactics and gears 5
 

Yoboman

Member
Ok this is completely off topic and highly pedantic.
But the past year I have seen this phrase come up more and more often on the internet. Is that slang or an actually grammatically correct phrase? Shouldn't it be "would have known"?

I'm really sorry to be that pedantic, but I need to know :D

Sincerely

a non native English speaker :)
Just an example of where the wrong use of similar sounding words in a phrase becomes so common that it is accepted as proper English

Like "one in the same" should be "one and the same." Or "I could care less" should be "I couldn't care less". Or "this peaked my interest" should be "this piqued my interest"
 

geordiemp

Member
Lots to unpack here.

1) You accuse Microsoft of lying about BC games not taking advantage of the full console capabilities. God knows why, I'd wager you wouldn't question the same claim made by Sony.

2) You still fail to grasp that emulation comes with a performance overhead. Despite this being clear on every emulation ever.

3) You fail to engage on the Gears 5 point as it goes against your narrative. DF already took a look at a early build 6 months ago, there are videos out there that clearly show the new settings, and even Ars Technica said it was like for like compared to the PC version. Per the article:



Lastly, we can use Gears Tactics as well:



PC to compare:

01105702757l.jpg


So my point stands, 2080ti class performance demonstrated in two optimized games.

I dont care about first party that has been recompiled and being re-released witgh new optimisations, it will run better on PC as well. So what.

Do you still think XSX will have a 18 % advantage over ps5 on third party lol. Oh how times have changed.
 
Last edited:

geordiemp

Member
Im still trying to wrap my head around that claim the the PS5s GPU is like a 2060 Super. It seems more like a 2070 Super based off what I've read.

We will compare how games run and ther ability to ray trace which is very expensive to performance to determine how good they are at running games.

Your post will not age well. After 28th things will be come clearer.
 
Last edited:

Zathalus

Member
The key word(s) are full clocks, every compute unit and the full 12 teraflop of compute.
Yes, that full power is being used to emulate a Xbox One X. Just like I can run a PS3 emulator on my PC, but that doesn't mean my GPU transforms into a PS3, or that my PC is not using all of its clocks while emulating.
 

geordiemp

Member
The key word(s) are full clocks, every compute unit and the full 12 teraflop of compute.

No GPU gets close to using 100 % of the TF for gaming, its something like 40 %.

You realise that the recent L1 common cache AMD patent gives up to 20 % perf increase on workloads its designed to optimise ?. And thats just 1 cache.....

TF is only a small part of the picture, and still SOME posters dont get it but it is 1 easy to grasp number I guess (not you moussy). Oh well.

Time out, see ya l8r
 
Last edited:

Riky

$MSFT
The only thing we don't really know is how much more powerful it will be. Will it be a small difference or a massive one?

We have to wait and see what the answer will be.

It will start close as devs get used to the systems, then as next gen engines appear and the consoles get pushed the XSX will pull away due to a lot more powerful GPU, a faster CPU and much faster memory bandwidth. A triple advantage won't be hidden for long and these are the specs that matter like they always have done in the PC market.
 

Zathalus

Member
That's what I am saying, it doesn't transform into a GCN card. It's running at full. We can argue it isn't optimised but that's a different matter.
Of course it doesn't transform into a GCN card, the software emulates a GCN card. Like a PS3 emulator emulates a PS3.
 

Mr Moose

Member
No GPU gets close to using 100 % of the TF for gaming, its something like 40 %.

You realise that the recent L1 common cache AMD patent gives up to 20 % perf increase on workloads its designed to optimise ?. And thats just 1 cache.....

TF is only a small part of the picture, and still posters dont get it but it is 1 easy to grasp number I guess. Oh well.

Time out, see ya l8r
I'm just quoting Digital Foundry. Full 12 TF compute.

Of course it doesn't transform into a GCN card, the software emulates a GCN card. Like a PS3 emulator emulates a PS3.
However, it's important to stress one thing: while Series X runs old games with full clocks, every compute unit and the full 12 teraflop of compute, it does so in compatibility mode - you aren't getting the considerable architectural performance boosts offered by the RDNA 2 architecture.
But it just sounds like bullshit, you are getting the performance boost, because you are seeing the performance boost (apart from the res is locked to 1800p or some shit). It's not optimised for it but you are seeing the boost from it.
 
Last edited:
This looks like post launch language? Was Sony beating the drum about worlfs most powerful console before games were actually out?
This has a lot more context to it than just bragging the most powerful console. It was a more significant gap than XSX/PS5 and it was clearly visible in game comparisons. However, the most important point was that the PS4 was the most powerful and cheaper console! This combined makes the PS4 the real deal back in 2013.

It’s nothing like today.
 

geordiemp

Member
It will start close as devs get used to the systems, then as next gen engines appear and the consoles get pushed the XSX will pull away due to a lot more powerful GPU, a faster CPU and much faster memory bandwidth. A triple advantage won't be hidden for long and these are the specs that matter like they always have done in the PC market.

TF have never mattered in the pC market, tell that to VEGA. Its a simple number for users who dont understand.

Here is one for you, recent AMD patent on L1 gives up to a 20 % perf increase for ML workloads, 20 %, without any TF increase.

Go figure.

Just wait until you see how stuff runs, there is no advantage. :messenger_beaming:

Also you like to trigger all of my posts, I notice that trend in a few posters, they all aave a certain something.......I wont say anymore lol
 
Last edited:
"all of the thousands of backwards compatible games run natively on the Xbox Series X|S, running with the full power of the CPU, GPU & SSD. No boost mode, no downclocking, the full power of the consoles for each & every game." -Mr Aaron Greenberg
I believe it's simple marketing. Generally a work of some sort has to be done in the transitioning between old and new hardware to truly use it, and I do not believe MS has done it for every single game before any patch. Where PS4 Pro is not supported, not even PS4 titles actually use the console. I don't know why I should believe that's the case between two different technologies.
 
It will start close as devs get used to the systems, then as next gen engines appear and the consoles get pushed the XSX will pull away due to a lot more powerful GPU, a faster CPU and much faster memory bandwidth. A triple advantage won't be hidden for long and these are the specs that matter like they always have done in the PC market.

So the XSX is three times as powerful?

I don't know if I believe that.
 
It will start close as devs get used to the systems, then as next gen engines appear and the consoles get pushed the XSX will pull away due to a lot more powerful GPU, a faster CPU and much faster memory bandwidth. A triple advantage won't be hidden for long and these are the specs that matter like they always have done in the PC market.
the cpu is 100Mhz faster, the rumor is the ps5 might have single die zen chiplet vs 2 dies on series x, if true that will easily eclipse a measly 100Mhz difference.

We also have to see what custom efficiency enhancements sony has made to the gpu. If say the ps5 gpu reaches 75% utilization and the xbox reaches 60% utilization, the ps5 performance would exceed series x.
 

Riky

$MSFT
the cpu is 100Mhz faster, the rumor is the ps5 might have single die zen chiplet vs 2 dies on series x, if true that will easily eclipse a measly 100Mhz difference.

We also have to see what custom efficiency enhancements sony has made to the gpu. If say the ps5 gpu reaches 75% utilization and the xbox reaches 60% utilization, the ps5 performance would exceed series x.

Rumours don't mean anything.

Cold hard reality is in the three most important specs the XSX wins them all and two of them by quite a margin. It'll become very obvious as time goes by.
 

Yoboman

Member
It was indeed? And they delivered. But...

Zrzut-ekranu-2020-10-16-o-21-52-53.png
This is such a massive tell

You dont just change your language from "worlds most powerful console" - which MS have been using for years now to "most powerful Xbox ever" by accident

These messages are heavily workshopped, and then dispersed to internal teams and agencies to maintain consistent language.

Something happened along the way for them to go back and update the language they are using
 
I didn't say that.

The XSX has the advantage on the three most important metrics, that will show easily over time. Like it always has in the past.

I don't know what you mean by "easily".

For me there has to be a big difference between the two for me to notice it without a Digital Foundry analysis. Like if one game is 1080P on one system and then 4K on the other. If it's something like 1800P vs 4K I won't notice it at all.
 
Status
Not open for further replies.
Top Bottom