• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Control PS5 Vs Xbox Series X Raytracing Benchmark

Riky

$MSFT
I don't think you should hate me for saying that I don't believe photomode I'd representative of what people will experience with the gameplay.

I understand why it was done but I have my doubts if it's something that will be seen often with actual gameplay. Like when will developers allow their games to run the same as photomode with uncapped framerates? I doubt something like that will rarely ever happen.

Most of the time if both systems have the same settings and the same capped framerates the results should be extremely similar between the two. I'm definitely not expecting a massive difference between the two. Although I will admit that Hitman 3 was a larger difference then what I was expecting. Will have to wait and see if that becomes the norm.

P.S

Also could you tone down your obsession with VRR? It's getting a little bit annoying in my opinion and just makes discussions a little awkward in my opinion.

I didn't mention VRR😂

You're still missing the point, it's a glitch in a mode, it isn't supposed to show what people will experience in a run of gameplay, it shows a technical fact, that's the point of the video.
 

CrustyBritches

Gold Member
I thought the photomode had an unlocked framerate while the gameplay mode was cap at 30FPs. That's why I said photomode doesn't really represent the actual gameplay due to the cap.
Ok, now I understand. I was referring to PC with similar hardware. For that reason, photomode is shown to indeed represent actual gameplay.

Multiple people have tested this and reached the same findings in this thread as DF did with their benchmark.
 
Ok, now I understand. I was referring to PC with similar hardware. For that reason, photomode is shown to indeed represent actual gameplay.

Multiple people have tested this and reached the same findings in this thread as DF did with their benchmark.

I was just thinking about consoles and take results that Digital Foundry got from both modes.

Sorry about that.
 

SlimySnake

Flashless at the Golden Globes
c6gkvTX.jpg
you have one crash every 2 weeks. thats not bad at all my man.

besides, not all of those are crashes. the something went wrong is the only one where the ps5 hard crashed on you. two of them are related to immortal fenyx rising. the others are telling you to go online. whats that about?
 
DF said the versions are pretty much identical during gameplay. DF was testing photo mode to see which console could push more frames if the developer decided to.

The X a clear winner.

The devs decided on parity at 30fps. Neither console can hit 60fps all the time.

Unlocked frame rate the Series would win comfortably.
Z9y8Jn9.png
Z9y8Jn9.png


Control an old release game, not actually a new title, so not really a test of potential power.
Exactly, This is the same thing that DF did back with last Gen Where the resolution of the og Xbox 1 game called The division, DF turnd the camera up to the sky to get the game to render at a higher level of res. PS5 is the new og Xbox 1, nothing is on screen ( look at the sky box) PS5 hardware is under no pressure as well as the XSX.
 
Wow, it's nothing to do with gameplay. They say that in the video it's purely academic.

Well in theory if a developer uncaps the framerate in a game you would get a higher FPS on tbe XSX. But capping it would make the experience between the two alot closer. That's basically what I'm getting from their analysis. I'm definitely not expecting the XSX to run at a locked 30 while the PS5 drops into 10FPS for example. Maybe the XSX will be 30FPS while the PS5 will drop a few FPS.
 

x@3f*oo_e!

Member
It looks like that Series X might have an advantage in the future but only on ray traced games (usually 30fps 'visuals' mode), because it has more compute unit (CU) power; whilst PS5 can be equal or better on purely rasterized games (usually 60fps 'performance' mode) because the higher gpu clocks give more performance to texture mapping and render output units than Series X.

Just summarizing what other people have told me, but it seems to make sense. That's excluding the stutters (or dips) that I think may be texture streaming related on Series X that PS5 doesn't seem to have (or not as bad)

Worth bearing in-mind in the future.

Also PS5 will get VRR eventually.
 
Last edited:

DinoD

Member
Exactly, This is the same thing that DF did back with last Gen Where the resolution of the og Xbox 1 game called The division, DF turnd the camera up to the sky to get the game to render at a higher level of res. PS5 is the new og Xbox 1, nothing is on screen ( look at the sky box) PS5 hardware is under no pressure as well as the XSX.

You both have no idea what you are talking about. All this potentially proves is that SX has more GPU power. To play the game you also need CPU and I/O. If the frame rate was uncapped in the game mode the results would not be the same as in the photo mode.
 

Riky

$MSFT
Well in theory if a developer uncaps the framerate in a game you would get a higher FPS on tbe XSX. But capping it would make the experience between the two alot closer. That's basically what I'm getting from their analysis. I'm definitely not expecting the XSX to run at a locked 30 while the PS5 drops into 10FPS for example. Maybe the XSX will be 30FPS while the PS5 will drop a few FPS.

They have already done an analysis of runs of gameplay, this video isn't about that. In this mode neither would reach a constant 60fps as they found, that's why it's capped at 30.
However the glitch they found in photo mode shows that on average the Xbox would be closer to 60fps and outperforms the PS5 by 16%, somewhere near the paper specs of the machines.
 
It looks like that Series X might have an advantage in the future but only on ray traced games (usually 30fps 'visuals' mode), because it has more compute unit (CU) power; whilst PS5 can be equal or better on purely rasterized games (usually 60fps 'performance' mode) because the higher gpu clocks give more performance to texture mapping and render output units than Series X.

Just summarizing what other people have told me, but it seems to make sense. That's excluding the stutters that I think may be texture streaming related on Series X that PS5 doesn't seem to have.

Worth bearing in-mind in the future.

Also PS5 will get VRR eventually.

Kind of reminds me of what's happening between AMD GPUs and Nvidia GPUs.

Take away RT and AMD GPUs perform extremely well compared to Nvidia GPUs. Add RT and then they start to fall behind alot more.

I'm seeing the same thing here. However it doesn't appear to be a massive difference if photomode is truly representative of what future RT experiences might be like.

Curious to see how many games actually push RT on these systems.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
They have already done an analysis of runs of gameplay, this video isn't about that. In this mode neither would reach a constant 60fps as they found, that's why it's capped at 30.
However the glitch they found in photo mode shows that on average the Xbox would be closer to 60fps and outperforms the PS5 by 16%, somewhere near the paper specs of the machines.
Also render differences and what else is different in photo mode. If there was a 16% advantage, you would think it would show in performance mode when the alpha effect get heavy and the frame drops.
 
They have already done an analysis of runs of gameplay, this video isn't about that. In this mode neither would reach a constant 60fps as they found, that's why it's capped at 30.
However the glitch they found in photo mode shows that on average the Xbox would be closer to 60fps and outperforms the PS5 by 16%, somewhere near the paper specs of the machines.

Is photomode truly the same as actual gameplay? It's why that 16% might not be applicable to all situations since each system has their own strengths and weaknesses.

Also wasn't the actual delta supposed to be 20%-30% between the two?
 

CrustyBritches

Gold Member
I was just thinking about consoles and take results that Digital Foundry got from both modes.

Sorry about that.
Same thing here, I was thinking PC. No worries.

Their settings are built around a 30fps min. In most scenarios XSX has headroom, but not enough to get to 60fps baseline, especially in Corridor of Doom, etc.

The real mystery is why Corridor of Doom behaves the way it does on PS5 and XSX, with 32fps and 33fps respectively in this scene. It's not CPU bottlenecked. It has to be either:
1. XSX dropping performance close to PS5. Same bottleneck on both systems governs the rest of pipeline.
2. PS5 is faster at something and catching up to XSX perf in these specific scenarios.

It's weird that this occurs in specific areas, but in general the XSX enjoy more overhead. You could argue that XSX could have better settings, but it can't because of those similar mins right above 30fps. Too much frame rate variation. It's XSX win for AVG, but basically tie for MIN.
 
Last edited:

John Wick

Member
It's a raw GPU benchmark, the Series X as I've been saying has distinct advantages in GPU size and memory bandwidth, people tried to deny this with custom secret sauce, but here we see what we have always seen in PC land, a bigger GPU is better.

Those in denial are about to have a very long generation, best to accept it now.

Ladies and gentlemen, damn it was really only a matter of time.
Ricky you first rate clown and Mr Fuckin Obvious. We Know the SX GPU is more powerful in certain aspects especially in compute and RT. There is no one in denial you thick idiot. It isn't the gap you think it is. Photo Mode allows the GPU to stretch it's legs without getting bogged down or bottlenecked.
 
Same thing here, I was thinking PC. No worries.

Their settings are built around a 30fps min. In most scenarios XSX has headroom, but not enough to get to 60fps baseline, especially in Corridor of Doom, etc.

The real mystery is why Corridor of Doom behaves the way it does on PS5 and XSX, with 32fps and 33fps respectively in this scene. It's not CPU bottlenecked. It has to be either:
1. XSX dropping performance close to PS5. Same bottleneck on both systems governs the rest of pipeline.
2. PS5 is faster at something and catching up to XSX perf.

It's weird that this occurs in specific areas, but in general the XSX enjoy more overhead. You could argue that XSX could have better settings, but it can't because of those similar mins right above 30fps. Too much frame rate variation. It's XSX win for AVG, but basically tie for MIN.

Correct me if I'm wrong but it could very well be possible that the XSX isn't that much stronger than the PS5 to have a higher locked framerate and settings. It probably didn't make sense to have an unlocked framerate so the developers probably thought the same cap was best on both even though the XSX has a higher average framerate. In the end the results will probably feel identical to people who play the game.
 
Honest question for GAF members:

Why do all system performance comparison threads have to include a snarky and childish remark such as “AND ANOTHER ONE! LOLOLOL” instead of a measured, impartial discussion regarding the differences between the games?

As much as I dislike ResetEra, it’s one area of moderation (their strict system warring intolerance) that I think NeoGaf could benefit from and would enhance GAF’s reputation tremendously. Because to a casual passer-by who might hate Era’s totalitarian politics, but come visit NeoGaf and see what appears to be a group of recalcitrant 14-year olds covered in Cheeto dust arguing fruitlessly over their shiny plastic boxes amidst a cloud of marijuana smoke, it’s almost enough to make political ideology tolerable.

Are we going to have adult discussions or not? It’s as simple as that.

LOL it's just fun trash talk.
 
Last edited:

Rentahamster

Rodent Whores
Well in theory if a developer uncaps the framerate in a game you would get a higher FPS on tbe XSX. But capping it would make the experience between the two alot closer. That's basically what I'm getting from their analysis. I'm definitely not expecting the XSX to run at a locked 30 while the PS5 drops into 10FPS for example. Maybe the XSX will be 30FPS while the PS5 will drop a few FPS.
No not necessarily. Gameplay mode and photo mode are utilizing the CPU/RAM/SSD/GPU differently so the 16% performance difference wouldn't necessarily be reflected directly with an uncapped framerate in gameplay mode.

It's entirely possible that with the way Control works, both systems run at nearly the same uncapped rate because there is some other hardware bottleneck or unoptimized thing in the programming that makes it so that the XSX can't utilize the extra GPU power even if it wanted to. Or maybe it blows the PS5 out of the water for other reasons. Who knows.

The point is, the devs can't achieve a smooth 60 fps on either console in pretty graphics mode, so the prudent thing to do is to cap both at 30fps as that is the best compromise they can do.

Any console being able to do a few fps more in uncapped mode is entirely irrelevant. The only time it would matter is if you could get one to a near constant 60fps, but since neither can do that, the point is moot, and for all intents and purposes, both systems are the same.
 
No not necessarily. Gameplay mode and photo mode are utilizing the CPU/RAM/SSD/GPU differently so the 16% performance difference wouldn't necessarily be reflected directly with an uncapped framerate in gameplay mode.

It's entirely possible that with the way Control works, both systems run at nearly the same uncapped rate because there is some other hardware bottleneck or unoptimized thing in the programming that makes it so that the XSX can't utilize the extra GPU power even if it wanted to. Or maybe it blows the PS5 out of the water for other reasons. Who knows.

The point is, the devs can't achieve a smooth 60 fps on either console in pretty graphics mode, so the prudent thing to do is to cap both at 30fps as that is the best compromise they can do.

Any console being able to do a few fps more in uncapped mode is entirely irrelevant. The only time it would matter is if you could get one to a near constant 60fps, but since neither can do that, the point is moot, and for all intents and purposes, both systems are the same.

Kind of like a 1070TI compared to a 1080. One is obviously more powerful than the other but they pretty much provide the same experience. Now going from a 1060 to a 1070Ti is very noticeable.

My desktop had a 1070Ti while my laptop has a 1060. I know mobile GPUs are generally weaker but I still believe my experience still matters.
 

ABnormal

Member
Because the framerate is capped to 30 fps in gameplay mode. If, at any one particular scene, PS5 does 30fps and XSX does 35 fps, they're both still only going to display 30 fps during regular gameplay. Then there's also the point that photo mode doesn't use as much CPU resources, and that affects framerate as well. During normal gameplay if both consoles dip below 30 fps, it might not always reflect that 16% average in photo mode because there's a lot of other variables to take account for under the hood and a surface analysis of the game isn't always going to tell us all the answers.
Development on consoles doesn't work that way. When a developer decides in advance for a fixed framerate cap, all the spare computation is utilized on rendering or any other useful aspect. On consoles, leaving processing power unused is just absurd. Everyone pushes as much is possible. The result depends by the engine and the quality of coding (and, obviously, the money spent on assets and optimization). But every ounce of usable power is always used.
 

Rentahamster

Rodent Whores
Kind of like a 1070TI compared to a 1080. One is obviously more powerful than the other but they pretty much provide the same experience. Now going from a 1060 to a 1070Ti is very noticeable.
Not really but sorta. When you use a PC component like 1070ti vs 1080, the comparison is pretty easy because one assumes the rest of the PC is the same and you're only comparing that one part. Also, when benchmarking on the PC, the software you're running is exactly the same.

When you're comparing XSX vs PS5, while it's true that the XSX has slightly more CUs, there a lot of other differences under the hood that may or may not make an impact in the overall performance of any given game. Additionally, you're running two different versions of the software for each console.

What this means is that with the XSX vs PS5 and the 1070ti vs 1080 comparison, you can make the qualitative assessment that for all intents and purposes, their performance is pretty much the same and gives the same experience to the majority of users.

The difference is that the "why" is a lot more easily answered with the 1070ti vs 1080. You only changed out one part from the same series of GPU. In the case of XSX vs PS5, there's a helluva lot more variables to consider and that makes the "why" part of the question that much harder to answer.

This is why the photo mode test is enlightening because we are able to eliminate some variables, like CPU utilization. It doesn't tell us the whole picture, but it's food for thought.
 

Rentahamster

Rodent Whores
Development on consoles doesn't work that way. When a developer decides in advance for a fixed framerate cap, all the spare computation is utilized on rendering or any other useful aspect. On consoles, leaving processing power unused is just absurd. Everyone pushes as much is possible. The result depends by the engine and the quality of coding (and, obviously, the money spent on assets and optimization). But every ounce of usable power is always used.
I'm talking about photo mode.

edit: Actually on a second reading, I'm not sure what your point is and why that is relevant to what I said.
 
Last edited:

x@3f*oo_e!

Member
Development on consoles doesn't work that way. When a developer decides in advance for a fixed framerate cap, all the spare computation is utilized on rendering or any other useful aspect. On consoles, leaving processing power unused is just absurd. Everyone pushes as much is possible. The result depends by the engine and the quality of coding (and, obviously, the money spent on assets and optimization). But every ounce of usable power is always used.
"every ounce of usable power is always used" - but not if they didn't fully optimise - kindof looks like they didn't totally optimise here.
 

CrustyBritches

Gold Member
Correct me if I'm wrong but it could very well be possible that the XSX isn't that much stronger than the PS5 to have a higher locked framerate and settings. It probably didn't make sense to have an unlocked framerate so the developers probably thought the same cap was best on both even though the XSX has a higher average framerate. In the end the results will probably feel identical to people who play the game.
According to these benchmarks the XSX does have overhead for a better setting or 2. Refer to Nvidia's Control Settings Performance Guide for options. It was a prudent decision to base the settings around 30fps for Quality and 60fps Perf considering both consoles get pushed down with their heads barely above the 30fps water in Corridor of Doom and a few other scenarios. Above all it's a stable frame rate experience capped off, instead of jumping between 40-55fps all the time.

Now, let us all take a moment to praise the gods for the 30fps and 60fps options on so many next-gen games so far. Even 120fps sometimes. We don't even have to kill each other!:messenger_peace:
 
Last edited:

Rea

Member
Dude the thing has been out for months. Multiple people are taking the thing apart on video, changing the paste etc. The fucking video is from playstation.

You do the math. Are you trolling?
I'm not trolling, i just want to know. Where is the actual size of the APU, and who is saying, where is the die shot measuring the actual size of the APU. I'm curious, and can't find one. If u have plz share.
 

truth411

Member
Is photomode truly the same as actual gameplay? It's why that 16% might not be applicable to all situations since each system has their own strengths and weaknesses.

Also wasn't the act20%-30% between the twoual delta supposed to be 20%-30% between the two?

That 20-30% was FUD. Xfans were wrongly thinking the PS5 was a 9.2Tflops box that boost up to 10.28Tflops.
 
That 20-30% was FUD. Xfans were wrongly thinking the PS5 was a 9.2Tflops box that boost up to 10.28Tflops.

Ok I thought the 20%-30% was calculated with the flops and other advantages the XSX has. The stated delta is always changing which is why I'm getting confused.
 

ABnormal

Member
"every ounce of usable power is always used" - but not if they didn't fully optimise - kindof looks like they didn't totally optimise here.
This is a common misunderstanding. They always use 100% of what they are ABLE to use. But improving and optimizing coding in time, CU's average idle time is reduced and the final result on screen improves and what they are able to use increases. But they never purposedly leave power unused.
 

ABnormal

Member
I'm talking about photo mode.

edit: Actually on a second reading, I'm not sure what your point is and why that is relevant to what I said.
It means that frame rate is not the only aspect that consumes processing power, and when developers cap frame rate, all the remaining processing power is used on rendering detail, according to avaliability. There is never purposedly unused processing power. So, if XSEX would have processing power to spare beyond frame rate cap, it would already been used on some rendering aspect (you can see it in every game comparison since comparisons exist). If there is not any significant difference in rendering, it means developers didn't have more avaliable processing power.
 

CrustyBritches

Gold Member
Ok I thought the 20%-30% was calculated with the flops and other advantages the XSX has. The stated delta is always changing which is why I'm getting confused.
The TFLOPS gap between the systems is 17%. However, PS5 has more than been holding it's own, with XSX pushing back a bit lately. The differences in perf between these systems is negligible. Something to remember, too, is that certain engines and settings config will favor narrow/fast or wide/less fast. We should see a back-and-forth like later gen HD Twin PS3 vs 360 multiplats.
 
Last edited:
The TFLOPS gap between the systems is 17%. However, PS5 has more than been holding it's own, with XSX pushing back a bit lately. The differences in perf between these systems is negligible. Something to remember, too, is that certain engines and settings config will favor narrow/fast or wide/less fast. We should see a back-and-forth like later gen HD Twin PS3 vs 360 multiplats.

That's probably true. This really isn't an X1 vs PS4 kind of situation. Both systems seem really well designed in my opinion.
 

SlimySnake

Flashless at the Golden Globes
Posted this in the next gen thread but figured id leave this here too. The CPu and GPU usage is virtually identical in photo mode and in game. doesnt matter if you are in combat or standing till or throwing shit about. its pretty much the same in both modes.

took several screenshots but this gif shows it well.

lHhPGbN.gif
 
D

Deleted member 17706

Unconfirmed Member
Both are pathetic and neither should use ray tracing at all outside of maybe some very custom stuff.
 
Last edited by a moderator:

SlimySnake

Flashless at the Golden Globes
Both are pathetic and neither should use ray tracing at all outside of maybe some very custom stuff.
nah. Control is an extremely poor performer even on PC. It's not a good benchmark for anything. Especially RT. Part of it is the destruction which is unlike anything this gen, but its mostly just an extremely poorly optimized game.

Here is Spiderman running at native 4k 30 fps on a 10 tflops PS5. This is an open world game. Drop the resolution down to 1440p like the UE5 demo and demon souls, and you are going to have games with some amazing RT with photorealistic visuals.

ATnFJUC.jpg
AMaWVh5.jpg
qo4FgcO.jpg
EnBHtjS.jpg
 
Last edited:
Was busy today so missed all the 'fun' here. It was interesting to read from the start and see how quickly the misdirection machine was spun up to downplay this - it's clearly the best/most obvious indicator of the actual power difference to date, but there are just soooo many people desperate to bury it in confusion (and sometimes outright lies) that it's actually quite sad to see.

More time and examples still needed to be sure, but it's becoming clearer that as the generation progresses the specs will not have lied. As games get more complex the gap will widen further and more obviously IMO. Only a big deal to those who will be suicidal at the realisation though. Both systems are fit for purpose.
 
Top Bottom