• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Thief for Xbox One edges out the PS4 version

chubigans

y'all should be ashamed
There's a lot of misinformation going on in here. Thinking that the 1080p resolution resulted in lower filtering...no. That's not at all how that works. Clearly performance wasn't at the forefront of the dev's concerns given the sub 30fps for both versions.

In fact I'm seeing a lot of "they should just lower the resolution to 900p/720p and get extra effects on PS4" and that's just a load of malarkey. What we're seeing with XB1, with the resolution issues and such, is a result of not necessarily performance, but of bandwidth concerns regarding ESRAM. To work within the 32mb, you have to pick and choose what you want to output. Right now devs have to actually choose resolution over post processing and effects on XB1, and the easy route at the moment is lower resolution. In time, there will be better tools so that devs can pump more into the ESRAM using smarter coding techniques given by better SDKs and such. But that will always be a factor for XB1.

The reason PS4 is almost always 1080p is because there is no major bandwidth hurdles like XB1. You output at 1080p because it's low cost and affects little with your memory/graphics budget on PS4. Literally the only exception to this is BF4, which was using early SDKs that, like all launch games, didn't fully harness all the hardware the PS4 had to offer. Consider that PS4 launch games got resolution bumps to 1080p once they had better dev tools/SDKs (Assassin's Creed, Call of Duty) with no trade off to performance.

At this point it is straight up FUD to believe PS4 titles are suffering graphically due to 1080p, or that the filtering effect from Thief would be due to the resolution bump.

edit: to be clear, I'm not saying running at 1080p isn't going to have a performance hit, because it is. But the cost of running 1080p on XB1 and PS4 are significantly different.
 

driver116

Member
i think for the resolutionframerategate that every website is talking about ..ps user base seem obsessed by that resolution
maybe the devs would satisfy that "type" of users?

i think if they used af on ps4 they would end to have the same framerate (clearly the resolution should have to be taken in consideration too eh )

Well done anyway man - you've done great with the xbox stuff.
 

Biker19

Banned
Amazing isn't it? Seeing them trying to scramble every explanation other than what's in front of them is amusing.

Just sit back and smile at their hypocrisy.

It's fairly easy to see, the 1st page is full of people discrediting DF, citing money hats,and using "lazy dev" excuse, you being equally disingenuous about it doesnt help. These explanations were brought up and rejected by many for Xbox One versions of games, yet their valid explanations here for some. Text book definition of hypocrisy. Don't be upset for pointing it out

LOL, you can't be serious with these posts.
 

TGO

Hype Train conductor. Works harder than it steams.
That makes zero sense...
So this is basically like Crysis 2 on consoles, PS3 had better texture filter but was lower res, lower framerate. except the 360 version was the DF recommendation because higher res and framerate, which made sense, but this don't by the usual standard they measure these comparisons by.
Interesting, Good win for Xbox One then....even though technically it don't make sense :-/
 

SapientWolf

Trucker Sexologist
There's a lot of misinformation going on in here. Thinking that the 1080p resolution resulted in lower filtering...no. That's not at all how that works. Clearly performance wasn't at the forefront of the dev's concerns given the sub 30fps for both versions.

In fact I'm seeing a lot of "they should just lower the resolution to 900p/720p and get extra effects on PS4" and that's just a load of malarkey. What we're seeing with XB1, with the resolution issues and such, is a result of not necessarily performance, but of bandwidth concerns regarding ESRAM. To work within the 32mb, you have to pick and choose what you want to output. Right now devs have to actually choose resolution over post processing and effects on XB1, and the easy route at the moment is lower resolution. In time, there will be better tools so that devs can pump more into the ESRAM using smarter coding techniques given by better SDKs and such. But that will always be a factor for XB1.

The reason PS4 is almost always 1080p is because there is no major bandwidth hurdles like XB1. You output at 1080p because it's low cost and affects little with your memory/graphics budget on PS4. Literally the only exception to this is BF4, which was using early SDKs that, like all launch games, didn't fully harness all the hardware the PS4 had to offer. Consider that PS4 launch games got resolution bumps to 1080p once they had better dev tools/SDKs (Assassin's Creed, Call of Duty) with no trade off to performance.

At this point it is straight up FUD to believe PS4 titles are suffering graphically due to 1080p, or that the filtering effect from Thief would be due to the resolution bump.
I don't think the XB1's GPU has the grunt to handle 1080p in games like BF4 even if it didn't have the bandwidth issues.

I also don't think the PS4's GPU has the grunt to do something like the Infiltrator demo at 1080p.
 

benny_a

extra source of jiggaflops
AFAIK PS4 don't use OpenGL.
Maybe they are using LibGNMX. Maybe Strider is the same, and the issue only exists if you use that API instead of straight LibGNM.

This is always the case. Anyone not praising PS4 is a Nintendo defender PC fanatic or xbone fan. Some folks can't accept bad news on ps4. Get over it is what I say.
What is the bad news here? DF thinks the Xbone version is slightly preferred over the PS4, because it lacks some things and has other things?
 

Kimawolf

Member
This is how it goes with the internet's response to digital foundry lately:
The PS4 is more powerful than the Xbox One. Therefore, if DF says a PS4 game looks better, they are right. If DF says a Xbox One game looks better, they are wrong.

If people really believe this, why read Df at all? To me this seems like the recent concentration on 1080p forced the ps4 versions texture filtering choice, and it impacted image quality.

This is always the case. Anyone not praising PS4 is a Nintendo defender PC fanatic or xbone fan. Some folks can't accept bad news on ps4. Get over it is what I say.
 
That makes zero sense...
So this is basically like Crysis 2 on consoles, PS3 had better texture filter but was lower res, lower framerate. except the 360 version was the DF recommendation because higher res and framerate, which made sense, but this don't by the usual standard they measure these comparisons by.
Interesting, Good win for Xbox One then....even though technically it don't make sense :-/

You can't call it a good win and in the same breath call it nonsensical, it sounds completely disingenuous. It's clear which side of the debate you cone down on so just state it plainly without trying to act impartial.
 

Wotanik

Banned
I've been following this a lot now, not because Thief interests me, but because I find it interesting that both of the ports are so poorly made. Were the devs on a strict deadline as every port has flaws, even the PC version struggles. It's quite horrible considering how much I liked Deus Ex:HR and how well it played (to my recollection).

So, are there some patches coming up? :)
 

Biker19

Banned
There's a lot of misinformation going on in here. Thinking that the 1080p resolution resulted in lower filtering...no. That's not at all how that works. Clearly performance wasn't at the forefront of the dev's concerns given the sub 30fps for both versions.

In fact I'm seeing a lot of "they should just lower the resolution to 900p/720p and get extra effects on PS4" and that's just a load of malarkey. What we're seeing with XB1, with the resolution issues and such, is a result of not necessarily performance, but of bandwidth concerns regarding ESRAM. To work within the 32mb, you have to pick and choose what you want to output. Right now devs have to actually choose resolution over post processing and effects on XB1, and the easy route at the moment is lower resolution. In time, there will be better tools so that devs can pump more into the ESRAM using smarter coding techniques given by better SDKs and such. But that will always be a factor for XB1.

The reason PS4 is almost always 1080p is because there is no major bandwidth hurdles like XB1. You output at 1080p because it's low cost and affects little with your memory/graphics budget on PS4. Literally the only exception to this is BF4, which was using early SDKs that, like all launch games, didn't fully harness all the hardware the PS4 had to offer. Consider that PS4 launch games got resolution bumps to 1080p once they had better dev tools/SDKs (Assassin's Creed, Call of Duty) with no trade off to performance.

At this point it is straight up FUD to believe PS4 titles are suffering graphically due to 1080p, or that the filtering effect from Thief would be due to the resolution bump.

This. Thief on PS4 is clearly due to lazy developers with awful porting.

PS4's definitely more than powerful enough to run games without any problems whatsoever in Native 1080p. Plenty of other games on the system have proven that.
 

chubigans

y'all should be ashamed
I don't think the XB1's GPU has the grunt to handle 1080p in games like BF4 even if it didn't have the bandwidth issues.

I also don't think the PS4's GPU has the grunt to do something like the Infiltrator demo at 1080p.

Right, I'm not saying 1080p doesn't impact performance. But the cost of running 1080p on XB1 and PS4 are significantly different.
 

Respawn

Banned
There's a lot of misinformation going on in here. Thinking that the 1080p resolution resulted in lower filtering...no. That's not at all how that works. Clearly performance wasn't at the forefront of the dev's concerns given the sub 30fps for both versions.

In fact I'm seeing a lot of "they should just lower the resolution to 900p/720p and get extra effects on PS4" and that's just a load of malarkey. What we're seeing with XB1, with the resolution issues and such, is a result of not necessarily performance, but of bandwidth concerns regarding ESRAM. To work within the 32mb, you have to pick and choose what you want to output. Right now devs have to actually choose resolution over post processing and effects on XB1, and the easy route at the moment is lower resolution. In time, there will be better tools so that devs can pump more into the ESRAM using smarter coding techniques given by better SDKs and such. But that will always be a factor for XB1.

The reason PS4 is almost always 1080p is because there is no major bandwidth hurdles like XB1. You output at 1080p because it's low cost and affects little with your memory/graphics budget on PS4. Literally the only exception to this is BF4, which was using early SDKs that, like all launch games, didn't fully harness all the hardware the PS4 had to offer. Consider that PS4 launch games got resolution bumps to 1080p once they had better dev tools/SDKs (Assassin's Creed, Call of Duty) with no trade off to performance.

At this point it is straight up FUD to believe PS4 titles are suffering graphically due to 1080p, or that the filtering effect from Thief would be due to the resolution bump.

Great post but you know the ones that will ignore this :). This has to be discussed and it should. It will be forgotten when Infamous is released.
 

Jtrizzy

Member
Next generation sucks because games don't run 1440p at 60fps for less than $500?
1121699683.png
 

Warewolf

Member
Wait. That has to be some kind of mistake from a development perspective right? Trilinear filtering hasn't provided a noticeable performance increase over 16x Anisotropic since 2002. What is going on?
 

Elios83

Member
I still don't get why Digital Foundry is mistaking the texture loading bug of the UE for a difference in texture filtering.
 

benny_a

extra source of jiggaflops
I still don't get why Digital Foundry is mistaking the texture loading bug of the UE for a difference in texture filtering.
They do not mistake those two. It's posters not reading the article fueled by other posters that submit these screenshots without context. (Context is given in the article.)
 
Half a million more pixels and better framerate on the PS4 and better filtering with XO.

Man, even when given the nod the XO falters heavily in comparison with the actual data. Triliniar filtering though, what the hell?
 

FranXico

Member
Is there a 2nd opinion/comparison on these drapes? I kinda refuse to belive that the quality in textures are that bad on the PS4 version.


I think the OP should be updated to clarify that those textures only look like that due to a delay in assets streaming.
 

SapientWolf

Trucker Sexologist
Right, I'm not saying 1080p doesn't impact performance. But the cost of running 1080p on XB1 and PS4 are significantly different.
I agree with the overall point but the XB1's GPU is going to crap out at 1080p well before the PS4, even if bandwidth wasn't an issue. ESRAM isn't the only problem. It just makes the problem even worse.
 

FranXico

Member
I agree with the overall point but the XB1's GPU is going to crap out at 1080p well before the PS4, even if bandwidth wasn't an issue. ESRAM isn't the only problem. It just makes the problem even worse.

I thought that the worst about the XB1 design was that the ESRAM was not large enough? If it was larger, it would be able to cope with higher resolution frames without much of a problem.
 

chubigans

y'all should be ashamed
I agree with the overall point but the XB1's GPU is going to crap out at 1080p well before the PS4, even if bandwidth wasn't an issue. ESRAM isn't the only problem. It just makes the problem even worse.

Yeah, dunno what MS was thinking there. :(

I thought that the worst about the XB1 design was that the ESRAM was not large enough? If it was larger, it would be able to cope with higher resolution frames without much of a problem.

Well the problem wouldn't go away if XB1 had no ESRAM and GDDR5, but it would definitely be much less of a problem.
 

Biker19

Banned
I agree with the overall point but the XB1's GPU is going to crap out at 1080p well before the PS4, even if bandwidth wasn't an issue. ESRAM isn't the only problem. It just makes the problem even worse.

And what's going to happen when future 3rd party games starts being more demanding? It'll be even worse for Xbox One than it will be for PS4, especially in resolutions. Heck, we could even be seeing a ton of Sub-HD games & Native 720p games on Xbox One.
 

Elios83

Member
They do not mistake those two. It's posters not reading the article fueled by other posters that submit these screenshots without context. (Context is given in the article.)

Umm no in their article they're posting screens with the PS4 version in the moment where the textures have not been loaded and they're talking about texture filtering? WTF?

Also it's weird that this issue is present if the assets are loaded from the hd, it's as if the textures are being loaded from the Blu Ray drive at a slow speed.
 
Umm no in their article they're posting screens with the PS4 version in the moment where the textures have not been loaded and they're talking about texture filtering? WTF?

Also it's weird that this issue is present if the assets are loaded from the hd, it's as if the textures are being loaded from the Blu Ray drive at a slow speed.

Yeah that seems a bit blatant to me. Why post screens of instances where the PS4 textures are not fully loaded?
 

BibiMaghoo

Member
Has anyone actually asked a DF employee if they did this with full installs on both consoles?

It would clear the recurring question, and possibly resolve the doubt people have about at what point the game was tested. Then we won't see the question every page.

50 ppp, sue me
 
Reality check:

PS4 and Xbox One have the same CPU but the PS4 has less performance reserved thus has more performance freed up for games and/or apps.

PS4 has 8GB GDDR5 while the Xbox One has 8GB DDR3 and 32MB ESRAM. GDDR always has much more aggressive timings than non GDDR memory as GPUs use GDDR (Graphics DDR) because they are more dependent on low latency than CPUs. The GDDR5 in the PS4 has fastly superior bandwidth (176GB/s vs 68GB/s and 102GB/s) to the DDR3 and ESRAM of the Xbox One and problably has lower latency aswell.

The Radeon GPU in the PS4 is superior in every area to the Xbox One GPU. It has much more shader cores (1152 vs 768) with better floating point performance (1,84TFLOPS vs 1,31TFLOPS). Because it has more unified shaders means more vertex, pixel and geometry shader performance. It has much more texel and pixel fillrate because it has more texture mapping units (TMUs) and double the render output unit GPU featureset is ROPs). The GPU featureset is also improved because the PS4 uses the GCN v1.1 featureset while the Xbox One uses the GCN v1.0 featureset.

The HDD and Blu-ray are nearly indentical.

To sum up. The PS4 is better in every aspect. With an unbiased developer there shouldn't be any reason why the PS4 version isn't fastly superior. So better framerates, higher resolution, more effects or a combination of the three.

Anisotropic filtering uses a very low amount of texel fillrate and memory bandwidth. It's next to free (less than 1% performance) since the Radeon X1900/GeForce 7900 GPUs so should be enabled in every PS4 and Xbox One game by default.

Things aren't so straight forwards. I already gave you valid hypotheticals, let me run one of them by you again.

Say you cannot get close to hitting 1080p on Xbox One. You pick a resolution less than that which has the performance and lets you do the level of effects that you want to do. Say you can ALMOST hit 1080p on PS4 with the performance and level of effects that you want to do, you may well as a developer feel that native output outweighs losing some effects here and there.

I am not saying that is what has happened here, I don't know. But if a game is almost at the point of 1080p on PS4, and a few cuts can get there, sometimes making those cuts to reach a native output is going to be preferred. Where you cannot get there on Xbox One, you'll just use whatever resolution lets you keep the performance and effects you want, because no scaling is much better than say a 5% up scale than a 5% up scale compared to a 10% one.
 

Saty

Member
The frame rate drops + frame pacing that result in lags are far bigger concerns than texture filtering or streaming or if there is or isn't POM.

To call the PS4\XB1 versions lazy\shitty\mess\unoptimized based on the latter and to hope the game suffers commercially because of this is way too much. Next time another AAA studio shuts down and people brush away the notion that users are fixating way too much about production values and are only tolerating ever-increasing standards of technical proficiency to the detriment of the industry - i'll point them here.
 

Kimawolf

Member
Maybe they are using LibGNMX. Maybe Strider is the same, and the issue only exists if you use that API instead of straight LibGNM.


What is the bad news here? DF thinks the Xbone version is slightly preferred over the PS4, because it lacks some things and has other things?
Well read the thread. Folks calling them bias and lol and stuff. Its crazy. I agree by the way. If you are stuck on consoles get the Xbone version. But really you should be getting it on PC is what I got from it.
 

test_account

XP-39C²
I think the OP should be updated to clarify that those textures only look like that due to a delay in assets streaming.
True. I did see some comments about it being a texture streaming issue, i was mostly thinking about if someone had done a 2nd comparsion to debunk it 100% :)
 

benny_a

extra source of jiggaflops
Umm no in their article they're posting screens with the PS4 version in the moment where the textures have not been loaded and they're talking about texture filtering? WTF?
Maybe your browser is broken, this is how it looks for me:
IN3kKBk.jpg

9Nf4oBv.jpg


Correct context given. Only thing one can complain about is that the first issue is better demonstrated using an animated image.

First time heard of LibGNMX. What's the difference between it and LibGNM?
DirectX-Style wrapper for libgnm

Well read the thread. Folks calling them bias and lol and stuff. Its crazy. I agree by the way. If you are stuck on consoles get the Xbone version. But really you should be getting it on PC is what I got from it.
I have read the thread. I asked what the "news" was. Digital Foundry's debatable verdict is not news.
 

btrboyev

Member
So now framerate and res don't matter?. Seems a strange outcome if the one with a higher res and higher framerate isn't the winner.


24fps vs 20 is barely noticeable..they are both shit.

1080p and 900p is hard to distinguish here because the Xbox is using a better form of filtering so it ends up looking on psr, and in cases better.
 

H6rdc0re

Banned
24fps vs 20 is barely noticeable..they are both shit.

1080p and 900p is hard to distinguish here because the Xbox is using a better form of filtering so it ends up looking on psr, and in cases better.

Filtering has absolutely nothing to do with resolution.
 
Top Bottom