• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Perfomance of Dying Light [PS4 1080p/XO 1080pr]

So which one to get out of the consoles? Seems like six of one and half a dozen of the other.

Do you prefer framerate or IQ?
AF really doesn't take much to do. I've never noticed a FPS decrease at all when turning it on on PC and I'm fairly certain that's usual. I remember on nvidia's recommend settings blog about Far Cry 4 they showed benchmarks for AF on/off and there was no difference between settings.
The PS4 has to use the ~176GB/s of bandwidth for normal RAM usage as well. AF is, for all intent and purposes, an expensive operation in consoles.
 

RoboPlato

I'd be in the dick
AF really doesn't take much to do. I've never noticed a FPS decrease at all when turning it on on PC and I'm fairly certain that's usual. I remember on nvidia's recommend settings blog about Far Cry 4 they showed benchmarks for AF on/off and there was no difference between settings.

I would be super shocked if the lack of AF isn't a mistake. It's an easy, cheap visual upgrade that shouldn't require dropping the resolution or cause any dips to FPS.
It really does depend on your specs. The idea if AF having zero effect on performance is a myth. The framerate of Source based games on my Macbook Pro drops noticeably above 2x AF.
 

mintylurb

Member
No not really.

The idea that a marginally better CPU should mean better frame rates for multiplatform games is absurd.
It can help make up for the GPU deficit, but it does not mean that Xbox One games are going to achieve higher frame rates.

X1's cpu is 150mhz faster per core. That's total 900mhz for 6 cores(useable for games).
And X1 can utilize up to 7 cores compared to PS4's 6 cores.
This is akin to Goku going from Kaioken 1x to 3x to beat PS4.. I mean Vegeta.
 

Shredderi

Member
Do you prefer framerate or IQ?

The PS4 has to use the ~176GB/s of bandwidth for normal RAM usage as well. AF is, for all intent and purposes, an expensive operation in consoles.

Expensive in consoles it may be but it also improves the visuals significantly so it's worth it IMO.
 

danowat

Banned
I guess the question is, is the framerate of the XO version close enough to the PS4 version to warrant choosing it to get the better IQ
 

Kezen

Banned
It really does depend on your specs. The idea if AF having zero effect on performance is a myth. The framerate of Source based games on my Macbook Pro drops noticeably above 2x AF.

The impact of AF (16 samples) is negligible in 99% of cases on PC GPUs with weaker bandwith than the PS4.
Skyrim-PerformanceChart-AnisotropicFiltering.png


This is on a Geforce 560.
 

dr_rus

Member
Do you prefer framerate or IQ?

The PS4 has to use the ~176GB/s of bandwidth for normal RAM usage as well. AF is, for all intent and purposes, an expensive operation in consoles.

No it's not. It's the exact same operation as on the same architecture GPUs on PC. All AF queries are completely cached on chip nowadays - this is why several generations of GPU architectures do not show big drops in performance.

The only explanation for this is a bug in PS4 SDK or a lack of clear requirement for AF to be enabled on such surfaces.

This is beyond stupid really. Having a +50% GPU and loosing in AF.
 

kitch9

Banned
I played Dying Light (XB1) for about 4 hours last night, and I thought it looked & played splendidly. I did not notice any glaring frame-rate issues, pop-in, or screen tearing. It just looked and played great. No complaints here, I'm glad I bought it.

Thanks for coming into a DF thread and telling us you think it looks great.

It's good to know you think it looks great.
 

Elandyll

Banned
Obviously, to some, AF being sacrificed is a bigger deal than lower resolution and/or framerate.
Although ps4's theoretical memory bw is 176GB, considering this isn't the first ps4 game that lacks AF, I can only presume running at 1080P while having cpu/gpu share the bw could be causing this issue. But who knows. Maybe dark could ask techland why the ps4 version is lacking AF.

If it was missing in every single PS4 game running at 1080p, one would suspect tech limitations...

It's not the case, thus one should probably look at the devs and their priorities ...
 

kitch9

Banned
X1's cpu is 150mhz faster per core. That's total 900mhz for 6 cores(useable for games).
And X1 can utilize up to 7 cores compared to PS4's 6 cores.
This is akin to Goku going from Kaioken 1x to 3x to beat PS4.. I mean Vegeta.

That's not how it works lol.
 

On Demand

Banned
X1's cpu is 150mhz faster per core. That's total 900mhz for 6 cores(useable for games).
And X1 can utilize up to 7 cores compared to PS4's 6 cores.
This is akin to Goku going from Kaioken 1x to 3x to beat PS4.. I mean Vegeta.

I hope you're joking.

Yeah you have to be.
 

Kinthalis

Banned
AF really doesn't take much to do. I've never noticed a FPS decrease at all when turning it on on PC and I'm fairly certain that's usual. I remember on nvidia's recommend settings blog about Far Cry 4 they showed benchmarks for AF on/off and there was no difference between settings.

I would be super shocked if the lack of AF isn't a mistake. It's an easy, cheap visual upgrade that shouldn't require dropping the resolution or cause any dips to FPS.

As mentioned before, this may be a bandwidth issue. Modenr PC GPU's have gobs of bandwidth that is only used by the GPU.

On consoles, with a unified memory architecture, that bandwidth is in demand from both GPU and CPU.
 

Angel_DvA

Member
X1's cpu is 150mhz faster per core. That's total 900mhz for 6 cores(useable for games).
And X1 can utilize up to 7 cores compared to PS4's 6 cores.
This is akin to Goku going from Kaioken 1x to 3x to beat PS4.. I mean Vegeta.

Not sure if serious or not..

iFZnDlwpCxQR6.gif
 

omonimo

Banned
Well, articles before said so. So I am surprised that you are surprised. But wait, are you suggesting it can't at all? Then this is of course false. Or do you suggest it can't with this game? Then I am looking forward for a detailed explanation why.
I have to explain to you again why xbone it less capable of ps4 to hit 1080p? Or you just deliberately trolling? Not sure why you feel so hurt from something explained tons of times. Xbone had 16 ROPS, esram bottleneck. The only games with full 1080p coming from the ps360 generation.
 

Seanspeed

Banned
Sure it is the issue. Tons of people do this debate dance on this very forum and others.

You say PS4 is also a sacrifice, which is absolutely correct, but we're talking about XB1 in comparison to PS4.

Hence saying that your going to get sacrifices for XB1 by default relative to what PS4 can do based on the hardware differential. This is the case for resolution and other things.

Although i really doubt this game is GPU limited...the CPU of XB1 should in theory atleast keep it a few frames above..
You can't just say the XB1 can ONLY do 1080p with sacrifices and then think this doesn't apply to the PS4 as well. There is no weaseling out of this.
 

FlyinJ

Douchebag. Yes, me.
I definitely noticed the lack of any AF when playing yesterday playing it on my PS4.

Ground textures that are more than 10 feet from you look awful.
 

Cidd

Member
Anyone think the AF issue has anything to do with allowing the Vita streaming?

I don't know if that would cause any technical issue.
 

HTupolev

Member
The impact of AF (16 samples) is negligible in 99% of cases on PC GPUs with weaker bandwith than the PS4.
Skyrim-PerformanceChart-AnisotropicFiltering.png


This is on a Geforce 560.
A 560 has more bandwidth relative to its innards than PS4's GPU (especially considering the reduced contention since there's no CPU fighting for BW on the 560!), in addition to banks of I/O blocks (TMUs&ROPs) which are also bigger relative to its processing innards than those on PS4's GPU. Hence although the PS4 should have generally better performance, I/O is much more likely to be a bottleneck.

I'm not saying that that's necessarily what's going on, but the example of the 560 doesn't really prove anything.

No it's not. It's the exact same operation as on the same architecture GPUs on PC. All AF queries are completely cached on chip nowadays
Which should work great, as long as everything is always sitting comfortably in cache and no external accesses have to be made. Caching will hide costs up to a point, after which you start having to pay real penalties. AF requires that larger MIP levels reside in the cache, so high AF should be harder on your cache when rendering surfaces at oblique angles.

The question would be how quickly this becomes an issue on PS4.
 

omonimo

Banned
You can't just say the XB1 can ONLY do 1080p with sacrifices and then think this doesn't apply to the PS4 as well. There is no weaseling out of this.
Come on now. You are trying really hard there. So because I said xbone require more sacrifice than ps4, you think my thoughts about ps4 it's to be unlimited powerful to hit 1080p ? Lol.
 

Marlenus

Member
Do you prefer framerate or IQ?

The PS4 has to use the ~176GB/s of bandwidth for normal RAM usage as well. AF is, for all intent and purposes, an expensive operation in consoles.

The CPU - GDDR5 bus tops out at 20GB/s that leaves 156GB/s available for the GPU - GDDR5 connection. That is more than a 7870 or a 7850 and for the class of card in the PS4 it is enough.

AF is not expensive, why people say it is I have no clue, perhaps some specific implementations are expensive but in general it is not that expensive to add.

In this particular game though it has AF on the Xbox One version, considering the frame buffer size I doubt they are using the ESRAM for anything other than render targets which means the AF has to go through the DDR3 at 68 GB/s, which if the CPU is using its 30GB/s allocation means the AF is working when the bandwidth is only 38GB/s

There is no technical reason why it is not on the PS4 version other then dev oversight.
 
The IQ choice isn't obvious, though. Some may prefer the higher resolution (PS4), and others may prefer the slight lower resolution with AF (XB1). FPS is a clear winner on PS4, though.
It's clear in the XB1s favour, IMO, even with a lower resolution.

This outside comparison looks a lot better on XB1 than PS4. The corrugated roofs look, well, corrugated and the crane can actually be resolved as a crane.

The indoor scene we know looks better on XB1, too.
 

c0de

Member
I have to explain to you again why xbone it less capable of ps4 to hit 1080p? Or you just deliberately trolling? Not sure why you feel so hurt from something explained tons of times. Xbone had 16 ROPS, esram bottleneck. The only games with full 1080p coming from the ps360 generation.

You decided to not answer to my questions, that's ok :) it isn't about being hurt, it's questioning some posts. If you state something, people will ask you to argue for your statements.
 

foxbeldin

Member
The lack of AF in an undemanding game like stryder points to the driver/sdk issue more than to the lack of bandwidth or horsepower.
 

Marlenus

Member
A 560 has more bandwidth relative to its innards than PS4's GPU (especially considering the reduced contention since there's no CPU fighting for BW on the 560!), in addition to banks of I/O blocks (TMUs&ROPs) which are also bigger relative to its processing innards than those on PS4's GPU. Hence although the PS4 should have generally better performance, I/O is much more likely to be a bottleneck.

I'm not saying that that's necessarily what's going on, but the example of the 560 doesn't really prove anything.

The 560 has 128GB/s of memory bandwidth, this is less than the PS4 even after taking the maximum allocation for the CPU off.


Which should work great, as long as everything is always sitting comfortably in cache and no external accesses have to be made. Caching will hide costs up to a point, after which you start having to pay real penalties. AF requires that larger MIP levels reside in the cache, so high AF should be harder on your cache when rendering surfaces at oblique angles.

The question would be how quickly this becomes an issue on PS4.

It will be the same as it is on the 7850, which has no issues with AF.
 

omonimo

Banned
You decided to not answer to my questions, that's ok :) it isn't about being hurt, it's questioning some posts. If you state something, people will ask you to argue for your statements.
What exactly was your question? I though you are asking just about the reason why we can't be surprise if xbone can't handle full 1080p like ps4.
 
The lack of AF in an undemanding game like stryder points to the driver/sdk issue more than to the lack of bandwidth or horsepower.

Wouldn't it be funny if it was default 8x on xb1, but it was default 1x on PS4? As in... to turn it up you have to manually configure it on PS4... to turn it off on xb1 you need to manually configure it...
 

Durante

Member
It's clear in the XB1s favour, IMO, even with a lower resolution.

This outside comparison looks a lot better on XB1 than PS4.
Looking at those shots, and some of the same area with the same view direction posted in the PC performance thread, it seems like the draw distance on both consoles is equal or lower than the "25%" setting on PC.

(Just noting this since someone asked earlier in the thread)
 

foxbeldin

Member
Wouldn't it be funny if it was default 8x on xb1, but it was default 1x on PS4? As in... to turn it up you have to manually configure it on PS4... to turn it off on xb1 you need to manually configure it...

It would be funny but also concerning if after years of development, devs would just forget to tick a case.
 
It's clear in the XB1s favour, IMO, even with a lower resolution.

This outside comparison looks a lot better on XB1 than PS4. The corrugated roofs look, well, corrugated and the crane can actually be resolved as a crane.

The indoor scene we know looks better on XB1, too.

NO HOT LINKING PLEASE

THE HORROR
 

HTupolev

Member
The 560 has 128GB/s of memory bandwidth, this is less than the PS4 even after taking the maximum allocation for the CPU off.
I didn't say it was bigger, I said it was bigger relative to the innards of the GPU it was feeding, and it's thus less likely for I/O to be a bottleneck.

Hyperbolic example:
2GB/s is more than 1GB/s, but a 2GB/s bus would be much more of a bottleneck on a modern desktop CPU than a 1GB/s bus would be on a 32kHz PIC16.

It will be the same as it is on the 7850, which has no issues with AF.
This is a more meaningful question, since the GPU's configuration and circumstances start to look vaguely like-for-like.
 

cakely

Member
X1's cpu is 150mhz faster per core. That's total 900mhz for 6 cores(useable for games).
And X1 can utilize up to 7 cores compared to PS4's 6 cores.
This is akin to Goku going from Kaioken 1x to 3x to beat PS4.. I mean Vegeta.

The Xbox One and the PS4 have identical 8-core CPUs. The Xbox One runs at a clock speed that is 9.4% faster than the one in the PS4.

Saying that the Xbox One has a "900mhz faster" CPU just demonstrates that you're not very good at math.
 
Looking at those shots, and some of the same area with the same view direction posted in the PC performance thread, it seems like the draw distance on both consoles is equal or lower than the "25%" setting on PC.

(Just noting this since someone asked earlier in the thread)



Judging by the roof, I think it's lower than 25%.

The roof is the same as 0%.
 

Conduit

Banned
It's clear in the XB1s favour, IMO, even with a lower resolution.

This outside comparison looks a lot better on XB1 than PS4. The corrugated roofs look, well, corrugated and the crane can actually be resolved as a crane.

The indoor scene we know looks better on XB1, too.

I don't see any superior image quality in outside comparison in Xbone favor. Except that noticeable AF. And i noticed worse AA on Xbone screenshot ( pipe, fence )
 
Top Bottom