• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4's AF issue we need answers!

thelastword

Banned
Again, this is the theoretical maximum which you won't see "in real life" as it is shared with the CPU which will always steal bandwidth from the GPU. Xbone has exclusive access to the esram, not to mention that access to esram should have way less wait-states.
My goodness, are you pretending you know what you're talking about or that what you're saying here is actually factual? You begin your post with "Again", to re-iterate that you're wrong?
 
You mean the jaggies-ridden chromatic-aberrated-mess Dying Light?
Evolve is really nothing to write home about, tech-wise, but is definitely better than DL.

Both games use PBR, have very high polygon models, DL has per obejct motionblur, SSR, large draw distances, DOF, sub surface scattering, etc...

Dying light is rather impressive from a pure "tech" standpoint.
 

Dehnus

Member
Yeah, Rock Paper Shotgun should do an interview with the PS4. We might finally get to the bottom of this.

*Phone Rings -- Click :"Hello this is Mark"*
RPS:"Mark Cerny are you a scamm?"
Cerny:"W.. what? I?"
RPS:"You told us this would be the most powerful yet I look at this picture and clearly see less image quality....."
Cerny:"Surely you realise that a game engine is a complex beast to program, and that I set out.."
RPS:"So are you saying you are scamming?"
Cerny:"No I.."
RPS:"WE DEMAND TO KNOW! WE THE GAMERS DEMAND!"


There just gave you a re-enactment. now we can save poor Mister Cerny the trouble of having to deal with that rude ass hole.
 

c0de

Member
My goodness, are you pretending you know what you're talking about or that what you're saying here is actually factual? You begin your post with "Again", to re-iterate that you're wrong?

Well, when you say I am wrong I guess you know the truth. So instead of using the words you did you are free to prove anyone wrong, including me, in a tone where people actually don't lose interest to talk to you now and in future threads.
 

Dehnus

Member
Xbox esram is still only about as fast as Ps4 memory generally. So that shouldn't be an issue

Bandwidth is not speed, it is the size of the lorry's loading area ;)..

Esram actually is quite a bit "faster" than the GDDR5 in the PS4. (As in say change 1 bit in memory and count the nanoseconds.. ESRAM will simply win big time).

But a Ferrari will simply not be able to outpace a lorry ifit has to bring food to a starving village ;). Sure it can make 3 trips in the same time as the Lorry makes one, but the Lorry carries a SHITLOAD more :D.

And that is actually what we need in graphics, big lorries delivering data in good time ;). Not a Ferrari delivering us a receipt of what is to come ;).
 

c0de

Member
Bandwidth is not speed, it is the size of the lorry's loading area ;)..

Esram actually is quite a bit "faster" than the GDDR5 in the PS4. (As in say change 1 bit in memory and count the nanoseconds.. ESRAM will simply win big time).

When using "speed" you have to consider both bandwidth and latency. Looking at only one number just doesn't tell you enough.
 

RoboPlato

I'd be in the dick
That I assumed, especially concerning 60fps. Do you find it odd as a PS4 dev that people are turning it off / limiting it to 4x in games targetting 30fps with high quality textures and complex shading? It seems like the opposite of what I would do if I authored some great 2048X2048 PBR texture.
I know you're a PC gamer but even 4x makes a pretty big difference compared to what console users are used to. I'd love to see at least 8x across the board in PBR games though.
 
eSRAM = a U-Haul truck
GDDR5 = Wal-Mart

of course eSRAM is fast. If it was eGDDR5 it would be even faster too lol. Embedded RAM is fast because ... its embedded. Directly on the chip right along with the APU. But its basically cache, with limited usability.

Durante has already stated the AF situation couldn't be a bandwidth situation as it is an entirely GPU related process. Because of that, it is going to be utilizing the 156GB/s pool. Which is more than enough for 1080p. Obviously, since most games use AF. Its those few that don't that we were trying to figure out.
 
I know you're a PC gamer but even 4x makes a pretty big difference compared to what console users are used to. I'd love to see at least 8x across the board in PBR games though.

One of us! One of us! One of us!

Yeah, it is definitely better than what was found on most last gen games. But 8x (16x is not wholly necessary for most games) is definitely a sweet spot.

EDIT: I also own a sega genesis. So I am not a complete PC-only gamer :p
 

RoboPlato

I'd be in the dick
One of us! One of us! One of us!

Yeah, it is definitely better than what was found on most last gen games. But 8x (16x is not wholly necessary for most games) is definitely a sweet spot.

EDIT: I also own a sega genesis. So I am not a complete PC-only gamer :p
I have a Mac so I'm not a console only guy either. I can play Source games on Medium at 60fps :D

Actually been playing around with AF on that since there is a noticeable performance drop with it on. 16x is almost identical to 8x for me and there's a big performance hit moving from 8 to 16.
 
Need to change the thread title to: The PS4 - Cerny does not give AF

3seed_medium.gif
 

c0de

Member
Looked back a few pages in the thread and didn't see any picture. I remember that the Oddworld New and Tasty devs said they were getting 172 pretty early on so most teams will probably be relatively close to that. Doesn't the cache coherency of the RAM pool also help with preventing some of the wait time and competition for bandwidth?

Did you see it now? Just in case, I'll repost it:
It has yet to be proven false.
I actually was looking for the 172GB/s and probably found what you were mentioning:
http://gamingbolt.com/oddworld-inha...act-that-memory-operates-at-172gbs-is-amazing
But we don't know if he actually measured it or if he was just assuming that it'd work at that speed, article is from 8/2013. We also thought for a while PS4 would have more RAM available than it currently has.
 
I have no idea why some games on the PS4 have AF absent. I am not even going to guess what it could be, or play Monday Morning Developer and try to explain it.




However, from the videos of Bloodborne I have seen, it appears effective AF is in place. So unless the game gets downgraded or something before release, that is something positive moving forward.
 
I have a Mac so I'm not a console only guy either. I can play Source games on Medium at 60fps :D

Actually been playing around with AF on that since there is a noticeable performance drop with it on. 16x is almost identical to 8x for me and there's a big performance hit moving from 8 to 16.

What is the GPU in that thing? Also, remember the mac ports of all games are rather lackluster in terms of permance / visuals.
Iw ould definitely consider bootcamping windows if you can.
 

Necro900

Member
Both games use PBR, have very high polygon models, DL has per obejct motionblur, SSR, large draw distances, DOF, sub surface scattering, etc...

Dying light is rather impressive from a pure "tech" standpoint.

You may be running freakin' ray tracing, but if your IQ is this (100% view it):

dying-light-ps4.png


You may as well reconsider some technical choices in favor of something more polished. Plus, everything you listed is becoming pretty much standard by now (even launch title KZ:SF used physically based rendering, for one).
There's a clear difference between just implementing effects and actually leverage them, and that's where developers' skills shine. Yes -on paper- we're talking about pretty advanced engines, but the result is most questionable (Evolve being definitely better overall anyway).
To me, it doesn't matter what you implement in your engine, what matters is how well you do it. But that's my opinion, everyone is free to drool on specs..
 

RoboPlato

I'd be in the dick
Did you see it now? Just in case, I'll repost it: It has yet to be proven false.
I actually was looking for the 172GB/s and probably found what you were mentioning:
http://gamingbolt.com/oddworld-inha...act-that-memory-operates-at-172gbs-is-amazing
But we don't know if he actually measured it or if he was just assuming that it'd work at that speed, article is from 8/2013. We also thought for a while PS4 would have more RAM available than it currently has.
I see it now. Isn't that slide really old? Like 14+4 old?
 

c0de

Member
I see it now. Isn't that slide really old? Like 14+4 old?

It is 7 months old. But no matter how old it is, we should get used to the words like "peak" and "theoretical" if we talk about bandwidths (and TFlops ;)) and not take these numbers for comparisons or to use them as they would represent what's actually provided by the system. Sites like arstechnica do this, too.
 

omonimo

Banned
It is 7 months old. But no matter how old it is, we should get used to the words like "peak" and "theoretical" if we talk about bandwidths (and TFlops ;)) and not take these numbers for comparisons or to use them as they would represent what's actually provided by the system. Sites like arstechnica do this, too.
I'm not sure what you are trying to argue. Xbone has a sort of hardware advantage for the AF? Because it's absolutely weird. There are multiplat with better AF on ps4 console compared the xbone.
 

RoboPlato

I'd be in the dick
It is 7 months old. But no matter how old it is, we should get used to the words like "peak" and "theoretical" if we talk about bandwidths (and TFlops ;)) and not take these numbers for comparisons or to use them as they would represent what's actually provided by the system. Sites like arstechnica do this, too.
Could you link me to the full presentation? I'd like to read it.
 

Ombala

Member
You may be running freakin' ray tracing, but if your IQ is this (100% view it):

dying-light-ps4.png


You may as well reconsider some technical choices in favor of something more polished. Plus, everything you listed is becoming pretty much standard by now (even launch title KZ:SF used physically based rendering, for one).
There's a clear difference between just implementing effects and actually leverage them, and that's where developers' skills shine. Yes -on paper- we're talking about pretty advanced engines, but the result is most questionable (Evolve being definitely better overall anyway).
To me, it doesn't matter what you implement in your engine, what matters is how well you do it. But that's my opinion, everyone is free to drool on specs..
Dont know what you are talking about that looks awsome.
 

c0de

Member
I'm not sure what you are trying to argue. Xbone has a sort of hardware advantage for the AF? Because it's absolutely weird. There are multiplat with better AF on ps4 console compared the xbone.

Nope. I am trying to argue that the unified RAM pool could, if a game has huge demands for bandwidth on the CPU side, be a disadvantage for the bandwidth that will be available for the GPU bandwidth. This doesn't mean that this will apply to all games but games that have a huge bandwidth load for the CPU tasks. This doesn't change the raw numbers in any way but it's software that makes hardware fly and there could be cases where certain engine loads don't "favor" the way PS4 hardware is designed.
 

c0de

Member
Could you link me to the full presentation? I'd like to read it.

There is, of course, no full presentation available, so as I already said yesterday, we don't know how credible the source for this is. But it is definitely false to assume that the hardware specs we all read a thousand time represent the actual numbers which are always available for devs. Believe me, I would be very interested in reading it as I was when I read the leaked XDK from November last year.
 

dr_rus

Member
AF is not free, there is actual latency issues inside the shader. When a texture fetch is done, even when coming from the cache, the shader still has to wait before getting the result. That wait can be hidden by other independ ALU operation in some cases but not always (especially when the shader is texture bound and not ALU bound). The GCN architecture also does a lot to hide those latencies by parallelising computation between wavefronts (a bit like pipelining) but in case of big shaders with a lot of register pressure that parallelisation cannot be done (not enough ressource to actually execute more wavefronts in parallele).
PS4 GPU profling tools easily show that kind of behavior.
Of course this does not explain those specifics cases (only the related devs could explain that), but I'm just saying that it's a lot more complicated than people think.

So what are we talking about if that doesn't explain the issue this thread is about? No one is saying that AF is "free", only that the performance hit from AF is minimal (it doesn't really matter here from where that minimal cost is coming from - additional fetches from VRAM or added latency in mem access from SPs) and the lack of AF in some PS4 titles can't be explained by either bandwidth/performance limitations or design choice.

We're back to some translation software being the likely candidate for the source of the issue.
 

Conduit

Banned
It is 7 months old. But no matter how old it is, we should get used to the words like "peak" and "theoretical" if we talk about bandwidths (and TFlops ;)) and not take these numbers for comparisons or to use them as they would represent what's actually provided by the system. Sites like arstechnica do this, too.

It is old year and a half. 14+4 were debunked last year. Dev can use CU's ratio how they want.
 

c0de

Member
It is old year and a half. 14+4 were debunked last year. Dev can use CU's ratio how they want.

I don't know what you are talking about but obviosly not the same as I did when I replied to Roboplato. Click throught the conversation to see what's the topic.
 

dr_rus

Member
Nope. I am trying to argue that the unified RAM pool could, if a game has huge demands for bandwidth on the CPU side, be a disadvantage for the bandwidth that will be available for the GPU bandwidth. This doesn't mean that this will apply to all games but games that have a huge bandwidth load for the CPU tasks. This doesn't change the raw numbers in any way but it's software that makes hardware fly and there could be cases where certain engine loads don't "favor" the way PS4 hardware is designed.

a. CPU bandwidth is a very low number in general. For the most of CPU tasks latency is more important than bandwidth. That's why you don't generally see much difference in performance on PC between 2 and 4 channel memory platforms.
b. Maximum CPU bandwidth possible is a known number on PS4.
c. Even if we subtract that number from the whole PS4 GDDR5 bandwidth we're still left with a figure which is several times higher than that on slower PC GCN cards _and_ on XBO.
d. AF is cached on modern GPUs and its external bandwidth requirement is actually very low. Hence why it's nearly "free" on low end GPUs which don't have even 1/10th of PS4 memory bandwidth.

This argument is invalid and can't be the reason why PS4 has no AF in some titles when compared to XBO and PS3.
 

thelastword

Banned
Well, when you say I am wrong I guess you know the truth. So instead of using the words you did you are free to prove anyone wrong, including me, in a tone where people actually don't lose interest to talk to you now and in future threads.
Don't be so sensitive, you've been corrected many times on the issue already but you keep repeating the same thing. The PS4 has more bandwidth than the XBONE, that's a fact, unless you have a special xbone of course.

Though ESRAM is fast, that is also it's shortcoming, it cannot carry much through the gate, hence the issue with any modern engine that tries to push the envelope. MGS PP is a deferred renderer, the engine has a very advanced lighting system, it's 60fps and very dynamic on many levels, day night, weather, clouds, a bevy of effects.

If any game is pushing the envelope and requires lots of read write routines, this is it also, (since you have a fascination with read write routines) based on the dynamism of the engine, player unpredictability and choice, Ai etc.. but that's mostly cpu anyway. Mostly based on the former paragraph though, the ESRAM just can't push all of that to the screen at 900 60fps on the xbone, it has to do so at 720p, so in most cases the ESRAM is a bottleneck rather than the huge open gate you think it is.

Using Esram properly is a headache in many circumstances because of it's limitations and you will begin to see even more of those limitations as games try to push 1080p 60fps open worlds etc..with advanced rendering techniques.

This kinda puts it into light. QLOC is not some type of genius developer that can work ESRAM better than any other. Based on their work on Xenoverse and now DMc, they're simply devs outsourced to bring a product to light, they did better work on the DMC port but the better AF and slightly better framerate on xbone has nothing to do with proper use of ESRAM or the power of the xbox.
 

c0de

Member
a. CPU bandwidth is a very low number in general. For the most of CPU tasks latency is more important than bandwidth. That's why you don't generally see much difference in performance on PC between 2 and 4 channel memory platforms.
b. Maximum CPU bandwidth possible is a known number on PS4.
c. Even if we subtract that number from the whole PS4 GDDR5 bandwidth we're still left with a figure which is several times higher than that on slower PC GCN cards _and_ on XBO.
d. AF is cached on modern GPUs and its external bandwidth requirement is actually very low. Hence why it's nearly "free" on low end GPUs which don't have even 1/10th of PS4 memory bandwidth.

This argument is invalid and can't be the reason why PS4 has no AF in some titles when compared to XBO and PS3.

I don't say you are wrong as I don't say I am right but the whole discussion just lacks any specific numbers so either one can say what he want to prove another wrong when he in fact doesn't.
But to b) and c) you are also assuming that you only have to subtract the CPU max bandwidth from the total max bandwidth and get what is available for the GPU but we don't know if this is true in reality. Of course you can do this when arguing but don't assume people will just follow statements when there is no proof behind that this is valid.
Again, we don't know any specifics on the actual numbers so unless we hear from a dev this is all fishing in muddy waters, no matter how hard one tries to present his opinion in this case as a fact.
 

c0de

Member
Don't be so sensitive, you've been corrected many times on the issue already but you keep repeating the same thing.

In fact, I am not. I get different opinions. This is not what is proving any theory wrong.


The PS4 has more bandwidth than the XBONE, that's a fact, unless you have a special xbone of course.

Yes. It has. It's the better machine. It has the better hardware. Everything is better. Yes.
AF is still worse in some cases. Goto Start:
 

thelastword

Banned
You may be running freakin' ray tracing, but if your IQ is this (100% view it):

dying-light-ps4.png


You may as well reconsider some technical choices in favor of something more polished. Plus, everything you listed is becoming pretty much standard by now (even launch title KZ:SF used physically based rendering, for one).
There's a clear difference between just implementing effects and actually leverage them, and that's where developers' skills shine. Yes -on paper- we're talking about pretty advanced engines, but the result is most questionable (Evolve being definitely better overall anyway).
To me, it doesn't matter what you implement in your engine, what matters is how well you do it. But that's my opinion, everyone is free to drool on specs..
You know I've been impressed with Dying Light and sometimes not, first impressions was not that good, but it has it's moments. I do get your point and agree though, some people believe that using every modern graphical effect in the book makes your game look great, that is certainly not so.

Some devs go way overboard on certain effects, sacrificing framerate, and the picture you get overcompensates for it's lack of subtlety and balance. Great art married with good tech and it's proper application is what sets a good looking game apart from games just pushing tech willy nilly.
 

R_Deckard

Member
You mean the jaggies-ridden chromatic-aberrated-mess Dying Light?
Evolve is really nothing to write home about, tech-wise, but is definitely better than DL.

What the hell is this?!!?

Dying Light is a great looking game with a decent set of Post Processing effects, great dynamic Lighting, models! Some people expectations are so whack it cannot even fathom??
 

leeh

Member
What the hell is this?!!?

Dying Light is a great looking game with a decent set of Post Processing effects, great dynamic Lighting, models! Some people expectations are so whack it cannot even fathom??

I agree with the posts above, I honestly feel like the developers should prioritize IQ before going all out on different post techniques. Why ruin your hard work of implementing nice post-processing when it's ruined by a lack of AF/AA?
 

R_Deckard

Member
It is 7 months old. But no matter how old it is, we should get used to the words like "peak" and "theoretical" if we talk about bandwidths (and TFlops ;)) and not take these numbers for comparisons or to use them as they would represent what's actually provided by the system. Sites like arstechnica do this, too.

Its not, it is infact from a Paris talk in 2013.
 

Mastperf

Member
In fact, I am not. I get different opinions. This is not what is proving any theory wrong.




Yes. It has. It's the better machine. It has the better hardware. Everything is better. Yes.
AF is still worse in some cases. Goto Start:

You guys are just arguing in circles. Since textures are handled via DDR3 on XB1, bandwidth wouldn't be the problem since there's no way the GDDR5 in PS4 is dropping to anywhere near the real-world performance of the DDR3. We're seeing games at 1080p on both lacking AF on PS4 as well as PS3 ports having it removed. I can't see any reasonable hardware explanation where the PS3 can handle it but the PS4 couldn't.
My point is arguing about it will get us nowhere without more developer input.
 

c0de

Member
Neither was I, the slide is from 2013

The one with the bandwidth? I only know that the news were I saw it for the first time is 7 months old. But as you seem to know more about it, then you can point me to the presentation were this slide occurs?
 
Top Bottom