I thought the 204 number was from adding DDR3 and eSRAM. You sure that's for eSRAM alone?
Yes:
http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects
I thought the 204 number was from adding DDR3 and eSRAM. You sure that's for eSRAM alone?
Wait, what? Those are brilliant looking games.
My goodness, are you pretending you know what you're talking about or that what you're saying here is actually factual? You begin your post with "Again", to re-iterate that you're wrong?Again, this is the theoretical maximum which you won't see "in real life" as it is shared with the CPU which will always steal bandwidth from the GPU. Xbone has exclusive access to the esram, not to mention that access to esram should have way less wait-states.
You mean the jaggies-ridden chromatic-aberrated-mess Dying Light?
Evolve is really nothing to write home about, tech-wise, but is definitely better than DL.
You mean the jaggies-ridden chromatic-aberrated-mess Dying Light?
Evolve is really nothing to write home about, tech-wise, but is definitely better than DL.
Yeah, Rock Paper Shotgun should do an interview with the PS4. We might finally get to the bottom of this.
My goodness, are you pretending you know what you're talking about or that what you're saying here is actually factual? You begin your post with "Again", to re-iterate that you're wrong?
Xbox esram is still only about as fast as Ps4 memory generally. So that shouldn't be an issue
Bandwidth is not speed, it is the size of the lorry's loading area ..
Esram actually is quite a bit "faster" than the GDDR5 in the PS4. (As in say change 1 bit in memory and count the nanoseconds.. ESRAM will simply win big time).
I know you're a PC gamer but even 4x makes a pretty big difference compared to what console users are used to. I'd love to see at least 8x across the board in PBR games though.That I assumed, especially concerning 60fps. Do you find it odd as a PS4 dev that people are turning it off / limiting it to 4x in games targetting 30fps with high quality textures and complex shading? It seems like the opposite of what I would do if I authored some great 2048X2048 PBR texture.
I know you're a PC gamer but even 4x makes a pretty big difference compared to what console users are used to. I'd love to see at least 8x across the board in PBR games though.
I have a Mac so I'm not a console only guy either. I can play Source games on Medium at 60fpsOne of us! One of us! One of us!
Yeah, it is definitely better than what was found on most last gen games. But 8x (16x is not wholly necessary for most games) is definitely a sweet spot.
EDIT: I also own a sega genesis. So I am not a complete PC-only gamer
Need to change the thread title to: The PS4 - Cerny does not give AF
Need to change the thread title to: The PS4 - Cerny does not give AF
lmaoNeed to change the thread title to: The PS4 - Cerny does not give AF
Looked back a few pages in the thread and didn't see any picture. I remember that the Oddworld New and Tasty devs said they were getting 172 pretty early on so most teams will probably be relatively close to that. Doesn't the cache coherency of the RAM pool also help with preventing some of the wait time and competition for bandwidth?
It has yet to be proven false.
I have a Mac so I'm not a console only guy either. I can play Source games on Medium at 60fps
Actually been playing around with AF on that since there is a noticeable performance drop with it on. 16x is almost identical to 8x for me and there's a big performance hit moving from 8 to 16.
Both games use PBR, have very high polygon models, DL has per obejct motionblur, SSR, large draw distances, DOF, sub surface scattering, etc...
Dying light is rather impressive from a pure "tech" standpoint.
I see it now. Isn't that slide really old? Like 14+4 old?Did you see it now? Just in case, I'll repost it: It has yet to be proven false.
I actually was looking for the 172GB/s and probably found what you were mentioning:
http://gamingbolt.com/oddworld-inha...act-that-memory-operates-at-172gbs-is-amazing
But we don't know if he actually measured it or if he was just assuming that it'd work at that speed, article is from 8/2013. We also thought for a while PS4 would have more RAM available than it currently has.
Need to change the thread title to: The PS4 - Cerny does not give AF
I see it now. Isn't that slide really old? Like 14+4 old?
I'm not sure what you are trying to argue. Xbone has a sort of hardware advantage for the AF? Because it's absolutely weird. There are multiplat with better AF on ps4 console compared the xbone.It is 7 months old. But no matter how old it is, we should get used to the words like "peak" and "theoretical" if we talk about bandwidths (and TFlops ) and not take these numbers for comparisons or to use them as they would represent what's actually provided by the system. Sites like arstechnica do this, too.
Could you link me to the full presentation? I'd like to read it.It is 7 months old. But no matter how old it is, we should get used to the words like "peak" and "theoretical" if we talk about bandwidths (and TFlops ) and not take these numbers for comparisons or to use them as they would represent what's actually provided by the system. Sites like arstechnica do this, too.
Dont know what you are talking about that looks awsome.You may be running freakin' ray tracing, but if your IQ is this (100% view it):
You may as well reconsider some technical choices in favor of something more polished. Plus, everything you listed is becoming pretty much standard by now (even launch title KZ:SF used physically based rendering, for one).
There's a clear difference between just implementing effects and actually leverage them, and that's where developers' skills shine. Yes -on paper- we're talking about pretty advanced engines, but the result is most questionable (Evolve being definitely better overall anyway).
To me, it doesn't matter what you implement in your engine, what matters is how well you do it. But that's my opinion, everyone is free to drool on specs..
I'm not sure what you are trying to argue. Xbone has a sort of hardware advantage for the AF? Because it's absolutely weird. There are multiplat with better AF on ps4 console compared the xbone.
Could you link me to the full presentation? I'd like to read it.
AF is not free, there is actual latency issues inside the shader. When a texture fetch is done, even when coming from the cache, the shader still has to wait before getting the result. That wait can be hidden by other independ ALU operation in some cases but not always (especially when the shader is texture bound and not ALU bound). The GCN architecture also does a lot to hide those latencies by parallelising computation between wavefronts (a bit like pipelining) but in case of big shaders with a lot of register pressure that parallelisation cannot be done (not enough ressource to actually execute more wavefronts in parallele).
PS4 GPU profling tools easily show that kind of behavior.
Of course this does not explain those specifics cases (only the related devs could explain that), but I'm just saying that it's a lot more complicated than people think.
It is 7 months old. But no matter how old it is, we should get used to the words like "peak" and "theoretical" if we talk about bandwidths (and TFlops ) and not take these numbers for comparisons or to use them as they would represent what's actually provided by the system. Sites like arstechnica do this, too.
It is old year and a half. 14+4 were debunked last year. Dev can use CU's ratio how they want.
Nope. I am trying to argue that the unified RAM pool could, if a game has huge demands for bandwidth on the CPU side, be a disadvantage for the bandwidth that will be available for the GPU bandwidth. This doesn't mean that this will apply to all games but games that have a huge bandwidth load for the CPU tasks. This doesn't change the raw numbers in any way but it's software that makes hardware fly and there could be cases where certain engine loads don't "favor" the way PS4 hardware is designed.
Don't be so sensitive, you've been corrected many times on the issue already but you keep repeating the same thing. The PS4 has more bandwidth than the XBONE, that's a fact, unless you have a special xbone of course.Well, when you say I am wrong I guess you know the truth. So instead of using the words you did you are free to prove anyone wrong, including me, in a tone where people actually don't lose interest to talk to you now and in future threads.
That is gold.Need to change the thread title to: The PS4 - Cerny does not give AF
a. CPU bandwidth is a very low number in general. For the most of CPU tasks latency is more important than bandwidth. That's why you don't generally see much difference in performance on PC between 2 and 4 channel memory platforms.
b. Maximum CPU bandwidth possible is a known number on PS4.
c. Even if we subtract that number from the whole PS4 GDDR5 bandwidth we're still left with a figure which is several times higher than that on slower PC GCN cards _and_ on XBO.
d. AF is cached on modern GPUs and its external bandwidth requirement is actually very low. Hence why it's nearly "free" on low end GPUs which don't have even 1/10th of PS4 memory bandwidth.
This argument is invalid and can't be the reason why PS4 has no AF in some titles when compared to XBO and PS3.
Don't be so sensitive, you've been corrected many times on the issue already but you keep repeating the same thing.
The PS4 has more bandwidth than the XBONE, that's a fact, unless you have a special xbone of course.
You know I've been impressed with Dying Light and sometimes not, first impressions was not that good, but it has it's moments. I do get your point and agree though, some people believe that using every modern graphical effect in the book makes your game look great, that is certainly not so.You may be running freakin' ray tracing, but if your IQ is this (100% view it):
You may as well reconsider some technical choices in favor of something more polished. Plus, everything you listed is becoming pretty much standard by now (even launch title KZ:SF used physically based rendering, for one).
There's a clear difference between just implementing effects and actually leverage them, and that's where developers' skills shine. Yes -on paper- we're talking about pretty advanced engines, but the result is most questionable (Evolve being definitely better overall anyway).
To me, it doesn't matter what you implement in your engine, what matters is how well you do it. But that's my opinion, everyone is free to drool on specs..
You mean the jaggies-ridden chromatic-aberrated-mess Dying Light?
Evolve is really nothing to write home about, tech-wise, but is definitely better than DL.
What the hell is this?!!?
Dying Light is a great looking game with a decent set of Post Processing effects, great dynamic Lighting, models! Some people expectations are so whack it cannot even fathom??
It is 7 months old. But no matter how old it is, we should get used to the words like "peak" and "theoretical" if we talk about bandwidths (and TFlops ) and not take these numbers for comparisons or to use them as they would represent what's actually provided by the system. Sites like arstechnica do this, too.
Its not, it is infact from a Paris talk in 2013.
In fact, I am not. I get different opinions. This is not what is proving any theory wrong.
Yes. It has. It's the better machine. It has the better hardware. Everything is better. Yes.
AF is still worse in some cases. Goto Start:
I wasn't talking about the 14+4.
Neither was I, the slide is from 2013
My point is arguing about it will get us nowhere without more developer input.
Need to change the thread title to: The PS4 - Cerny does not give AF
Just tweeted yosp about the matter... Let's wait.