• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGLeaks - Orbis GPU Detailed - compute, queues and pipelines

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Ok, then. What are the pros and cons of having the PS4 UMA for AA and AF ?

MSAA is very bandwidth intensive, so 176GB/s is good, but the days of 4xMSAA are probably over. I don't know what goes into AF calculations.
 

Biggzy

Member
MSAA is very bandwidth intensive, so 176GB/s is good, but the days of 4xMSAA are probably over. I don't know what goes into AF calculations.

AF is basically 'free' on any decent GPU now, so bye bye to horrible blurry textures this gen. With the switch to deferred rendering happening more and more it seems, I wouldn't be surprised to see other forms of AA being chosen for the PS4.
 
MSAA is very bandwidth intensive, so 176GB/s is good, but the days of 4xMSAA are probably over. I don't know what goes into AF calculations.

If a game is 1080p native though, AA becomes less important, and 2xMSAA looks as good as 4xMSAA at 720p.
 

Biggzy

Member
Yeah, that will result in a huge improvement in IQ for nextgen console visuals.. PC gamers have been benefiting from this for eons it seems.

IQ improved with the jump to HD this gen, but next gen should see that jump be even bigger than last gen to this gen.
 

TronLight

Everybody is Mikkelsexual
Yeah, that will result in a huge improvement in IQ for nextgen console visuals.. PC gamers have been benefiting from this for eons it seems.

Yes, AF it's really a non-issuee, you can crank it up to 16x without losing a frame.
And if they can use SMAA as standard for AA and not those blur-fest that are FXAA/TXAA/MLAA, even better.
I'm not really hoping for MSAA, not more than 2X anyway, since it tanks FPS on every deferred engine.
But if it's really a bandwitdh problem, than maybe PS4 will be able to pull it off. I don't really know if it is so, though, on my PC MSAA in Max Payne 3 or Batman AC kills the framerate, and my GPU has a lot of bandwidth...
Maybe if forward rendering + becomes a standard... AMD seemed enthusiastic about it, basically they described it as best of both worlds.
 

Biggzy

Member
Yes, AF it's really a non-issuee, you can crank it up to 16x without losing a frame.
And if they can use SMAA as standard for AA and not those blur-fest that are FXAA/TXAA/MLAA, even better.
I'm not really hoping for MSAA, not more than 2X anyway, since it tanks FPS on every deferred engine.
But if it's really a bandwitdh problem, than maybe PS4 will be able to pull it off. I don't really know if it is so, though, on my PC MSAA in Max Payne 3 or Batman AC kills the framerate, and my GPU has a lot of bandwidth...
Maybe if forward rendering + becomes a standard... AMD seemed enthusiastic about it, basically they described it as best of both worlds.

MSAA is not worth the performance hit you take for deferred rendering - at least in my opinion - and you are better off using that bandwidth for other graphical features. I have heard AMD's forward rendering + but I don't know anything about it, we will see if the industry adopts it in the coming year.
 

TronLight

Everybody is Mikkelsexual
MSAA is not worth the performance hit you take for deferred rendering, and you are better off using that bandwidth for other graphical features. I have heard AMD's forward rendering + but I don't know anything about it, we will see if the industry adopts it in the coming year.

I don't know if you've already seen these:
http://www.youtube.com/watch?v=zYweEn6DFcU AMD Leo tech Demo
http://www.youtube.com/watch?v=s2y7e3Zm1xc same tech demo but with some tech-chat on.

AMD says that basically, you can have the advanges of deferred rendering (so lots of dynamic lights) but with forward benefits (MSAA doesn't kill the framerate, not so memory-intensive).

CDProjeckt RED said that they're new engine will support it.
Hopefully, if it really solves those problems without creating new ones, it'll become widely used.
 
MSAA is not worth the performance hit you take for deferred rendering - at least in my opinion - and you are better off using that bandwidth for other graphical features. I have heard AMD's forward rendering + but I don't know anything about it, we will see if the industry adopts it in the coming year.

no one is going to be using deferred rendering anymore, the big push is to forward tile rendering.

https://www.youtube.com/watch?v=6DyTk7917ZI
https://www.youtube.com/watch?v=5cLOLE9Tn-g (before it got axed the intel larrabee was specializing for forward tiled rendering solutions)

gives you the same number of lights as deferred, but you get the bonus of aa, alpha and a vareity of better shader effects.
 

Biggzy

Member
I don't know if you've already seen these:
http://www.youtube.com/watch?v=zYweEn6DFcU AMD Leo tech Demo
http://www.youtube.com/watch?v=s2y7e3Zm1xc same tech demo but with some tech-chat on.

AMD says that basically, you can have the advanges of deferred rendering (so lots of dynamic lights) but with forward benefits (MSAA doesn't kill the framerate, not so memory-intensive).

CDProjeckt RED said that they're new engine will support it.
Hopefully, if it really solves those problems without creating new ones, it'll become widely used.

If what they say is true, then I don't see why a lot of developers won't use this form of rendering.

I have seen that tech demo before, very impressive stuff.

no one is going to be using deferred rendering anymore, the big push is to forward tile rendering.

https://www.youtube.com/watch?v=6DyTk7917ZI

gives you the same number of lights as deferred, but you get the bonus of aa, alpha and a vareity of better shader effects.

Doesn't tech advance quickly. I can remember when deferred rendering was all the rage, still is in fact.
 
If what they say is true, then I don't see why a lot of developers won't use this form of rendering.

I have seen that tech demo before, very impressive stuff.



Doesn't tech advance quickly. I can remember when deferred rendering was all the rage, still is in fact.

While this is all just my personal opinion.

I think the reason that MS isn't so concerned with the memory, and memory bandwith, is partially due to their emphasis on these solutions, since you dont have the same sized gbuffer.

Judging buy demos like this
http://www.youtube.com/watch?v=cdcqtjlBWCE
http://www.youtube.com/watch?v=M04SMNkTx9E

its probably playing heavily into the architecture of their next console. It'll be interesting to see how this all plays out. I think Sony went for flexibility, and MS went for specialization. Which makes sense since MS is trying to homogonize development across its platforms.
 

Biggzy

Member
While this is all just my personal opinion.

I think the reason that MS isn't so concerned with the memory, and memory bandwith, is partially due to their emphasis on these solutions, since you dont have the same sized gbuffer.

Judging buy demos like this
http://www.youtube.com/watch?v=cdcqtjlBWCE
http://www.youtube.com/watch?v=M04SMNkTx9E

its probably playing heavily into the architecture of their next console. It'll be interesting to see how this all plays out. I think Sony went for flexibility, and MS went for specialization. Which makes sense since MS is trying to homogonize development across its platforms.

I was discussing discussing that Lionhead tech demos with a user on another thread not too long ago. We came to an agreement that we would be surprised if that tech demo wasn't being influenced by Durango.
 

RoboPlato

I'd be in the dick
Yes, AF it's really a non-issuee, you can crank it up to 16x without losing a frame.
And if they can use SMAA as standard for AA and not those blur-fest that are FXAA/TXAA/MLAA, even better.
I'm not really hoping for MSAA, not more than 2X anyway, since it tanks FPS on every deferred engine.
But if it's really a bandwitdh problem, than maybe PS4 will be able to pull it off. I don't really know if it is so, though, on my PC MSAA in Max Payne 3 or Batman AC kills the framerate, and my GPU has a lot of bandwidth...
Maybe if forward rendering + becomes a standard... AMD seemed enthusiastic about it, basically they described it as best of both worlds.

I hope we see a lot of SMAA next gen. It's a great, low impact method that cleans up the image well, especially at 1080p.
 
I was discussing discussing that Lionhead tech demos with a user on another thread not too long ago. We came to an agreement that we would be surprised if that tech demo wasn't being influenced by Durango.

Same, but i think it actually stems more from the memexport instruction on the 360, and I think the DME's are supposed to act as a hardware implementation of the feature.
 
Yeah, that will result in a huge improvement in IQ for nextgen console visuals.. PC gamers have been benefiting from this for eons it seems.

This alone sold me on the PS4, i know PC gamers will scoff at this because they've been privy to godly IQ, but honestly, SONY´S first party is second to none IMHO.
 

Biggzy

Member
Same, but i think it actually stems more from the memexport instruction on the 360, and I think the DME's are supposed to act as a hardware implementation of the feature.

Wouldn't surprise me and it makes some sense because the tech demo was running on the 360, but I find it hard to believe that the Lionhead engineers either didn't have any input on the hardware of Durango or didn't hear where it was heading architecture wise, even that early in development.

I hope we see a lot of SMAA next gen. It's a great, low impact method that cleans up the image well, especially at 1080p.

SMAA is fantastic as it combines the best qualities of MLAA and MSAA into one.
 
Wouldn't surprise me and it makes some sense because the tech demo was running on the 360, but I find it hard to believe that the Lionhead engineers either didn't have any input on the hardware of Durango or didn't hear where it was heading architecture wise, even that early in development.



SMAA is fantastic as it combines the best qualities of MLAA and MSAA into one.

Why don't they do SMAA on current consoles?
 

Biggzy

Member
Crysis 3 does. I'm not sure why other games don't.

SMAA T2X to be precise, and thank goodness they went away from temporal AA. I don't want to see that anyway near the next gen consoles.

IQ won't be much of a problem on either console if the specs hold up.

PS4 should have no trouble at all: 32 ROPS and 176GB/s of bandwidth should be plenty enough for a 1080p buffer with a nice layer of quality AA solution and I already said AF is basically 'free' now. So you have the holy trinity of IQ right there.
 

Gorgon

Member
That poster got it wrong, it's content, not quality. But still not far-fetched.

"Titles for Xbox 360 must ship at least simultaneously with other video game platform, and must have at least feature and content parity on-disc with the other video game platform versions in all regions where the title is available. If these conditions are not met, Microsoft reserves the right to not allow the content to be released on Xbox 360."

If that was correct, no game would ever had shiped with exclusive content on the PS3, which we know isn't true. Obviously, MS may "reserve the right" to say no but they never would to a big title. It would hurt them more than any perceived inferiority.
 

i-Lo

Member
Sorry to give it a bump but I had a pertinent question. It is becoming more apparent that both Sony and MS are looking to maximise efficiency to reach the performance that can be accorded by their GPUs theoretical "FLOPs" number. As such here are four questions:

  • What kind of efficiency percentage is Sony (given Reiko keeps repeating that MS are going for 100%) and AMD looking to pull from this 1.84 Pitcairn derivative?
  • Also, when they talk of reaching higher efficiency, do they mean peak or sustainable performance?
  • What would the efficiency of a HD7850 in PC environment?
  • Given nVidia Flops are lower than AMD Flops despite the former being able to match or exceed the latter (eg: GTX 680 v HD 7970) in real world performance, does it indicate that nVidia cards are more efficient?
 

Mario007

Member
Sorry to give it a bump but I had a pertinent question. It is becoming more apparent that both Sony and MS are looking to maximise efficiency to reach the performance that can be accorded by their GPUs theoretical "FLOPs" number. As such here are four questions:

  • What kind of efficiency percentage is Sony (given Reiko keeps repeating that MS are going for 100%) and AMD looking to pull from this 1.84 Pitcairn derivative?
  • Also, when they talk of reaching higher efficiency, do they mean peak or sustainable performance?
  • What would the efficiency of a HD7850 in PC environment?
  • Given nVidia Flops are lower than AMD Flops despite the former being able to match or exceed the latter (eg: GTX 680 v HD 7970) in real world performance, does it indicate that nVidia cards are more efficient?

From what I remember being written on GAF that 100% efficiency is a feature of the AMD GPU architecture rather than an MS exclusive feature.
 
If that was correct, no game would ever had shiped with exclusive content on the PS3, which we know isn't true. Obviously, MS may "reserve the right" to say no but they never would to a big title. It would hurt them more than any perceived inferiority.

Probably cases where a developer would try and make it big on one franchise title or be a big player anyway and then have leverage to have exclusive content by strong-arming manufacturers to forget about "reserve the right to say no".
 

antic604

Banned
I have a different question to those in the know: if Durango is built to take advantage of forward tiled renderers, how would PS4 cope with such tech?
 

KidBeta

Junior Member
I have a different question to those in the know: if Durango is built to take advantage of forward tiled renderers, how would PS4 cope with such tech?

Fine. Unless your doing something that requires the super low latency of the eSRAM (which your probably not going to be) then the extra bandwidth on the GDDR5 will probably even help, and you wouldn't even need to tile.
 

mrklaw

MrArseFace
Sorry to give it a bump but I had a pertinent question. It is becoming more apparent that both Sony and MS are looking to maximise efficiency to reach the performance that can be accorded by their GPUs theoretical "FLOPs" number. As such here are four questions:

  • What kind of efficiency percentage is Sony (given Reiko keeps repeating that MS are going for 100%) and AMD looking to pull from this 1.84 Pitcairn derivative?
  • Also, when they talk of reaching higher efficiency, do they mean peak or sustainable performance?
  • What would the efficiency of a HD7850 in PC environment?
  • Given nVidia Flops are lower than AMD Flops despite the former being able to match or exceed the latter (eg: GTX 680 v HD 7970) in real world performance, does it indicate that nVidia cards are more efficient?

the recent rumour about the multiple ACEs in PS4 suggests they are pushing for lots of efficiency around compute in particular. It has been a weakness of GCN to some extent, but the PS4 multiple queues looks to address that and should also help mask latency to an extent.
 
Top Bottom