miladesn said:
it does to some extent but not nearly as effective as edges although MSAA is not based on edge detection.
Yeah, I don't mean that it uses edge detection in the 2D image processing sense, just that MSAA 'only' anti-aliases edges (using edge coverage etc.).
The Intel paper is still the best public reference on MLAA (AFAIK), and the ups and downs vs SSAA. It shows the results of some problems like pixel-width objects and so on, but also talks about advantages relative to SSAA's own problems.
http://visual-computing.intel-research.net/publications/mlaa.pdf
miladesn said:
I think the problem is that it's not something standard like MSAA, developers need to jump through hoops to get a custom algorithm working on a SPU, I think it requires a considerable amount of R&D which 3rd parties aren't willing to spend money on when there are standardized and implemented methods. if it was implemented in dev kits, something that developers just could turn on and use like MSAA or QAA then yes.
This is all very true. I think it's exactly why Sony should make their implementation available, they'd be stupid not to.
For RDR, I'm sure by the time the idea of it was gaining interest, it was already late in development, and I highly doubt they had the time to start an experimental trip down this particular avenue.
However, I'm not sure how custom each game's implementation needs to be. I think if you could take, for example, gow's implementation it would be a huge headstart over rolling your own. You'd probably want to tweak some variables for best results in your frames, and you'd need to do you own testing and comparisons etc. but if you had the benefit of someone else's experience it would be a big help. Enough of a help, I think, to make it a no-brainer to test.
miladesn said:
IQ is definitely superior to low levels of MSAA in most cases.
MLAA isn't cost free performance wise, basically an SPU is assigned to do AA full time
The
worst case frames cost in gow were 3-4ms, running on 5 SPUs. To try and boil that down to a percentage of CPU performance, for a 30fps game, you're talking maybe 8% of CPU time. Probably for latency reasons you'd not want to put 1 SPU working on it for half your frame time, it's probably better to split it among as many as you can fit it on.
Another big deal is the saving on the GPU - they saved 6 or 7ms of RSX latency (~20% in a 30fps game).
miladesn said:
in games with a massive open world that require constant streaming, or complex AI or physics is used, I think this method is not a good solution. RDR is a game with all of these things, you have euphoria, a big massive world and lots of NPCs.
That's a big assumption. I doubt this stuff is hurting the CPU to the point of no headroom (unless it's being wildly inefficient or something).
miladesn said:
The problem is that edge detection algorithms are far from perfect (at least those that I've seen and worked with, well known methods like Canny) , their performance depends on many factors like gamma, color contrast, brightness etc. they are even less efficient on complex surfaces like foliage, trees or fences etc.
To say what you're saying in another way...MSAA is relatively constantly expensive regardless of edge complexity
If even your worst case frames with MLAA are cheaper than with 2xMSAA, it doesn't really matter that the cost increases on more edge-heavy frames. Even if MLAA was the same cost, or perhaps even a little more expensive, it could still be worthwhile if you're heavily GPU bound (and if you're getting a better result that doesn't hurt either).
Anyway, we're straying a bit OT. But I do hope R* will have the opportunity to try out new AA tech for their next game on this engine.