• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The end of your AA woes! MLAA coming to AMD drivers

UPDATE 3: Comparison pics time! :D

JADS said:


Valru said:
valrumlaasp2d.png


If you want to take your own screenshots simply run your games in windowed mode and use the Windows 7 snipping tool. FRAPs will not work.



UPDATE 2: Now working on all 5xxx series cards!


Simply install the link below(10.10a hotfix w/ MLAA edit slipstreamed in, provided by mr_nothin) and then restart and you should have the option to enable MLAA in CCC:

http://www.mediafire.com/?5h6cpooivo4sbox

If this doesn't work then try:

You can try to add MLAA manually:
Start the Regedit (/REGISTRY/MACHINE/SYSTEM/ControlSet001/Control/CLASS/{4D36E968-E325-11CE-BFC1-08002BE10318}/0000) and look for MLF_NA. So change 1 to 0 and restart the CCC.
If MLF_NA is not available: Add it!
Restart computer


RadeonPro is currently being updated so soon you'll have the ability to apply MLAA per application without any driver or registry fiddling
----------------------------------------------------------------------------------------


As many know, MSAA is incompatible with multiple render targets under DX9 and as the popularity of MRTs has grown, there's been an increasing amount of PC games without any AA solution at all, notable examples include Dead Space and GTA 4. Its sometimes possible to enable brute force super sampling but even when this does work (and its often a crapshoot), the insane performance penalty makes it rather infeasible in modern games.

However, fear not! Since MLAA (as popularised by GOW3 on the PS3) is a simple post process filter and nothing more, its long been debated that it should be quite simple to force it on top of any DX9/10/11 application without interupting the rendering process. Infact, some work has already been done in this area and the early and the early performance numbers are hugely encouraging. A simple 9800GTX can manage to deliver AA of similar quality to 4xmsaa in just 0.5ms! Now AMD have stepped uptp the plate and are going to include an MLAA setting in their drivers for their new 6xxx series cards! Coupled with Radeon Pro, you'll be able to force MLAA per application with ease in any DX9/10/11 game! :D Fantastic!


Here's the leaked slide with details of their implementation:

8.jpg


There's not a single technical reason why this can't work on 5xxx series cards since 6xxx series cards bring no new functionality. Since its just a simple post process that has been proven to work under DX9, there's no reason it can't be brought to even older cards either, its more of a marketing rather than a technical reason but it could prove more work to rework their implementation for none DX11 class hardware, we'll see.

Hopefully this lights a fire under Nvidia's arse as this is a great USP to have imo. As demonstrated previously, even at 1080p, MLAA should be practically free on a modern PC GPU and its roughly 1000% faster than 8xmsaa in most cases (at least it was on a 9800GTX).

I'm eager to see how well AMD's implementation works.

UPDATE:

Its the real deal! :D

It has practically no noticeable impact on performance at all and the results, while slightly destructive, are still very good indeed.

morphological%20aa.png


morphological%20aa%20comparo.gif


The one real "gotcha" is that it affects the HUD as wel, not surprising since its a post-process after the fact and not integrated into the game's rendering pipeline but not ideal either. Here's a gif to demonstrate the effect.

morphological%20aa%20text.gif
 
i wonder if this'll be a similar situation to SSAA being unsupported but sort of possible to get going on cards lower than the 5000 series. i hope so.
 

GWX

Member
I hope MLAA hits NVIDIA cards soon. If my 9800GTX+ can do it (as brain_stew said in the OP), gimme!
 
plagiarize said:
i wonder if this'll be a similar situation to SSAA being unsupported but sort of possible to get going on cards lower than the 5000 series. i hope so.


I'm hoping it is completely supported without jumping through hoops.
 
AlStrong said:
That looks awful. But hey, it's FASTER THAN SUPER SAMPLING.

There's been plenty of practical implementations that show MLAA can deliver very high quality edge smoothing without destroying texture detail. I'll wait and see it applied to an actual game scenario before I judge the quality of AMD's implementation, its not as if they can't make improvements to the algorithm over time, the important thing is that they're comminted to supporting MLAA at the driver level.

Anything is better than no AA support at all and currently there's dozens of DX9 games that are in this bracket, so any improvement at all is welcome. MRT can often break driver forced supersampling as well, as this is just a post process filter and isn't messing with the framebuffer in any way, it should have much greater compatibility.

And again, if supersampling is your only current option for AA in a particular game, anything that delivers decent AA at a lower cost is welcome. So yes, just being "FASTER THAN SUPER SAMPLING" is reason enough to celebrate this news.
 

BeeDog

Member
This is probably a retarded question, but how come the PS3 has mighty PC's beaten to the punch in regards to MLAA? Is it simply because the Sony teams researched the subjects before, or was it simply that PC devs didn't bother implementing MLAA since PC hardware has the power to brute-force the classic AA options?
 

Stink

Member
BeeDog said:
This is probably a retarded question, but how come the PS3 has mighty PC's beaten to the punch in regards to MLAA? Is it simply because the Sony teams researched the subjects before, or was it simply that PC devs didn't bother implementing MLAA since PC hardware has the power to brute-force the classic AA options?

Necessity is the mother of invention.
 
BeeDog said:
This is probably a retarded question, but how come the PS3 has mighty PC's beaten to the punch in regards to MLAA? Is it simply because the Sony teams researched the subjects before, or was it simply that PC devs didn't bother implementing MLAA since PC hardware has the power to brute-force the classic AA options?

MSAA is dog slow on RSX, so the PS3 really didn't have any good option for AA previously, while developers still had half a dozen SPEs to hand tasks to. The need necessitated the solution but as you can see in the presentation I linked, MLAA should actually be (much) faster on a modern GPU than it is on CELL. The first white paper on MLAA was actually from an Intel employee and he ran the algorithm on a single core x86 CPU.


Stink said:
Necessity is the mother of invention.

:lol

You said in 5 words what I said in three lines! This is basically all there was to it.
 
mikespit1200 said:
So, it's feasible that someone could mod the new AMD drivers so my 5770 can get in on some MLAA goodness?

Its possible AMD will offer official support (especially since 5770 cards are soon to become part of the 6xxx series) but we'll have to wait and see. There's nothing stopping some enterprising individual to create a third party app that applies the algorithm to all DX10/DX11 GPUs, its just a case of wait and see. This is an important first step to universal MLAA support on modern PC GPUs.
 

JB1981

Member
didn't this first show up in that open world nazi-era game from pandemic? what was the name again? anyway, the gow 3 implementation is the best yet. game has almost bullshot-level AA
 

AlStrong

Member
brain_stew said:
There's been plenty of practical implementations that show MLAA can deliver very high quality edge smoothing without destroying texture detail.

That screenshot above isn't good quality and neither is the performance claim.

And again, if supersampling is your only current option for AA in a particular game, anything that delivers decent AA at a lower cost is welcome. So yes, just being "FASTER THAN SUPER SAMPLING" is reason enough to celebrate this news.

The point I was making was just how ridiculously obvious the statement was. My car is faster than a dog. ORLY.

But what's more worrying is that they claim similar performance to EDCFAA. This is not that awesome.
 

jett

D-Member
AlStrong said:
That looks awful. The Intel MLAA algorithm shouldn't be that blurry.

But hey, it's FASTER THAN SUPER SAMPLING.

You say awful but if the end result is anything like GOW3(by MILES the best image quality in a console game), it should be pretty great.
 

AlStrong

Member
jett said:
You say awful but if the end result is anything like GOW3(by MILES the best image quality in a console game), it should be pretty great.

The blur in the above is worse than that of God of War 3.
 
AlStrong said:
That screenshot above isn't good quality and neither is the performance claim.



The point I was making was just how ridiculously obvious the statement was. My car is faster than a dog. ORLY.

But what's more worrying is that they claim similar performance to EDCFAA. This is not fast.

Its not as if I disagree but I'm willing to wait to see how it actually looks and performs ingame, before I make a final judgement. We all know what the algorithm is capable of, so I'm not particularly worried if the first implementation isn't exactly ideal, the mere indication that this is something that AMD are willing to work on is enough to set the ball rolling.

GPU vendors hate not having feature parity, so its pretty safe to say that this has increased the likelihood of Nvidia working on their own solution by a great deal. If Nvidia are able to deliver better image quality than God of War then the onus will be on AMD to improve their implementation.

I'm not trying to sell this as a silver bullet (its not) but it has major promise and its great to see progress being made.
 

BobsRevenge

I do not avoid women, GAF, but I do deny them my essence.
Zombie James said:
If it's just a simple post-process filter, get it working on my 4770.
Now that Crysis can already run at 60fps on currently released hardware, they have to sell a new series videocards somehow. :lol :lol
 
BobsRevenge said:
Now that Crysis can already run at 60fps on currently released hardware, they have to sell a new series videocards somehow. :lol :lol

This is actually the reason why I worried it may never come to AMD/Nvidia drivers. Now the first move has been made, all bets are off.
 
DonMigs85 said:
Say, do modern Nvidia GPUs still even offer Quincunx at all? It should die a swift death now.

Yup, though not officially. Its useful for some scenarios imo, nice to play with anyway.
 

Konosuke

Member
This was a rollercoaster ride :lol

Reads thread title, fuck yeah!
Reads "drivers for their new 6xxx series cards", damn it!
Remembers having a 5770, hope restored!
Reads "No reason why this can't work on 5xxx series cards", fuck yeah!

I want to test it, hurry it up AMD.
 

Lord Error

Insane For Sony
brain_stew said:
The need necessitated the solution but as you can see in the presentation I linked, MLAA should actually be (much) faster on a modern GPU than it is on CELL. The first white paper on MLAA was actually from an Intel employee and he ran the algorithm on a single core x86 CPU.
Performance impact depends on quality of implementation. Based on the image above it's comparing apples and oranges between this and SCEs algorithm.
 

Mr_Brit

Banned
I NEED SCISSORS said:
Holy crap. Best USP of the new cards by far. The only real reasons to go with Nvidia are nHancer and 3D Vision and now AMD are rivalling both.
And you know, drivers that actually work and don't break older games with every release.:D
 

Ceebs

Member
While this sounds awesome, the fact that I will never see it on my 4870 makes me less than excited. Someone needs to hurry up and give me a reason to upgrade!
 
Lord Error said:
Performance impact depends on quality of implementation. Based on the image above it's comparing apples and oranges between this and SCEs algorithm.

I was talking about this implementation when quoting rendertimes, and the quality there is a very good match for GOW3 despite only taking 0.5ms of rendertime on a puny 9800GTX.
 
Ceebs said:
While this sounds awesome, the fact that I will never see it on my 4870 makes me less than excited. Someone needs to hurry up and give me a reason to upgrade!

I don't know how you can draw that conclusion at all. There's no technical reason why a tweaked implementation can't run on a 4870, it just won't have (official) support at launch.
 

BobsRevenge

I do not avoid women, GAF, but I do deny them my essence.
Darklord said:
So it makes it less jagged by making it fuzzy? I dunno...I think I'd rather clear and jagged.
Its just zoomed in really close. This is what anti-aliasing does.
 

kittoo

Cretinously credulous
DieH@rd said:
6870 here i come! Finaly I will be able to play Darksiders in TRUE 1080p@60 bullshots per second.

we should wait for reviews and comparisons etc
 
Cheeto said:
Is this the filter that makes most PS3 games look like poop?

No, you're thinking of quincunx. This is the filter used in GOW3 and LBP2 and those two games probably have the best image quality of any console game on the market. We'll wait and see how AMD's implementation measures up.
 

DieH@rd

Banned
kittoo said:
we should wait for reviews and comparisons etc

Darksiders already have stylized muddy textures. Even if MLAA screws them even more there would not be that much of a difference. And game itsfelf have a pretty low sys req, tons of spaces for eventual performance hit.
 
Well implemented MLAA would've made me switch to AMD no joke. Then I saw the blurry mess. It's looks like edge detection plus Gaussian blur.

Don't know if Intels algorithm isn't feasible for real-time rendering but it looks much better.

Here's an example of Intels solution in Dead Space:
6sIcU.gif
 
Top Bottom