• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The end of your AA woes! MLAA coming to AMD drivers

Ceebs

Member
Konosuke said:
Better IQ and performance trumps cosmetic only graphical improvements. If PhysX ever had a meaningful gameplay impact on a large number of games it would be a more persuasive feature.
 
So what are all the graphics card families that support MLAA? I want this but I still don't know what my next graphics card will be.
 

Shai-Tan

Banned
I don't see whats wrong with that dead space image. It does a good job smoothing the edges. The textures in that game are already horribly blurry even with 0xAA
 
ChoklitReign said:
So what are all the graphics card families that support MLAA? I want this but I still don't know what my next graphics card will be.

Only 6xxx series AMD cards so far, we'll see how this develops.
 

1-D_FTW

Member
AlStrong said:
That looks awful. The Intel MLAA algorithm shouldn't be that blurry.

But hey, it's FASTER THAN SUPER SAMPLING.

Reminds me of the early days when I refused to use AA because of the smeared look.
 

Lathentar

Looking for Pants
brain_stew said:
Clearly its not possible for the 360. 4 ms is much too long for most games.

It would be nice if they shared their code. I did a rewrite of the original MLAA adapting the guy's code to make it a bit more generic and use a lot less memory (in place instead of in a copy of the frame buffer).
 
Shai-Tan said:
I don't see whats wrong with that dead space image. It does a good job smoothing the edges. The textures in that game are already horribly blurry even with 0xAA
That's not what AMD is using apparently.
 

Corky

Nine out of ten orphans can't tell the difference.
So whats the possibility of Nvidia countering this with their own version of MLAA that's supported by current GTX's' ?

please say high
 
Lathentar said:
Clearly its not possible for the 360. 4 ms is much too long for most games.
).

That's with XNA and its still quicker than 2xmsaa in games that use deferred rendering according to some of the developers at B3D.
 
Corky said:
So whats the possibility of Nvidia countering this with their own version of MLAA that's supported by current GTX's' ?

please say high

I'd wager high, Nvidia are big on adding driver enhancements.
 
Konosuke said:

Not really a big deal - it's nice to have, but outside of a tiny number of games it means very little in the bigger picture (and you need either a dedicated PhysX or very powerful graphics card to pull it off at good framerates). MLAA and 3D on the other hand - these can retroactively work in most games.
 
Lathentar said:
Clearly its not possible for the 360. 4 ms is much too long for most games.

It would be nice if they shared their code. I did a rewrite of the original MLAA adapting the guy's code to make it a bit more generic and use a lot less memory (in place instead of in a copy of the frame buffer).
Wouldn't it be possible to make a ENB-series kinda program, that hooks into the game and applies MLAA? That would own.
 

Cheeto

Member
brain_stew said:
No, you're thinking of quincunx. This is the filter used in GOW3 and LBP2 and those two games probably have the best image quality of any console game on the market. We'll wait and see how AMD's implementation measures up.
Oh my bad, excuse my ignorance... the example image just seemed to give the same effect as quincunx.
 

Gwanatu T

Junior Member
Withnail said:
GOW3 has some of the best IQ I've ever seen in a game. This sounds like good news for PC gamers.

Agreed, and it's an incredibly efficient method from what I understand. This is another reason for me to go 69xx this year.
 

jarosh

Member
well, it looks pretty damn good here, that's for sure:

results.jpg
 

teiresias

Member
Don't worry, no one can increase inefficiency and decrease performance with supposedly low-overhead processing better than the AMD driver team.
 
Could someone with knowhow animate/blink those two images above ^^ like the Dead Space image?

e: nevermind I just compared them in tabs. Look very similar.
 
Lathentar said:
Clearly its not possible for the 360. 4 ms is much too long for most games.
Probably worth noting that the tests were done in XNA. I'd love for someone to loan em a devkit to see what sort of difference it makes.
 

forrest

formerly nacire
opticalmace said:
Could someone with knowhow animate/blink those two images above ^^ like the Dead Space image?

e: nevermind I just compared them in tabs. Look very similar.

You need to be comparing the bottom two with the original posted. The original is showing the mlaa.

Being a 5850 owner, I'll wait and see, but I fully expect someone to get it working on older cards. Any idea on when AMD plans to roll this out?
 

nubbe

Member
Seems like it might be worth to go from 4870 to 6870.
Bu till wait for the Crysis benchmarks... 120fps or bust
 
kinggroin said:
Which should be the equivalent of the 5970? Which means no way in he'll you're getting 120fps in the first crysis.

Its 2x cayman which is better then cypress but no one knows how much at this point.
 

subversus

I've done nothing with my life except eat and fap
AMD has been on a roll lately. NVIDIA, get your shit together, there's no crown on your head anymore and you haven't noticed yet.
 

Shambles

Member
subversus said:
AMD has been on a roll lately. NVIDIA, get your shit together, there's no crown on your head anymore and you haven't noticed yet.

gtx-480chip.jpg


Until the 6900 series comes out, they still are sitting on the throne.
 

Combichristoffersen

Combovers don't work when there is no hair
[Montgomery Burns]Excellent[/Montgomery Burns]

Bring this to my 5770 kthxbye

And I guess this means Nvidia has to get their shit together. Hasn't AMD/ATI been consistently kicking Nvidia in the balls for the last year or so?
 

subversus

I've done nothing with my life except eat and fap
Combichristoffersen said:
[Montgomery Burns]Excellent[/Montgomery Burns]

Bring this to my 5770 kthxbye

And I guess this means Nvidia has to get their shit together. Hasn't AMD/ATI been consistently kicking Nvidia in the balls for the last year or so?
yes, they were.

but it's good. Expect something cool from Nvidia in their next generation.
 

Lord Error

Insane For Sony
JaseC said:
MLAA-induced blur is practically non-existent in God of War III.
Yeah, their implementation is better than Intels prototype even. Not just in image quality, but in temporal image consistency as well, which is where Intel's implementation fails in comparison (antialiased pixels shifting too much from frame to frame)
 
Binabik15 said:
.

How´d MLAA affect Crysis and it´s foliage aliasing?

PS: First time in months I *might* feel silly for buying a 5850 last winter.

As per the slide, it works with alpha test, so it should be a much better fit than msaa for Crysis so long as AMD don't bork their implementation.
 

Binabik15

Member
brain_stew said:
As per the slide, it works with alpha test, so it should be a much better fit than msaa for Crysis so long as AMD don't bork their implementation.


A9gwV.gif


Full disclaimer: I play Crysis at 1080p 0xAA, so anything would be an improvement ;)
 
If we assume that a 6xxx card exists that is better than a 5970 2GB, then that means Crysis at 60fps, on Very High, in 1680x1050 with MLAA and 32xAF is a very real prospect.

Homer Simpson drool is an understatement.
 
Binabik15 said:
.

How´d MLAA affect Crysis and it´s foliage aliasing?

PS: First time in months I *might* feel silly for buying a 5850 last winter.

I'd be shocked if there doesn't end up a way of enabling this on 5xxx series cards. AMD are going to have to do it anyway since they're rebadging 57xx cards into 67xx series cards, just wait for an unofficial tweak to pop up. Radeon Pro will probably build in support.
 

Erasus

Member
Seems awesome if it has less performance hit than MSAA! I usually play without AA because in motion I dont really mind, and my 4830 can only render so much at 1920x1080.
 
Top Bottom