• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The end of your AA woes! MLAA coming to AMD drivers

Nabs

Member
question: anyone getting some weird purple/pink thing going on with the new drivers? gaf looks kinda funny.
 

Wallach

Member
Nabs said:
question: anyone getting some weird purple/pink thing going on with the new drivers? gaf looks kinda funny.

Yeah, I totally thought it was me. :lol

Not sure what the deal is with the color settings, looking into it.
 

Nabs

Member
apparently the color temp got changed to 6600k from 6500k. just go into desktop & display and configure your monitor. Color > Color Temp Control. glad thats over with :lol
 

gillty

Banned
not so good results with HL2, also the menus look a mess.

valrumlaa-hl2-1xsia.png

valrumlaa-hl2-2mruj.png
 

Wallach

Member
Nabs said:
apparently the color temp got changed to 6600k from 6500k. just go into desktop & display and configure your monitor. Color > Color Temp Control. glad thats over with :lol

Yeah, that fixed it.

Also, New Vegas does not like MLAA one bit. Totally gross.
 
Disappointing but not surprising. I thought it would fall apart with thin geometry and lo and behold it does. Thanks for those shots, they're precisely what I was looking for.
 

Wallach

Member
brain_stew said:
Disappointing but not surprising. I thought it would fall apart with thin geometry and lo and behold it does. Thanks for those shots, they're precisely what I was looking for.

Yeah, this is definitely New Vegas' problem as well. Tons of thin geometry in the world, especially as the LOD kicks in on distant structures.

Will have to go back to adaptive MSAA on that one for now.
 
Nabs said:
question: anyone getting some weird purple/pink thing going on with the new drivers? gaf looks kinda funny.

The new driver changed the default colour temperature. You'll have to recalibrate your screen.
 

Dalauz

Member
Wallach said:
Yeah, this is definitely New Vegas' problem as well. Tons of thin geometry in the world, especially as the LOD kicks in on distant structures.

Will have to go back to adaptive MSAA on that one for now.
just wondering, are you using that d3d9.dll "fixed"?
 

Wallach

Member
Dalauz said:
just wondering, are you using that d3d9.dll "fixed"?

Yessir. The reason I even bothered to try it was because using that spoof prevents you from enabling transparency multisampling, which is annoying. I wouldn't mind if I could afford to run 4xSSAA, but I can't really without it causing a bit too much stuttering in VATS (though I'm tempted to because 4xSSAA looks fuck awesome).
 

stuminus3

Member
Sorry, I think this looks like shit so far. I've tried to play some stuff that's really bright so I can see the jaggies/blurs clearly (stuff like OutRun 2006 that has very high contrast), and compared to straight 4x MSAA it looks like mud on my 1600x900 screen. I like my PC gaming crisp, whether it's aliased or not, and MLAA isn't crispy.

A reasonable solution for games that currently don't work with MSAA or have serious performance problems with MSAA? Sure, it might be a good enough replacement. I don't see it replacing more traditional AA methods though.

Note: my opinion may change as I play more games. But even a glance at the Steam overlay with MLAA on makes we want to puke. :lol

EDIT: just a thought, too... I'm thinking there's going to be a lot of people turning on MLAA while still having their MSAA options set, and thinking that's how great MLAA is, when in actual fact they're just running proper MSAA then MLAA on top of it...
 

Truespeed

Member
stuminus3 said:
Sorry, I think this looks like shit so far. I've tried to play some stuff that's really bright so I can see the jaggies/blurs clearly (stuff like OutRun 2006 that has very high contrast), and compared to straight 4x MSAA it looks like mud on my 1600x900 screen. I like my PC gaming crisp, whether it's aliased or not, and MLAA isn't crispy.

A reasonable solution for games that currently don't work with MSAA or have serious performance problems with MSAA? Sure, it might be a good enough replacement. I don't see it replacing more traditional AA methods though.

Note: my opinion may change as I play more games. But even a glance at the Steam overlay with MLAA on makes we want to puke. :lol

EDIT: just a thought, too... I'm thinking there's going to be a lot of people turning on MLAA while still having their MSAA options set, and thinking that's how great MLAA is, when in actual fact they're just running proper MSAA then MLAA on top of it...

Blame the implementation, not the technique. The MLAA on GOW3 is incredible. Incidentally, I'm the same way - I like my text and graphics crisp.
 

Wallach

Member
For what it's worth, I'm almost certain they are going to put a lot of effort into improving this because the potential benefits are enormous. It's not there yet, but I wouldn't be surprised to see them back it 100%. I look forward to revisiting it in the future.
 
so i think it looks pretty awesome on Dead Rising 2 and GTA4, oh and Batman: AA.

it makes my graphics card scream like it's overheating or not getting enough power though, but that could be something more general about these drivers.
 
MLAA is one of those common sense things that's been running through my head for years. "Instead of doing all this AA sampling during rendering, why not just blur edges after rendering?" Obviously as not as simple as it sounds, but the results are showing some excellent promise. Loved MLAA in GoW3, hope someone manages to get it working on my 4770 (somehow). Hopefully with more work and refinement, both ATI and Nvidia will support it and we can do away with all of the overly costly forms of AA.
 

M3d10n

Member
TOAO_Cyrus said:
Or it could be the current implementation is a compute 5 shader because it is intended as a feature for the 6xxx cards and they havent backported it yet.
Just like deferred rendering could be "backported" from DX9.0 to DX8.0, heh?

Without looking at the shader itself, I can't tell if it can be done in CS4 or not, but it's a given that CS5 is flexible enough that Intel's MLAA could be probably dropped on it without major changes in the technique (just like Sony did on the PS3's SPUs). Porting it to CS4 might, at the very least, require some re-thinking.

Anyway, at gamedev.net's forums someone came up with a DX9 MLAA-like effect (but it is more aggressive on textures), so they could go that route if they really wanted.
 

Konosuke

Member
I can't get this to work. I reinstalled various times and everytime i use the OP's method the video driver doesn't install correctly. Help me GAF : <

Edit: I have a 5770 VaporX
 

sfried

Member
Okay ATi folks...should I install 10.10 or 10.10a hotfix?

I only have Catalyst 10.7 on a Mobility Radeon HD 4200 on Win 7 64-bit. Going to DriverSweep before I install, so I'm doing a backup.
 

Luigiv

Member
brain_stew said:
Time for a visit to the optician. The edge blurring in Dead Space is terrible.
Hmm, looking at it again now that I am no longer sleep deprived, you may be right. Still the more important question remains; if AA with MRT is only a problem for DX9 as your OP suggests and the MLAA solution only works on DX11 calibre GPUs, wouldn't running your games in DX10/11 modes make this solution redundant anyway?

Though going to see an optician isn't a bad idea. Haven't had my eyes checked in years despite the fact that my eyesight is very obviously weakening.
 

pestul

Member
Here are my findings..

S.T.A.L.K.E.R.: Call of Pripyat - didn't work but really slowed the game down
FC2 - didn't appear to work
Anno 1404 - again, didn't notice a difference
Crysis - Didn't notice a huge difference probably wasn't working

Dragon Age: Origins - looked great.. liked this one a lot
Assassins Creed II - looked great
Tomb Raider: Underworld - good again

Based on that pic above, I'm going to have to test Anno again. Oh, I should mention that the 10.10 series of drivers really improves Crysis performance.
 

gillty

Banned
sfried said:
Wait, so let me get this straight: Morphological Anti-Aliasing is only available in 10.10a, am I correct?
Technically the driver only supports MLAA on 6000 series cards, however with 10.10a and modified settings files MLAA also works on 5000 series cards.
 

Fafalada

Fafracer forever
brain_stew said:
Disappointing but not surprising. I thought it would fall apart with thin geometry and lo and behold it does.
To be fair that screenshot shows primarily texture-aliasing (and thus MSAA does nothing for it either outside of the power lines).

M3d10n said:
Just like deferred rendering could be "backported" from DX9.0 to DX8.0, heh?
Not sure why you'd need to "backport" anything - it can be implemented(and it was) just fine on DX8.
 
I have a 5870, I updated to 10.10 in Steam and then installed the hotfix from the link in the OP and restarted, but I am not seeing any option for MLAA, just the normal 3 modes that have always been there.

I also tried editing the registry entry mentioned in the OP to 0 but it was already set to 0

Any idea what i am doing wrong?
 
SuicideUZI said:
I have a 5870, I updated to 10.10 in Steam and then installed the hotfix from the link in the OP and restarted, but I am not seeing any option for MLAA, just the normal 3 modes that have always been there.

I also tried editing the registry entry mentioned in the OP to 0 but it was already set to 0

Any idea what i am doing wrong?
Might be a stupid question but did you reboot?
 

Extollere

Sucks at poetry
Meh, I usually play all my games with no AA, so it doesn't bother me anymore. Quick question though for those in the know... when screen resolutions (and game resolutions) start get get higher than 1080p, will AA even matter anymore? It seems to me that the higher the resolution the less aliasing is even noticeable.
 

sfried

Member
Valru said:
Technically the driver only supports MLAA on 6000 series cards, however with 10.10a and modified settings files MLAA also works on 5000 series cards.
So I probably shouldn't even bother with the hotfix since I only have a Mobility Radeon 4200? But someone said the hotfix is actually slightly improved. In that case do I still need to use the .ini or slipstream version or just go with the hotfix vanilla?
 

pestul

Member
Someone on Rage3d reported that enabling MLAA with vsync in game will disable triple buffering. That would probably explain some of the more brutal looking textures.
 

pestul

Member
Here's my Anno 1404. It looks like the Windows Snipping tool degrades the image so much as to deceive it having AA. Honestly I couldn't tell the difference between MLAA on and NoAA. Don't think it was working..

anno1404.jpg
 
Extollere said:
Meh, I usually play all my games with no AA, so it doesn't bother me anymore. Quick question though for those in the know... when screen resolutions (and game resolutions) start get get higher than 1080p, will AA even matter anymore? It seems to me that the higher the resolution the less aliasing is even noticeable.


I think it depends on the game, I play at 1920x1200 and some games without AA the jaggies stick out like a sore thumb, while others you have to pay close attention to notice them.

Either way I play at a high resolution and the difference between AA and no AA in general is pretty noticeable to me
 

pestul

Member
felipepl said:
Yes... running Crysis in DX9 allows MLAA to work.
Looks GORGEOUS and runs a lot better, amazing job AMD.
Only working in DX9 you say? Certainly plausible given the results we've gotten.. wait a tick, is Star Craft 2 DX11?
 
opticalmace said:
Might be a stupid question but did you reboot?


yeah, I was trying to find it by just right clicking on the ATI task manager icon, didnt realize you had to actually go into CCC and check on "Morphological filtering"

it is kind of confusing because you'd think having application managed clicked would not use any AA from CCC, but if you uncheck application managed then its going to use whatever you AA mode in CCC is set to

It seems to be working for me in Starcraft II, but does not work at all in City of Heroes, think I will try Crysis next
 

Peterthumpa

Member
pestul said:
Only working in DX9 you say? Certainly plausible given the results we've gotten.. wait a tick, is Star Craft 2 DX11?
DX9.

I'm pretty sure that for now, MLAA is only working with DX9 titles.
 

M3d10n

Member
Luigiv said:
Hmm, looking at it again now that I am no longer sleep deprived, you may be right. Still the more important question remains; if AA with MRT is only a problem for DX9 as your OP suggests and the MLAA solution only works on DX11 calibre GPUs, wouldn't running your games in DX10/11 modes make this solution redundant anyway?
Running a "MRT" game in DX10/11 doesn't automatically grant AA.

1) These "MRT" games usually fill multiple buffers (render targets) with information other than colors, the most common being surface normals (aka: direction vectors) and depth (a distance value). This information is used in other parts of the rendering (depth can be used to apply fog, SSAO or depth of field and normals can be used for deferred lighting, decals, etc).

2) AA works by storing multiple colors for each pixel (the samples), which are then blended together to form the final pixel color which is displayed to the screen.

In DX9, the contents of a MSAA buffer can only be read after it has been "resolved" (aka: the samples for each pixel are blended together to form single pixels). So, if you render non-color information in a MSAA buffer and try to read from it later, the pixels at geometry edges will contain incorrect data: you cannot simply blend normals, depth or other custom data like you do with colors. Using this incorrect data will cause glitches on the edges.

DX10.1 and DX11 allow MSAA buffers to be accessed without blending the samples: the shaders can read the individual samples and work at sub-pixel level. This allows the shaders to read all samples of whatever data the buffer contains for each pixel, do the calculations, and blend the (color) results. However, this isn't free: MSAA buffers use more bandwidth (since they are 2x, 4x, 8x larger) and MRT rendering techniques already use more bandwidth than normal. Also, the same shader needs to run multiple times for a single pixel (on the edges). And all shaders need to be coded to make the MSAA work: it's no longer a simple flag the devs can turn on like MSAA was in DX8/9.

Now, it is very likely that upcoming DX10/11 games start packing their own MLAA implementations properly integrated into the games (without messing with UI).
 

pestul

Member
felipepl said:
DX9.

I'm pretty sure that for now, MLAA is only working with DX9 titles.
I think you're right. I just read up a bit on the two games I saw the largest impact in (Dragon Age and Assassin's Creed II), apparently both are DX9 at the core.

For the record, I also didn't notice any difference in Metro 2033.
 
Top Bottom