• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Fidelity FX Review - Featuring RAGE 2

thelastword

Member
Apr 7, 2006
8,160
1,994
1,380
Fidelity FX? Is it good or not? DLSS was a high talking point after RTX games were found sparse and Arizona Dry around launch...….It was said that DLSS would provide just as sharp an image or even sharper than native at a fraction of the rez and cost per frame...….Yet, it was proven that even DLSS can take a bigger hit on framerate than traditional upscaling in some instances, whilst looking blurrier, jaggier and generally butchering detail like a Friday the 13th movie on speed...…..

Yet, what is it? Is Fidelity FX better? Can it deliver a sharper image over normal rez without butchering frames and enhancing the image? See below in the first title to support the feature...

https://www.overclock3d.net/reviews/software/amd_fidelity_fx_review_-_featuring_rage_2/1

Fidelity FX OFF


Fidelity FX ON



Fidelity FX OFF


Fidelity FX ON



"In this second set of screenshots, we can see that AMD's Fidelity FX has a notable impact on the sharpness of RAGE 2's terrain, the player's weapon and on the textures for rocks and other debris. Again, this difference isn't huge, but it is more than enough to sell FidelityFX as a worthwhile feature."

-----------
Image is improved, not made worse, unlike DLSS...….This is early days for Fidelity FX though and support is there day 1 or even before Navi's launch unlike DLSS......Also, no proprietary bullshit here, it works on Nvidia cards, just as Radeon Rays will work on Nvidia and Freesync works on Nvidia cards...…..As a matter of fact..... "These screenshots were taken at 1080p on an Nvidia Geforce RTX 2080 Ti at Ultra settings with resolution scaling disabled." Yes, they used an NV card to test this feature, since Navi is still under NDA, so we will see how it performs on Radeon cards at launch......


Pic Slider Comparisons

https://cdn.knightlab.com/libs/juxtapose/latest/embed/index.html?uid=05336874-9da3-11e9-b9b8-0edaf8f81e27

https://cdn.knightlab.com/libs/juxtapose/latest/embed/index.html?uid=4c78fc62-9da3-11e9-b9b8-0edaf8f81e27
 
Last edited by a moderator:

Al3x1s

Gold Member
Nov 24, 2018
2,071
1,431
475
Was all that really "proven" for DLSS? I recall some pretty sweet screenshot comparisons in the first games supported showing it basically looked like native 4k. Though it did mess with the image in some other games I'd chalk that up to the implementation rather than the tech since it worked right elsewhere, no (edit: I might be confusing this with the CAS feature which appears to work better but offer less benefits)? Nice that this works on a variety of cards. So it's not really a GPU selling feature, rather a cool technique they developed and games can implement regardless of the user's system? Like say, TressFX in games like Tomb Raider? I don't get why this stuff couldn't be a checkbox in GPU driver options rather than need to be implemented by the game itself, just have it where you get to choose between scaling methods, from stretching to no scaling to whatever reconstruction method the GPU company has. Ubisoft has its own reconstruction technique in its games which works pretty decently in my experience too. But I love TAA-likes cos of how they often eliminate motion shimmering, I don't care about jaggies in stills as much.
 
Last edited:

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
9,261
2,586
800
Was all that really "proven" for DLSS? I recall some pretty sweet screenshot comparisons in the first games supported showing it basically looked like native 4k for way less of a perf hit than that. Though it did mess with the image too much in some other games I'd chalk that up to the implementation rather than the tech since it worked right elsewhere, no? Nice that this works on a variety of cards. So it's not really a GPU selling feature, rather a cool tool they developed and games can implement regardless of the user's system? Like say, TressFX in games like Tomb Raider?
You recall incorrectly. DLSS was an epic fail. It made the image quality noticeably worse in that it looked like vasoline was smeared on the screen.
Go to GamersNexus, Hardware Unboxed, you will see all the comparison videos and DLSS gets crapped on.

The conclusion for DLSS is that you get far better results merely dropping the resolution. Doing that gets you the performance of DLSS and much less image quality loss.

Its been several months since then so its possible it might have improved by now.
 
Last edited:

Soltype

Member
Mar 30, 2015
1,773
417
410
You recall incorrectly. DLSS was an epic fail. It made the image quality noticeably worse in that it looked like vasoline was smeared on the screen.
Go to GamersNexus, Hardware Unboxed, you will see all the comparison videos and DLSS gets crapped on.

The conclusion for DLSS is that you get far better results merely dropping the resolution. Doing that gets you the performance of DLSS and much less image quality loss.

Its been several months since then so its possible it might have improved by now.
yeah ... it was crap ... only one game used it decently ....metro exodus....
 

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
9,261
2,586
800
yeah ... it was crap ... only one game used it decently ....metro exodus....
Disagree. I tried it with it on and off. Dropping the resolution had similar performance but less of an image quality penalty.
 
  • Like
Reactions: thelastword

LOLCats

Member
Sep 5, 2013
2,751
619
495
Nice to have this available for i guess all games? Idk i want to say mods have been doing this a while, but obviously on a per game basis.

FXAA tool for Skyrim offers sharpening features for example, with little to no performance hit. Probably different methods but i love that tool and it is a noticeable difference.
 
Last edited:

Soltype

Member
Mar 30, 2015
1,773
417
410
Disagree. I tried it with it on and off. Dropping the resolution had similar performance but less of an image quality penalty.
I don't doubt it ...just going by what hardware unboxed said... they said metro's use was decent.....Its still crap in its current state.... don't see it getting better either....
 

Dr.D00p

Member
May 23, 2019
68
118
185
Disagree. I tried it with it on and off. Dropping the resolution had similar performance but less of an image quality penalty.
My only experience of DLSS is with Anthem and it works very well, native 4K at max settings averages about 40-50fps on my RTX 2080 and pretty much a locked 60 with DLSS enabled. Image quality with DLSS is still very good and you'd have to look very hard to notice your actually looking at an upscaled 1440p image...and DLSS on looked better (to me) than running Anthem at 1800p with DLSS off. I also think the kind of experience you'll have with DLSS is very much dependant on how good your monitor is at scaling non native resolutions. To me, running with DLSS on, looked better than running at a non native 1800p.
 

Whitecrow

Member
May 7, 2018
674
568
345
I dont think this could be a selling point.
You need some great eyes to see the differences in static images, let alone in motion...

I like it nonetheless.
 

Armorian

Member
Jan 17, 2018
785
578
360
Sharpening is required to combat shit implementation of TAA, Rage looked like blurry mess when I played it (and was still aliased). Why games like Shadow of War or AC Odyssey/Origins can have TAA working great and not smearing the image (only donside is ghosting) while many others look like garbage?

Sharpening filters in reshade does the same thing and work in all DX9/11 games (too bad they affect fonts), this one needs to be implemented in game so... developers won't give a fuck as usual.
 

thelastword

Member
Apr 7, 2006
8,160
1,994
1,380
Sharpening is required to combat shit implementation of TAA, Rage looked like blurry mess when I played it (and was still aliased). Why games like Shadow of War or AC Odyssey/Origins can have TAA working great and not smearing the image (only donside is ghosting) while many others look like garbage?

Sharpening filters in reshade does the same thing and work in all DX9/11 games (too bad they affect fonts), this one needs to be implemented in game so... developers won't give a fuck as usual.
Well so far it's started on the right foot.......I guess we can assume it's easy enough to implement as Rage 2 a title already released has it....It will be interesting to see if games which implement Fidelity FX from the ground up will have even better results......What's shown here is very promising as a first example,,,,,,,
 

thelastword

Member
Apr 7, 2006
8,160
1,994
1,380
Even OP didn't read the OP chuckle, so I can't blame you.

Tech is company agnostic, works on Radeons and Geforces alike.
Well I did tell him what you just said, that the tech is not proprietary, that if it works on a 2080ti, it should work on a Vega 64....and it does......So just to verify...

 
  • Like
Reactions: llien

thelastword

Member
Apr 7, 2006
8,160
1,994
1,380
Whilst (Contrast Adaptive Sharpening) or Fidelity FX has to be implemented by the developer, where they can manually tweak and build their games ground up with the feature for greater results...….

RIS or (Radeon Image Sharpening) is an exclusive AMD feature within the control panel, which works on all games....You simply toggle it on/off.....It just works...

Here, RIS is switched on in Strange Brigade and Shadow of the Tomb Raider.....


https://hothardware.com/news/amd-radeon-rx-5700-image-sharpening-comparison


Strange Brigade
RIS OFF


RIS ON


Direct Comparison



Shadow of the Tomb Raider
RIS OFF


RIS ON


Direct Comparison



Performance Hit





 

Armorian

Member
Jan 17, 2018
785
578
360
Whilst (Contrast Adaptive Sharpening) or Fidelity FX has to be implemented by the developer, where they can manually tweak and build their games ground up with the feature for greater results...….

RIS or (Radeon Image Sharpening) is an exclusive AMD feature within the control panel, which works on all games....You simply toggle it on/off.....It just works...

Here, RIS is switched on in Strange Brigade and Shadow of the Tomb Raider.....


https://hothardware.com/news/amd-radeon-rx-5700-image-sharpening-comparison


Strange Brigade
RIS OFF


RIS ON


Direct Comparison



Shadow of the Tomb Raider
RIS OFF


RIS ON


Direct Comparison



Performance Hit
Now this shit is quite great, best Navi feature so far.

I wonder if Navi 20 will really be any different in hardware compared to 10 (aside more cus), maybe current GPUs are as RT friendly as console versions will be (and N20), just all the software needed for RT implementation in games will be ready next year.
 

Remij

Member
Apr 23, 2009
2,187
158
835
lmao... yea, that's a sharpening filter.....

You can do this shit on NV hardware with no performance hit too. 🤷‍♂️
 

thelastword

Member
Apr 7, 2006
8,160
1,994
1,380
lmao... yea, that's a sharpening filter.....

You can do this shit on NV hardware with no performance hit too. 🤷‍♂️
Yea, Damn right it is, and it works well......It just works...

FYI, RIS is exclusive to Radeon, it's on the GPU Dash and works on hundreds/thousands of games day 1.....Only supported on Navi cards atm...


CAS/Fidelity FX is implemented by the developer, they can optimize and develop their games with CAS in mind from the ground up, so we might see some interesting results, as was shown in the Unity demo at E3.....Fidelity FX is developer focused, it's not propietary, it is available on all cards....including NVIDIA.
 
Jun 26, 2013
3,279
1,513
635
lmao... yea, that's a sharpening filter.....

You can do this shit on NV hardware with no performance hit too. 🤷‍♂️
If RIS > DLSS and Nvidia's sharpening filter = RIS, then Nvidia's sharpening filter > DLSS.

Why then, did Nvidia push DLSS in the first place if, to paraphrase you, you can simply do the same thing RIS does on Nvidia hardware?
 

Remij

Member
Apr 23, 2009
2,187
158
835
If RIS > DLSS and Nvidia's sharpening filter = RIS, then Nvidia's sharpening filter > DLSS.

Why then, did Nvidia push DLSS in the first place if, to paraphrase you, you can simply do the same thing RIS does on Nvidia hardware?
Because this isn't doing what DLSS is doing. Take in consideration that Hardware Unboxed even admits that DLSS (with a base 1440p upscaled) looks better than AMD's FidelityFX at 1440p. If you compare them at the same resolutions then DLSS can look better. But it has it's drawbacks as well. It completely depends on the implementation... to which, admittedly DLSS has to improve. BF5 looks terrible and was even worse on release. Metro is one of the better test cases. Anthem is pretty decent as well.

The biggest downside of DLSS as Hardware Unboxed rightfully said, is that there's a fixed performance hit from the tensor cores. So it's costing you more performance to get arguable worse/better results. That said.. it doesn't mean the technology itself is bad... it's just that at the moment it's not optimal. They definitely have to improve their algorithms and implementations.. however in the future with more tensor cores they could potentially lower the cost even further.

FidelityFX has its benefits. But at the end of the day, it's a simple sharpening filter. Nvidia has freestyle which can also do a lot of other stuff, and of course there's always reshade.

Monster Hunter World DLSS patch is coming next week so it will be interesting to see how that performs. They can't really afford to fuck up their DLSS implementations like they did with BF5 and Metro in the beginning. Now, every game that supports it is going to be compared immediately to FidelityFX, so it's going to look even worse for Nvidia if it continues to look like shit in the beginning.
 

Remij

Member
Apr 23, 2009
2,187
158
835
Yea, Damn right it is, and it works well......It just works...

FYI, RIS is exclusive to Radeon, it's on the GPU Dash and works on hundreds/thousands of games day 1.....Only supported on Navi cards atm...


CAS/Fidelity FX is implemented by the developer, they can optimize and develop their games with CAS in mind from the ground up, so we might see some interesting results, as was shown in the Unity demo at E3.....Fidelity FX is developer focused, it's not propietary, it is available on all cards....including NVIDIA.
From what I understand RIS is in fact CAS. There's no difference between them. The only thing is that developers can implement it through their games and customize it for their game and it will work on both AMD and Nvidia gpus.

RIS is essentially generic driver wide implementation not particularly customized for any specific game.

Let's be real though for a second... it only works on DX12, Vulkan, and DX9... That's not a whole shit load of games...

DX11 or bust.
 
Jun 26, 2013
3,279
1,513
635
Because this isn't doing what DLSS is doing. Take in consideration that Hardware Unboxed even admits that DLSS (with a base 1440p upscaled) looks better than AMD's FidelityFX at 1440p. If you compare them at the same resolutions then DLSS can look better. But it has it's drawbacks as well. It completely depends on the implementation... to which, admittedly DLSS has to improve. BF5 looks terrible and was even worse on release. Metro is one of the better test cases. Anthem is pretty decent as well.

The biggest downside of DLSS as Hardware Unboxed rightfully said, is that there's a fixed performance hit from the tensor cores. So it's costing you more performance to get arguable worse/better results. That said.. it doesn't mean the technology itself is bad... it's just that at the moment it's not optimal. They definitely have to improve their algorithms and implementations.. however in the future with more tensor cores they could potentially lower the cost even further.

FidelityFX has its benefits. But at the end of the day, it's a simple sharpening filter. Nvidia has freestyle which can also do a lot of other stuff, and of course there's always reshade.

Monster Hunter World DLSS patch is coming next week so it will be interesting to see how that performs. They can't really afford to fuck up their DLSS implementations like they did with BF5 and Metro in the beginning. Now, every game that supports it is going to be compared immediately to FidelityFX, so it's going to look even worse for Nvidia if it continues to look like shit in the beginning.
Um... FidelityFX and RIS are separate features.

Also, simple sharpening filters sharpen everything on screen. However, RIS selectively sharpens some objects, but does not on others as mentioned by Hardware Unboxed.

RIS is something that can be applied to any Vulkan, DX9, and DX12 game without any developer needing to tweak anything. Meanwhile, DLSS is dependent on whether the developer implements it in. In regards to 1440p RIS vs. 1440p DLSS, Hardware Unboxed also mentioned that DLSS has a larger performance hit. So you're getting a slightly better image while sacrificing significant performance.
 
  • Like
Reactions: Tygeezy

vpance

Member
Jun 20, 2005
7,044
274
1,380
Developers have been doing shader-based sharpening filters since 360/PS3. It's not something new.
Not something new, but if it's not "built-in" then not every dev will think to add it.

If next gen is going to be filled with CBR 4K games this should be mandatory.
 

thelastword

Member
Apr 7, 2006
8,160
1,994
1,380
Because this isn't doing what DLSS is doing. Take in consideration that Hardware Unboxed even admits that DLSS (with a base 1440p upscaled) looks better than AMD's FidelityFX at 1440p. If you compare them at the same resolutions then DLSS can look better. But it has it's drawbacks as well. It completely depends on the implementation... to which, admittedly DLSS has to improve. BF5 looks terrible and was even worse on release. Metro is one of the better test cases. Anthem is pretty decent as well.

The biggest downside of DLSS as Hardware Unboxed rightfully said, is that there's a fixed performance hit from the tensor cores. So it's costing you more performance to get arguable worse/better results. That said.. it doesn't mean the technology itself is bad... it's just that at the moment it's not optimal. They definitely have to improve their algorithms and implementations.. however in the future with more tensor cores they could potentially lower the cost even further.

FidelityFX has its benefits. But at the end of the day, it's a simple sharpening filter. Nvidia has freestyle which can also do a lot of other stuff, and of course there's always reshade.

Monster Hunter World DLSS patch is coming next week so it will be interesting to see how that performs. They can't really afford to fuck up their DLSS implementations like they did with BF5 and Metro in the beginning. Now, every game that supports it is going to be compared immediately to FidelityFX, so it's going to look even worse for Nvidia if it continues to look like shit in the beginning.
What exactly is DLSS doing? DLSS was said to mostly improve IQ, remember, it was pitched as being a better AA solution over TAA, the lie detector proved that was a lie, because all DLSS did was add more aliasing to the image, especially on background objects, only it did that with lots of blur on top, all you have to do is go look at the feature in an NV favored game called FF15, severe shimmering, lots of blur, loss of detail everywhere as it smudges the image, what a recipe indeed.....DLSS is actually an antonym to GIQ (good image quality)…….DLSS is like being promised Halle Berry for Prom in 1990 and then Whoopi Goldberg shows up......

From what I understand RIS is in fact CAS. There's no difference between them. The only thing is that developers can implement it through their games and customize it for their game and it will work on both AMD and Nvidia gpus.

RIS is essentially generic driver wide implementation not particularly customized for any specific game.

Let's be real though for a second... it only works on DX12, Vulkan, and DX9... That's not a whole shit load of games...

DX11 or bust.
RIS is based on CAS, but Fidelity FX takes it a step higher where more can be done when done ground up in games........The post processing pipeline is tinkered with, shader passes are reduced with high quality PP effects. The fidelity FX suite contains CAS for sure, it also contains FP16 and Effects......Yet, the whole draw is to improve the image whilst reducing the load or making the hit minimal.......If Fidelity FX and RIS says anything to you, is that it bodes well for Radeon Rays...High quality implementation without a huge hit or reduced with FP16, PP, GPGPU etc....

And of course, like clockwork, DX11 is the most important API ever, but you should remember that there are many DX12 titles available and even more DX9 titles, there are a few vulkan titles too......I suspect, there are more vulkan titles supported than all DLSS games available, but I guess not enough RIS games right? AMD can never do enough, even if Nvidia's output there is paper thin....after 10 months in...

Also, I'm pretty sure DX11 support will come, but they need to work on it since DX11 always had a bigger hit on AMD hardware....I'm pretty sure, they just don't want the hit in DX11 titles to be too much more than in DX12....
 

Remij

Member
Apr 23, 2009
2,187
158
835
RIS is based on CAS, but Fidelity FX takes it a step higher where more can be done when done ground up in games........The post processing pipeline is tinkered with, shader passes are reduced with high quality PP effects. The fidelity FX suite contains CAS for sure, it also contains FP16 and Effects......Yet, the whole draw is to improve the image whilst reducing the load or making the hit minimal.......If Fidelity FX and RIS says anything to you, is that it bodes well for Radeon Rays...High quality implementation without a huge hit or reduced with FP16, PP, GPGPU etc....
You don't even know what the hell you're talking about. FidelityFX also contains FP16... and effects?

FidelityFX can utilize FP16 to slightly reduce performance impact.. it's not like it "contains FP16". FidelityFX and RIS say nothing to me other then... "We have a sharpening filter" LOL

Oh and:

 
Last edited:

thelastword

Member
Apr 7, 2006
8,160
1,994
1,380
From what I understand RIS is in fact CAS. There's no difference between them. The only thing is that developers can implement it through their games and customize it for their game and it will work on both AMD and Nvidia gpus.

RIS is essentially generic driver wide implementation not particularly customized for any specific game.

Let's be real though for a second... it only works on DX12, Vulkan, and DX9... That's not a whole shit load of games...

DX11 or bust.
Yes, RIS is derived from CAS....CAS is a tool under Fidelity FX, End users don't have access to CAS.....Now with CAS, the developer can actually tinker with how sharpening is done in their game, they can enhance scenes with the sharpening, but unlike AMD's locked system,t hey can go broke, they also have access to the high quality PP effects pipeline that reduce shader passes, they can use FP16 and enhance TAA all in the Fidelity FX suite...…..Now RIS is AMD's own proprietary sharpening tool, but it's locked to everyone, it just works and by the looks of it, it works well....


Some Quotes


"Following the Radeon RX 5700 series launch, AMD has now open-sourced their Contrast Adaptive Sharpening (CAS) technology under FidelityFX on GPUOpen.

Contrast Adaptive Sharpening provides sharpening and optional scaling and is implemented as HLSL and GLSL shaders for Direct3D and Vulkan. CAS is designed to provide better sharpness with fewer artifacts and to increase the quality of temporal anti-aliasing."
https://www.phoronix.com/scan.php?page=news_item&px=AMD-GPUOpen-FidelityFX-CAS


"As of now, however, RIS is only available on AMD’s new Radeon RX 5700-series graphics cards, and doesn’t yet include support for DX11. AMD says it’s only a matter of resources, but Radeon Image Sharpening is currently only available for DX9, DX12, and Vulkan.

AMD has taken its DLSS solution even further. It also announced FidelityFX, a open-source developer toolkit that could have an even wider impact on future games. The most significant tool in the kit is CAS (Contrast Adaptive Sharpening), which is based on a similar set of technology as Radeon Image Sharpening.
Unlike Radeon Image Sharpening, Contrast Adaptive Sharpening must be implemented individually by games, but it’s all completely open source and free for developers to use through AMD’s GPUopen.com. It’ll even work on Nvidia cards, according to AMD.

The features in FidelityFX give developers more control over in-game visuals, tweaking sharpness for ultimate results. It’s even possible to give players manual control for custom results. Radeon Image Sharpening, by contrast, is an on/off switch that can’t be customized."

https://www.digitaltrends.com/computing/amd-radeon-image-sharpening-dlss-ray-tracing-e3-2019/

------------
Just read the whole article, it's very good.......and 1440p RIS or Fidelity FX=On, looks much better than DLSS when you enable GPU scaling in the Radeon Graphics Menu....These tools have lots of benefits to the end user...
 

Remij

Member
Apr 23, 2009
2,187
158
835
Just read the whole article, it's very good.......and 1440p RIS or Fidelity FX=On, looks much better than DLSS when you enable GPU scaling in the Radeon Graphics Menu....These tools have lots of benefits to the end user...
1440p RIS doesn't look better than DLSS rendered internally at 1440p... Hardware Unboxed even said as much.

I guess it could depend on how good the DLSS implementation is though on a per game basis. Some games might look as good, some might look better, and some might look worse.
 
Last edited:

thelastword

Member
Apr 7, 2006
8,160
1,994
1,380
1440p RIS doesn't look better than DLSS rendered internally at 1440p... Hardware Unboxed even said as much.

I guess it could depend on how good the DLSS implementation is though on a per game basis. Some games might look as good, some might look better, and some might look worse.
I don't necessarily agree with that, I know you would hang unto that, but do some more looking, look at some of the videos I posted in this thread......Especially the EPOSVOX video, these guys do video editing for a living and have a wealth of knowledge on the subject.....FYI, I also didn't agree with Tim when he said that 1440p DLSS looks better than 1440p upscaled, primarily because dlss smudges the detail more than the upscaled image, you actually lose detail when the AI tries to reconstruct the scene, whilst the upscaled image maintains all detail relative to native, just not as crisp....so I find it odd that he said that a sharpened 1440p that enhances detail looks worse than 1440p DLSS...….When EPOSVOX was doing 1440p RIS and comparing it to native 4k and he said unless you're looking quite closely it looks very close...... The video clearly shows Epos to be right...

Also, if you go back to when Tim did the Metro video, I agreed with him on some aspects and not on some.....I mostly agreed of course, but I felt he was trying to still give Nvidia a bone, so they would not sack them whole....Lol....

In any case, if you have doubts on what I'm saying, revisit these old threads...I posted lots of PNG's on the infiltrator demo and even FF15.....In the Infiltrator demo, you lost lots of light sources in the background, in FF15, not only did you lose foreground detail, but the backgrounds became a shimmery mess at 1440p DLSS.....There's no way a sharpened image where no reconstruction is being guessed in, is not more accurately detailed or sharper than a 1440p DLSS image. The DLSS image guesses and loses image information and also makes the image blurry, it was not even close to how well CB did that.. Remember there is little to no artefacting with RIS.....1440p RIS vs 1440p DLSS.....RIS will always win in sharpness, maintaining the image composition, an dof course, no detail loss.....All it does is it enhances the low contrast areas so you can see more detail, in essence (detail embossment), there's no way that loses to DLSS at any resolution.....


Yet, I can guarantee you there will be more comparisons coming up.....I'll try to do some PNG comparisons myself.....The 1440p comparisons will be done, I guarantee you...
 
Jun 26, 2013
3,279
1,513
635
1440p RIS doesn't look better than DLSS rendered internally at 1440p... Hardware Unboxed even said as much.

I guess it could depend on how good the DLSS implementation is though on a per game basis. Some games might look as good, some might look better, and some might look worse.
Doesn't look better at what magnitude? Does the better image quality on 1440p DLSS cancel out the higher performance penalty?
 

Remij

Member
Apr 23, 2009
2,187
158
835
Yet, I can guarantee you there will be more comparisons coming up.....I'll try to do some PNG comparisons myself.....The 1440p comparisons will be done, I guarantee you...
For sure there will be. And I honestly hope it pushes Nvidia to make DLSS better. Trust me, I recognize that DLSS has some major flaws in comparison to a sharpening filter which can be applied to any game (when AMD gets DX11 support)... In comparison, DLSS being something that developers have to implement, the fact that it comes with a fixed reduction to performance.. the fact that it only works at specific resolutions and even specific resolutions on specific cards. Those are huge barriers that they have to figure out. RIS will always be superior in the fact that any resolution, any game, any Navi card.. and little performance impact.

One of the advantages to DLSS is that it can provide a very temporally stable image. Those "fuller" grass blades and branches among other aspects can look nice at times. Also the way DLSS can reduce/remove dithering artifacts, such as like in FF15. Regardless of the TAA implementation being garbage in that game, DLSS does an amazing job of completely removing the terrible dithering.

At the end of the day though... it's a simple sharpening shader. We've had them for a long time. Watching people and the press act like this shit hasn't been possible before is just meh for me.

If this being branded as some AMD technology forces Nvidia to improve DLSS or implement other things, then that's great for everyone.
 

Remij

Member
Apr 23, 2009
2,187
158
835
Doesn't look better at what magnitude? Does the better image quality on 1440p DLSS cancel out the higher performance penalty?
I suppose if it looks better and still performs better on Nvidia GPUs... then yes.

If not.. then use Freestyle or Reshade and apply a slight sharpening filter.. you have that option 🤷‍♂️
 
Jun 26, 2013
3,279
1,513
635
I suppose if it looks better and still performs better on Nvidia GPUs... then yes.

If not.. then use Freestyle or Reshade and apply a slight sharpening filter.. you have that option 🤷‍♂️
But that's not what I was asking about. DLSS has a higher performance penalty than RIS, so you can't have both.
 

Remij

Member
Apr 23, 2009
2,187
158
835
But that's not what I was asking about. DLSS has a higher performance penalty than RIS, so you can't have both.
Huh?

It doesn't matter if DLSS has a higher performance impact than RIS. If Nvidia running DLSS looks better and still runs faster than RIS on AMD... who would care?

Like I said.. even if it does perform or look worse... at that point you can do exactly what RIS does on Nvidia hardware as well using Reshade or Freestyle.
 
Jun 26, 2013
3,279
1,513
635
It doesn't matter if DLSS has a higher performance impact than RIS.

If Nvidia running DLSS looks better and still runs faster than RIS on AMD... who would care?
You already know my answer to that based on my point about DLSS having a higher performance penalty than RIS's. Not sure why you even asked this question in the first place.

Like I said.. even if it does perform or look worse... at that point you can do exactly what RIS does on Nvidia hardware as well using Reshade or Freestyle.
And how much performance can you gain from Reshade or Freestyle? Or will the performance get hit a bit more on top of the already worse performance in DLSS?
 

Remij

Member
Apr 23, 2009
2,187
158
835
You already know my answer to that based on my point about DLSS having a higher performance penalty than RIS's. Not sure why you even asked this question in the first place.


And how much performance can you gain from Reshade or Freestyle? Or will the performance get hit a bit more on top of the already worse performance in DLSS?
Dude... what the fuck are you even talking about?
 
Jun 26, 2013
3,279
1,513
635
Dude... what the fuck are you even talking about?
I would ask you the same thing. Let's analyze how inept these two sentences are in sequence, shall we?
It doesn't matter if DLSS has a higher performance impact than RIS. If Nvidia running DLSS looks better and still runs faster than RIS on AMD... who would care?
First sentence claims that DLSS having a higher performance impact than RIS does not matter. And then, it is followed up with the insinuation that no one would care if DLSS looks and performs better than RIS.

This contradicts the fact that a significant portion of PC gamers game on PC in the first place due to performance. But I guess performance goes out of the window in this specific situation. The cherry picking is made more apparent when you failed to answer my questions. There's nothing so harmful about asking if Reshade and Freestyle have performance hits or boosts performance.

But let me guess, they do not boost performance based on your dodging...
 
Last edited:

Remij

Member
Apr 23, 2009
2,187
158
835
I would ask you the same thing. Let's analyze how inept these two sentences are in sequence, shall we?

First sentence claims that DLSS having a higher performance impact than RIS does not matter. And then, it is followed up with the insinuation that no one would care if DLSS looks and performs better than RIS.

This contradicts the fact that a significant portion of PC gamers game on PC in the first place due to performance. But I guess performance goes out of the window in this specific situation. The cherry picking is made more apparent when you failed to answer my questions. There's nothing so harmful about asking if Reshade and Freestyle have performance hits or boosts performance.

But let me guess, they do not boost performance based on your dodging...


I mean... 2070 4K DLSS (1440p base res) looks better than the 5700XT at 1440p+Sharpening. Hardware Unboxed said that. Of course, the performance hit of DLSS brings it well below what standard 1440p+sharpening does on the 5700XT... fair enough... but DLSS still looks better.

So, to make RIS look better than DLSS, they run the 5700XT at 4K with a 70% internal rendering resolution+sharpening. So now it looks better... but look at the minimum framerates...

Running the 2070 at native 4K runs average fps damn near identical to the 5700XT running at 70% 4K resolution... but much much better min fps.

Running the 2070 at 4K with a 70% scale performs WAY better than the 5700XT at the same. Look at the min fps for crying out loud! "pie_tears_joy:
 
Last edited:
Jun 26, 2013
3,279
1,513
635


I mean... 2070 4K DLSS (1440p base res) looks better than the 5700XT at 1440p+Sharpening. Hardware Unboxed said that. Of course, the performance hit of DLSS brings it well below what standard 1440p+sharpening does on the 5700XT... fair enough... but DLSS still looks better.
But how much better? I've already asked you about the magnitude, but you've never answered that question. If DLSS looks better to the same degree as the magnitude of the performance hit, then you have a case. However, "looks better" is not informational enough. How much better?

Also, taking one game and then extrapolating that across all games is a massive leap.

So, to make RIS look better than DLSS, they run the 5700XT at 4K with a 70% internal rendering resolution+sharpening. So now it looks better... but look at the minimum framerates...

Running the 2070 at native 4K runs average fps damn near identical to the 5700XT running at 70% 4K resolution... but much much better min fps.

Running the 2070 at 4K with a 70% scale performs WAY better than the 5700XT at the same. Look at the min fps for crying out loud! "pie_tears_joy:
And you're basing this off of one game and assuming that this is the case for all games. Look how the 5700XT's 4K performance on Metro Exodus is worse than the 2070's. However, based on this figure, this ends up being more of an exception than the rule:

 

Remij

Member
Apr 23, 2009
2,187
158
835
But how much better? I've already asked you about the magnitude, but you've never answered that question. If DLSS looks better to the same degree as the magnitude of the performance hit, then you have a case. However, "looks better" is not informational enough. How much better?

Also, taking one game and then extrapolating that across all games is a massive leap.


And you're basing this off of one game and assuming that this is the case for all games. Look how the 5700XT's 4K performance on Metro Exodus is worse than the 2070's. However, based on this figure, this ends up being more of an exception than the rule:
It's all subjective... To some people TAA and a slightly blurrier more temporally stable image looks better than a sharper, more aliased image... Hardware Unboxed said one looks better than the other.. it's up to each individual person to decide if it's worth the performance hit. In the Metro Example I gave... on the 5700XT.. is running the game at 70% 4K + RIS and losing almost 50% of your framerate worth the visual improvement over 1440p+RIS?? The only person who can decide that, is you. But in this case, they said they thought DLSS looks better than RIS at 1440p.

And I never took one game and then extrapolated it across all games... I took an example that was compared and spoke about it. In fact, I say this in a post above

1440p RIS doesn't look better than DLSS rendered internally at 1440p... Hardware Unboxed even said as much.

I guess it could depend on how good the DLSS implementation is though on a per game basis. Some games might look as good, some might look better, and some might look worse.
And no.. I'm not assuming this is the case for all games. I never said that was the case. In that quote I'm specifically talking about Metro Exodus.

Again, since I made a post about it in the other thread but never did here, I'll say it here... The games they've tested have internal scalers... so they can actually set the resolution to something between 1440p and 4K on AMD cards. They're testing games at ~70% of 4K resolution. Most games don't have the ability to manually adjust the internal resolution. So in games which don't have that option.. you're stuck with either the full performance hit of native 4K... or dropping the res to 1440p. So to get that "looks like 4K but runs much better" talking point.. the game has to support the ability to scale the internal resolution.
 
Last edited: