• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia's DLSS 2 vs. AMD's FSR 2: The Ultimate Analysis

MrTroubleMaker

Gold Member
FSR is open to work on all Nvidia GPUs from 10XX series on. Unless your GPU is super old, you should be good.
Thanks, it must be broken in Elite Dangerous, I have a 4080. I'm just using the other "default" scaler option in the game and it works, would have liked to use the better solution.
 

YCoCg

Member
So quick question, why are people even arguing over this and picking "sides"? It's good tech that's pushing each other that will continue to improve which only benefits gamers in the end.
 

OmegaSupreme

advanced basic bitch
Dlss is objectively superior and likely always will be. Makes choosing cards easy. Not to mention all the other things Nvidia has going for it.
 

yamaci17

Member
DLSS is amazing. even the performance mode is leagues ahead of FSR quality (1440p upscaled to 4K) or native 1440p or regular upscalers that upscale to 4k from 1440p.

clueless people brush it off saying "lolol you play at 1080p lolol even consoles do 1440p lolol"

4k dlss performance (internal 1080p) demolishing native 1440p



its just no competition. per pixel quality is so better than regular upscalers/TAA that it puts RTX cards in another level. it really makes the direct 1 to 1 hardware comparison moot, when you can have better image quality at 1080p render resolution with DLSS compared to a console rendering at 1440p.


look at this mess. even fsr quality at 4k is a hit and miss, whereas 4k dlss performance has been consistenlty amazing for me.

obviously this is near impossible to break down to statistics.
 
Last edited:

Kilau

Gold Member
When I see DLSS option.

hell yeah GIF


When I see the FSR option.

captain-america-no.gif
 

MikeM

Member
This won't age well.
Nintendo going to produce a tablet that outperforms the PS5 and X|S in image quality, due to DLSS.
Pass me whatever you are on my man.

Back on topic- DLSS is superior. In saying that, FSR looks good when playing at 4k. Not as good as DLSS, but close enough in most games.
 
Last edited:

PeteBull

Member
So quick question, why are people even arguing over this and picking "sides"? It's good tech that's pushing each other that will continue to improve which only benefits gamers in the end.
Simply coz dlss/fsr does affect purchasing decisions, for myself and other ppl who went with nvidia, its valid feature and help that makes card last longer, coz it gives tons more performance for barely any loss in image quality, think of it like turning settings from max/ultra to high, for ppl who dont have access to dlss but only fsr, its like going from ultra to medium settings, u get tons of fps more but the downgrade is very noticable in most scenarios-, so 1440p and 1080p.

Contrary to popular belief- tons of amd customers are 1080p/1440p monitor users, and not 4k users, since historically their cards were never go to choice for 4k resolution, that was top/high end of nvidia customers, back from first gtx titan days, which was early 2013.
 

Tarin02543

Member
Death Stranding DC DLSS looks incredible on my 768p plasma

Does DLSS get better with each Nvidia driver update or is it tied to the game itself?
 

Kenpachii

Member
It doesn't get better with drivers, but you can replace the DLSS version in a game with a newer one by swapping a file in the directory. https://www.techpowerup.com/download/nvidia-dlss-dll/

Pretty sure nvidia i read nvidia is going to let games update automatically to the newest dlss through there drivers at some point.

But yea, main reason i bought a 3080 over a 1080ti was because of DLSS. my ultrawide only renders a bit more then 1080p with dlss quality in control and fills a entire ultra wide 1440p screen with it, and it looks even better then native. That alone made it worth it.
 
Last edited:

ToTTenTranz

Banned
DLSS is great and popular RTX cards like the 3070 and 3060 ti are even better.

Or they would be, if they could play all those 2023 games that require 10GB+ VRAM.
I guess some users really do enjoy getting fucked up by their favorite corporation's planned obsolescence.
You know, like DLSS3 that only works on RTX4000 series because of reasons, while FSR3 works on virtually anything.

It's time to give up a kidney to get a 4070 Ti, boys. It'll last a whole other year before those 12GB run out.

EDIT: just to clarify, I have nothing against the test and HUB's conclusions. It's the sticky and smelly corporate fanboying that is puke material.
I hope some people are at least getting paid for this, otherwise they're making themselves clowns for nothing.
 
Last edited:

skneogaf

Member
I'm not a fan of either but dlss is so much better than fsr yet we are starting to see less of dlss in games ie resident evil 4 remake, the collisto protocol.

I was just playing forza horizon 5 and that has every solution.

I play that at 4k@120fps 8 x msaa which which still isn't enough.

I prefer image resolution increase so in resident evil 4 I can play at 200 percent of 4k with TAA 👍
 

01011001

Banned
Which one will be useful on consoles:

DLSS wins: 0
FSR wins: 1

with the rise of Unreal Engine, with more and more devs jumping on that engine, FSR2 will be used less and less.

developers will use Epic's TSR instead. also there have been tons of reconstruction methods on console before FSR or even DLSS were a thing.
FSR2 is not needed on console really... in fact I'd be way happier if developers continued to use CBR and improve upon it instead of using FSR2
 
Last edited:

Buggy Loop

Member
Really good showing for dlss. Better than native 40% of the time and even when it's not it's still close enough in quality that it's worth using for perf boost (strictly dlss quality).

Change .DLL to best version and it becomes 100% of the time

You can even tweak the resolution factor for quality mode so that it’s higher res than Nvidia’s

And if you have the performance for it and the game supports it, DLAA (DLSS at native res) is a killer.
 

SmokedMeat

Gamer™
DLSS is absolutely preferred, but it still didn’t save me from having to drop texture settings in my 3070ti in Spiderman and Resident Evil 4 Remake.

I’ll take the extra VRAM over the small difference between DLSS and FSR. Especially when you have to use screenshots to pick out a difference you would likely not notice while playing.
 

SmokedMeat

Gamer™
DLSS is amazing. even the performance mode is leagues ahead of FSR quality (1440p upscaled to 4K) or native 1440p or regular upscalers that upscale to 4k from 1440p.

clueless people brush it off saying "lolol you play at 1080p lolol even consoles do 1440p lolol"

4k dlss performance (internal 1080p) demolishing native 1440p

[/URL]

[/URL]

its just no competition. per pixel quality is so better than regular upscalers/TAA that it puts RTX cards in another level. it really makes the direct 1 to 1 hardware comparison moot, when you can have better image quality at 1080p render resolution with DLSS compared to a console rendering at 1440p.

[/URL]

look at this mess. even fsr quality at 4k is a hit and miss, whereas 4k dlss performance has been consistenlty amazing for me.

obviously this is near impossible to break down to statistics.

You can run run 4K performance on a 1440p monitor? If that’s the case I’ll switch to that.
 

PeteBull

Member
From what i see/guess, the lower we go down the stack and the more budget concious buyers are reached- the more advantage amd should have vs nvidia, on the topend 4090 and even 4080 has big advantage in consumers minds coz they are not price-sensitive and even that 16gigs of vram in 4080's case seems plenty for next few years for 4k.

Dlss and frame generator work best at higher frames and higher res too.

Below 4070 when nivdia cards gonna have only 8gigs of vram and dlss 3.0 effectivness gonna be vastly reduced(coz those cards wont get high frames anyways) same way with dlss2.0 coz most ppl wont even try to play in 4k using their new 4060/ti cards with only 8gigs of vram- that should definitely push more buyers into amd's arms provided they offer appealing product- decently priced with 50% more vram compared to direct nvidia competitor.

Not even mentioning rt coz its just nice feature to check but not actually to play games with turned on- and i say it as 3080ti owner,- simply when u actually play more fps/stable fps is always preferable to better visual effects, can only imagine lower tier cards owners are even more strongly opinionated in the fps favour.
 
Last edited:

Cryio

Member
FSR2 is wonderful. Unfortunately, often times FSR2 modded looks better than native FSR2, which is more on the devs than on the tech.

FSR2.2 is only available in a few titles for now. And it's the latest version, so it should look the best.

Most games compared in the video have either outdated builds, poor implementations or the difference was Tie/DLSS+, which is good enough for a performant agnostic solution.

FSR2 Quality is IMO always better than native when properly implemented and even FSR2 Balanced is plenty fine.

The Performance and Ultra Performance tho? Nah, stay away. DLSS is still miles better there.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not


TLDR: XESS 1.1 nearly on par with DLSS3.1, in image quality. And well ahead of FSR 2.1
It loses a couple of fps, compared to DLSS, probably because it uses DP4A, instead of nvidia's tensor cores.

Bring on the Battlemage!!!
 

kiphalfton

Member
So quick question, why are people even arguing over this and picking "sides"? It's good tech that's pushing each other that will continue to improve which only benefits gamers in the end.

Gotta always pick sides, and make it into an argument wherever possible.
 
Last edited:

01011001

Banned


TLDR: XESS 1.1 nearly on par with DLSS3.1, in image quality. And well ahead of FSR 2.1
It loses a couple of fps, compared to DLSS, probably because it uses DP4A, instead of nvidia's tensor cores.


jesus christ that first scene... the F in FSR apparently stands for Flicker
also this is recorded on an Nvidia card, XeSS often looks better on Intel Arc, with less artifacts.

Checkerboard rendering died so that AMD's shitty FSR can live... sad...
 
Last edited:

winjer

Gold Member
xess looks very soft in the side by sides compared to fsr/dlss, has intel got any sharpening pass in xess? It looked like it was upscaling from a lower resolution than the other 2 in some of those comparisons.

As far as I know, it doesn't.
Also consider that XeSS usually looks blurrier than XeSS 1.1, especially on the DP4A path.
 

hlm666

Member
As far as I know, it doesn't.
Also consider that XeSS usually looks blurrier than XeSS 1.1, especially on the DP4A path.
Ah ok, I figured all the examples would have been the full blown xess running on the arc xmx hardware. If it was dp4a kinda makes sense, that mode has never been impressive.
 

winjer

Gold Member
Ah ok, I figured all the examples would have been the full blown xess running on the arc xmx hardware. If it was dp4a kinda makes sense, that mode has never been impressive.

In XeSS 1.0, there was a significant diference in image quality, between the XMX path and the DP4A path.
But with XeSS 1.1 they have the same quality. It's just the performance that changes.
 

LiquidMetal14

hide your water-based mammals
Because of a sharpening filter. Don't be dishonest.
Thank you, I just came in here to say that since I watched that video too. But hey I love AMD for their processors but I'm not going to go and overshill for the graphics cards when they are leads behind nvidia on that front
 

Gaiff

SBI’s Resident Gaslighter
I don't care sharp or not sharp, if game have difference options, then i can choose better quality picture.
You can also sharpen the image on the NVIDIA GPU. You can't make that awful ghosting and smearing go away on the AMD GPU.

It's not "better quality" it's simply sharpness.
 

SolidQ

Member
You can't make that awful ghosting
never notice ghosting in games what i'm playing, but anyway only 2 games played with FSR, and zero with dlss

You can also sharpen the image on the NVIDIA GPU.
Never use it, because playing on native resolution, and there quality is sharp already
 
Last edited:
Top Bottom