• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is DLSS really the way of the future?

Jon Neu

Banned
Disprove his point. It already happened with Control. Try running it without DLSS with ray tracing. It has like 25-30 frames per second, a linear game. Remedy are hot garbage at optimizing their games. Remember Quantum Break? yeah go watch some benchmarks of that as well. DLSS is a nice feature but it shouldnt be used as a tool for lazy devs but it will be.

Ray Tracing is so demanding already. The fact that you can enable ray tracing and get insane resolutions and framerates with the highest graphical settings with DLSS it's the proof that not only isn't ruining games, but it's making them so much better.

Mentioning Quantum Break is kind of weird also, as it was made long before the existence of DLSS. Accusing DLSS of being responsible of Quantum's Break brokeness it's just plain nonsensical bullshit. Bad optimized PC ports have happened since forever, but somehow now are DLSS's fault. You all couldn't hide your bias even if you tried.

And we all know that if DLSS was invented by Sony and was an exclusive of the PS5, you, thelastword, geordiemp, bo_hazem and the rest would be singing a quite different song and half the forum would be filled with threads explaining how DLSS is the best thing ever that happened to humanity. The SSD hype would be a joke compared to that.

There you have it.
 

GymWolf

Member
Ray Tracing is so demanding already. The fact that you can enable ray tracing and get insane resolutions and framerates with the highest graphical settings with DLSS it's the proof that not only isn't ruining games, but it's making them so much better.

Mentioning Quantum Break is kind of weird also, as it was made long before the existence of DLSS. Accusing DLSS of being responsible of Quantum's Break brokeness it's just plain nonsensical bullshit. Bad optimized PC ports have happened since forever, but somehow now are DLSS's fault. You all couldn't hide your bias even if you tried.

And we all know that if DLSS was invented by Sony and was an exclusive of the PS5, you, thelastword, geordiemp, bo_hazem and the rest would be singing a quite different song and half the forum would be filled with threads explaining how DLSS is the best thing ever that happened to humanity. The SSD hype would be a joke compared to that.

There you have it.
Yeah it's pretty strange reading people who think that dlss is the cause for future broken games when they always existed.
At least now we have a something to counter-balance lazy devs and shitty ports.
 

The Cockatrice

Gold Member
Ray Tracing is so demanding already. The fact that you can enable ray tracing and get insane resolutions and framerates with the highest graphical settings with DLSS it's the proof that not only isn't ruining games, but it's making them so much better.

Mentioning Quantum Break is kind of weird also, as it was made long before the existence of DLSS. Accusing DLSS of being responsible of Quantum's Break brokeness it's just plain nonsensical bullshit. Bad optimized PC ports have happened since forever, but somehow now are DLSS's fault. You all couldn't hide your bias even if you tried.

And we all know that if DLSS was invented by Sony and was an exclusive of the PS5, you, thelastword, geordiemp, bo_hazem and the rest would be singing a quite different song and half the forum would be filled with threads explaining how DLSS is the best thing ever that happened to humanity. The SSD hype would be a joke compared to that.

There you have it.

Mentioning QB was a jab at Remedy. Way to twist the narrative from ray tracing performance to how it makes games amazing lmfao. Metro Exodus runs with ray tracing and no dlss so much better than Control and it looks miles better, besides being semi open world. Also way to make assumptions about me. I'm a PC gamer my dude. I got a 2080 and I like the feature. No one is saying it's bad, at least I'm not. All I'm saying is, it will be used as a lazy method to optimize games. Fucking read.
 
Last edited:

GymWolf

Member
Mentioning QB was a jab at Remedy. Way to twist the narrative from ray tracing performance to how it makes games amazing lmfao. Metro Exodus runs with ray tracing and no dlss so much better than Control and it looks miles better, besides being semi open world. Also way to make assumptions about me. I'm a PC gamer my dude. I got a 2080 and I like the feature. No one is saying it's bad, at least I'm not. All I'm saying is, it will be used as a lazy method to optimize games. Fucking read.
And what is the negative part of being a lazy method to fix stuff?? I bet that nvidia created dlss exactly because they know how sloppy and unoptimized rtx implementatios are gonna be in future games.

Better not having a lazy method while still having lazy devs??

the point we are trying to discuss here is that lazy devs and shitty ports always existed, hell i even gonna say that majority of pc games have some kind of problems, stuff like doom or gears 5 are white flies.

the only risk is for already lazy teams to get even more lazy but this time we can counter-balance this with dlss, before we had jack shit, i still see this like an overall improvement.
 
Last edited:

DavidGzz

Member
Yeah, it killed my next gen console boner. Getting a new PC for sure earlier than I wanted unless MS has something up their sleeve that can compete. It's ironic that these consoles are beasts compared to last gen but this DLSS secret sauce made them even worse off or at least just as shitty as they were in 2013.


Yeah it's pretty strange reading people who think that dlss is the cause for future broken games when they always existed.
At least now we have a something to counter-balance lazy devs and shitty ports.

People always pull shit out of their ass. It reminds me of GaaS games being blamed on Game Pass. No, they have been around and will keep coming regardless.
 
Last edited:

thelastword

Banned
Ok then, but be aware that reconstruction, machine learning for IQ is not a new thing. As you can see currently, DLSS is not something that can be done from the GPU directly, so devs have to incorporate it in, so adoption may be slow in games support. We are aware of many other reconstruction techniques before DLSS, and they could be done directly from the GPU itself....Checkerboard Rendering, Temporal Injection, Geometry Rendering......We all know these techniques will be improved further and will be native to the hardware without waiting on an AI'd image beforehand...

People who think DLSS will have no competition or that consoles, primarily the PS5, won't have an answer for DLSS or won't have a proper or enhanced reconstruction feature are simply not paying attention. Even James Stephenson from Insomniac suggests that the 60fps mode in Ratchet and Clank will use temporal injection......AMD, Sony, Guerilla and ICE have not been twiddling their thumbs on tech and it's advancement. As I said before, we hardly have too many DLSS games yet, when AMD delivers an open source AI reconstruction technique or CB 2.0, that's what will most widely be used or adopted, since most games will lead on consoles, which are AMD based and these consoles will popularize AMD based techniques....
 

GymWolf

Member
Ok then, but be aware that reconstruction, machine learning for IQ is not a new thing. As you can see currently, DLSS is not something that can be done from the GPU directly, so devs have to incorporate it in, so adoption may be slow in games support. We are aware of many other reconstruction techniques before DLSS, and they could be done directly from the GPU itself....Checkerboard Rendering, Temporal Injection, Geometry Rendering......We all know these techniques will be improved further and will be native to the hardware without waiting on an AI'd image beforehand...

People who think DLSS will have no competition or that consoles, primarily the PS5, won't have an answer for DLSS or won't have a proper or enhanced reconstruction feature are simply not paying attention. Even James Stephenson from Insomniac suggests that the 60fps mode in Ratchet and Clank will use temporal injection......AMD, Sony, Guerilla and ICE have not been twiddling their thumbs on tech and it's advancement. As I said before, we hardly have too many DLSS games yet, when AMD delivers an open source AI reconstruction technique or CB 2.0, that's what will most widely be used or adopted, since most games will lead on consoles, which are AMD based and these consoles will popularize AMD based techniques....
Yes, like i said many times, if they can't manage to have dlss AT LAUNCH for every important game, this thing is gonna be utterly useless for all the people who buy games at day one (you know, a lot of hardcore gamers with rtx gpus...).

I hope people is not gonna defend this particualr aspect with stuff like "gne gne but nobody force you to buy at launch" in a goddamn hardcore videogame forum...
 

Jon Neu

Banned
And what is the negwtive part of being a lazy method to fix stuff??

Better not having a lazy method while still having lazy devs??

the point we are trying to discuss here is that lazy devs and shitty ports always existed, hell i even gonna say that majority of pc games have some kind of problems, stuff like doom or gears 5 are white flies.

the only risk is for already lazy teams to get even more lazy but this time we can counter-balance this with dlss, befire we had jack shit, i still see thos like an overall improvement.

But now they are going to blame every lazy port because of DLSS's existence, as they are already doing... ¡retroactively!

The funny thing is that DLSS is going to make room to amazing improvements in videogames that will not be possible without it's existence. We are going to see things that shouldn't be possible with the way we always analyzed how hardware and software works. PC's (and eventually consoles) are going to have an even much greater perfomance than their hardware suggests. The benefits are going to be mindblowing almost magic like.

But some people just need to search for something negative to put the blame on it.
 

The Cockatrice

Gold Member
And what is the negative part of being a lazy method to fix stuff??

Better not having a lazy method while still having lazy devs??

the point we are trying to discuss here is that lazy devs and shitty ports always existed, hell i even gonna say that majority of pc games have some kind of problems, stuff like doom or gears 5 are white flies.

the only risk is for already lazy teams to get even more lazy but this time we can counter-balance this with dlss, before we had jack shit, i still see this like an overall improvement.

It won't be an improvement if everyone becomes lazy due to the feature. I mean, yeah if we could get payed and not work, everyone would accept it, but will it be good in the future? It won't. Same shit here. DLSS is nice, no doubt, but native will always look better and accurate and the sharpening feature can easily be added in case of blurry rendering/AA with sweetfx/nvidia cpl. Lazy people should not be rewarded, and this is coming from a lazy person.
 

GymWolf

Member
But now they are going to blame every lazy port because of DLSS's existence, as they are already doing... ¡retroactively!

The funny thing is that DLSS is going to make room to amazing improvements in videogames that will not be possible without it's existence. We are going to see things that shouldn't be possible with the way we always analyzed how hardware and software works. PC's (and eventually consoles) are going to have an even much greater perfomance than their hardware suggests. The benefits are going to be mindblowing almost magic like.

But some people just need to search for something negative to put the blame on it.
They really can't, i just have to open my uplay or steam library to make a list with the hundreds of old broken games when dlss was not even in the mind of jensen...
 

GymWolf

Member
It won't be an improvement if everyone becomes lazy due to the feature. I mean, yeah if we could get payed and not work, everyone would accept it, but will it be good in the future? It won't. Same shit here. DLSS is nice, no doubt, but native will always look better and accurate and the sharpening feature can easily be added in case of blurry rendering/AA with sweetfx/nvidia cpl. Lazy people should not be rewarded, and this is coming from a lazy person.
Dude spoiler alert, majority of pc devs who works on portings are lazy fucks already, they don't need dlss to do shitty ports because pc optimization is already pretty rare in current games.

How many games run like gears 5 and doom while having great graphics? I can do 4k60 high to ultra details and maynbe something on medium with those games with a filthy 2070super, do you wanna know how many other big games can achieve that or even come close to that?
Well i can start with...or...and don't forget...

Do you get my point?
 
Last edited:

The Cockatrice

Gold Member
Dude spoiler alert, majority of pc devs who works on portings are lazy fucks already, they don't need dlss to do shitty ports because pc optimization is already pretty rare in current games.

How many games run like gears 5 and doom while having great graphics? I can do 4k60 high to ultra details and maynbe something on medium with those games with a filthy 2070super, do you wanna know how many other big games can achieve that? Well i can start with...or...and don't forget...

Do you get my point?

Yeah I'm aware of what you're saying. Ofc we need DLSS but just like all good things, they get abused. Still, we've seen some good ports lately and now with the new consoles and how they're built, PC ports should be better, hopefully. Theres also the fact that we need some competition for Nvidia for a healthy environment.
 

GymWolf

Member
Yeah I'm aware of what you're saying. Ofc we need DLSS but just like all good things, they get abused. Still, we've seen some good ports lately and now with the new consoles and how they're built, PC ports should be better, hopefully. Theres also the fact that we need some competition for Nvidia for a healthy environment.
Yeah of course, we all are crossing our pubic hairs waiting for amd response to dlss, i absolutely want something similar on console too.
 
Last edited:

Jon Neu

Banned
i absolutely want something similar on console too.

I really hope too that next year the consoles can come up with something similar to be implemented in 2022 games.

Hell, a PS5 Pro and Xbox Series X Ultra (or whatever) would be justified just for putting in hardware that helps with the tech alone.

With HDMI 2.1 becoming a standard in the next years, they can market the new Pro versions as 120fps + Full RT machines.
 

thelastword

Banned
Yes, like i said many times, if they can't manage to have dlss AT LAUNCH for every important game, this thing is gonna be utterly useless for all the people who buy games at day one (you know, a lot of hardcore gamers with rtx gpus...).

I hope people is not gonna defend this particualr aspect with stuff like "gne gne but nobody force you to buy at launch" in a goddamn hardcore videogame forum...
I see your point. I'm not against reconstruction, you probably know that....... DLSS is a feature that looks good and I must say 2.0 is a nice improvement from 1.0 which was just abysmal. Yet it does have it's drawbacks and it's adoption just wont be as quick as a hardware based solution native to the hardware being done on the fly, that's the future of reconstruction tbh....

The other point is that any technique that lowers the footprint in the render cycle will sacrifice something......I don't trust DF to show you this now, but you will see it soon enough when more DLSS games come out and our comparison tools become better for the next gen. Tools that can compare 4-8K native and say framerates above 60-120fps a bit better. It's just like people saying that VRS was the answer, yes they are great for saving frames, but they do all have compromises.

So to your point. I don't think a PS5 without reconstructed games will be a detriment at launch. COD looks good and it will be 4K native, Ratchet is 4K native, GT7 is 4k native, Kena is 4k Native, pretty much all the excellent looking games shown at the Sony conference were 4K native with RT......Now if they want to push higher frames like 30 to 60fps or 60 to 120fps they can do reconstruction for the latter. It will be a nice option and it will be there, but I think many will be satisfied with better graphics at 30fps at native resolution or 60fps at native resolution with RT....I say let the consoles push their 4K native prowess now, when things get even more ambitious, they can start using reconstruction as their main target. For now, they can render 4k native no problem, with room to spare, because they are doing RT in tow with 4k native 30-60 fps......And the games look excellent.

A picture was posted of the different images. DLSS does not look sharper than native 4K, native is using TAA whilst DLSS does not use TAA and DLSS is using a sharpening also before the final image and using downsampling from a 4k-16k blowdown. All you have to do is to sharpen the 4k native picture just the same, now zoom the pictures to see which holds resolve even more. Tbh, a proper comparison have not been done since DLSS 2.0 vs native, partly because there are not many DLSS 2.0 games. Yet wait till the next gen begins and you will see the discrepancies between the native render vs dlss........I still remember when DF was praising DLSS 1.0, when the very image of the infiltrator demo was missing tonnes of lights and geometry detail in the background. So stay tuned on proper comparisons with with Native vs DLSS vs CB 2.0......You can't say something is the future, when it has not taken flight yet, all the noise you hear about DLSS is marketing at this point, they are trying to sell you NV cards, the feature has not taken over anything yet....or seen wide adoption.....If and when it does, then we can talk.......
 

geordiemp

Member
I really hope too that next year the consoles can come up with something similar to be implemented in 2022 games.

Hell, a PS5 Pro and Xbox Series X Ultra (or whatever) would be justified just for putting in hardware that helps with the tech alone.

With HDMI 2.1 becoming a standard in the next years, they can market the new Pro versions as 120fps + Full RT machines.

No you dont teally hope that at all.

And as I said, Nvidia are doubling down on DLSS, and its good.

Every one else seems to be utilising more advanced temporal techniques and evolved temporal over older similar stuff, such as Unreal engine or decima or anybody else. And its equally as good.

We dont need ML for upscaling, temporal is great as well.

Spin that however you want, it will make you upset because only Nvidia is allowed good upscaling ?
 
Last edited:

Jon Neu

Banned
No you dont teally hope that at all.



tenor.gif


And its equally as good.

Yeah, that's why the UE5 tech demo was running at 1440p and 30fps. Without NPC's.

Spin that however you want.

sony-spin-media-kit-1-728.jpg
 

Jon Neu

Banned
You really are incapable of intelligent and respectful discorse. I pity you.

Says the man who quoted me saying this:

No you dont teally hope that at all.

You're so butthurt that Nvidia is leading the way that you have to retort to ridiculous mind reading and claiming that I don't want consoles to have a similar tech as DLSS.

I'm a console gamer, I'm going to buy both the XsX and the PS5. Of course I want that tech into the consoles because that means that we are going to see much better games. Much better games = more enjoyment for console users. Me = console user.

You will go far in life with that attitide.

I can't complain.


The fact that you are having this tantrum over a gif of a girl singing a song is amusing.
 

GymWolf

Member
I see your point. I'm not against reconstruction, you probably know that....... DLSS is a feature that looks good and I must say 2.0 is a nice improvement from 1.0 which was just abysmal. Yet it does have it's drawbacks and it's adoption just wont be as quick as a hardware based solution native to the hardware being done on the fly, that's the future of reconstruction tbh....

The other point is that any technique that lowers the footprint in the render cycle will sacrifice something......I don't trust DF to show you this now, but you will see it soon enough when more DLSS games come out and our comparison tools become better for the next gen. Tools that can compare 4-8K native and say framerates above 60-120fps a bit better. It's just like people saying that VRS was the answer, yes they are great for saving frames, but they do all have compromises.

So to your point. I don't think a PS5 without reconstructed games will be a detriment at launch. COD looks good and it will be 4K native, Ratchet is 4K native, GT7 is 4k native, Kena is 4k Native, pretty much all the excellent looking games shown at the Sony conference were 4K native with RT......Now if they want to push higher frames like 30 to 60fps or 60 to 120fps they can do reconstruction for the latter. It will be a nice option and it will be there, but I think many will be satisfied with better graphics at 30fps at native resolution or 60fps at native resolution with RT....I say let the consoles push their 4K native prowess now, when things get even more ambitious, they can start using reconstruction as their main target. For now, they can render 4k native no problem, with room to spare, because they are doing RT in tow with 4k native 30-60 fps......And the games look excellent.

A picture was posted of the different images. DLSS does not look sharper than native 4K, native is using TAA whilst DLSS does not use TAA and DLSS is using a sharpening also before the final image and using downsampling from a 4k-16k blowdown. All you have to do is to sharpen the 4k native picture just the same, now zoom the pictures to see which holds resolve even more. Tbh, a proper comparison have not been done since DLSS 2.0 vs native, partly because there are not many DLSS 2.0 games. Yet wait till the next gen begins and you will see the discrepancies between the native render vs dlss........I still remember when DF was praising DLSS 1.0, when the very image of the infiltrator demo was missing tonnes of lights and geometry detail in the background. So stay tuned on proper comparisons with with Native vs DLSS vs CB 2.0......You can't say something is the future, when it has not taken flight yet, all the noise you hear about DLSS is marketing at this point, they are trying to sell you NV cards, the feature has not taken over anything yet....or seen wide adoption.....If and when it does, then we can talk.......
It's not really the fact that they already run at 4k, it's what they waste to get there (and you know that they are wasting resources for native 4k)
All these games would look even noticeably better with dlss, and not because it's better than native 4k, but becouse all the things that you can add (more detailed pg or locations, more effects, more rtx etc while STILL HAVING 60 fps) are ALL more noticeable than the little loss in IQ (if there is any loss).

there is always room for improvement if you have free resources.

but yeah i get your point.
 
Last edited:

GymWolf

Member
Says the man who quoted me saying this:



You're so butthurt that Nvidia is leading the way that you have to retort to ridiculous mind reading and claiming that I don't want consoles to have a similar tech as DLSS.

I'm a console gamer, I'm going to buy both the XsX and the PS5. Of course I want that tech into the consoles because that means that we are going to see much better games. Much better games = more enjoyment for console users. Me = console user.



I can't complain.



The fact that you are having this tantrum over a gif of a girl singing a song is amusing.

maybe he is more of a belle delphine guy :lollipop_squinting:

i hope he doens't get mad at me too...
 

SF Kosmo

Al Jazeera Special Reporter
Out of the small hand of games it's available for, it really is only a couple of instances where the reconstructed image is better.
Basically the games that use 2.0, I know. But even when it is not "better" than native, at 4K it's certainly "close enough" that few would be able to tell the difference without freezing the frame and zooming in.

It has its limits. Going from 540p or 720p base up to whatever is going to be more obvious than using a 1080p or 1140p base. But even then I'd be hard pressed to say it looks "bad."

It's native rendered is the way to go issue is the engines the games are built on are not targeting ultra high end/ movie like quality assets. Which is why there are so many instances of games when scaled to 4k having a lot of rough edges.
It's not really that, but yes games will get better at handling 4K, but it's difficult to imagine we will ever reach a point where a 35-70% performance boost is not useful in many cases.

Where I think DLSS (or something related) gets really useful is next gen VR. When games are targetting high stereoscopic resolutions AND high frame rates, it gets really tough on the GPU, and if we pair this tech with foveated rendering, we get a situation where devs can cut the amount of pixels they need to render dramatically.
 

GymWolf

Member
Let's play a match of Spot the differences, one is native4k, the other is 720pDlss4k

Control-Native-4k.png


Control-4k-720p-DLSS.png


Thanks to reeee for the img🕺

I can only clearly spot the written emergency light on the lamp being more blurry, this is dlss with reconstruction from 720p but it's already pretty great.
 
Last edited:

DeaconOfTheDank

Gold Member
The lengths people will go to discredit/downplay DLSS is insane.

It's a marvelous technology that'll only push gaming forward. Does that mean that Nvidia will have exclusive rights to the concept (e.g. their implementation will be the only option) till the end of time? Probably not. Literally everyone could benefit from the tech and I fail to see how consoles wouldn't want to jump on it.

I also don't understand the logic that assumes DLSS adoption will lead to lazier PC ports... If I told you that I knew of a solution to expand your rendering budget without sacrificing visual fidelity, your immediate reaction would be to waste the extra resources? That's some flawed logic; devs would be thinking about what else they could do for maximize utilization for the best presentation.

Here's an analogy: imagine that game developers are financial investors and that developing video games within certain hardware constraints is like investing money with a fixed amount of captial -- you're hoping to get the biggest returns for what you've got. Now, if I gave an investor more captial to work with then I assume they would be looking to increase their returns on their investment. However, some people here assume that the investor's reaction would be "lol fuck it, let's buy some Funko Pops with all the extra cash."

DLSS gives developers extra capital/resources to work with be eliminating the scaling bottleneck of resolution vs visual fidelity. Ray tracing at 1080p is a lot less costly than ray tracing at 4K. In fact, the same can be said about a lot of techniques (AO, SSR, DoF, volumetrics, etc.). By allowing devs to operate at much lower resolution, they can maximize the effects in use without worrying about how they scale with resolution. Afterwards, DLSS takes what they've rendered and upscales it without noticeable loss in image quality. You would be delusional to think this isn't fucking incredible. To do so would ironically be in direct opposition of other image reconstruction techniques (e.g. checkerboard rendering).
 

Kuranghi

Member
Apparently there is already an alternative by AMD.



It doesn't I can see the difference even through youtube compression especially the aliasing. Adaptive sharpening like fidelityFX and the nvidia equivalent can get you down to 1800p from 2160p after that image quality suffers too much.

The most egregious aliasing I could see without zooming is on this light pole, on the edges of blue part of it.

Is this the aliasing you were referring to ScaryBrandon ScaryBrandon ?:

XBfmDtY.png


...because thats on the DLSS side, and its not there on the CAS side. This is with image compression on top of the video compression so the aliasing is much clearer on the original video, you can still see it here, albeit blurred by the compression.

Did you mix up which side is which? Not meaning that in a shitty way, I just see the opposite of what you see (Like BluRayHiDef BluRayHiDef sees on his 55") and I'm on a 65" TV, sitting 8 feet away.

When presented with this comparison I'd say the CAS side looks a bit better than the DLSS side overall to me, but I wonder what the difference in framerate is.
 

IntentionalPun

Ask me about my wife's perfect butthole
Dlss is not only software you need special tensor cores that work some of that AI wizardry and as of now you can only get it on rtx cards. Unless amd works on something similar(which I think they are) you won't be getting dlss on amd cards or consoles.
Dlss is a big deal to me and its one of the reasons I'm going with these new rtx cards. Shit seems to good to be true but from what I have seen its amazing.
Some DLSS implementations actually didn't run on tensor cores.. it's possible on your main GPU cores, and still benefits.

One of the versions of Control did that.
 

EverydayBeast

thinks Halo Infinite is a new graphical benchmark
Guess what happens every generation? 3D, CD, DVD, Blu-Ray, HDR, 4K etc. you see what happens? Developers, gamers etc. will sign off on some new tech.
 

Neo_game

Member
Yes it is a blessing in disguise. Checkerboard res was also awesome PS4 Pro feature. Hopefully in few years time games will not even consider native 4K for good. It is very demanding and resources can be used elsewhere. On console temporal or checkerboard are probably going to be used. Personally I find it amusing that you have to zoom in so much to see the artifacts. If that is something that bother you then PC is definitely the way to go.
 
Last edited:

martino

Member
Let's play a match of Spot the differences, one is native4k, the other is 720pDlss4k

Control-Native-4k.png


Control-4k-720p-DLSS.png


Thanks to reeee for the img🕺

I can only clearly spot the written emergency light on the lamp being more blurry, this is dlss with reconstruction from 720p but it's already pretty great.

it's impressive (because coming from 720p) but without opening them i already see aliasing(or artifacts in this case) on all the boxes
what remain to be seen too is how results of dlss will evolve with busier more complex scenes.
 
Last edited:

Chromata

Member
I hope so, every game should have DLSS or something like it. Why bother with native 4k and endure the performance hit?
 

Rikkori

Member

Last year Nvidia announced DLSS 2.0 with the ability to open this new anti-aliasing solution for all games. The game-specific deep learning is no longer required as it was with prior iterations. For an indie dev that is especially exciting as the chance of getting onto Nvidia’s SuperComputer was pretty slim.

Besides the new game agnostic algorithm, we also get much-improved anti-aliasing results which honestly looks pretty fantastic and somewhat unbelievable in games like Control (See Digital Foundry’s DLSS Comparison). You’ll get a chance to see how DLSS performs in Unreal Engine through my own experiments below.

For testing, I used my open-source SurvivalGame (available on GitHub) and Dekogon Studios’ City Subway Train asset.



SurvivalGame on GitHub received a graphical refresh for this DLSS Test.

What is DLSS?​

DLSS stands for Deep Learning Super Sampling and in the 2.0 iteration it can use the Nvidia RTX cards’ Tensor Cores to upscale lower resolution render buffers to the desired ‘native’ scale much better than any existing upscale algorithm inside UE4 currently. At the same time that DLSS upscales, it is performing anti-aliasing effectively gaining performance from rendering the scene and post-processing at reduced input resolution. This AA solution replaces TXAA and so dithered materials don’t seem to render the same with DLSS currently (where TXAA would soften the dither pattern).

Remember that aliasing itself occurs from rasterizing a scene to pixels. Fewer pixels will cause higher aliasing, so the fact that DLSS actually fixes most aliasing while we provide it a much lower input than native is pretty amazing if you ask me.

Even Zoomed-in you can barely see the difference. (but there is a big gain in performance, some numbers further down)
One title using DLSS to improve performance while maintaining visual fidelity is Deliver Us The Moon, built using UE4.


Getting Started​

For this article, Nvidia hooked me up with an RTX graphics card and access to the Unreal Engine RTX/DLSS Branch on Github. With the new graphics card, I got to play with ray-tracing for things like shadows, reflections, ambient occlusion, and even full Path Tracing that is now available. But what excited me most in practical terms is DLSS 2.0 and it has been the least covered in the context of Unreal Engine.

So what does it take to get DLSS running in your Unreal Engine project? Well, for now at least you need an AppID provided by Nvidia (must apply to receive an AppId) to enable the tech along with the custom engine modifications on GitHub. I suspect this to get easier in the future, but at least I can already show you what you should be excited about in case your studio can’t get access to it just yet.

Once you do have those pre-requisites figured out – it’s pretty straightforward to get up and running (So long as you know where to look).

Here are the required steps to make this process easier for you:

  1. Associate and compile your project with the UE4 RTX Branch
  2. Make sure the “Nvidia DLSS” Plugin is enabled via Edit > Plugins…
  3. Add the AppId provided by Nvidia in Config/DefaultEngine.ini (See Below)
[/Script/DLSS.DLSSSettings]
NVIDIANGXApplicationId=XXXX
  1. Enable DLSS by calling UDLSSLibrary::SetDLSSMode(UDLSSMode) (Available as Blueprint node) or “r.NGX.DLSS.Enable 1”
  2. (Optional) Tweak the desired Sharpness with UDLSS::SetDLSSSharpness(float) (Or Blueprint variant) between 0.0 and 1.0 (0.35 seems to be the recommended value)
The internal resolution is downscaled automatically based on the DLSS quality-setting. You can choose between 5 modes from Ultra Performance, Performance, Balanced, Quality, and Ultra Quality (Although this last mode was ‘not supported‘ for my setup). Finally, you have the option to sharpen the output.

survival_dlss_perf_sharpen_ngxsettings.jpg
Helpful (UMG) Widget to display and test DLSS. (from Nvidia Branch)
In my tests, the internal resolution can go down to 33% (in Ultra Performance, meant for 4K Displays) which is a huge saving in screen-space operations such as pixel shaders, post-processing, and ray-tracing in particular. Even at 50% for Performance & Quality modes it’s still x4 fewer pixels to process. Judging from the provided UI the internal resolution can change on the fly between a predefined range (eg. from 50% to 69% in Quality-Mode) I’m not sure at this time how DLSS decides which exact internal resolution to use.

survival_sp50_noaa_zoomed.jpg
Zoomed view of 1440p at 50% (r.ScreenPercentage 50, no AA), this is the input data that DLSS has to work with.

Anti-Aliasing Quality (DLSS vs. TXAA)​

The default anti-aliasing solution for Unreal Engine is TXAA (although FXAA is supported with inferior results) and so this is the main competitor for Nvidia’s DLSS in the engine.

Since DLSS is using a lower internal resolution, the real test is whether it can maintain final image quality while improving performance. A second major benefit for DLSS is how it scales to 4K Displays (unfortunately my monitor is only 1440p) as it can enable 4K Gaming on mid-range graphics cards by using a more reasonable internal resolution.

It’s telling that often I had to double-check the screenshots to make sure I had the correct one between TXAA and DLSS. Below you’ll see a few zoomed-in comparisons so that image resizing can’t interfere and honestly it’s necessary in order to see the difference.




survival_zoom_txaa_cropbox.png

survival_zoom_dlss_quality_cropbox.png

Trees even look crisper on DLSS than native TXAA. Cables hold up incredibly well, slightly harsh in places, mainly noticeable due to zoom-level.



metro_test_fullres_35fps.png

metro_test_dlss_quality_64fps.png

Left: 1440p TXAA, Right: DLSS Quality-mode (Zoomed)
Nearly indentical quality, slight error in the ceiling lights where a white line in the original texture got blown out by the DLSS algorithm causing a noticable stripe. I reckon this should be ‘fixed’ in the source texture instead.




subway_raster_zoom_txaa.jpg

subway_raster_zoom_dlss_quality.jpg

Left: TXAA 1440p, Right: DLSS Quality-mode. (Zoomed)
Reflections look ever so slightly different, this scene used ray-traced reflections on highly reflective materials.

DLSS vs. TXAA Performance​

For the quality and performance comparison I’ve made a quick video toggling between the 3 different AA modes (TXAA, DLSS Quality & DLSS Performance) to see the difference. As you’ll notice the visual quality is often difficult to see while the framerate takes a big leap at the same time. I would even argue that DLSS Quality-mode can conjure a higher quality image in some cases (I found that my foliage scene ‘felt’ crisper with DLSS enabled).


(Make sure to watch in 1440p and fullscreen.)

The Numbers​

Please keep in mind these numbers were taken from my unoptimized scenes, running in standalone-mode (outside the editor) but not a cooked build. Several RT features were turned on to strain the system (including raytraced reflections).

Forest Scene (RTX On 2560×1440)​

Forest Scene (Note: Downscaled JPG from 1440p source)
This scene was likely bottlenecked by the ray-traced reflections and so you’ll see a huge gain in framerate as the internal resolution lowers from DLSS.

  • TXAA Baseline ~35 FPS
  • DLSS Quality ~56 FPS (+60%)
  • DLSS Balanced ~65 FPS (+85%)
  • DLSS Performance ~75 FPS (+114%)
  • DLSS Ultra Performance ~98 FPS (+180%?! – Noticeably blurry on 1440p, intended for 4K)

Subway Train (RTX On 2560×1024)​

Subway RTX On (Note: Downscaled JPG from 1440p source)
The camera used a cinematic aspect ratio hence the 1024p height. This scene used similar RTX settings and even ray-traced ambient occlusion on top.

  • TXAA Baseline ~38 FPS
  • DLSS Quality ~69 FPS (+82%)
  • DLSS Performance ~100 FPS (+163%)

Subway Train (RTX Off 2560×1024)​

Subway non-RTX (Note: Downscaled JPG from 1440p source)
Without any further RT-options enabled the difference in performance between internal resolutions appears to diminish somewhat. Although this was just a single test and your mileage may vary (as with all performance metrics, GPUs are a complicated beast)

  • TXAA Baseline ~99 FPS
  • DLSS Quality ~158 FPS (+60%)
  • DLSS Performance ~164 FPS (+65%)

DLSS Render Pass​

The DLSS-pass occurs during Post Processing much like traditional AA solutions. On my Nvidia RTX 2080 Ti the cost to upscale to 1440p was about 0.8-1.2ms. This number seems to be consistent regardless of DLSS-mode, For reference, TXAA at full 1440p costs about 0.22ms on my machine.

ue4_profilegpu_dlss-1.jpg
ProfileGPU Output Window.
You can measure performance of individual render passes by either using “ProfileGPU” or “stat GPU” console commands.

Performance Conclusions​

The performance potential of DLSS is huge depending on your game’s rendering bottleneck. Reducing internal resolution by 50% has an especially large benefit on RTX-enabled games with ray-tracing cost highly dependent on resolution.

Ray-tracing features appear to be working better with DLSS enabled from a visual standpoint too. RTAO (Ray-traced Ambient Occlusion) specifically causes wavy patterns almost like a scrolling water texture when combined with TXAA. However, Enabling DLSS above Performance-mode completely eliminates these issues and provides a stable ambient occlusion.

Conclusion​

Throughout my experiments, I’ve been super impressed with the results of DLSS 2.0. The fact that this magically works on any game out of the box without Nvidia SuperComputer pre-processing is impressive.

The image quality remains high and sometimes even managed to be crisper than TXAA. There are some artifacts I ran into, that were overall small compared to the large performance gains we saw across the board. I’d love to test this on a 4K Display which is where DLSS can shine even more.

For now though the DLSS Branch of Unreal Engine isn’t widely accessible and you’ll need to contact Nvidia to get access.

If you want to see more, follow me on Twitter and Subscribe below to receive new articles straight to your inbox!

References​

 
Last edited:

mxbison

Member
DLSS is amazing.

I'm playing Control on 4K DLSS with ultra details and max Ray Tracing on a mid price range GPU (RTX 3070).
Cyberpunk on 1440p DLSS with ultra details and ultra Ray Tracing, which looks absolutely nuts.

Massive performance increase for barely noticable quality decrease. I'll take that any day.
 
Top Bottom