• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia introduces DLDSR. AI powered DSR powered by tensor cores.

Braag

Member
I have a 1440p monitor so I don't use DSR much as I would need to upscale to 2160p (4K) in order to get a smooth picture. This could come handy in some titles though.
 

GymWolf

Member
That's the point of the "DL" part. If your GPU has Tensor cores, they take the load of upscaling. Basically, you maintain whatever FPS you can get running at native, but improved visual fidelity. It probably won't matter much if you can't sustain native resolution though.
This is my problem, i can't play at 4k because i can't do 4k60 in majority of games so i just settle for 1440p.

this thing is only useful for 4k user if they have power to spare with a 3080\3090, not for people with mid-tier gpu like me.



I'm currently playing Quantum Break and man does the image looks washed out. Never seen before.
i'm sorry what?
 
Last edited:

amigastar

Member
This is my problem, i can't play at 4k because i can't do 4k60 in majority of games so i just settle for 1440p.

this thing is only useful for 4k user if they have power to spare with a 3080\3090, not for people with mid-tier gpu like me.




i'm sorry what?
I mean with that if you want to really experience washed out screen, try Quantum Break.
 

Kenpachii

Member
Sorry if i get straight to the point, but how is dldsr useful for people like me with a 4k display but a 1440p gpu like a 2070super??

I can't sustain 4k so downscaling from even higher resolution would be pointless because i just don't have enough horse power right?!

Sorry for dumbing down the discussion too much.

DLSS is useless ( well not really as it still improves the image quality ) in games where u are cpu bound in. Then this can be a useful solution.

DLDSR is basically DLSS higher visual preset but it gives you no longer any performance gains.

So u need performance u go for DLSS, u don't need gpu performance u go for DLDSR.
 
Last edited:

buenoblue

Member
This is my problem, i can't play at 4k because i can't do 4k60 in majority of games so i just settle for 1440p.

this thing is only useful for 4k user if they have power to spare with a 3080\3090, not for people with mid-tier gpu like me.




i'm sorry what?

I also have a 2070 super and like you play 1440p on a 4k screen. I think what people are trying to say is that using dldsr we will be able to get the same performance we do at 1440p but using dldsr to downscale from a higher res. Maybe not from 4k to 1440p but 1800p sounds reasonable.

So set your desktop res to 1440p, use dldsr to downscale from 1800p and get the same framerate. Whether this actually looks better is yet to be seen but in theory should have less aliasing at least.
 

GymWolf

Member
DLDSR is basically DLSS higher visual preset but it gives you no longer any performance gains.

So u need performance u go for DLSS, u don't need gpu performance u go for DLDSR.
But i need power to go up to 4k to use this thing to downscale from even higher resolutions (because i have a 4k panel, not a 1440p one), and i lack the power to go up to 4k to begin with...

With dlss i can just play at fake4k\1800p with the performance hit of playing at 1440p that is the most common res that i can use with my 2070super to achieve steady 60 fps in games that don't support dlss.

So in my case, dldsr is kinda useless until i'm gonna upgrade my gpu to have power to spare at 4k.
 
Last edited:

Kenpachii

Member
But i need power to go up to 4k to use this thing to downscale from even higher resolutions (because i have a 4k panel, not a 1440p one), and i lack the power to go up to 4k to begin with...

With dlss i can just play at fake4k\1800p with the performance hit of playing at 1440p that is the most common res that i can use with my 2070super to achieve steady 60 fps without in games that don't support dlss.

So in my case, dldsr is kinda useless until i'm gonna upgrade my gpu to have power to spare at 4k.

Depends on what games u play, if you play lots of RTS / city builders this is great even for lower end cards. If you only play GPU limited games, then yea its useless. Unless we can use it as DLSS on driver level which means u can push it into all games that don't officially support DLSS, but that has yet to be seen.
 
Last edited:

GymWolf

Member
I also have a 2070 super and like you play 1440p on a 4k screen. I think what people are trying to say is that using dldsr we will be able to get the same performance we do at 1440p but using dldsr to downscale from a higher res. Maybe not from 4k to 1440p but 1800p sounds reasonable.

So set your desktop res to 1440p, use dldsr to downscale from 1800p and get the same framerate. Whether this actually looks better is yet to be seen but in theory should have less aliasing at least.
Another member told me that those are too many passages and that it could result in no IQ gains.

I specifically asked what you are saying now because it was the only option for people like us to gain something from this tech.

What i asked:

Wait a moment, if i set my tv resolution to 1440p and then i upscale to 4k and the return to 1440p with the dldsr, do i have a 1440p image that looks better than native 1440p??

What i was told:

Thats an awfully convoluted way to make an image look good.
Youd be better off just using DLSS from 1440p to 4K.
Going down to go up to go back down doesnt make much sense and likely wont be worth the hassle.
 

GymWolf

Member
Depends on what games u play, if you play lots of RTS / city builders this is great even for lower end cards. If you only play GPU limited games, then yea its useless. Unless we can use it as DLSS on driver level which means u can push it into all games that don't officially support DLSS, but that yet has to be seen.
Yeah i play mostly gpu limited games.
 
Holy shit - if this is the real deal it would add another major way to improve the image of every game at almost no cost. DSR is already one of the best ways to make an image look sharper and cleaner - now you could do it without the massive performance hit. And combined with DLSS the total performance gains would be huge.
 
Last edited:

RoadHazard

Gold Member
I don't understand how this is different from using DLSS to generate a higher resolution image than your display can support and then downscaling that. Can't DLSS already do this?
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
This is like magic! It also adds self shadowing too?
The GI component correctly lights things.
The AO component correctly shadows things.

Assuming the objects in screen space have actual geometry and depth information then yes it will add "self shadowing" to everything.
Its actually Ambient Occlusion not really self shadowing but same shit really no.
Wait, do i have to install geforce experience to use that GI reshade thingy?!
You can decide whether you want to use Reshade or GFE to activate SSRTGI.
SSRTGI piggybacks off Reshade and with this new driver itll be able to piggyback off GFE.
Using GFE is the painless solution as Nvidia is basically doing all the heavy lifting for you.
Installing Reshade in every folder of every game you want to activate this in is a hassle.

Heck ever since Freestyle was introduced I havent actually used Reshade cept for RTGI, all the other color effect and sharpening I can do directly through GFE.
P.S Freestyle is Nvidias name for all the effects that GFE can add to games

NVIDIA-FreeStyle-CES-2018.jpg
 
I don't understand how this is different from using DLSS to generate a higher resolution image than your display can support and then downscaling that. Can't DLSS already do this?
Yes, but DLSS requires implementation on a per-game basis, and is more flexible. This is a driver level upscaler that doesn't do as good a job, but since the output gets downsampled anyway it doesn't matter.
 

bargeparty

Member
I don't understand how this is different from using DLSS to generate a higher resolution image than your display can support and then downscaling that. Can't DLSS already do this?

DLSS upscales a lower resolution doesn't it? DSR is supersampling down to a lower resolution, e.g. running 1440p on a 1080p monitor. It creates a sharper image.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I don't understand how this is different from using DLSS to generate a higher resolution image than your display can support and then downscaling that. Can't DLSS already do this?

It can/could but you had to "trick" DLSS into thinking your display was a higher resolution than it was.
DLSS will always upscale from a resolution lower than you display resolution.
So say yours panel is 4K....DLSS will always choose a resolution lower than 4K to sample from.
But if you tricked DLSS into thinking you had a 6K panel you could choose 4K as the resolution to sample from and go higher.
This will sample from a resolution higher than your display resolution without needing the tricks.
 

RoadHazard

Gold Member
DLSS upscales a lower resolution doesn't it? DSR is supersampling down to a lower resolution, e.g. running 1440p on a 1080p monitor. It creates a sharper image.

I understood it as first doing AI upscaling from a lower resolution (which is what DLSS does) and then downsampling. Just rendering at a higher resolution is nothing new, that's regular old supersampling.
 

RoadHazard

Gold Member
It can/could but you had to "trick" DLSS into thinking your display was a higher resolution than it was.
DLSS will always upscale from a resolution lower than you display resolution.
So say yours panel is 4K....DLSS will always choose a resolution lower than 4K to sample from.
But if you tricked DLSS into thinking you had a 6K panel you could choose 4K as the resolution to sample from and go higher.
This will sample from a resolution higher than your display resolution without needing the tricks.

If that's all it is it sounds like something DLSS could easily do. No reason that should be limited to your display resolution.
 
But it already uses DLSS for DSR... It's essentially the same tech, you going from let's say 1440p->4K but on an actual 1440p display instead of 4K one.
You can already combine DLSS (upscaling) and DSR (downscaling). As I understand it, now you'll be able to combine them and not take the massive performance hit that comes with DSR. Of course, you can just use the new DSR tech without DLSS as well and still get all the advantages of downscaling without all the cost.

If true, this is basically like a massive performance-to-image quality upgrade for everyone, free of charge. We'll have to see, though. Sounds a little too good to be true.
 
Last edited:

Mister Wolf

Member
The GI component correctly lights things.
The AO component correctly shadows things.

Assuming the objects in screen space have actual geometry and depth information then yes it will add "self shadowing" to everything.
Its actually Ambient Occlusion not really self shadowing but same shit really no.

You can decide whether you want to use Reshade or GFE to activate SSRTGI.
SSRTGI piggybacks off Reshade and with this new driver itll be able to piggyback off GFE.
Using GFE is the painless solution as Nvidia is basically doing all the heavy lifting for you.
Installing Reshade in every folder of every game you want to activate this in is a hassle.

Heck ever since Freestyle was introduced I havent actually used Reshade cept for RTGI, all the other color effect and sharpening I can do directly through GFE.
P.S Freestyle is Nvidias name for all the effects that GFE can add to games

NVIDIA-FreeStyle-CES-2018.jpg

So we should be disabling our in game ambient occlusion if we want to use this?
The first game I will be using this with is Monster Huntet Rise. I also want to test this with GOW. If the RTSSGI is better than the PC version's GTAO paired with SSDO.
 

bargeparty

Member
I understood it as first doing AI upscaling from a lower resolution (which is what DLSS does) and then downsampling. Just rendering at a higher resolution is nothing new, that's regular old supersampling.

Honestly I haven't had the time to read into it, but just like upscaling techniques prior to DLSS, the added AI processing of DLSS drastically improved performance and over time image quality as well, so applying AI processing to a supersample has to provide similiar benefits. Downsampling is very costly.
 
Last edited:

WitchHunter

Member
Another acronym, oh for fucks sake... ABCD -> CDXA -> WTFA-> CCDDxa if you enable all of these you'll mine digital money, play a game in 1080p retroretarded from 4K, 96X antialiasing and it also makes coffee.
 
let's show off features that you can't use, because no one can buy a GPU unless you are a miner.
Thats how i feel when all these threads about the ps5/xsx show up. Can't get a ps5 to save my life. Been trying forever.. Yet own a 3060ti. Had a 2060 before the shortage started. Thank you evga stepup.

yeah miners and scalpers suck!!! These companies need to do more to get product at msrp into legit gamers hands.

As for this stuff, i will try it out. Although i will probably just use my native 1080p. Trying to preserve my 3060ti as replacments aren't easy to get and they aren't cheap.
 
Last edited:
Thats how i feel when all these threads about the ps5/xsx show up. Can't get a ps5 to save my life. Been trying forever.. Yet own a 3060ti. Had a 2060 before the shortage started. Thank you evga stepup.

yeah miners and scalpers suck!!! These companies need to do more to get product at msrp into legit gamers hands.

As for this stuff, i will try it out. Although i will probably just use my native 1080p. Trying to preserve my 3060ti as replacments aren't easy to get and they aren't cheap.
neither nvidia nor AMD has any incentive to sell to the average gamer or everyday consumer when miners are buying cards by the pallet. There's a reason why the newly announced 12GB 3080 has no MSRP.
 

amigastar

Member
Will try it out on the new Gothic 2 mod called "Chronicles of Myrtana Archolos" when the driver comes out. Should work nice.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
If that's all it is it sounds like something DLSS could easily do. No reason that should be limited to your display resolution.
Basically thats what they have done.
DLSS original pitch as the name suggests was always Deep Learning Super Sampling.
So we should be disabling our in game ambient occlusion if we want to use this?
The first game I will be using this with is Monster Huntet Rise. I also want to test this with GOW. If the RTSSGI is better than the PC version's GTAO paired with SSDO.
Depends on what kind of AO the game has.
Ground Truth Ambient Occlusion in theory should match or be very very close to Raytraced Ambient Occlusion but if your game has not so great AO you could either add this AO on top, or switch the games AO off depending on which image you think looks better.
SSDO should be no match for SSRTGI though, I think Nvidia specifically chose Prey to show this off because Prey uses SSDO.
The joys of Nvidia Freestyle and PC gaming in general is you can mix and match whatever settings and features you want till you get the image and performance you are happy with.
 

Kenpachii

Member
DLDSR = downsampling, running higher resolution then native. ( focused on higher image quality )
DLSS = upscaling, running lower resolution then native. ( focused on more performance )

If you run games at lower resolutions then native, DLDSR isn't much useful for that because u either go native or use DLSS.

Maybe it gets easier then to understand for some people.

Another acronym, oh for fucks sake... ABCD -> CDXA -> WTFA-> CCDDxa if you enable all of these you'll mine digital money, play a game in 1080p retroretarded from 4K, 96X antialiasing and it also makes coffee.

DSR is well known, they could however have called it AI DSR. But they probably want to keep it more together with DLSS naming as its a well known solution by now. Maybe

anyway hope it releases soon, want to play AC with it, when the next expansion comes out.
 

buenoblue

Member
Another member told me that those are too many passages and that it could result in no IQ gains.

I specifically asked what you are saying now because it was the only option for people like us to gain something from this tech.

What i asked:

Wait a moment, if i set my tv resolution to 1440p and then i upscale to 4k and the return to 1440p with the dldsr, do i have a 1440p image that looks better than native 1440p??

What i was told:

Thats an awfully convoluted way to make an image look good.
Youd be better off just using DLSS from 1440p to 4K.
Going down to go up to go back down doesnt make much sense and likely wont be worth the hassle.

If dlss is an option we should always use that. If this is driver level and works on any game then no harm in trying it on non supported dlss games.

I've gone through some bizzare hoops to get what I percieve is better image quality lol. I recently played FF7Remake on ps5 and found the image really blurry, so I set my ps5 to 1080p and downscaled the image. My 4k tv seems to apply a sharpening filter to 1080p content so even though technically the image is worse it actually looks better to my eyes.
 
Last edited:
Thats how i feel when all these threads about the ps5/xsx show up. Can't get a ps5 to save my life. Been trying forever.. Yet own a 3060ti. Had a 2060 before the shortage started. Thank you evga stepup.

yeah miners and scalpers suck!!! These companies need to do more to get product at msrp into legit gamers hands.

As for this stuff, i will try it out. Although i will probably just use my native 1080p. Trying to preserve my 3060ti as replacments aren't easy to get and they aren't cheap.
I don't understand preserve your 3060.
It is the complete opposite of DLSS.

DLSS = Render low resolution and AI upscale.
DLDSR = Render high resolution and AI downscale.

Two extreme poles.
There must be something else because it shows no FPS loss from native.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Thats how i feel when all these threads about the ps5/xsx show up. Can't get a ps5 to save my life. Been trying forever.. Yet own a 3060ti. Had a 2060 before the shortage started. Thank you evga stepup.

yeah miners and scalpers suck!!! These companies need to do more to get product at msrp into legit gamers hands.

As for this stuff, i will try it out. Although i will probably just use my native 1080p. Trying to preserve my 3060ti as replacments aren't easy to get and they aren't cheap.
How are you going to preserve your 3060ti....are putting it in Amber?
 

TonyK

Member
Thats the SSRGTI and AO.
And yes thats how light works....if there is no light bouncing or directly hitting an object why would it be lit?

I know videogames have trained you to believe objects just get lit by magic.....but thats no how the world works.
The occluded areas are more realistic and actually ground the objects in the world.
Without that occlusion the objects look like they are floating in the space.

Look at the TV or bookshelfs in the background.....whats lighting them up?
Even the gold circle on the ground....look how much more detail you are getting from it simply because it has more occlusion on it.
The side NOT facing the lights should be that bright.
In fact no, sorry. In real world that scene will not have those dark areas because light bounces infinite times until bounce intensity is too low. With a white floor illuminated from above, like in that screenshot, shadows that should receive light bounce from the floor never will be so black. In general, all the image looks fake because shadows are not affected by surrounding illuminated areas.

Said that, I also prefer the second image, because even being inaccurate it looks better to my eyes than the first one.
 

LiquidMetal14

hide your water-based mammals
God bless America (and your home country).

This is the kind of advancements we need to see more given the GPU climate. Pretty much free performance.

I like it coming from all sides of the competition pendulum.

Remember the times we are living now and realize we haven't had such new development techniques and advancements in the ways which impact our every day performance at no extra cost.

I'll be glad to praise anything whether it's open standard or something like DLSS which is built with tensor cores in mind.

We are going in the right direction and this is all organically maturing right now.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
In fact no, sorry. In real world that scene will not have those dark areas because light bounces infinite times until bounce intensity is too low. With a white floor illuminated from above, like in that screenshot, shadows that should receive light bounce from the floor never will be so black. In general, all the image looks fake because shadows are not affected by surrounding illuminated areas.

Said that, I also prefer the second image, because even being inaccurate it looks better to my eyes than the first one.
Mate I could take this scene into Octane and it will look much more like the second image than the first.
Light doesnt bounce infinity in any scenario unless you have a room of perfect mirrors so even with the floor being a light color, the intensity of the lights actually hitting it would dictate how much more light would bounce off.
The occluded areas say on the rafters in the library would indeed be shadowed.....would it be exactly that dark? Obviously not this is a screen space solution and doesnt even use tensor cores so it isnt going to be offline levels of accurate but its way way more accurate than everything being lit by magic lights.....even full path traced GI doesnt work like that.
 
Top Bottom