• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA announces DLSS 3.5 with Ray Reconstruction, launches this fall.

LiquidMetal14

hide your water-based mammals
Hell yes son!

As was said already, say what you want about the pricing (the son of bitch whorez), they are awesome at putting the SW side to work and use their HW in ways that no one is competing on their level with.
 

Buggy Loop

Member
I wonder how many academic papers Nvidia contributes to computer science and software journals per year. They have really smart people working for them.

A lot

They’re collaborating with every major universities, Nvidia won’t be directly referenced but authors working for them are. This is a space that AMD totally neglected.

Following the paper trail, the next tech we’ll see implemented is likely NRC, Neural Radiance Cache, an improvement over ReSTIR, was rumoured to be in development for Cyberpunk. This is not only faster (less denoise work) than ReSTIR that was used in overdrive, but also doesn’t have to flaws of it. Uses tensor cores and low compute, with way less memory access.


Nvidia is really out there in the field
 
Yes yes lets give devs even more excuses to not optimize their games.

Fuck me... It's okay you prefer a certain graphics card manufacturer over the other, Nvidia provide a ton of benefits over AMD for sure, but celebrating tech that basically amounts to a band-aid on a more industry wide problem is a bit "eh" to say the least lol.

Saying that though I guess this is the new reality of videogames where the tech people behind them aren't getting any better at optimization because they're not working with the typical constraints that were present in the past.
Why do you consider it a Band Aid and a crutch for developers?

This tech can make a game that performs perfectly fine (60 fps) and let you max out all the bells and whistles and take advantage of a 4k 120hz display.

Its like being mad a car has a turbo...
If ThE CaR MaNuFaCtUreRs BuIlT BeTtEr EnGiNeS..

Some people are strange.
 

lh032

I cry about Xbox and hate PlayStation.
meh, all these tech are useless if the majority of the devs does not utilize or optimize their games properly.
And Nvidia probably going to charge their GPU prices even higher
 
Last edited:
Because your logic is that more tools allow the developers to be lazy and not have to maximize thier optimization...

Couldn't a lazy dev just say F it and let the newer more powerful cards just brute force there code?

The premise of you logic is idiotic, devs are not inherently lazy, some are but most are not.
 

ahtlas7

Member
My 30 series would like to know why it wasn’t invited to the party.
Sad Season 2 GIF by Friends
 

Dream-Knife

Banned
This seems really neat, but I know this will get mostly used to pump out games with horrible performance.

Of course this may end 9th gen consoles quicker than expected.
 
Last edited:

VN1X

Banned
Because your logic is that more tools allow the developers to be lazy and not have to maximize thier optimization...

Couldn't a lazy dev just say F it and let the newer more powerful cards just brute force there code?

The premise of you logic is idiotic, devs are not inherently lazy, some are but most are not.
"Idiotic" okay I guess we're off to a great start with this discussion.

Anyway, just to humour you: we're now getting games where the developers have outright come out and said "yes we developed this with upscaling in mind" (Remnant 2, Immortals are two recent examples). Obviously it's great tech, interesting in a number of ways, however this shouldn't be a crutch for developers to lean on when the games themselves don't look a hair better as to what came out an entire generation ago.

Again, it's great that companies like Nvidia, AMD and others are exploring and innovating in that area but when videogame devs are leaning on it as a shortcut (upscaling in this example) then we as consumers are getting worse products as a result, not better ones.
 
Last edited:

Buggy Loop

Member
Yes yes lets give devs even more excuses to not optimize their games.

Fuck me... It's okay you prefer a certain graphics card manufacturer over the other, Nvidia provide a ton of benefits over AMD for sure, but celebrating tech that basically amounts to a band-aid on a more industry wide problem is a bit "eh" to say the least lol.

Saying that though I guess this is the new reality of videogames where the tech people behind them aren't getting any better at optimization because they're not working with the typical constraints that were present in the past.

This iteration is out of developer’s control for what is going on here. DX12 API has very basic function calls. How the GPU compiles the data and orders it with their scheduler is on the shoulders of GPU vendors. Nvidia made an optimization in that area, there’s no way for devs to optimize at that level.

Just like AMD compiler decided to make a ridiculous 99ms ubershader in Portal RTX when the game doesn’t call for it


Everyone thought that Nvidia was sabotaging performances on AMD cards for portal RTX, ends up that AMD’s compiler is drunk.

So devs don’t have all the parameters

But I’m not excusing them for most of the laziness that I’ve already critiqued them of in the past.

As for DLSS, yea as we saw with Remnant 2, to dev a game performance based on upsampling rather than native performances is a bad idea.
 
Last edited:

Kataploom

Gold Member
Ok, this seemed cool at first but some questions:
- Does it have to be implemented by devs manually per game? If so, nothing to set the world on fire for a couple years at least

- Does it requires the game to be using fake frames? If so, useless to me

Tech looks cool but if the above is true, it changes nothing to me and average gamer, not only niche as RT but even more than that.
 
Last edited:

Ev1L AuRoN

Member
Would be cool to get the ray tracing reconstruction separate from the DLSS 3 frame generation bullshit but presumably they're doing this so that they can pseudo-force consumers to buy Lovelace GPUs over Ampere.

Slightly unrelated but why the hell does AMD open source its stuff? It never really benefits them. If (big if) FSR somehow surpasses DLSS in future in terms of quality and AMD keeps it open source, they'll have no software advantage over Nvidia so the default option (assuming pricing is not an issue) would be to buy Nvidia (since you'd get access to their closed stuff plus AMD-developed open source stuff).
It's open source, so developers use it. AMD has such a tiny marketshare that exclusive tech won't have any traction if made exclusive to their GPU's. Intel were clever about it, XeSS is open source but the tech has better quality/performance on their gpu's thanks to the silicon features their architecture has.
 
"Idiotic" okay I guess we're off to a great start with this discussion.

Anyway, just to humour you: we're now getting games where the developers have outright come out and said "yes we developed this with upscaling in mind" (Remnant 2, Immortals are two recent examples). Obviously it's great tech, interesting in a number of ways, however this shouldn't be a crutch for developers to lean on when the games themselves don't look a hair better as to what came out an entire generation ago.

Again, it's great that companies like Nvidia, AMD and others are exploring and innovating in that area but when videogame devs are leaning on it as a shortcut (upscaling in this example) then we as consumers are getting worse products as a result, not better ones.
Idiotic was appropriate based on the generalization you made. Then followed up with by providing 2 instances where you may be correct out of how many developers? Grow some thicker skin my dude, if you are going to generalize expect your opinion to be criticized.

On the topic at hand

Can this tech allow for developers to cut corners? Yes

Can this tech allow for developers to push their games even further? Yes

Some people see the glass half empty some see it half full.
 

VN1X

Banned
Idiotic was appropriate based on the generalization you made. Then followed up with by providing 2 instances where you may be correct out of how many developers? Grow some thicker skin my dude, if you are going to generalize expect your opinion to be criticized.

On the topic at hand

Can this tech allow for developers to cut corners? Yes

Can this tech allow for developers to push their games even further? Yes

Some people see the glass half empty some see it half full.
I need to grow thicker skin but you're the one getting emotional over a graphics discussion and throwing insults around.

Brady Bunch K GIF
 

smbu2000

Member
no it will work on all RTX cards as it just improves regular DLSS 2

nvidia's naming scheme is dumb and it shows, they should've called frame gen seperately from DLSS
Well then their naming scheme is pretty dumb. From what I had seen, DLSS3 was exclusive to 4000 cards and older 2000/3000 cards were still using DLSS2 tech.
The increased performance they've shown has been DLSS3 w/frame gen vs DLSS3 w/frame gen & the ray reconstruction. I wonder what kind of performance increase there is without frame gen being included?

My desktop has a 4000 series card, but my gaming laptop has a 3000 series card.
 
I need to grow thicker skin but you're the one getting emotional over a graphics discussion and throwing insults around.

Brady Bunch K GIF
Idiotic is not an insult its an observation..... if you took it as an insult than i suppose you do indeed need to grow thicker skin.

Can you reply the the actual content of my posts instead of focusing on one word that you took offense to please?
 

Gaiff

SBI’s Resident Gaslighter
I bet the timing of them coming out with this is due to the rumblings of FSR 3 around the corner...
Lol, yeah. NVIDIA is trembling in their boots because of a feature that was announced a year ago and is still nowhere to be found.

Any moment now...
 

hlm666

Member
Maybe they should start inventing reconstruction techniques for the lack of v-ram on there gpu's.
You could probably drop down a quality level in dlss 3.5 get similar image quality to the higher dlss 3 level and get more performance and use less vram if it's an issue. The real issue going forward is going to be apples to apples benchmarks when dlss 3.5 could look like it's at higher detail RT settings if they are just matching base resolutions.

Then in the future maybe dlss 4 will use that AI texture reconstruction someone posted here a few months ago, use lower res smaller textures in vram then hit them with the AI upscaling during the dlss pass.
 
Last edited:

mhirano

Member
Cool, DLSS is awesome. I feel bad for 2000/3000 series owners.
"NVIDIA has confirmed that DLSS 3.5 will make its debut in the fall, featuring in titles such as Cyberpunk 2077: Phantom Liberty, Portal with RTX, and Alan Wake 2. Additionally, it will be available in the NVIDIA Omniverse Platform, Chaos Vantage, and D5 Renderer. The “RR” tech will work across all RTX GPUs (unlike Frame Generation)."

Haters gonna hate, still rolling with ma 3080
 

Hudo

Member
but Nvidia's software and development is amazing.
That...depends. Cuda is the least-worst of all the GPGPU frameworks, yes. But it's still shit to use, imho. Their effort to port the C++ STL to a unified library that works seamlessly on device and host side is cool, I admit. Some papers published by Nvidia's research team are really impressive, others not so much (to the point where I question the reviewers). Nvidias drivers are shit, unfortunately. It got better over the years but they are still a pain in the ass, especially their linux drivers. Their profiler for evaluating your CUDA-Kernels is also not as easy and straightforward to use as I'd like. And let's face it, they effectively hold the entire research community hostage with their closed-off CUDA ecosystem. You're vendor locked, which is why they can charge what they charge. I still do hope that Intel and AMD push either OpenCL or some alternative framework for real.

And don't get me started on Nvidia's "Geforce Experience" bullshit on Windows, where they seriously ask you to make an account. I had a good laugh when I first saw that.
 

violence

Member
I’m not sweating my 3000 series. It’s gonna take a while for this to even get support.


Edit: “The “RR” tech will work across all RTX GPUs (unlike Frame Generation).“ cool.
 
Last edited:

Ivan

Member
Too bad Nvidia doesn't have x86 licence... I bet they'll be able to make something really good for next gen consoles too.

Switching to arm might be the only way for that to happen. I'm sick of amd at this point.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
This seems really neat, but I know this will get mostly used to pump out games with horrible performance.

Of course this may end 9th gen consoles quicker than expected.

Mark Cerny will need to get with AMD and have a real talk about the PS6. Nvidia is just going nuts with the tech and how they use tenor cores and A.I.
 
Top Bottom