• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA announces DLSS 3.5 with Ray Reconstruction, launches this fall.

Zathalus

Member
No, it's SCIENCE!
futurama-hail-science.gif
 

SlimySnake

Flashless at the Golden Globes
I can't wait to play the 2 to 6 games with this in the next 5 years :messenger_unamused:

Even the amount of games with RT is so disappointing :messenger_unamused:
At the end of the video, he goes we revolutionized gaming 5 years ago and wonders what the next 5 years will look like. Yeah, i bet it will still be Cyberpunk and nothing else.

Ray tracing has simply not been adopted as well as Nvidia had hoped. Even games like Star wars and RE4 that ship with RT nowadays do the bare minimum. Only 1 of the PS exclusive dev seems to be using RT. 1 at microsoft.

I remember buying an RTX card in early 2019 for Metro and am honestly shocked at how few games have utilized RT since then. Control came out in 2019 but after that? Cyberpunk and what else? So many games just use RT shadows and call it a day and i honestly cant tell the difference. Playing CoD Black Ops cold war on PS5 right now and i dont know what these shadows are supposed to do.

Id say DLSS has had a bigger impact than RTX and i blame nvidia for letting so many devs ship games without DLSS. RE4, Star Wars, and now Starfield. Simply inexcusable from arguably the most profitable video game company out there.
 

yamaci17

Member
At the end of the video, he goes we revolutionized gaming 5 years ago and wonders what the next 5 years will look like. Yeah, i bet it will still be Cyberpunk and nothing else.

Ray tracing has simply not been adopted as well as Nvidia had hoped. Even games like Star wars and RE4 that ship with RT nowadays do the bare minimum. Only 1 of the PS exclusive dev seems to be using RT. 1 at microsoft.

I remember buying an RTX card in early 2019 for Metro and am honestly shocked at how few games have utilized RT since then. Control came out in 2019 but after that? Cyberpunk and what else? So many games just use RT shadows and call it a day and i honestly cant tell the difference. Playing CoD Black Ops cold war on PS5 right now and i dont know what these shadows are supposed to do.

Id say DLSS has had a bigger impact than RTX and i blame nvidia for letting so many devs ship games without DLSS. RE4, Star Wars, and now Starfield. Simply inexcusable from arguably the most profitable video game company out there.
what do you expect, more than %80 of the ray tracing capable nvidia cards have 6-8 GB VRAM. that is pathetic. no wonder devs simply ignore it.
ignoring it is best for both worlds:

it works nicely on console and compliant with 6-8 GB card folks

cyberpunk is an edge case where there are extreme amounts of hardcore optimizations to make it viable on 6-8 GB VRAM budgets. but just becauese it is doable doesn't really mean it is feasible to do it
 

hlm666

Member
At the end of the video, he goes we revolutionized gaming 5 years ago and wonders what the next 5 years will look like. Yeah, i bet it will still be Cyberpunk and nothing else.

Ray tracing has simply not been adopted as well as Nvidia had hoped. Even games like Star wars and RE4 that ship with RT nowadays do the bare minimum. Only 1 of the PS exclusive dev seems to be using RT. 1 at microsoft.

I remember buying an RTX card in early 2019 for Metro and am honestly shocked at how few games have utilized RT since then. Control came out in 2019 but after that? Cyberpunk and what else? So many games just use RT shadows and call it a day and i honestly cant tell the difference. Playing CoD Black Ops cold war on PS5 right now and i dont know what these shadows are supposed to do.

Id say DLSS has had a bigger impact than RTX and i blame nvidia for letting so many devs ship games without DLSS. RE4, Star Wars, and now Starfield. Simply inexcusable from arguably the most profitable video game company out there.
Alan wake 2 is going to use it, so that's phantom liberty and alan wake 2 this year. Black Myth is still sponsored by nvidia (the trailer the other day was running on nvidia hardware using dlss 3) so there's another you can expect the full nvidia dog and pony show with, hell you can probably expect it to use the nvidia ue5 branch that desordre does.

As for your list of games using RT, there's actually alot more than you seem to be aware of. A guy on beyond3d has a good list you can get yourself up to speed with.


edit: may aswell also drop a link in for a few of the upcoming games.

 
Last edited:

Honey Bunny

Member
Frame Generation is not arbitrarily blocked from the 3000 series. There have been mods to enable it on the 3000 series and it turned out terrible. The white paper makes it clear that the new OFA capabilities of Ada makes it possible with very little image quality loss.

Do you think DLSS can run on the 1000 series as well?

There have been mods to enable it on the 3000 series and it turned out terrible.

Lol. I'm guessing Nvidia could do a little bit better than random modders if the economic incentive was there. Nice to know that random modders *can* get it running though.
 

Zathalus

Member
Lol. I'm guessing Nvidia could do a little bit better than random modders if the economic incentive was there. Nice to know that random modders *can* get it running though.
Well, 3.5 runs just fine on 2000 and 3000 series cards. So obviously Nvidia is not just stopping new features from working for no reason.

Can Nvidia get frame generation to work on previous gen cards? Maybe, but it seems it would lead to a drop in visual fidelity that would make enabling it rather pointless. Depends on how FSR3 stacks up I guess.
 

yamaci17

Member
Well, 3.5 runs just fine on 2000 and 3000 series cards. So obviously Nvidia is not just stopping new features from working for no reason.

Can Nvidia get frame generation to work on previous gen cards? Maybe, but it seems it would lead to a drop in visual fidelity that would make enabling it rather pointless. Depends on how FSR3 stacks up I guess.
it is delusional to think frame gen will function fine on 3000/2000 cards

look at how whachy it is on 8 gb 4060/4060ti. it will be UNUSABLE in games that you will actually REQUIRE it, in other words, in games that stresses the card a lot. and what happens in those games that stresses a GPU like a 4060ti? they tend to gobble up VRAM like no tomorrow. and surprise surprise, everything ranging from 3060ti to 3080 10 GB lacks VRAM.

even 4070 gets hammered in hogwarts legacy at 1440p/raytracing when frame gen is enabled. extra frame drops and stutters happen due to card running out of VRAM.

frame gen will be unusable on 8-12 GB cards going forward considering the steep VRAM consumptions. devs won't really care about whether you have enough VRAM to run it or not. it is just some shiny NVIDIA tool that they can quickly implement. if their game chomps of upwards of 10.5 GB VRAM at 1440p with modest ray tracing settings, kiss a good bye to the viability and usability of Frame generation on an 12 GB card.

and simply forget about it on 8 GB cards.

there's a reason NVIDIA invested a lot in cyberpunk. it is a game that is super compliant on 6 GB cards. heck, you can run RTGI at 1080p dlss quality and be still fine in terms of VRAM usage. naturally, you will see quite nice bumps in that game with DLSS3, because there's enough VRAM for the game to operate on 8 GB budget.

but it is all smokes and mirrors. it would be, for example, unusable in jedi survivor with ray tracing.

and without ray tracing, all RTX 3000 cards are super fast and do not really require DLSS3.

DLSS3 could be meaningful with CPU bound cases but surprise surprise, ray tracing itself is creating immense CPU bound situations. funny, right? not that funny, but it is what it is.

potentially it could work okayish on 24 gb 3090 and maybe 12 gb 3080/3080ti in NICHE situations but then that would open another can of worms
 
Last edited:

supernova8

Banned
But what's the point of having gaming features exclusive to one vendor? GPUs should be a little more open as they aren't closed systems.
Sure but the reality is that Nvidia's solution isn't open source and will never be open source because they don't want anyone being able to use Nvidia-developed features on competing GPUs.
 

Zathalus

Member
What's your guy's thoughts on this from Kepler? Same guy that leaked PS5 Pro and other AMD GPU code names.


Intel for sure, should run with the same performance boost as Nvidia cards (assuming it can leverage the XMX cores). AMD will get a performance boost as well but it won't be as large, as matrix multiplication still runs on shaders. Considering XeSS runs on AMD and Nvidia cards but take a performance hit relative to FSR/DLSS (as for both AMD and Nvidia the ML is running in shaders). XeSS is also not using the best ML model running on AMD and Nvidia either, as that is reserved for Intel cards due to being run on the XMX cores.

Thus DLSS on AMD/Intel will have a performance and image quality hit compared to Nvidia.

Of course Nvidia has zero incentive to allow that, considering the 90% discrete GPU marketshare on PC.
 
Last edited:
LOL, AMD pays developers to keep DLSS out of games, Nvidia just keeps paying their engineers and developers to make DLSS better.

I would be pissed if I were an AMD GPU owner, knowing that this is what the money they gave to AMD is being spent on.

Unrelated: Nvidia's versioning system for DLSS is getting pants-on-head retarded. So DLSS 2.0 tech works on RTX 20/30/40, DLSS 3.0 tech works only on RTX 40, but DLSS 3.5 tech works on RTX 20/30/40 again? What the fuck, Jensen?
 
Last edited:

LiquidMetal14

hide your water-based mammals
Great stuff I'm reading here from Nvidia.

I don't know anything about their future GPU's and pricing structure but on the SW/AI end, they are in a league of their own.
 

Hugare

Member
LOL, AMD pays developers to keep DLSS out of games, Nvidia just keeps paying their engineers and developers to make DLSS better.

I would be pissed if I were an AMD GPU owner, knowing that this is what the money they gave to AMD is being spent on.

Unrelated: Nvidia's versioning system for DLSS is getting pants-on-head retarded. So DLSS 2.0 tech works on RTX 20/30/40, DLSS 3.0 tech works only on RTX 40, but DLSS 3.5 tech works on RTX 20/30/40 again? What the fuck, Jensen?
Every DLSS version is compatible with every RTX card. Only one feature so far is locked out, and thats frame gen from DLSS 3. But you can still use Reflex from DLSS 3 and now RR from 3.5 in new games.

But I agree, its confusing
 

hlm666

Member
What's your guy's thoughts on this from Kepler? Same guy that leaked PS5 Pro and other AMD GPU code names.
It's a no win situation, they keep it exclusive they are bad. They make it work on everything that support matrix math and then when it performs better on nvidias tensor hardware they will be accused of making it run worse on competitors hardware. They also trained the AI model with up to 5x more data than previous models and no one is giving away their AI models.

But the biggest hurdle is this is actually part of the dlss pass, it doesn't work with dlss off, it doesn't even work with dlaa which is nvidias own feature.
 

It appears that Nvidia has made another breakthrough in image reconstruction.
This is why PC gaming is awesome.
 
Yeah, imo the only cards worth buying are the 4080 or the 4090. Everything else from either Nvidia or AMD (or Intel) is too compromised in some way whether that be VRAM amount, ray-tracing performance or image reconstruction ability.
I hope you don't play on consoles then with their 6600XT performance.
 

LiquidMetal14

hide your water-based mammals
I hope you don't play on consoles then with their 6600XT performance.
Clearly it's all about the RT and performance the tensor cores give you with the Nvidia suite features.

I used to bastion Intel in terms of the CPU versus AMD but that was just due to raw performance. Nvidia has the performance and features that you cannot best.

Until there is a point in time where we can see something different, Nvidia is the place to be when it comes to graphics if you are an enthusiast pushing for all the features and not necessarily the high-end graphics but overall performance.
 

Buggy Loop

Member
Well, 3.5 runs just fine on 2000 and 3000 series cards. So obviously Nvidia is not just stopping new features from working for no reason.

Can Nvidia get frame generation to work on previous gen cards? Maybe, but it seems it would lead to a drop in visual fidelity that would make enabling it rather pointless. Depends on how FSR3 stacks up I guess.

Peoples can validate that for themselves if they think old RTX cards are left out. You can test by using either the older Video Code SDK that provides optical flow analysis functions for hardware as old as Maxwell or the new Optical Flow SDK for Turing>.

Prior to Ada the way of loading data into the OFA was either from NVDEC or from very slow managed memory buffers and even tho D3D buffers were supported it the latency was nowhere near useable for interactive real time rendering.

With Ada the OFA fixed function units can access the entire memory directly via address masking, as well as the L3 cache and likely have had a substantial number of other hardware specific changes to facilitate this. Ada’s higher clock speed also helps.

Ampere is actually slower than Turing for motion vector extraction, but has fewer features such as not going under 4x4 grid, while FG uses 1x1 or 2x2

At same clock to Ampere, Ada is 2.5 to 4 times faster than Ampere. Leave the clock of Ada back to normal and it’s even higher.

So the end result is, without even going into the artifacts, latency would just kill any performance gain.
 

Edder1

Member
So much fear mongering going on in this thread, smh. It just seems like nobody read the press release in which it clearly states that ray reconstruction is available for all RTX GPU series. It's really sad to see people spread misinformation simply because they don't like Nvidia or because they game on consoles.
 
Last edited:

LordOfChaos

Member


Actually looks amazing. Nvidia for whatever else about pricing and arrogance actually does push the boundary as a routine practice.
 

Neilg

Member
Damn, I thought the slow reactivity of GI and reflections in raytracing was going to be something that wasnt solved for a long time. that cyberpunk example with the colored signage spilling into the alleyway was a huge leap in quality.
 

Protocol7

Member
This technology will be available to every rtx card, as per the article.

Nope. Works on 2000 and 3000 series. But the frame gen part doesn’t. so confusing

"NVIDIA has confirmed that DLSS 3.5 will make its debut in the fall, featuring in titles such as Cyberpunk 2077: Phantom Liberty, Portal with RTX, and Alan Wake 2. Additionally, it will be available in the NVIDIA Omniverse Platform, Chaos Vantage, and D5 Renderer. The “RR” tech will work across all RTX GPUs (unlike Frame Generation)."

Haters gonna hate, still rolling with ma 3080


I feel like people aren't reading the source. It specifically says ray reconstruction will work across all RTX GPU's unlike frame generation.
This should be added to OP, to avoid unnecessary misunderstandings and... people not reading the article properly...
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
Why do you consider it a Band Aid and a crutch for developers?

This tech can make a game that performs perfectly fine (60 fps) and let you max out all the bells and whistles and take advantage of a 4k 120hz display.

Its like being mad a car has a turbo...
If ThE CaR MaNuFaCtUreRs BuIlT BeTtEr EnGiNeS..

Some people are strange.

We all know how this will end: games that will run at 10 fps unless DLSS3.5 is turned on - and then you will hit 30 fps. That turbo button will become a necessity to make the car move at all.
 
We all know how this will end: games that will run at 10 fps unless DLSS3.5 is turned on - and then you will hit 30 fps. That turbo button will become a necessity to make the car move at all.
Without saying lazy devs, what is your premise of making that assumption?
 

Dacvak

No one shall be brought before our LORD David Bowie without the true and secret knowledge of the Photoshop. For in that time, so shall He appear.
I just want to say that before I watched the DLSS 3.5 comparison videos, I had never once noticed the RT GI bounce latency for flickering lights before. And now I can’t unsee it in all of my games and it has kind of ruined RT for me a bit.
 

kiphalfton

Member
Alan wake 2 is going to use it, so that's phantom liberty and alan wake 2 this year. Black Myth is still sponsored by nvidia (the trailer the other day was running on nvidia hardware using dlss 3) so there's another you can expect the full nvidia dog and pony show with, hell you can probably expect it to use the nvidia ue5 branch that desordre does.

As for your list of games using RT, there's actually alot more than you seem to be aware of. A guy on beyond3d has a good list you can get yourself up to speed with.


edit: may aswell also drop a link in for a few of the upcoming games.


Their point still stands. Since the half a dozen or so games you mentioned is like what, 10% of major releases this year. Pitiful.
 
  • Empathy
Reactions: amc
We're already seeing it happen with PS5 games and FRS.
Consoles have a finite amount of resources to balance the devs vision of thier game.

There isnt limitless access to system resources, even in the PC space. Never has been never will be.

There are three pillars in which need to be juggled to produce a good looking and playing game. FPS, resolution and geometric density/complexity.

A dev chosing one over the other is what they feel is the best for there game to meet their vision.

Giving them access to more resources will allow them to continue to build upon the three pillars listed above.

Look at it this way without the tools and advancements the game s you are referencing would run at 720p, sub 30 fps or have the geometric complexity of a PS3 game sometimes all of the above.

Its not a get out of jail free card for the devs... Its a get out of jail free card for you...

Save money on GPUs or budget getting a console and still have games that look and perform similarly.

The idea that devs just need to work harder is ridiculous. Enjoy your game as is or buy a 4090. Shit aint free
 

Xcell Miguel

Gold Member
They modded DLSS into RE4, Jedi Survivor and most likely Starfield. AMD ain't stopping shit.
I know there are mods, but sometimes you have to pay to get something that should have been free if AMD was not involved.
Also, as AMD cards mostly sucks at ray tracing, some games limit what RT is used for, like Halo Infinite only using it for sun shadows, Forerunner structures would look great with RT reflections, but AMD cards sucks at it.

AMD cards are the Series S of AMD sponsored PC games 😅
 
Top Bottom