• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA DLSS 3 “Frame Generation” Lock Reportedly Bypassed, RTX 2070 Gets Double The Frames In Cyberpunk 2077

Gaiff

Gold Member
For the record, there's nothing on the hardware level preventing Ampere and even Turing cards from running DLSS 3. An NVIDIA engineer stated that their Motion Flow Accelerators compared to Lovelace are quite a bit slower and that the end result was presumably not satisfying enough for DLSS 3. They should have simply enabled it anyway and let players decide for themselves.
 
Last edited:

kuncol02

Banned
Just chain 1,000 Voodoo 2 cards together.

giphy.gif
I think I saw a picture of that card.
 

FingerBang

Member
Jesus Christ, read the goddamn article before jizzing all over the keyboard, people 🤦🏻‍♂️

This DOESN'T ENABLE FRAME GENERATION FOR NON 40XX CARDS! It just means it can be used with FSR and XESS. Which is still a nothing burger, cause DLSS is the best one for Nvidia cards.

I get hating a company, but hate boners aren't healhty.
 
Last edited:

Utherellus

Member
Nothing new in that article, really.

Frame interpolation is standalone toggle and can work without DLSS's upscaling enabled.

It does not care what kind of post-processing you do with original frames.
 

Irobot82

Member
Jesus Christ, read the goddamn article before jizzing all over the keyboard, people 🤦🏻‍♂️

This DOESN'T ENABLE FRAME GENERATION FOR NON 40XX CARDS! It just means it can be used with FSR and XESS. Which is still a nothing burger, cause DLSS is the best one for Nvidia cards.

I get hating a company, but hate boners aren't healhty.
Let me just glance 15 degrees to the right of this monitor that I am using right now. Let's see... a glowing RGB sign that says uh... EVGA GEFORCE GTX1080 Classified. Damn I hate Nvidia! (your brand loyalties are showing)
 

FingerBang

Member
Let me just glance 15 degrees to the right of this monitor that I am using right now. Let's see... a glowing RGB sign that says uh... EVGA GEFORCE GTX1080 Classified. Damn I hate Nvidia! (your brand loyalties are showing)
How's correcting bullshit brand loyalty? I've been very critical of Nvidia in the other threads.

I find an update to this thread making people believe that Nvidia DLSS3 now runs on non RTX 40xx cards. That's false and Nvidia being greedy and having put shitty connectors on their new cards doesn't suddenly make all news true.
 

Irobot82

Member
How's correcting bullshit brand loyalty? I've been very critical of Nvidia in the other threads.

I find an update to this thread making people believe that Nvidia DLSS3 now runs on non RTX 40xx cards. That's false and Nvidia being greedy and having put shitty connectors on their new cards doesn't suddenly make all news true.
It actually does run on Non 40xx cards. But I didn't make that claim. DLSS3's bullshit fake frames runs on DLSS's competitors.



DLSS3.0 and FSR3.0 will not be good for the industry. Increasing latency to make a game appear to have more frames isn't the way forward.
 

01011001

Banned
It actually does run on Non 40xx cards. But I didn't make that claim. DLSS3's bullshit fake frames runs on DLSS's competitors.



DLSS3.0 and FSR3.0 will not be good for the industry. Increasing latency to make a game appear to have more frames isn't the way forward.


the latency is slightly worse than running "native" frames... not really a big issue
 

FingerBang

Member
It actually does run on Non 40xx cards. But I didn't make that claim. DLSS3's bullshit fake frames runs on DLSS's competitors.



DLSS3.0 and FSR3.0 will not be good for the industry. Increasing latency to make a game appear to have more frames isn't the way forward.

I agree, I'm not a fan of the technology either. I think it can be good for single player games to do 60 -> 120, but that's it. Bad for competitive games and awful below 60.
 

01011001

Banned
I agree, I'm not a fan of the technology either. I think it can be good for single player games to do 60 -> 120, but that's it. Bad for competitive games and awful below 60.

but that's not the usecase.
if you have the choice of running a game at 80fps with the latency of 80fps
or running it 144fps with almost the same latency as 80fps, what would you chose?

you lose basically nothing and gain motion clarity
 

FingerBang

Member
but that's not the usecase.
if you have the choice of running a game at 80fps with the latency of 80fps
or running it 144fps with almost the same latency as 80fps, what would you chose?

you lose basically nothing and gain motion clarity
Yes, again, for high fps. It can do 80 to 160 or 60 to 120 and it's going to be fine.

But this kind of technology would be amazing at 30 to get 60 fps, but it doesn't perform that well, with visible artifacts and gives you still crappy input lag.

I might be wrong but I just think it'd be the most useful on weaker cards (think of 4060 and below) and on those the experience doesn't sound like it'd be that good.

I'll test it on cyberpunk when that patch is finally out.
 

LiquidMetal14

hide your water-based mammals
For me, Plague Tale Requiem is awesome for what it's doing. This is new territory and I'm not going to be the one to crap on them when DLSS is a prime example of how good it can get with the R&D behind it.

I'd rather be going this direction than not on any vendor.
 

01011001

Banned
Yes, again, for high fps. It can do 80 to 160 or 60 to 120 and it's going to be fine.

But this kind of technology would be amazing at 30 to get 60 fps, but it doesn't perform that well, with visible artifacts and gives you still crappy input lag.

I might be wrong but I just think it'd be the most useful on weaker cards (think of 4060 and below) and on those the experience doesn't sound like it'd be that good.

I'll test it on cyberpunk when that patch is finally out.

the usecase on PC is mostly to get to your monitor's refresh without running natively at that refresh.

so if you have a 165hz monitor but newer games with high settings only run at maybe 80fps, then you just turn on frame generation and reach your max refresh or get close to it.

I don't think this was ever meant to be used at 60hz since most PC screens these days are at least 120hz or 144hz

look where PC screens move towards. with the new Display Port version we will maybe soon see 900hz monitors.
combine that with DLSS frame generation and you could get actually close to 900fps on high end cards.

but even with screens that already exist this could be a big improvement in motion/image clarity.
with 1440p 240hz monitors on the market already you could see it being used to get 120fps games closer to that 240hz refresh without sacrificing graphics settings.

and if you run at 120 native frames + frame generation + nvidia reflex, your input latency will still be super low.
 
Last edited:

FingerBang

Member
the usecase on PC is mostly to get to your monitor's refresh without running natively at that refresh.

so if you have a 165hz monitor but newer games with high settings only run at maybe 80fps, then you just turn on frame generation and reach your max refresh or get close to it.

I don't think this was ever meant to be used at 60hz since most PC screens these days are at least 120hz or 144hz

look where PC screens move towards. with the new Display Port version we will maybe soon see 900hz monitors.
combine that with DLSS frame generation and you could get actually close to 900fps on high end cards.

but even with screens that already exist this could be a big improvement in motion/image clarity.
with 1440p 240hz monitors on the market already you could see it being used to get 120fps games closer to that 240hz refresh without sacrificing graphics settings.

and if you run at 120 native frames + frame generation + nvidia reflex, your input latency will still be super low.
Maybe it's because I'm not used to super high frames, but in general the moment we're above 90/100hz, i barely nothing a difference up to 144/165. I do get that pushing a screen to its max rate might have its benefits, but at least at the moment this technology seems to be effective only when it's less noticeable. Happy to be proven wrong!
 

baphomet

Member
Unlike when the article was written, games with DLSS 3 are out, and no, they don't work on non 4000 series cards.
 

01011001

Banned
Maybe it's because I'm not used to super high frames, but in general the moment we're above 90/100hz, i barely nothing a difference up to 144/165. I do get that pushing a screen to its max rate might have its benefits, but at least at the moment this technology seems to be effective only when it's less noticeable. Happy to be proven wrong!

Diminishing returns set in at above 100fps for sure, but higher refresh also = cleaner pixel response.
High frame rates on modern screens is not only good for motion fluidity but also mainly for clarity.

On my PC screen running at 144fps/Hz makes motion look significantly more easy to the eye than at 90fps/Hz even tho it doesn't look significantly smoother.

Although if you play at 144fps and instantly switch down to 90fps you absolutely at first see the increased stutter in motion.
 
I pretty sure I read a comment from one of the developers of DLSS3 that the technology wasn't locked to the RTX 4000 series GPUs but that they had specific frame generation hardware built into them that made DLSS3 more efficient on the newer cards. My guess is that NVIDIA always intended DLSS3 to come to the RTX 2000 and 3000 cards as well but, obviously, they need this as a selling point for these new cards so, for now, it is exclusive to the RTX 4080 and 4090.

I am quite intrigued by this as someone with an RTX 3080 running on an older PC that can typically play most games at 60 fps on near maxed out settings at 2560x1440. Could DLSS3 be used for more demanding games to run, say, 30 fps ones at 60 fps as well as 60 fps ones at 120 fps? I guess the graphical artifacts are going to be more noticeable at lower framerates than higher ones?
 
Last edited:

b0uncyfr0

Member
Playing on a TV, afew metres away and you'll never see these smaller issues with artifacting/motion and frame generation.

I see it as big win for PC gamers with TV's. Its finally a good time to aim for 120hz and beyond.
 
Last edited:

01011001

Banned
so was this ever confirmed btw? seems like it has been quite a while and noone managed to showcase this actually running on 2000 and 3000 series cards, meaning I guess it was a whole load of bullshit right?
 

Haggard

Banned
Playing on a TV, afew metres away and you'll never see these smaller issues with artifacting/motion and frame generation.

I see it as big win for PC gamers with TV's. Its finally a good time to aim for 120hz and beyond.
The current version of DLSS3 has issues you WILL notice, even from a distance. Flickering scrambled HUDs f.e.
 

b0uncyfr0

Member
The current version of DLSS3 has issues you WILL notice, even from a distance. Flickering scrambled HUDs f.e.
Well shit, i guess it needs more time in the oven :messenger_expressionless: I was under the impression, from a distance everything was pretty good.

Any vides out there about these 'issues'...?
 

Haggard

Banned
Well shit, i guess it needs more time in the oven :messenger_expressionless: I was under the impression, from a distance everything was pretty good.

Any vides out there about these 'issues'...?
There are several DLSS3 tests from the usual suspects. Just YT search.
 

01011001

Banned
The current version of DLSS3 has issues you WILL notice, even from a distance. Flickering scrambled HUDs f.e.

scrambled huds is an issue with developers not implementing it correctly tho...
and for the rest, somehow all of that is totally fine for everyone with other graphics effects tho.

I am a stern enemy of Screen Space Reflections, because they are LITERALLY A GLORIFIED FUCKING GRAPHICS GLITCH but somehow everyone is totally ok with SSR in games... to the point where people shit on raytracing reflections for "not improving the image a lot"

SSR artifacts are visible literally in every single frame you play as soon as they are on screen, right in your face... but that's ok... same with Screen Space AO and Screen Space Shadows, somehow totally ok for everyone that these exist.

or TAA with its often soft af image quality and obvious trails and noise in motion... totally ok...

"BUT OH NO! Spider-Man's right foot is slightly warped in 1 frame during fast movement! DLSS3 suuuucks!"


it's weird to me that DLSS3 is singled out in this discussion as soon as anyone tries to say it's not that good, while a multitute of other elements that make up the final image of a modern game often have not only more but also more obvious issues and not only for 1 frame but in every frame displayed
 
Last edited:

Mister Wolf

Member
so was this ever confirmed btw? seems like it has been quite a while and noone managed to showcase this actually running on 2000 and 3000 series cards, meaning I guess it was a whole load of bullshit right?

If it could be done and not run like absolute garbage we would have been seen it plastered all over the web.
 
We're a few months out from this claim, and weirdly, I haven't seen anyone able to replicate it. I would've expected someone to have written an "unlock" program or whatever by now for pre-4000 series cards.

I'm happy enough with my 3080, but having this feature of DLSS locked behind what could be a toggle as a result of a marketing decision to push the 4000 series is frustrating. Not frustrating enough to drop $1000+ on a new 4000 card, mind you.
 

raduque

Member
The problem is nobody competes with them past a certain point. If I want blistering 4K performance and have the money to pay for it, I really only have the one option.
I think the biggest issue is people are buying into the 4k hype. It's exceptionally expensive, and exceptionally useless. I would take 1080p 144fps with max settings over 4k60 and low any day.
 
Top Bottom