• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD FSR 3 FidelityFX Super Resolution Technology Detailed

winjer

Gold Member

In an early look provided to GDC attendees, AMD revealed that FSR 3 FidelityFX Super Resolution will leverage from:

  • Motion Vectors and AMD Fluid Motion to produce interpolated frames
  • Good motion estimation is key for interpolation
  • Additional internal information from FSR 2 can be leveraged

During the last part of our temporal upscaling session, we also revealed some early information about FSR 3 and talked about the benefits and challenges in development.

With FSR 2 we’re already computing more pixels than we have samples in the current frame, and we realized we could generate even more by introducing interpolated frames. This has allowed us to achieve up to a 2x framerate boost in the process.

Frame interpolation is more complex, as there are several challenges:

  • We can’t rely on color clamping to correct the color of outdated samples.
  • Non-linear motion interpolation is hard with 2D screen space motion vectors, which is why we recommend at least 60fps input.
  • If the final frames are interpolated, then the UI and all post-processing will also need to interpolated.
However, there is good news!

  • There’s a high probability there will be at least one sample for every interpolated pixel.
  • There’s no feedback loop as the interpolated frame will only be shown once – any interpolation artifact would only remain for one frame.

LxKvcOo.png


tbe4m4W.png
 

kingyala

Banned
Gonna be interesting how well this works after nvidia saying it was too hard to get working without their updated ada optical flow accelerator.
nvidia are good at marketing their hardware for gimmick technologies, motion interpolation isnt new at all it was experimented even on xbox 360 days, what they are doing now is just reconstructing it to clean it upand reducing latency... remember nvidia has a history of fraud advertisement they even claimed they invented the gpu!... also claimed they invented realtime raytracing and was only possible on rtx hardware but later people found out you could use raytracing on older gpu's even ps4 and xbone had raytracing on crysis remastered + a couple of ps4 games used raytracing techniques like dreams.
 

Skifi28

Member
"Darn, nVidia just showed DLSS3. Quick, announce something and we'll work out the details later"

This is how FSR3 always felt. I don't have any expectations whatsoever from it. Maybe once it matures in a couple of years, right now they're just rushing it out the door to compete like they did with FSR1.
 
Last edited:

sendit

Member
nvidia are good at marketing their hardware for gimmick technologies, motion interpolation isnt new at all it was experimented even on xbox 360 days, what they are doing now is just reconstructing it to clean it upand reducing latency... remember nvidia has a history of fraud advertisement they even claimed they invented the gpu!... also claimed they invented realtime raytracing and was only possible on rtx hardware but later people found out you could use raytracing on older gpu's even ps4 and xbone had raytracing on crysis remastered + a couple of ps4 games used raytracing techniques like dreams.
DLSS3’s frame generation option works well below a native input of 60 FPS.
 

hlm666

Member
nvidia are good at marketing their hardware for gimmick technologies, motion interpolation isnt new at all it was experimented even on xbox 360 days, what they are doing now is just reconstructing it to clean it upand reducing latency... remember nvidia has a history of fraud advertisement they even claimed they invented the gpu!... also claimed they invented realtime raytracing and was only possible on rtx hardware but later people found out you could use raytracing on older gpu's even ps4 and xbone had raytracing on crysis remastered + a couple of ps4 games used raytracing techniques like dreams.
It's not all totally lies as you put it, dlss has a lower time cost than fsr on nvidia hardware with the tensor cores, so you do get the slightly better image upscale done slightly quicker. Thats why using dlss you normally get a few more fps than fsr, not a huge deal but technically the hardware solution is doing a better faster job than the software only solution.

If the optical flow accelerator is actually helping much with frame generation we will find out soon though, although they didn't even show anything yet like they did with godfall for fsr when it was close so this might still be a few months away.

edit: on a related note I just noticed the slides are specifying x2 upscale (so fsr quality), I wonder if using performance which would be like x4 upscale does not produce good results or makes the time cost eat into the performance uplift.
 
Last edited:

Spyxos

Gold Member
This time there is no mention if it runs on Nvidia cards. After Fsr 2.0 I had hoped that it would also work.
 

Gaiff

Member
Can consoles get this?
I'm assuming, yes.

FSR 3 is not a reaction or something quick to DLSS 3, it’s absolutely something we’ve been working on for a while. Why is it taking a little longer to come out, than you probably expected? The most important thing to remember is that the philosophy of the FSR and the FSR so far not only work with RDNA 2 or RDNA 1, but also with other generations of AMD graphics cards. They also work on competitor graphics cards. It is exponentially more difficult than if only we could make it work on RDNA 3. We really want to work on something other than RDNA 3.

That's from Frank Azor, the chief architect of gaming solutions & marketing at AMD.
 

YCoCg

Member
The 60fps minimum input kinda limits things.
Because 30fps to 60fps doesn't provide enough "information" and comes out looking weird and jittery, you want to see how bad it is look up some examples on PC, Sack Boys Little Big Adventure is currently broke on PC where it always enables Nvidias DLSS3 and that game is capped to 60fps in places, so due to the bug the game is rendering 30fps and "upscaling" to 60fps and it just looks a mess.
though arguably they should give the option to enable it for 30 -> 60
See above.
 
Because 30fps to 60fps doesn't provide enough "information" and comes out looking weird and jittery, you want to see how bad it is look up some examples on PC, Sack Boys Little Big Adventure is currently broke on PC where it always enables Nvidias DLSS3 and that game is capped to 60fps in places, so due to the bug the game is rendering 30fps and "upscaling" to 60fps and it just looks a mess.

See above.

Yeah, I get why it has to be that way, but it just seems like something that could end up being really finicky to use.
 
Last edited:

kingyala

Banned
DLSS3’s frame generation option works well below a native input of 60 FPS.
im not arguing if it works well, any interpolation below 60 is problematic since you have less frames to sample from.. The problem is nvidia's old school marketing where they claim anything theyve done is only special because of their hardware..
 

kingyala

Banned
It's not all totally lies as you put it, dlss has a lower time cost than fsr on nvidia hardware with the tensor cores, so you do get the slightly better image upscale done slightly quicker. Thats why using dlss you normally get a few more fps than fsr, not a huge deal but technically the hardware solution is doing a better faster job than the software only solution.

If the optical flow accelerator is actually helping much with frame generation we will find out soon though, although they didn't even show anything yet like they did with godfall for fsr when it was close so this might still be a few months away.

edit: on a related note I just noticed the slides are specifying x2 upscale (so fsr quality), I wonder if using performance which would be like x4 upscale does not produce good results or makes the time cost eat into the performance uplift.
dlss might be quicker than fs3 thats ok and granted but the bs that nvidia claim is olways overhyped pr... to sell their 2000$ laptop sized gpu's... they need to find a reason for consumers to spend ridiculous amount of money on their products and propaganda is part of it... claiming raytracing was invented by rtx and oonly possible because of tensor cores and such and such is just music to y ears. Basically nvidia gpu's have an edge in performance than amd but please there hardware isnt a neccessity to implement certain features.
 

GymWolf

Member
Can someone explain to me the minimum 60 frame required thing?

Like it only work to reach higher framerates than 60 but doesn't work if your objective is 60?
 
Can someone explain to me the minimum 60 frame required thing?

Like it only work to reach higher framerates than 60 but doesn't work if your objective is 60?

Latency & frame information pretty much. You can run under 60 of course but the latency would be more noticeable and there would likely be more artefacts due to fewer frames to sample from
 

FingerBang

Member
Because 30fps to 60fps doesn't provide enough "information" and comes out looking weird and jittery, you want to see how bad it is look up some examples on PC, Sack Boys Little Big Adventure is currently broke on PC where it always enables Nvidias DLSS3 and that game is capped to 60fps in places, so due to the bug the game is rendering 30fps and "upscaling" to 60fps and it just looks a mess.
I agree that 60 should be the baseline, but some people preferred 30 to 60 to "clean" 30 fps in some games when LTT made the comparison a while ago. I'm not sure it's a no-go for every game and the technology in general, which can still improve the same way FSR2 and DLSS2/3 have done so far.
 

GymWolf

Member
Latency & frame information pretty much. You can run under 60 of course but the latency would be more noticeable and there would likely be more artefacts due to fewer frames to sample from

It's not required, it's recommended. Lower than 60 you will get very noticeable input latency.
Do you think that it is something that amd and nvidia can fix with future patch or it's an intrinsic limit of this tech?
 
Last edited:
I wonder why they've not been updated to 2.2, that one reduced ghosting a lot.
I didn’t find the ghosting too bad in Cyberpunk mainly because it only reared it’s head in specific spots like the lines in the cement at the crosswalks causing people to ghost, and other things moving across really specific patterns.

My main gripe about it was it just couldn’t handle the high contrast and high speed of driving fast at night in the city, especially in the 30fps quality mode and it would just completely mess up (but when you drive past puddles that have a light reflection that lasts a single frame then goes completely dark it kinda figures why it would have a hard time). I think on that front TAA actually worked a bit better despite the blur.

Witcher 3 on the other hand no real gripes other than the performance, but that’s not the fault of FSR.
 
Last edited:

YCoCg

Member
Can someone explain to me the minimum 60 frame required thing?

Like it only work to reach higher framerates than 60 but doesn't work if your objective is 60?
It's like resolution, the more you put in, the better the out result will be, DLSS3 was intended for high frame rates, mainly to hit that 4k120 line, so the higher the number in, the closer to native the "fake" will be.

E.g. 120 > 240 will look closer to native 240Hz then 60 > 120 does at 120Hz
 
Do you think that it is something that amd and nvidia can fix with future patch or it's an intrinsic limit of this tech?

One way of looking at it is that if Samsung are able to have frame interpolation on their TV's Game Motion Plus without it causing a fuss then improvements to it on a GPU are guaranteed.
 
Last edited:

ToTTenTranz

Banned
The 60fps minimum input kinda limits things.
Can someone explain to me the minimum 60 frame required thing?

60 FPS minimum is recommended, but there's nothing in the tech blocking the frame generation from being used below 60FPS.

Non-linear motion interpolation is hard with 2D screen space motion vectors, which is why we recommend at least 60fps input.
 
Last edited:

b0uncyfr0

Member
Cant believe ppl are actually complaining about FSR. The nerve of you twits. Without it , Nvidia would add another $200 onto of their already uber expensive but not so great selection just because they can.

We should be thanking AMD for this, we need the damn competition. The more open options we have, the better off we are.
 
Last edited:
60 FPS minimum is recommended, but there's nothing in the tech blocking the frame generation from being used below 60FPS.
I know, but it will likely degrade image quality and add artifacts beyond that.

The question is:

If I have a game that is CPU bottlenecking where I can get locked 60 easily in the first few hours then in the later game with more stuff going on I’m getting slight drops for prolonged periods into the 40’s and 50’s, am I going to be viewing it in ‘shit vision’ at that point vs FSR 2.2 ?
 
Last edited:

KungFucius

King Snowflake
nvidia are good at marketing their hardware for gimmick technologies, motion interpolation isnt new at all it was experimented even on xbox 360 days, what they are doing now is just reconstructing it to clean it upand reducing latency... remember nvidia has a history of fraud advertisement they even claimed they invented the gpu!... also claimed they invented realtime raytracing and was only possible on rtx hardware but later people found out you could use raytracing on older gpu's even ps4 and xbone had raytracing on crysis remastered + a couple of ps4 games used raytracing techniques like dreams.
What the fuck are you smoking? Ray Tracing was around for ages. It is a blatantly obvious use of physics. Wikipedia says the first demon on a computer was from 1968. NVidia developed the first commercial HW accelerator for RT. I could always be run on older GPUs. It just runs like shit. The only one committing 'fraud' here is you with this insanity.
 

MikeM

Member
Cant believe ppl are actually complaining about FSR. The nerve of you twits. Without it , Nvidia would add another $200 onto of their already uber expensive but not so great selection just because they can.

We should be thanking AMD for this, we need the damn competition. The more open options we have, the better off we are.
I use it all the time on my PC. Free frames and I don’t see the image quality hit. I plat at 4k tho
 

kingyala

Banned
What the fuck are you smoking? Ray Tracing was around for ages. It is a blatantly obvious use of physics. Wikipedia says the first demon on a computer was from 1968. NVidia developed the first commercial HW accelerator for RT. I could always be run on older GPUs. It just runs like shit. The only one committing 'fraud' here is you with this insanity.
they said they invented the first realtime rt capable gpu... but raytracing was already done on ps4 with games like dreams and the tomorrow children even crysis remastered has raytracing on ps4 and xbox one which use gpu's from 2013... so it was all a pr lie
 

kingyala

Banned
60 FPS minimum is recommended, but there's nothing in the tech blocking the frame generation from being used below 60FPS.
recommended not impossible... motion interpolation at 30 fps is alright on racing games ive tried it on horizon 5 on tv and it looks alright only noticable when you spin the camera though
 
Top Bottom