• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD FidelityFX Super Resolution (FSR) launches June 22nd

ErRor88

Member
Last edited:
Thanks AMD for giving your old Nvidia GPU a few more years of life:

j7tSQAz.jpg
 

IntentionalPun

Ask me about my wife's perfect butthole
Looking at the 4K video on my phone… hope this looks better in motion as it’s pretty blurry; but not seeing artifacts.

Would like to see a comparison to native of the lower resolution it’s scaling from.

If it’s improving perf while looking any better and not producing artifacts I’ll take it even if it doesn’t beat DLSS in every way.

Since games have to code for it will have to wonder how widespread it’ll be; good there is an open source option out there though.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Consoles will support that?
Sure; it’s just a piece of software that devs can use.

But devs are already using all kinds of upscaling techniques.

We’ll have to see how this compares. The IQ is notably blurrier than native in the video presentation but it is shitty YouTube. Of course both sides of the comparison are shitty YouTube lol
 

longdi

Banned
And, to drop into op-ed mode, this is where AMD has me a bit worried. In our pre-briefing with AMD, the company did confirm that FSR is going to be a purely spatial upscaling technology; it will operate on a frame-by-frame basis, without taking into account motion data (motion vectors) from the game itself.

For GPU junkies, many of you will recognize this as a similar strategy to how NVIDIA designed DLSS 1.0, which was all about spatial upscaling by using pre-trained, game-specific neural network models. DLSS 1.0 was ultimately a failure – it couldn’t consistently produce acceptable results and temporal artifacting was all too common. It wasn’t until NVIDIA introduced DLSS 2.0, a significantly expanded version of the technology that integrated motion vector data (essentially creating Temporal AA on steroids), that they finally got DLSS as we know it in working order.

Given NVIDIA’s experience with spatial-only upscaling, I’m concerned that AMD is going to repeat NVIDIA’s early mistakes. Spatial is a lot easier to do on the backend – and requires a lot less work from developers – but the lack of motion vector data presents some challenges. In particular, motion vectors are the traditional solution to countering temporal artifacting in TAA/DLSS, which is what ensures that there are no frame-by-frame oddities or other rendering errors from moving objects. Which is not to say that spatial-only upscaling can’t work, only that, if it’s competitive in image quality with DLSS, that would be a big first for AMD.

 

octiny

Banned
Saw this part in 4K. FSR looks noticeably blurrier on that mode at least.

Noticed that as well, even on my phone. Performance improvements look great, but if the image quality is trash it defeats the purpose. Extremely noticable on the pink/green leaves. Reminds me of DLSS 1.0.

Won't be surprised if DF makes a video once it's released named "Deep Dive into AMD's FidelityFX DLSS 1.0"
 

MonarchJT

Banned
It would never have been without tensor cores on the same levels as dlss2. having said that the real question to ask is: Is it better than checkboard rendering? it is better than dynamic upscaling techniques what do we have today?
 

Whitecrow

Banned
fsrcxkuo.png


This is “quality” mode, the native is 1440p.

Wonder what the base resolution is for that?

edit: keep in mind this isn’t the “ultra” quality setting, I’d be most curious about that one.
Yeah I'm not buying it. in the sea pic, the native part is just a bunch of rocks in the shadows where you can see nothing.
It wont hold a candle to DLSS.
Just like mesh shaders wont hold a candle to Geometry Engine lmao, had to say it
 

ThisIsMyDog

Member
nowhere near x1x

x1x can brute force run the rdr 2 at native 4k and 30 fps

we tried same x1x settings and 4k and it only rendered 18-20 fps. that's nearly %50 slower than one x

desktop gpus cant match console gpus properly, console gpu will always work more efficient
RDR2 runs like shit on 1060, basically worst example you can get.
 

yamaci17

Member
RDR2 runs like shit on 1060, basically worst example you can get.
but it is also the most brutal example i can get

its the biggest GOTY contender game the PC received in the entirety of generation

i can get you more examples though, if you like... but then it would be horizon zero dawn, ac valhalla and such and you would say its old and its normal and this discussion would go on forever...

there's no proper disperancy between consoles and desktop gpus. it is clear, i'm not saying x1x is magically faster. x1x has the advantage of years of years development and developers have mastered it and manage to extract more efficient performance out of it.

that's series x being equal to 6700xt these days are irrevelant, because sx will be rivaling 6800xt by the time nextgen gpus arrive (same applies to ps5 as well. they're already doing mighty fine job in re:Village against their desktop counterparts)
 
Top Bottom