• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is DLSS Really "Better Than Native"? - 24 Game Comparison, DLSS 2 vs FSR 2 vs Native

RoboFu

One of the green rats
nope. the common bait response. there's no sharpening involved. purely 4k lods+assets+textures at work there. you cannot simply recreate what is happening there with a sharpen filter. and it is disabled at %0 regardless

good luck bringing those face textures/details with a sharpener. I will wait for you to try

vO3vhOi.png


QMvsPI1.png



as I said, in all my comparisons, sharpening is disabled. Because I've dealt with such responses before

i don't blame you though, as for most "1440p" users, 4k dlss performance destroying+demolishing+annihilating native 1440p image quality presentation with matched or better performance is a tough pill to swallow
That’s some bs pictures 😂. 1440 doesn’t make textures blurry. 🤣
 

yamaci17

Member
That’s some bs pictures 😂. 1440 doesn’t make textures blurry. 🤣

oh so you legit don't believe actual evidence, proof? I can make the same comparison by capturing a live video while making the comparison, but if I do so, you will get yourself permantently banned for accusing me of lying/misinformation. how does that sound? if I cannot produce the same result with live actual recording (I will literally choose 1440p, take a shot, then go 4K dlss perf, take another shot and compare them all the while recording the desktop the whole time), I will get myself permanently banned.

it is not 1440p making textures "blurry". it is 4k dlss performance having "better" texture quality instead. if you cannot understand that, I don't know what to tell you. you're looking at things the wrong way. regardless, if you legit disregard actual proof evidence, there can be no healthy discussion.
 
Last edited:

RoboFu

One of the green rats
oh so you legit don't believe actual evidence, proof? I can make the same comparison by capturing a live video while making the comparison, but if I do so, you will get yourself permantently banned for accusing me of lying/misinformation. how does that sound? if I cannot produce the same result with live actual recording (I will literally choose 1440p, take a shot, then go 4K dlss perf, take another shot and compare them all the while recording the desktop the whole time), I will get myself permanently banned.

it is not 1440p making textures "blurry". it is 4k dlss performance having "better" texture quality instead. if you cannot understand that, I don't know what to tell you. you're looking at things the wrong way. regardless, if you legit disregard actual proof evidence, there can be no healthy discussion.

if you want to, I can do that. if you want to risk your account and have the guts to follow up your "bs" claims about my so called "bs" pictures.
Calm down I am on my phone and didn’t notice it’s was a zoomed in part of a larger pic. It just looked like someone threw a blur filter on the photo 😂. I was like no way that not how textures work.
 
Last edited:

yamaci17

Member
Calm down I am on my phone and didn’t notice it’s was a zoomed in part of a larger pic. It just looked like someone threw a blur filter on the photo 😂. I was like no way that not how textures work.
of course, even without zoom, there's a noticable difference. zoomed in is worse. but I just had to zoom in because "muh sharpen" argument is thrown at me whenever I say 4k dlss performance looks better than 1440p. What I mean to say, it looks organically better, not artificially through some sharpening filter. eventually they will simply accept the in game TAA is bad and move on. that's usually what it happens. but somehow also keeps happening.

i actively dislike sharpen. i disable it whenever possible
 
Last edited:

Mr.Phoenix

Member
Kind of insane people are using an upscaler to improve image quality. Why dont they release another *acronymed feature* without the upscaler but with the extra image processing?
dlss/fsr2.... is NOT upscaling.

What they do inherently, is literally why they can `improve` image quality with them. Depending on which apples and oranges you are comparing that is.
 

T4keD0wN

Member
its not; regardless of screen size; 4k dlss performance produces better image quality. because It uses 4k lods+assets+texture that "native" 1440p will never use. this is a different beast that people are unable to comprehend.

the game will legit load/utilize higher quality assets+textures+lods with 4k upscaling even at internal 1080p rendering.

2yclmn8.png
6uu79Xj.png



detail is literally not there. you cannot bring any new detail with a sharpening, you can only pronounce them.

sharpening lol... give me a break

HuHQVu6.png
KXQO5hL.png



I even had to "show" overlay to prove someone that the effect above and below was not affected by sharpening


(I can repeat the same test in rdr2 at 1440p too. will provide similar results. 4k dlss perf will wreck 1440p there too)

it stops having to do anything with screen size when game will legit refuse to utilize actual high quality assets+textures when you output to 1440p. nothing can change that; unless you use DSR/custom res to go to 4K output and downsample back again
Youre absolutely right that the 4k + dlss pref output will be better than native 1440p one, but comparing just pictures is something a bit different to comparing to 2 monitors (its clear who is going to win that, it strips the only advantage that a 1440p monitor can possibly, but not necessarily have), it cant really be done digitally as comparing screenshot completely ignores pixel density factor which can benefit either.

Not saying youre wrong, just that its not a full picture comparison.
 
Last edited:

Buggy Loop

Member
Obviously ML on top of Temporal reconstruction will give better results than just a Temporal solution alone.
You have to remember that FSR will continue to improve its algorithms and increase the tech it is using. While it isn't as good, it's good enough. It will help the console keep performance up.

Soon^TM

87-DE00-B1-CAE6-4664-8145-CC6-EF2059885.jpg


Peoples have no idea the for leap difference AMD has to do to catch up. Algorithms without ML can only go so far, and they’re nonexistent in the AI space.

This very HUB video uses the DLSS version that came with the game at launch. If you inject the better DLSS versions, you’ll probably be looking at >90% of games where it looks better than native.
 
Last edited:
i don't know i mean i don't go out my way to compare it but if a game has DLSS i turn it on even if my PC can handle native and above 144fps. if i can lower the load on my GPU and have it run cooler then i'm doing it.

quality/balanced i can't tell a difference from native. the only game i notice an issue is Flight Simulator where there is noticeable shimmering. performance isn't that great though but it is the performance mode so... can't complain.
Even when using performance mode at 1080p in some games like Cyberpunk, with new versions of DLSS, its almost as good as DLSS on Quality

DLSS is pretty nuts

Having DLSS is almost mandatory for me to buy a new game
a game not have DLSS wouldn't be a dealbreaking but i'd still be disappointed. every game should have it!
 

Loxus

Member
Soon^TM

87-DE00-B1-CAE6-4664-8145-CC6-EF2059885.jpg


Peoples have no idea the for leap difference AMD has to do to catch up. Algorithms without ML can only go so far, and they’re nonexistent in the AI space.

This very HUB video uses the DLSS version that came with the game at launch. If you inject the better DLSS versions, you’ll probably be looking at >90% of games where it looks better than native.
In my opinion, AMD is keeping pace with improving tech for custom products, especially the consoles.

Yes AMD compares FSR to DLSS, but it's an unfair comparison due to FSR not utilizing dedicated hardware.

AMD has something now called XDNA for AI.
PrCEdr2.jpg

FW26Tyx.jpg



AMD also offers something similar to Intel and Nvidia, call AI Matrix Accelerators.
qZpKN62.png

Qz8pbb6.jpg
K3Sltz9.jpg


This AI Matrix Accelerator is within the CU in the same way as with Nvidia's SM and Intel's Xe core, and for FSR 3 not to utilize this hardware would be quite odd for AMD to do.
IBUohvp.png
6kZPg7Y.jpg

oIlvBuE.jpg
bhf16tH.jpg

SL9Yu4V.jpg


Even their CDNA architecture has AI acceleration.
bXFSmtc.jpg



To say AMD is non existence in the AI space isn't entirely accurate, but I'll admit they're lacking in software using these hardware features at the moment.
 

Zathalus

Member
In my opinion, AMD is keeping pace with improving tech for custom products, especially the consoles.

Yes AMD compares FSR to DLSS, but it's an unfair comparison due to FSR not utilizing dedicated hardware.

AMD has something now called XDNA for AI.
PrCEdr2.jpg

FW26Tyx.jpg



AMD also offers something similar to Intel and Nvidia, call AI Matrix Accelerators.
qZpKN62.png

Qz8pbb6.jpg
K3Sltz9.jpg


This AI Matrix Accelerator is within the CU in the same way as with Nvidia's SM and Intel's Xe core, and for FSR 3 not to utilize this hardware would be quite odd for AMD to do.
IBUohvp.png
6kZPg7Y.jpg

oIlvBuE.jpg
bhf16tH.jpg

SL9Yu4V.jpg


Even their CDNA architecture has AI acceleration.
bXFSmtc.jpg



To say AMD is non existence in the AI space isn't entirely accurate, but I'll admit they're lacking in software using these hardware features at the moment.
The AI acceleration in RDNA3 and CDNA pale in comparison to Nvidia tensor cores though. Like not even in the same ballpark.

XDNA is a big unknown at this time.
 
Honestly, buying anything other than an Nvidia card is a mistake.
Buying an Nvidia card considering the outrageous prices is a mistake

I am part of the problem. 4090 owner here.

DLSS is heavenly. and I am very excited about the Switch 2 more than any other next-gen console just because of the DLSS idea into the next-gen Switch is gonna be amazing.

AMD needs to act up and fast if they want any pie in the GPU market. They do not even have a mid range GPU yet when Nvidia so far has released 4 SKUs..what is AMD smocking?

FSR3 must use hardware-level upscaling if they wanna compete. whatever crap they have now is not working.
Yup. Switch 2 is going to have better iq in some multiplatform games than PS5 or Series X. The meltdowns are going to be legendary 😂
 

Skifi28

Member
Yup. Switch 2 is going to have better iq in some multiplatform games than PS5 or Series X. The meltdowns are going to be legendary 😂
We know next to nothing about the Switch 2, but now we're sure it'll use DLSS and have better image quality than the competition? You sound way more invested than those having hypothetical meltdowns.
 
We know next to nothing about the Switch 2, but now we're sure it'll use DLSS and have better image quality than the competition? You sound way more invested than those having hypothetical meltdowns.
Next to nothing? The chip info is online for almost a year due to the Nvidia info hack (the same one that gave us all the PS4/5 to PC port info). It’s a ginormous leap over Switch and has hardware for RT and DLSS 2.0.
 

The Cockatrice

Gold Member
nope. the common bait response. there's no sharpening involved. purely 4k lods+assets+textures at work there. you cannot simply recreate what is happening there with a sharpen filter. and it is disabled at %0 regardless

good luck bringing those face textures/details with a sharpener. I will wait for you to try

vO3vhOi.png


QMvsPI1.png



as I said, in all my comparisons, sharpening is disabled. Because I've dealt with such responses before

i don't blame you though, as for most "1440p" users, 4k dlss performance destroying+demolishing+annihilating native 1440p image quality presentation with matched or better performance is a tough pill to swallow

DLSS uses sharpening filters. I have nothing against DLSS, I always use it, but native will always be more accurate. DLSS uses enhancements.
 

winjer

Gold Member
Soon^TM

87-DE00-B1-CAE6-4664-8145-CC6-EF2059885.jpg


Peoples have no idea the for leap difference AMD has to do to catch up. Algorithms without ML can only go so far, and they’re nonexistent in the AI space.

This very HUB video uses the DLSS version that came with the game at launch. If you inject the better DLSS versions, you’ll probably be looking at >90% of games where it looks better than native.

That is while using FSR 2.0
FSR 2.1 made big strides in image quality. And 2.2 a bit more.
DLSS2 still has a good advantage, but it's not as it seems in those 2 screens.
 

yamaci17

Member
DLSS uses sharpening filters. I have nothing against DLSS, I always use it, but native will always be more accurate. DLSS uses enhancements.
not anymore with newer versions; they shelved off their sharpening implementation with 2.5.1 version and last of us uses one of the more recent 3.1.2 version



 

Hoddi

Member
I like DLSS and prefer it to the crappier TAA methods that we’ve been saddled with recently. But I think the issue is much more that modern TAA often looks like trash and not that DLSS looks so great.

Many older games from 2015-2017 had TAA methods that were much cleaner than what we have now. I recently played a bit of Doom 2016 on my old plasma TV and it looked nothing like more recent games do at 1080p. It just looked very clean and sharp.

In contrast, newer games often look plain awful at 1080p and even 4k can’t save them in many cases. This sometimes makes DLSS feel like a bandaid for something that looks far worse than it should.
 

The Cockatrice

Gold Member
not anymore with newer versions; they shelved off their sharpening implementation with 2.5.1 version and last of us uses one of the more recent 3.1.2 version





Thats because the AI uses something else to clean the image. Native is native, DLSS is an enhancement. We can make the native image clearer as well.
 

Zathalus

Member
That is while using FSR 2.0
FSR 2.1 made big strides in image quality. And 2.2 a bit more.
DLSS2 still has a good advantage, but it's not as it seems in those 2 screens.
And DLSS2 has had like a dozen versions since 2.3.4 with improvements to ghosting, stability, and temporal artifacts. The Last of Us uses the latest version on both and DLSS2 has areas that are still much better then FSR2.
 

winjer

Gold Member
And DLSS2 has had like a dozen versions since 2.3.4 with improvements to ghosting, stability, and temporal artifacts. The Last of Us uses the latest version on both and DLSS2 has areas that are still much better then FSR2.

Much better? From this analysis, HU concluded that in The Last of Us, using DLSS Q, at 4K is "slightly better" than FSR2. And at 1440p, it's "moderately better".
DLSS2 is the best upscaler, but let's not exaggerate things.
 
Oh, come on with this ridiculous "kept relevant".
Your 3090 will run circles around anything else in the console space for years to come, "awful ports" or not. Even without DLSS.

I have a 3080ti myself and I have yet to approach the point where I struggled with ANYTHING.
I was thinking the same thing lol you kids saying shit like that is why these companies can keep scamming you.

3090 irrelevant...my god man.

By the way, in my opinion I want a game that looks like the next crysis without any raytracing to be the reason why my 3080 struggles. I dont want some tacked on feature like rtx or physx or whatever to be the reason why I need to upgrade.

That allows devs to make a game with muddy textures, that look like a console ubisoft game to be "taxing" because they tacked on rtx on it. Thats NOT ambition, and NOT what my card should be used for.

Were getting scammed guys, but whatever
 
Last edited:
"But but but DLSS is just TV Motion Interpolation"

Shut the F up to those dudes who knows nothing
even tv interpolation is waaay better these days (depends on the tv/implementation though).
15 years ago, it was total garbage. weirdest looking crap.

i use it with youtube videos and stuff now (lg cx, "natural" setting).
 

poppabk

Cheeks Spread for Digital Only Future
These sites write “native” but it’s native + reflex in reality.

qRLYAFT.png


I guess it can depend on the game, but here. A whooping 10ms delay from what is native + reflex

plague-latency-tests.png


AMD latency?

fortnite-latency-4070-ti-perf.png


"We experimented with Radeon Anti-Lag here as well, which did seem to reduces latency on the Radeons by about 10 - 20ms, but we couldn't get reliable / repeatable frame rates with Anti-Lag enabled."

"Normally, all other things being equal, higher framerates result in lower latency, but that is not the case here. The GeForce RTX 4070 Ti offers significantly better latency characteristics versus the Radeons, though it obviously trails the higher-end RTX 4080."

But really, for sure that 10ms makes it unplayable (recurring comments in every frame gen discussions..) I guess all AMD flagship owners have everything unplayable. While consoles comparatively have >double the latency when compared to reflex technology.

If you play a competitive games and you’re a pro gamer (lol sure), the cards that have frame gen don’t need to run frame gen with these games, they run on potatoes. You don’t need to care for 10 ms in cyberpunk 2077 overdrive path tracing, but goddamn I wish I had frame gen right about now for that game.

Peoples just remember « oh no! More latency! » from the frame gen articles. Dude, you’re eating good with reflex and inherent lower latency on Nvidia cards.
Yeah I would call 5 frames of latency at 60fps pretty terrible on AMDs part. But this isn't about whether AMD sucks versus Nvidia, it's about whether frame generation is a good technique or not. And the fact remains that beyond that 10ms lag you are also talking about the latency relative to the original framerate. The question still remains- why are we pursuing high framerates - so it looks smooth or so it is smooth?
 

flying_sq

Member
I'd argue against this. I cant notice any artifacts anymore which were there in earlier version of DLSS. I'm playing Warzone DMZ and MW2 multiplayer at constant 175 fps with DLSS in quality mode everything turned highest setting and its looks crips as fuck. No artifacts what so ever. Even tried pixel peeping to see some but no.
What are you playing on? I do 4k dlss on a 3090ti and a 5900x and I am around 90-120 completely maxed out
 

Buggy Loop

Member
Yeah I would call 5 frames of latency at 60fps pretty terrible on AMDs part. But this isn't about whether AMD sucks versus Nvidia, it's about whether frame generation is a good technique or not. And the fact remains that beyond that 10ms lag you are also talking about the latency relative to the original framerate. The question still remains- why are we pursuing high framerates - so it looks smooth or so it is smooth?

Its not frame times, its system latency. Totally different.

Consoles have way high latencies than anything, native, with DLSS or even including with frame gen, enabled reflex games. It can look smooth and still have lower latency than anything native on competing hardware. So what's the excuse? Like i said, if its for pro-gaming, 4000 series is already not needing frame gen for these type of games, nor do they even support frame gen anyway. As of now, its a bunch of single player games.

Then i bet everybody is perfectly fine with huge mechanical switches for keyboards with travel distances that adds 2ms + 3ms of just the keyboard itself and then oh boy, you better not have a goddamn peripheral in bluetooth.

Its bullshit, 10ms system latency is nothing.
 
Last edited:

01011001

Banned
Yeah I would call 5 frames of latency at 60fps pretty terrible on AMDs part. But this isn't about whether AMD sucks versus Nvidia, it's about whether frame generation is a good technique or not. And the fact remains that beyond that 10ms lag you are also talking about the latency relative to the original framerate. The question still remains- why are we pursuing high framerates - so it looks smooth or so it is smooth?

frame generation is in its infancy currently. in the future it will allow you to max out ultra high refresht rare monitors even in demanding games.

imaging if your card could run Cyberpunk RT Overdrive at 144fps, but you have a 240hz monitor.
if you enable frame generation you will get closer to that 240fps with a tiny latency penalty at these framerates.

your image looks smoother and the pixel response time will be better since your refresh rate runs higher.

and even now. what do you want? 60ms latency at native, or 70ms latency at almost twice the perceived smoothness?

for a single player game anything below 80ms is absolutely good enough.
hell many console games have higher input latency even at 60fps.
God of War Ragnarök has ~85ms latency in its 60fps mode
 
Last edited:

squarealex

Member
How original... people spending 2000$ GPU claim how is the best in many ways, with fake screen and false fact.
 
Last edited:

Kaleinc

Banned
When Nvidia was pushing DSR the claim was that 1080p image on 1080p screen is aliased shit but if downscaled from 4k to 1080p - everything a nerd could wish for. Well it makes sense, 4k image contains more data.
Now when they are pushing DLSS it's the opposite, how shitty 1080p blown to 4k is awesome and even better than native 4k. It just doesn't work like that.
Jensen: these gullible gamers will buy $2000 GPUs and use upscalers from 1080p to 4k and be happy about it cause we will say it's AI blessing.
Sell your ray tracing rendered in native resolution beatch
 

Buggy Loop

Member
How original... people spending 2000$ GPU claim how is the best in many ways, with fake screen and false fact.


fans GIF


When you can't even price the flagship correctly and ignore that the whole range of 4000 series down to $599 does the tech.

What's even your point? False fact?
 

marjo

Member
I hate Nvidia as much as any red blooded PC gamer for their shitty business practices, but there's no denying that DLSS is pretty awesome.

Still, unless they change their philosophy in the future, my 3080 is likely to be my last Nvidia card (hoping Intel's next gen will be competitive).
 

Zathalus

Member
Much better? From this analysis, HU concluded that in The Last of Us, using DLSS Q, at 4K is "slightly better" than FSR2. And at 1440p, it's "moderately better".
DLSS2 is the best upscaler, but let's not exaggerate things.
How does performance mode hold up again? The gap widens anytime FSR has less pixels to upscale with. The image you quoted was at 1080p where FSR is quite useless. 4k Quality mode is the only res/quality setting FSR can compete with and even then it never matches DLSS.
 

Zathalus

Member
How original... people spending 2000$ GPU claim how is the best in many ways, with fake screen and false fact.
The 4090 ($1699 mind you) is indeed the best, even without DLSS and frame generation applied. Those are just amazing bonuses on top of that.

That being said you can get DLSS on a $219 card while frame generation can be had on a $599 card. Only a small minority of PC gamers own a 4090
 

PaintTinJr

Member
nope. the common bait response. there's no sharpening involved. purely 4k lods+assets+textures at work there. you cannot simply recreate what is happening there with a sharpen filter. and it is disabled at %0 regardless

good luck bringing those face textures/details with a sharpener. I will wait for you to try

vO3vhOi.png


QMvsPI1.png



as I said, in all my comparisons, sharpening is disabled. Because I've dealt with such responses before

i don't blame you though, as for most "1440p" users, 4k dlss performance destroying+demolishing+annihilating native 1440p image quality presentation with matched or better performance is a tough pill to swallow
It is a shame the two pictures aren't identical without the white text in them so we can look at the histogram on each as a direct comparison. I personally feel the detail is much better like you say, but I still like the 1440p tones more, and can't help but feel the DLSS inference errors are partly hidden by it altering the tones slightly in a low frequency solid fill type of way.

The thing is, a AI enhanced imagine will always have inference errors of some type - typically unnoticeable, as seen here but might be more apparent in frame to frame transitions - whereas the native image will have under-sample artefacts at worst but will always be error free, irrespective of the scene, so depending on the frequency composition of a scene's changing frames, the DLSS errors could be unstable making the feature unusable in edge cases, so in a formal statement of which is best I would still claim the native 1440p is superior, because it is completely indifferent to scene composition and error free rendering at all times.
 

poppabk

Cheeks Spread for Digital Only Future
Its not frame times, its system latency. Totally different.

Consoles have way high latencies than anything, native, with DLSS or even including with frame gen, enabled reflex games. It can look smooth and still have lower latency than anything native on competing hardware. So what's the excuse? Like i said, if its for pro-gaming, 4000 series is already not needing frame gen for these type of games, nor do they even support frame gen anyway. As of now, its a bunch of single player games.

Then i bet everybody is perfectly fine with huge mechanical switches for keyboards with travel distances that adds 2ms + 3ms of just the keyboard itself and then oh boy, you better not have a goddamn peripheral in bluetooth.

Its bullshit, 10ms system latency is nothing.
Maybe I am confused but surely latency depends on the frametime. Even with zero extra latency at 10fps you would see the frametime as latency?
 

Buggy Loop

Member
Maybe I am confused but surely latency depends on the frametime. Even with zero extra latency at 10fps you would see the frametime as latency?

System latency will take all that into account yes.
  1. The input latency (peripheral latency)
  2. To CPU (game latency)
  3. To Render queue (render latency)
  4. To GPU (render latency, the framerate's inherent latency, the latencies added for every frames for the likes of RT, DLSS, frame gen etc)
  5. To Display
This is not analyzed by any softwares. It needs something like the Reflex analyzer. But even with just the GPU analysis, there's a ton going on nowadays for post-processing effects and upscaling / motion vectors that the frametime just by framerates is mostly a thing of the past. Almost no games will see a pure 16.6ms for 60 fps. The likes of Apex Legends have 25.3 ms for 248 fps, while having 43.5 ms at 95 fps. Not linear at all. Valorant at >300 fps has no return value for these render queue optimizationss and will be in the range of 20ms (rather than the 1/300 frametime of 3.33 ms).
 
Soon^TM

87-DE00-B1-CAE6-4664-8145-CC6-EF2059885.jpg


Peoples have no idea the for leap difference AMD has to do to catch up. Algorithms without ML can only go so far, and they’re nonexistent in the AI space.

This very HUB video uses the DLSS version that came with the game at launch. If you inject the better DLSS versions, you’ll probably be looking at >90% of games where it looks better than native.
My point wasn't that FSR will match DLSS, it won't, but that it will continue to improve over what it is now, as DLSS will continue to improve.

I also wonder if there will be a ML component down the track with MS. MS has been working on AI and ML for a long time. They have Direct ML and we saw that they had indeed been working on upscaling games with what they showed on Forza Horizon.
MS added the int4 and int8 to the XSXS for a reason, and they have said it will be able to upscale games using that, so the question is when, or even if, we will ever see this.
 

winjer

Gold Member
How does performance mode hold up again? The gap widens anytime FSR has less pixels to upscale with. The image you quoted was at 1080p where FSR is quite useless. 4k Quality mode is the only res/quality setting FSR can compete with and even then it never matches DLSS.

Yes, FSR2 still lags at the Performance mode. But this is only usable at 4K, be it with DLSS2 or FSR2.
I never used DLSS2 bellow Quality mode at 1440p. Even with balanced I could immediately tell the drop in quality. Performance mode is even worse.
In static shots it kind of looks ok. But in movement, the drop in quality is immediately apparent.
 

Loxus

Member
The AI acceleration in RDNA3 and CDNA pale in comparison to Nvidia tensor cores though. Like not even in the same ballpark.

XDNA is a big unknown at this time.
How do you know it pales in comparison if it isn't even utilized yet?
 

Zathalus

Member
How do you know it pales in comparison if it isn't even utilized yet?
What do you mean not utilized? RDNA3 and CDNA has been out for a while. That being said the specifications for AI acceleration for all the cards have been released, Nvidia Tensor cores are simply on another level. The very few AI benchmarks you can get of the 7900 XTX support this.
 

winjer

Gold Member
What do you mean not utilized? RDNA3 and CDNA has been out for a while. That being said the specifications for AI acceleration for all the cards have been released, Nvidia Tensor cores are simply on another level. The very few AI benchmarks you can get of the 7900 XTX support this.

RDNA does not have tensor units. It's just DP4a.
But CDNA has them. Unfortunately, AMD decided not to have these Matrix units in RDNA.
So the benchmarks in AI for the 7900XTX don't relate to CDNA.
 

Zathalus

Member
RDNA does not have tensor units. It's just DP4a.
But CDNA has them. Unfortunately, AMD decided not to have these Matrix units in RDNA.
So the benchmarks in AI for the 7900XTX don't relate to CDNA.
Yeah, but we have the specifications of both the latest CDNA and Hopper GPUs and Nvidia is still far ahead when it comes to AI acceleration.
 

winjer

Gold Member
Yeah, but we have the specifications of both the latest CDNA and Hopper GPUs and Nvidia is still far ahead when it comes to AI acceleration.

You have the specs for CDNA3 and Hopper? Care to share them?
 
Last edited:
Top Bottom