The important benchmark is Unreal Engine 4 based games since many games run with this 3D engine.Hm...
I know they are cherry-picked, but that this is even possible at all says a lot.
RDNA 2's RT hardware is placed inside a DCU which is next to TMUs.The ray tracing hardware is built into the CUs, which also perform rasterization. So, it's the same hardware that does both functions.
that setup has me jelly............but the casing is is very close to the wall..........any issues with ventilation of the pc?I present to you OP's genuine desktop wallpaper:
Control - DLSS Modes Compared to Native 4K
The following are screen captures that I've made using the print-screen function. Every image is 4K but differently so. Some are upscaled to 4K via regular upscaling since that's the native resolution of my display, other are upscaled to 4K via Nvidia's DLSS (Deep Learning Super Sampling), and...www.neogaf.com
that setup has me jelly............but the casing is is very close to the wall..........any issues with ventilation of the pc?
Hmm I actually think that the perf/watt numbers are pretty bad for NVIDIA. There's a serious problem if you consume more power and underperform an AMD card.
But who knows how AMD is getting their numbers. It could be vendor specific optimizations on the dev front.
Where are people getting this idea that Nvidia's cards are underperforming relative to AMD's? The benchmarks indicate that they trade blows and other factors indicate that they have better ray tracing performance.
BluRayHiDefMoving the goalposts.
From the charts I've seen it seems like they're either outperforming or roughly trading blows. But like I said it could just be vendor specific optimizations.
NVIDIA seems to be underperforming in perf/watt. That's going to be the key metric for the next decade and NVIDIA will need some serious architectural upgrades to account for that.
You're right, for workloads outside of rasterization an NVIDIA card is going to probably outperform an AMD card but most gamers don't care about that stuff. Most of them want more fps.
The thing is, with that Zen3 cache that RDNA2 is swinging, nobody knows.You're right, for workloads outside of rasterization an NVIDIA card is going to probably outperform an AMD
Games you've picked.Nvidia's cards win a majority of the
So you hate that AMD can't compete with cyberpunk and other AAA games that are already out, and coming out? Do you not realize all games are coming out with raytracing? Might as well get the better performer. Or will you continue to shill as usual? Remember that raytracing sucks, until AMD can. Until it doesn't compare with Nvidia. But when it does, maybe it'll be relevant again? That's your logic in a nutshell.BluRayHiDef
Are you guys on crazy pills, or something?
Why is "how many RT games actually" and "does it include WoW" so triggering?
I was told VRS is not compatible with TAA and hence neither with TAA based upscaling, such as DLSS 2.0, do you know if that's true?
Ray tracing will stop sucking on November 12th. Mark my words.So you hate that AMD can't compete with cyberpunk and other AAA games that are already out, and coming out? Do you not realize all games are coming out with raytracing? Might as well get the better performer. Or will you continue to shill as usual? Remember that raytracing sucks, until AMD can. Until it doesn't compare with Nvidia. But when it does, maybe it'll be relevant again? That's your logic in a nutshell.
BluRayHiDef
Are you guys on crazy pills, or something?
Why is "how many RT games actually" and "does it include WoW" so triggering?
I was told VRS is not compatible with TAA and hence neither with TAA based upscaling, such as DLSS 2.0, do you know if that's true?
Just like DLSS sucks until AMD can utilize something similar, than it'll be a new thing all over again.Ray tracing will stop sucking on November 12th. Mark my words.
The thing is, with that Zen3 cache that RDNA2 is swinging, nobody knows.
Games you've picked.
You need to clarify how you picked 5 out of more than a dozen, to make a point.
Why would AMD somehow compete with (unreleased, cough) game?AMD can't compete with cyberpunk
NV's take on brute-force RT is a tech that is unlikely to get any serious traction any time soon and that won't change even if AMD's Zen 3 infinity cache lets RDNA2 cards wipe the floor with Ampere at RT.Ray tracing will stop sucking on November 12th. Mark my words.
What kind of "next gen features" does this PS4 game utilize?next-gen features. Death Stranding,
You got it. Hate all you want. If you had a choice to turn it on or off, you would keep that sucker on all the fucking time.What kind of "next gen features" does this PS4 game utilize?
Fancy-pants TAA upscaling?
You got it. Hate all you want. If you had a choice to turn it on or off, you would keep that sucker on all the fucking time.
These kind of posts is why I call this forum SodiumGaf (or NeoSalt).
<snip> sorry
Hence, if Ampere were to be refreshed on TSMC's "7nm" manufacturing processes, it would be outright faster than RDNA2 in rasterization.
If you had a choice to turn it on or off, you would keep that sucker on all the fucking time
You are wrong again, how surprising. It's because I own 3080 and before that I owned 2080Ti. It's because I played games like Death Stranding or Wolfenstein and now I'm playing Watch Dogs Legion.Because, let me guess, some cherry picking videos convinced you so?
How do you figure which cherry picked pics are to be ignored, by the way?
Shall I go into your post history to expose how many times you have peddled that same image, over and over again? Why cherry pick the worst case scenario? Especially as someone who hasn't ever used DLSS to begin with? Can't really have a valid opinion without ever seeing it, especially since you can't tell the difference between certain obvious differences. Aka raytracing/DLSS. That is until AMD touts the superiority... HmmmBecause, let me guess, some cherry picking videos convinced you so?
How do you figure which cherry picked pics are to be ignored, by the way?
If you're basing your opinion on screenshots / YouTube videos you are disqualified to weight in.
Because yet another... inidividual essentially claimed it's better than native again.Why cherry pick the worst case scenario?
Nvidia's cards outperform them in The Division 2, Doom Eternal, and Resident Evil 3, whereas they outperform Nvidia's cards in Call of Duty Modern Warfare and Forza Horizon 4; so, Nvidia's cards win a majority of the time. Keep in mind that AMD's cards are benefiting from SAM (Smart Access Memory) as a result of being paired with Zen 3 CPUs and may be benefiting from their "Rage Mode" overclock feature.
-------------------------------------
Nvidia's cards win outright.
-------------------------------------
Nvidia's cards win due to each of them beating the correspondingly classed AMD card.
-------------------------------------
AMD's cards win due to each of them beating the correspondingly classed Nvidia card.
-------------------------------------
AMD's cards win outright.
-------------------------------------
Nvidia's cards win due to each of them beating the correspondingly classed AMD card.
So, Nvidia's cards win a majority of the time and do so without benefiting from any performance boosting feature.
The thing is, with that Zen3 cache that RDNA2 is swinging, nobody knows.
At least try and hide your biasness.... Unlike you, many of us want the best performer. Price is irrelevant to an extent. And even at lower price points, how can you hate on better performance and image quality of DLSS? Are you doing 800x zoom, at 60fps with RTX on? And even then DLSS has proven to be better than native 4K.I need to buy an overpriced card junk from team green
Yeah, like with infinity cache. or Radeon Chill, or Anti-Lag, or DirectML, or FidelityFX (it's not only CAS and it's used in 35+ games) please, tell us more FUD.AMD tends to follow from behind
Yeah, like with infinity cache. or Radeon Chill, or Anti-Lag, or DirectML, or FidelityFX (it's not only CAS) please, tell us more FUD.
We have small 6900XT on 256bit bus on GDDR6 beating bigger card with higher power consumption on GDDR6x with 384bit bus.
What the hell is team green? Don't start with this fanboy bullshit. I couldn't care less about the logo on the card, I care about advancements in technology.I need to buy an overpriced card junk from team green to figure out, whether the hyped upscaling tech from the said company, that is available in a whopping handful of games and is incompatible with VRS, is worth it to buy the said overpriced card from team green.
Screnshots are not to be trusted.
I think it sounds reasonable.
TAA blurs things inherently.
TAA + some NN still blurs things.
Learn to deal with it.
Of course it gives advantages in certain cases, it wouldn't exist if it did not.
Because yet another... inidividual essentially claimed it's better than native again.
if you had an option to turn it on or off, you would keep it on
I don't know. How much did you pay for that meal you had last weekend? I don't track these kind of expenses.If it's not a secret, how much did you pay for your 3080?
Oh, that's not a problem at all.I don't know. How much did you pay for that meal you had last weekend? I don't track these kind of expenses.
Unfortunate that those are jpeg and not png.One is native, one is with DLSS
Yeah, like with infinity cache. or Radeon Chill, or Anti-Lag, or DirectML, or FidelityFX (it's not only CAS and it's used in 35+ games) please, tell us more FUD.
We have 6900XT on 256bit bus on GDDR6 beating bigger card with higher power consumption on GDDR6x with 384bit bus.
Unfortunate that those are jpeg and not png.
I'd expect this one to be the TAA upscaled one:
https://i.imgur.com/ qRJxtHU.jpg
I zoomed into the tree on the left.
Oh, that's not a problem at all.
I didn't have any meals including GPUs last weekend.
Unfortunate that those are jpeg and not png.
I'd expect this one to be the TAA upscaled one:
https://i.imgur.com/ qRJxtHU.jpg
I zoomed into the tree on the left.
Radeon Chill, Anti-Lag is software, so is FidelityFX suite (which, as nearly everything AMD, is cross platform)..I was talking about software not hardware
You would "expect" and you had to zoom into the tree. You just proven my point.
It is a problem too, although smaller one, as it is a jpeg of a zoomed in crop, not original.But that one image you keep posting all the time is jpeg as well it's not a problem.
It does and the easy how I could spot which of the two images is DLSS 2.0 upscaled (the blurrier one) is demonstration of it.Nvidia doesn't do "TAA upscaling".
Again, sorry for your 3080 pain.
Sorry for your 1080p monitor pain. If you're using a 1080p monitor to examine DLSS then it's you that have a lot of learning to do.If you expected me to examine 4k pic on 1080p monitor without zooming in, there is a lot about how images and PCs work that you need to learn.
It still costs transistors.The ray tracing hardware is built into the CUs, which also perform rasterization. So, it's the same hardware that does both functions.
It does and the easy how I could spot which of the two images is DLSS 2.0 upscaled (the blurrier one) is demonstration of it.
It was so easy that you had to zoom into a bunch of tree leaves (and even then, the difference is miniscule).
Let me guess, you spent $1500 on a 3090 and now you're not sleeping so well
Guys, you should stop jumping around and face it:
1) TAA and derivatives have its uses
2) NV has indeed one of the best TAA derivatives
3) "better than native" and "like native" is utter BS
4) DLSS 2 still suffers from most TAA woes, including blur, quickly moving objects, wiping out fine details
So, there is that. A useful feature, not even remotely a silver bullet someone pretends it to be.
downplay DLSS by calling it TAA
It uses NN inference when processing images, you can call it "uses machine learning" if it makes you feel better.it uses machine learning
I didn't have problem spotting blurry tree in the image from the last page.a native-like image quality at much better performance levels.
It uses NN inference when processing images
Whether this is "native-like" (as in "we have upscaled from 1440p, doesn't it look like 4k"), is in the eyes of the beholder:
"But sometimes it looks better", yeah, sometimes it does. And sometimes it looks worse too.