There's is comparison with 2080ti in doom eternal. RTX 3080 is 50% (up to 65%) faster and 2080ti has 1GB more VRAM.
Yes it seems Nvidia have recently released a new comparison video with the 2080ti for Doom Eternal. Granted we don't know what settings/drivers etc... they are using and as with anything we will need to wait for independent benchmarks to see the real performance/make an informed comparison. But that looks like a nice uplift for Doom.
It looks like RTX 3080 will be indeed around 80% faster than standard RTX 2080, and probably more with RT.
I think 60-70% might be a more realistic figure? Either way an impressive uplift, but again we will need to wait for real benchmarks and comparisons to see how it plays out across a number of titles. Of course with RT on I expect a bigger uplift than pure rasterization as the RT gains for Ampere seem great compared to Turing.
You say RDNA2 GPU will be competitive, but Ampere will be already much cheaper compared to Turing and even if RDNA2 will match Ampere raster performance Nvidia will still have the upper hand (DLSS, better compatibility with exsiting RT games).
When I say competitive I mean roughly equal/higher rasterization performance, ya know the actual "power" of a card we have always compared. No matter what anyone says this is still and will always remain the most important metric regarding a cards performance. At the moment there are thousands of games on steam for example that don't use either RT or DLSS and there will continue to be newer releases that don't support either.
I find it interesting that for Nvidia fans, or just people swept up in the current 3000 series hype train the narrative seems to continue shifting as new rumours come out about what RDNA2 might bring.
First the narrative was "Lol AMD would be lucky to compare to a 3070 with their highest card! Expect mid range performance from AMD as usual!". Then as rumours come out stating that RDNA2 might actually be competitive or might even exceed 3070/3080 in power, then suddenly the narrative shifts to "Well even if they do match/exceed in rasterization, that is not important anymore! Because it is all about RT and DLSS!!".
RDNA2 cards will have Ray Tracing. How performant this will be is anyone's guess. Current rumours seem to suggest better than Turing, but probably not as good as Ampere. So let's assume for now that a best case scenario for RDNA2 would be somewhere between 60-80% as performant as Ampere at RT. That is still pretty solid, especially for a first generation RT product. And dare I say it reasonably competitive? Now maybe it could also be much worse than that figure? Hard to say until we have any actual benchmarks.
As for DLSS, it seems like a fantastic technology. Nvidia's R&D teams have come up with something really great here and I really like to see them continue to advance it. Is it the be all and end all of GPU/Graphics technology? Of course not. Is it still great for games that support it? Yeah it seems like a great feature.
Does it look "better than native 4K"? No of course not. Listen, a lot of people need to realize this:
Fake is never better than Real.
DLSS is not perfect, introduces some artifacts and obviously is not as good as native res. Having said that is it "close enough" in a lot of cases? It seems like it might be. I say "Might be" because I have not used it myself so can't comment from first hand experience but most reports have it performing quite well for a small reduction in image quality with a few scaling artifacts here and there.
Now here is the rub, there are only around 10 games that support DLSS right now. Out of the thousands on PC. Most of those recent and older games will never support DLSS. Even a lot of newer games probably won't. The reason is that so far it has to be implemented on a game by game basis, rather than something that can be applied to all games. So trying to use DLSS as some kind of trump card when it is barely supported as of now is a bit silly.
People also seem to not understand that this is not something you can just automatically turn on and it just works for all games. So when people try to downplay actual power (rasterization) and instead hype DLSS as more important when it has a tiny handful of bespoke supported games this is just silly and won't matter to 90% of people and so far 99% of games. Could it become more important/supported in the future? Sure and if almost every game supports DLSS in 4 years time then I'll eat my crow.
Will AMD have something to compete/compare with DLSS? So far I haven't heard anything about it so we can probably safely guess it is unlikely. Could they have something in the future either through software updates or later cards? Who knows, maybe. Again it depends on how much DLSS continues to improve/is adopted by developers. Right now anyway I wouldn't base my GPU purchase solely on DLSS/AI Upscaling as the deciding factor or most important feature.
Do I think AMD will be competitive on price? Almost definitely. It has generally been their strategy to undercut Nvidia so I expect either slightly cheaper or worst case scenario equal pricing for equivalent power tier from Nvidia. I'm not really trying to downplay Nvidia here and upsell AMD, not even trying to pop the hype bubble for Nvidia/3000 series fans, my only goal is to try to reign in expectations a little and bring everyone back down to earth is all. Also correcting false information that we often see continuously such as "Nvidia claimed 2x performance of a 2080ti!" etc.
As with anything, I could be wrong about everything above. Most of what we know about Ampere right now is marketing without any real independent benchmarks and we only have varying rumours about RDNA2, sometimes contradictory. The best and most measured course of action is to wait until both release and have actual benchmarks to compare them.