With all this talk on DLSS......the constant, "Amd has to offer an alternative to DLSS or else".......Seems like De Ja Vu. They said the same thing about Navi 1.0 when the Super cards launched. People were hyped the same way about DLSS 1.0 as they are now for DLSS 2.0....and how many DLSS games do we have even after 2.0? Not many.......Initially, Nvidia said DLSS 1.0 was it's own form of AA due to A.I, they said no TAA for us, no time for that blurry shit. Then when the comparisons came, the IQ on DLSS 2.0 was awful with jaggies and shimmering everywhere, so Nvidia simply adopted TAA for DLSS 2.0 and then sharpened the snort out of the image, whilst using some A.I to clean artefacts in the foreground, but there is still missing detail in backgrounds as approximations are just that, approximations. Yet, people call it the holy grail, but a proper comparison has not been done on it yet. When more DLSS games are released, when some Direct ML games are released, when CB 2.0/Fidelity FX 2.0 is released we will sit at this table and analyze properly. There are not enough games, there is not enough content to justify any hype or proper breakdown of the techniques as even AMD has not evolved their tech from CB 1.0, at least let them release their software first and do their reveal.
DLSS 1.0 was hyped even more than 2.0, still only a few games.... Raytracing through Turing was hyped just the same....People are only going to take stock of RT now, since AMD has made it viable on consoles and it's also a feature on their PC GPU's. Hype does not make anything stick in this industry.....The reality is this, RT is going to take off now because of AMD, when AMD announces it's upscaling and reconstruction tech, that is what is going to be used predominantly by the industry, because it's the consoles that determine how popular specific gaming features can get by normalizing it. We have two consoles built ground up on AMD tech. Doing a faster reconstruction technique without using A.I makes more sense for consoles, it would mean the technology would just work without the need for any third party or super computer or server farm, it would be innate to the console hardware and an easier implementation for devs and of course adoption in games...Nvidia's proprietary nonsense has never really gone too far and with AMD dominating CPU's and now clutching at the GPU teat with a fierce bite from the leaks so far, it's inevitable that their technologies and architecture will be prioritized over Nvidia's proprietary tech......Console tech is pretty much AMD tech and that's 200 million gamers that devs will use what's best for said architecture....
So preliminary AMD RT performance, don't worry about that, devs will learn how to utilize the strengths by which AMD cards do RT and get it to run even faster, because it's what's in the consoles too. The infinity cache, great for rasterization and if it's being used a bit more when RT comes into the picture, devs will use that better too. AMD's approach to RT is still new and devs are putting their heads around it now, console devs are already doing great with it, even in crossgen titles like Miles Morales, which happens to have a very impressive implementation of RT, it will only get better. Devs will seek to get more RT performance on AMD hardware as the consoles launch, RDNA PC GPU's launch and beyond. This is what devs will prioritize, not ampere performance or DLSS 2.0 and you will see an uptick in performance and quality of RT on consoles and RDNA 2 GPU's as we go along...
AMD GPU's has more vram, the infinity cache, better GPGPU performance, the writing is on the wall.....DLSS 2.0 is not Nvidia's savior because there will be many solutions to combat that, at AMD and even from MS. What becomes most adopted as the go to reconstruction technique will have very little to do with Nvidia.... Stop hanging your hats on DLSS 2.0 as soon as you hear rumors that AMD has taken the rasterization performance crown.....
Competition is a good thing and you guys should be glad, but seeing threads like "AMD will not even match the 2080ti/3070".......tweets like "There's just no way RDNA2 gets close to RTX 3000 performance, if these numbers are real ", in which Herkelman answered with an emoji, that was shortly after Jensen gave his spiel on 30TF cards without the caveat..... and spoke of an 8K 3090, but was not in hurry to emphasize that this was with 1440p DLSS upscaling to 8K...So yes, stop hiding behind marketing speak, cards with such TF power should ideally be doing 8k and 4k 120Hz imo.....You can't talk about 33TF and on the flip use a cost cutting DLSS to sell your higher rez......I mean if you are going to defend something, defend a company which is pushing the technology forward with higher clocks, infinity cache, 7nm standards which nets more performance per watt, a company that has given consoles some damn good CPU+GPU power on 300-350 watts on a SFF, with no BS like 4 core i7's which we would still be on had it not been for Ryzen...A company that's pushing open standards with Radeon Rays, Freesync, whilst all the other company does is do proprietary stuff to kill the competition and industry, hog performance and ask for insanely high prices because people are high on the Fellatio 1-0-Huang.....
The industry is changing, it's for the good... In a few years, Infinity Fabric will be on everything, chiplets will be on GPU's and CPU's. What were we expecting, pushing insane clocks on lower single fab nodes forever.... I guess if our intention was to limit core counts and maximize profit on 4 core i7's for another decade, that would be a sound plan....but I'm glad the industry is moving forward. Next gen, I'm all but ready to embrace multi GPU's and CPU's on one die, I'm ready to embrace even more revolutions in this industry.....Yet in the here and now, I can't deny who has has got us here and provided this much needed injection in the vein of this industry.....We were stagnant for far too long, almost like we were pushing daisies, but better days are here, that's for sure....