Nah, this is false. In the past you could see, like you can still see with AMD, where they explicitly say which cards get a performance increase in which game and the older ones pretty much were forgotten as soon as they could unless it was a game-breaking issue. A lot of the Nvidia performance crown comes from game-specific driver improvements, unlike AMD, so when they get new cards they shift focus and if you have an old card you'll not get the performance tier your hardware should get otherwise.
Let me give you a concrete example, because I've been collating GPU data for myself. Normally, a GTX 1080 was on-par with a Vega 64, sometimes slightly better, sometimes slightly worse, but they were in the same tier. Now, you can see more and more examples of where the Vega just blows it out of the water. Why? And keep in mind, the funny thing is The Surge 2 was actually developed on Pascal GPUs, you can see this in some of the behind the scenes videos. So it's not even a question of the devs not testing the hardware.
On the other hand, Nvidia pumped so many drivers for Maxwell & Pascal for Witcher 3, you can clearly see the result, and ofc it was a flagship title for Nvidia and got all the special treatment. So when I'm saying what I'm saying about the driver support, I'm talking to you about it from years of experience with this shit, because I've been reading driver notes since the 3dfx days. And ofc, if we compare other games you'll find the same pattern emerging (RDR 2 & Control to name 2 others).
Data is from TechPowerUp, the unreleased cards at the bottom are just something I was visualising for myself, obviously that performance is just an estimate. Shoulda cut them out but meh, forgot.