Let’s see
The Last of Us Part 1 -
Problematic -
AMD sponsored
Hogwarts Legacy -
No problems after the fix - no sponsors
Resident Evil 4 -
Problematic -
AMD sponsored
Forspoken -
Problematic -
AMD sponsored
A Plague Tale: Requiem -
Not problematic -
Nvidia sponsored
The Callisto Protocol -
Problematic -
AMD sponsored
Warhammer 40,000: Darktide -
No problems -
Nvidia sponsored
Call of Duty Modern Warfare II -
Not problematic for VRAM, huge performance gap AMD side - no sponsors
Dying Light 2 -
Not problematic -
Nvidia sponsored
Dead Space -
Not problematic - no sponsors
Fortnite -
Not problematic - no sponsors
Halo Infinite -
Problematic -
AMD sponsored
Returnal -
Not problematic - no sponsors (?)
Marvel’s Spider-Man: Miles Morales -
Not problematic -
Nvidia sponsored (?, more like remastered version was i think?)
".. definitive proof that 8GB of VRAM is no longer sufficient for high-end gaming and to be clear, i'm not talking about a single outlier here in TLOU part 1, there are a number of new titles, AAA titles, that will break 8GB GPUs, and you can expect many more before year's end and of course, into the future.."
He says with that smug face
If there was some sort of Hardware unboxed bias pattern and AMD sponsorship..
Same guy who claimed that ampere tensor cores had no advantage compared to Turing when the 3070's 184 tensor cores was matching 2080 Ti's 544 tensor cores. Or that the 3080 is faster at stuff like ray tracing because it's a faster GPU, not because of 2nd gen RT cores, while the 90MHz difference should explain the 42% increase in performances
HUB