Hogwarts Legacy is a fucking mess, it can consume 24GB of RAM and 14GB of VRAM (saw that on my 6800) and at the same time barely look better than Xbox version that has 13GB of total memory available...
I think that few games last year were really, really badly optizmed on memory front and this massive vram requirements spike won't become a trend in the end. But 8GB is really on the edge, I can't use card like that with 4K screen but for 1440p target it can probably still be decent.
yeah 4k was fun while it lasted. it really looks gorgeous over 1440p. only saving grace for 1440p for me is that it looks better than 1080p by a noticable margin, but by itself is not impressive as 4k is (of course it shouldn't have, 4k is just a brilliant target). i like 4k dlss performance better than native 1440p for example. but it is what it is, 4k buffer with 8 gb vram going forward is just not feasible
1080p folks will probably do decently with 8 GB. I will have to squueze every bit of optimized settings to make 1440p dlss balanced work as a minimum baseline. For me I feel like I will be able to hold out till GTA 6. And by then my GPU will be 6 years old so I will be able to say I got great mileage out of it (counting all niche path traced old titles, dlss reflex stuff, low latency gaming,
DLDSR, and so on etc. etc.) it has been 3.5 years and im mostly fine with the product but helps that i have gotten it for MSRP 500 bucks. For MSRP 500 bucks, being able to play gorgeous games like alan wake 2 / avatar decently in 2023, I would say it did okay. And it will keep doing so for a while from the looks of it. But for people who have gotten it for absurd 900+ bucks prices, whelp, they probably have every right to complain about it... (they also have themselves to blame for it though)
dlss itself is a great feature imo and for that alone it was a worthy investment. it is the only upscaler I can stomach at 1440p
also notice how the exact same problem now happens in final fantasy game. it probably is 1-2 gb short of what devs have intended on PS5's vram budget, and as a result, game loads garbage textures all over the place. i cannot say this is a problem with unreal engine in general, why? because i played jedi survivor with ray tracing at 1440p and textures look phenomenal at all times. so i dont even understand how certain games get so bad so quickly. there are good unreal examples and bad unreal examples which makes it hard to make blanket statements about it.
it probably has to do with texture classification. identifying textures based on their distance and importance must be playing an important role for proper texture streaming. although unoptimized mess the jedi survivor is, it actually has the most decent texture streamer out of an unreal engine i've ever seen. even in the 4 gb vs 8 gb video hardware unboxed did, 4 gb card actually looked "decentish" considering how heavy the game appears to be on VRAM. not once while playing jedi survivor that I saw potato, garbage or horrible textures, despite pushing the visuals to their limits
texture streaming however is crucial and will be the name of the game if devs want to scale their games gracefully from series s to 8 gb cards to series x/ps5. it will benefit everyone, and probably... nvidia preyed on/hoped for this to happen.