- Aug 21, 2015
RajmanGaming HD used an NVIDIA GeForce RTX3090 and as we can see, this $1500 GPU cannot run Watch Dogs Legion with 60fps in 4K/Ultra settings.
Who would've thought that a graphics card that costs a whopping $1500 and has an astounding 10,496 CUDA cores, 24 gigabytes of VRAM (GDDR6X), and a bandwidth of 936 gigabytes per second would fail to maintain sixty frames per second when running a typical, bland-looking game by Ubisoft?
Sure, the conditions under which it fails are extreme - Ultra settings and 4K resolution - but the RTX 3090 was designed to be The 4K Beast and a card that can flirt with even 8K.
So, Watch Dogs: Legion should be child's play for the Big Ferocious GPU, particularly with ray tracing turned off, considering the severe lack of detail on the textures of objects and the skin of character models, as well as the apparent low number of polygons that comprise the models of both objects and characters.
This game doesn't appear to feature graphical features such as tesselation, subsurface scattering, volumetric lighting, or global illumination. So, why is it so demanding?
Red Dead Redemption 2 is a much more graphically advanced game, yet the RTX 3090 can run it consistently at 70 to 85 frames per second in 4K with Ultra settings.
What do you think is the explanation, PC gamers? That the game has been poorly coded or that it genuinely warrants being as demanding as it is?