• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Game Ready Driver for RDR2 Out Now; RTX 2080 Ti Can Only Do 4K@60 with a Mix of Medium & High Settings

thelastword

Banned
NVIDIA has released a new Game Ready driver (version 441.12) today, delivering optimizations for Red Dead Redemption 2 (out tomorrow) and Need for Speed Heat (out on Friday). The official changelog document also mentions new optimizations for Borderlands 3.

NVIDIA also posted a blog article today where they shared graphics card recommendations for running Red Dead Redemption 2 at 60 frames per second on PC. It looks it won't be the easiest feat to pull off, particularly at high resolution. NVIDIA recommends a GeForce RTX 2060 to play the game at 1080P@60FPS on High settings, a GeForce RTX 2070 SUPER to play at 1440P@60FPS on High settings, and a GeForce RTX 2080 Ti to play at 4K@60FPS.

However, not even the mighty RTX 2080 Ti can pull off High settings at that resolution and frame rate, which is why the folks from NVIDIA recommend to either drop the detail level to a mix of Medium and High settings or alternatively to drop the rendering resolution to 3264x1836 or 2880x1620, then using the sharpening and upscaling technique now available through NVIDIA's Control Panel to make up for the resolution drop.

The good news is that Red Dead Redemption 2 features a built-in benchmark, making it easy for everyone to measure their system's performance. We'll have our own performance impressions and benchmarks up soon enough; stay tuned.

https://wccftech.com/game-ready-dri...y-do-4k60-with-a-mix-of-medium-high-settings/

-----------------------------------
So either PC settings are much better than consoles or this is not the best optimized game out there...…..Medium and high settings to achieve 4k 60fps on a 2080ti and 1080p 60fps on high will require an RTX 2060 seems kinda high.......So what happens to 580 and 1060 class GPU's....We shall see when it releases......
 

PhoenixTank

Member
So either PC settings are much better than consoles or this is not the best optimized game out there...…..Medium and high settings to achieve 4k 60fps on a 2080ti and 1080p 60fps on high will require an RTX 2060 seems kinda high.......So what happens to 580 and 1060 class GPU's....We shall see when it releases......
This is strange given that an RX 480 is listed as part of the recommended specs, which as far as I can tell meant 1080p60.
 
Not surprising.

100916.png
 
Eh, I'll wait for the benchmarks. And it's nice this game has a built in benchmark.

As in MOST games there are going to be a small handful of settings that DESTROY performance for barely noticeable graphical improvements.

I bet that we'll be able to get 4k 60fps out of 1080ti level GPUs with some reasonable tweaks that maintain 95%+ graphical fidelity.

I remember GTA V for PC, the "normal" graphics settings were console settings. Is "medium" the new normal?

This is also Nvidia we're talking about. Are they talking about having to lower other settings because ... we couldn't possibly turn down/off the all important RTX RayTracing now could we.
 
Last edited:

chinoXL

Member
And yet people are expecting miracles out of next gen consoles, that will be much less powerful than a 2080ti.
Thats why i said a lot of next gen games will just look like current gen games, but at 4k.
its not that cut n dry though..some of the games the ps4 and xbone pulled off this gen with their relatively lower powered gpu's cant be compared 1 to 1 to a pc gpu..
 

Alexios

Cores, shaders and BIOS oh my!
This is still relevant.
That "only" in the title and its connotations further elaborated on in the rest of the post are ignorant and misguided to say the least.
 
Last edited:

thelastword

Banned
This is still relevant.
That "only" in the title and its connotations further elaborated on in the rest of the post are ignorant and misguided to say the least.
Max settings is how you test the best the game offers, sometimes the aesthetic gains are not worth it, sometimes certain features just brings the GPU to their knees, but it is how things are tested. As a PC guy for playability, you know what you have to do, but for the raw maxed sliders tests to show which GPU's come out on top, yes they are good for benchmarks.....No one is saying you should play at max and no one is ignorant for playing with the best settings either.....You do you, it's their money, not yours, they can play how they want to play....
 
It's that this thread title ( which I realize is not yours ) is clickbait. It makes it seem like 4k60 is somehow impossible even for the most powerful GPUs.

In reality 4k 60 will likely be very possible on lesser GPUs while only having to tone down the most extreme offenders in the graphics settings.

Anyway, we will see in about 9 hours from now :messenger_sunglasses:
 

Jtibh

Banned
Ok so how does this work.
Xbox runs at native 4k if i am not misstaken at medium tops high settings.
And yeah at 30fps.
But i am under the impression fps has to do with the cpu?
So why is pc having issues?
 

Alexios

Cores, shaders and BIOS oh my!
Max settings is how you test the best the game offers, sometimes the aesthetic gains are not worth it, sometimes certain features just brings the GPU to their knees, but it is how things are tested. As a PC guy for playability, you know what you have to do, but for the raw maxed sliders tests to show which GPU's come out on top, yes they are good for benchmarks.....No one is saying you should play at max and no one is ignorant for playing with the best settings either.....You do you, it's their money, not yours, they can play how they want to play....
Where did I say anything that even begins to beg for a response like yours? It's like you're quoting the wrong post. Nowhere did I say someone is ignorant for playing at max settings or choosing to do anything how they want. What's ignorant is to try and be mean at a piece of hardware or software by saying that in combination they "only" achieve this or that performance level at less than some nebulous vague shit like some game's high settings before knowing anything about what that entails and if that really shows some deficiency of the hardware or even the software for having options that can perform in this manner under the tested combination.

Basically actually read the OP I linked, it makes a very simple and to the point, er, point, even someone without much knowledge on the subject can get it, there's nothing whatsoever judgmental about how one chooses to play or test either in that or in my own post for you to respond with this random shit I'm quoting here.

We just have two things here, the settings and the performance achieved by the GPU, to try and imply that can only be either because of weak hardware or bad optimization rather than simply report the facts where neither has to be true, or rather Nvidia's statements as we don't know if they're fact yet, is ignorant for sure.
 
Last edited:

thelastword

Banned
Eh, I'll wait for the benchmarks. And it's nice this game has a built in benchmark.

As in MOST games there are going to be a small handful of settings that DESTROY performance for barely noticeable graphical improvements.

I bet that we'll be able to get 4k 60fps out of 1080ti level GPUs with some reasonable tweaks that maintain 95%+ graphical fidelity.

I remember GTA V for PC, the "normal" graphics settings were console settings. Is "medium" the new normal?

This is also Nvidia we're talking about. Are they talking about having to lower other settings because ... we couldn't possibly turn down/off the all important RTX RayTracing now could we.
There is no Raytracing/RTX in RDR2...

I think the game will be playable at console settings with a RX580 easily, with enhanced features over that too...….I think the better CPU's on PC will help with the framerate quite a bit, but we shall see.....PC might see a nice boost over consoles......And I still don't think it was the most optimized game for PRO with that resolution and some of the textures overall, but you can't deny that draw distance/LOD and foliage, even textures and lighting are going to be boosted up on PC...
 

Siri

Banned
Can’t wait.... can I do some system bragging now... okay, here goes.... 9900k liquid cooled CPU, RTX 2080 TI FE GPU, DDR4 3200Mhz Corsair RAM, m.2 SSD, LG C9 OLED 4K now with Gsync!

Only cost me 8k!
 

dotnotbot

Member
Can’t wait.... can I do some system bragging now... okay, here goes.... 9900k liquid cooled CPU, RTX 2080 TI FE GPU, DDR4 3200Mhz Corsair RAM, m.2 SSD, LG C9 OLED 4K now with Gsync!

Only cost me 8k!

Big props for OLED, it always breaks my heart when I see #pcmasterrace with top specs, 2080ti and shit, all that paired with a shitty, disgusting 1337 Hz TN or glowing like a christmas tree IPS monitor.
 

PhoenixTank

Member
Ok so how does this work.
Xbox runs at native 4k if i am not misstaken at medium tops high settings.
And yeah at 30fps.
But i am under the impression fps has to do with the cpu?
So why is pc having issues?
Simplifying: the CPU is weak on the consoles, usually forcing a target of 30fps. If you're limited in that respect and have the GPU headroom you can bump up the visuals or resolution without there being a hit to framerate.
 

Alexios

Cores, shaders and BIOS oh my!
Can’t wait.... can I do some system bragging now... okay, here goes.... 9900k liquid cooled CPU, RTX 2080 TI FE GPU, DDR4 3200Mhz Corsair RAM, m.2 SSD, LG C9 OLED 4K now with Gsync!

Only cost me 8k!
Better be the ~$5k 77" C9 TV otherwise you overpaid by a few thousands dollars.
 
Last edited:

ethomaz

Banned
Ok so how does this work.
Xbox runs at native 4k if i am not misstaken at medium tops high settings.
And yeah at 30fps.
But i am under the impression fps has to do with the cpu?
So why is pc having issues?
FPS has to do with both GPU and CPU...
To have a better ideia you need to know if the game is GPU or CPU bound... if it is GPU bound (like GTA) then you will need twice the GPU power to make a 30fps game run at 60fps.
 

Celcius

°Temp. member
I know a lot of people are interested in RDR2, but I'd be interested in knowing more about how this driver improves things for BL3
 

Max_Po

Banned
No need to worry, everyone on here has SLI setup with 2 x 2080 TI.

All were laughing at XboxOne 4k 30 fps... or PS4 Pro's checker-boarding reconstruction...

A year later their Team Green God has failed them.....



on a serious note.... can't wait to test out my 1080 Ti.
 

daninthemix

Member
On Max settings GTA5 had about ten times the foliage of the console versions, and it went out about 4 times further into the distance. Complete overkill and you could claw back a ton of performance by reigning in just that one setting.

I suspect it will be the same here.
 

Kagey K

Banned
Oof, if this is master race, I’m cool being medium race.
Good luck to you all chasing that dream though.

id rather invest my money in something good, while using my surplus for entertaiment.
 
Last edited:

h00jraq

Neo Member
And yet people are expecting miracles out of next gen consoles, that will be much less powerful than a 2080ti.
Thats why i said a lot of next gen games will just look like current gen games, but at 4k.
I think you have no idea about hardware of current/next gen consoles and about programming.
The GPU in Xbox One X is(performance wise) looking on pure numbers, probably Radeon X580 but with better memoru bandwich. And it has 40 "customized units" and we have no idea what thateven means to be honest.
Looking by TeraFlops, Xbox One X GPU just can't do 4k, not matter what. But then just look on RDR 2, DMC 5 and last GoW 5 which runs between 1584P -2160P(4K) in 60 fps. X580 can't do that.

But tools provided by M$ and Sony give you direct access to hardware power and you don't loose this power somewhere in between. Game optimization is much, much better andbecauseall of that, overal performanceis much better then you could expect by looking at hardware.

By taking all of that into account, it is quite easy do imagine that new consoles will be capable of doing 60fps@4K
 
Looking forward to seeing what graphics options are available and in-depth comparison to the console versions.

I think it's ~107 GB on PS4 but it's ~150 on PC. What will those extra GBs get us?
 

Romulus

Member
I'll thinking RDR2 on ps5 will be 4k 60fps dynamic resolution with infrequent drops, mixed settings on ps5.

If gears 5 can do 1580p-4k at 60fps on X1X with a shit tablet CPU that's worse or barely on par to an I3, the ps5 and Scarlett will easily pull off better results, especially with a massively improved CPU, better GPU, and even more bandwidth.
 
Last edited:

Alexios

Cores, shaders and BIOS oh my!
I think you have no idea about hardware of current/next gen consoles and about programming.
The GPU in Xbox One X is(performance wise) looking on pure numbers, probably Radeon X580 but with better memoru bandwich. And it has 40 "customized units" and we have no idea what thateven means to be honest.
Looking by TeraFlops, Xbox One X GPU just can't do 4k, not matter what. But then just look on RDR 2, DMC 5 and last GoW 5 which runs between 1584P -2160P(4K) in 60 fps. X580 can't do that.

But tools provided by M$ and Sony give you direct access to hardware power and you don't loose this power somewhere in between. Game optimization is much, much better andbecauseall of that, overal performanceis much better then you could expect by looking at hardware.

By taking all of that into account, it is quite easy do imagine that new consoles will be capable of doing 60fps@4K
DMC5 is not native 4K on the X, it's reconstructed. I can't find a number for the actual resolution it reconstructs from. Digital Foundry did a very good comparison video where the 580 with some settings similar to X and others higher is absolutely flawless at 1440p native never dropping below 60fps during gameplay. Bumping it up to 4K and enabling the interlacing option it doesn't quite reach 60 but it hovers close enough to think if they offered the same 580 optimized reconstruction technique as on the X rather than just the catch-all interlacing and he lowered the settings enough to match the X rather than slightly surpass it based on his preference maybe it would maintain 60 on the 580. Either way it's pretty darn close so there's no magic sauce difference, especially with Windows/Denuvo hogging resources. No idea about the other games,. I just recalled that about DMC5.
 
Last edited:

klosos

Member
I don't care about 4k , i am on the 21.9 3440x1440 Ultrawide train , after you have used a top quality ultrawide you will never go back to 16.9. the only way i game in the last two years.
 

sendit

Member
And yet people are expecting miracles out of next gen consoles, that will be much less powerful than a 2080ti.
Thats why i said a lot of next gen games will just look like current gen games, but at 4k.

Next gen games will look much better.
Lol what? A 1080Ti founders faster than a 2080 non founders? RTX tax in more ways than one I guess :messenger_grinning_sweat:

This I will have to agree with. The 2080 TI should just have been the 2080.
 
Top Bottom