• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry : Red Dead Redemption 2 PC: What Does It Take To Run At 60fps?

Evilms

Banned



Summary Part 1:
  • The complete video as well as the comparison with the console versions will come later, here is only a brief overview of what you need to know about this PC version to be in 60fps with the visual settings closest to the console versions.
  • The XBOX ONE X version released last year had for example some settings equivalent to High on PC (but not many), some others in Medium, many in Low and even some in Sub-Low, which are not possible on PC.
  • The Vulkan API seems better for better performance on graphics cards like the GTX 1060 while there is not much difference in 4K on cards like the RTX 2080.
  • Vulkan is generally better on AMD and Nvidia than DX12 which even seems to cause some slowdowns.
  • The multiple graphical options available on the PC version represent a big upgrade compared to the console versions.
  • Apart from the crashes and worries related to Rockstar Games Launcher, the game is well optimized.
  • Last generation AMD cards have an advantage over Nvidia cards in terms of performance as is the case with the RX 580 over the GTX 1060.
  • On the other hand, the GTX 970 against the R9 390 is tighter, the GTX 970 holds surprisingly well on the game compared to the GTX 1060.
  • The GTX 1660 and 1660 Ti have 35% and 52% gain compared to the GTX 1060.
  • The performance of the RX 580 is similar to that of the GTX 1070 with a small lead over the GTX 1660.
(Update)




Summary Part 2:

  • Every graphic aspect has been tested and analyzed.
  • Red Dead Redemption 2 on PC offers more than 40 graphic options.
  • This PC version provides a high degree of flexibility and scalability.
  • With an i7-8700K processor and graphic card like GTX 1060/RX 580, it is possible to have 60 frames per second in 1080p with a visual quality similar to that of PS4/XBO by mixing setting High/Medium/Low.
  • The Low and Medium settings on PCs are not bad in themselves and may even in many cases appear better than their equivalents on Xbox One X.
  • On PC by switching from Ultra to High settings this will make you gain 21% in terms of performance, to Medium 36% and Low 53%.
  • For resolution, switching from 2160p to 1440p will increase your performance by 56% and 93% in 1080p.
  • The equivalent PC settings used on the Xbox One X are:
ZrwEdoL.png


  • For example, grass details and tree display distance are set to a minimum on Xbox One X, some settings seem to be hybrid settings not available on PC.
  • Most of the High and Ultra settings go far beyond what you could see on PS4 Pro and Xbox One X.
  • This PC version is almost a generational gap.
( Update 2)

 
Last edited:

CrustyBritches

Gold Member
Preliminary console equivalent settings(some settings can't match because consoles are "lower than low"). This is a 'Low'/'Medium' mix with some 'High' settings, and a dash of 'Ultra'. My head hurts from taking screencaps and looking through all the settings. *EDIT* I fucked that count up with overlapped images. Fixing right now.*EDIT*Fixed image.
 
Last edited:

stranno

Member
Long story short: Another Crysis. If you want to get the console level graphics you can but if you want to go everything ultra in 4K, just wait two generations of Nvidia HW.
 
Last edited:
Interesting that the consoles have some settings that are lower than the low preset on PC.

I still think there is quite a bit of optimization work that can be done to improve PC version performance.

I'd like to see a very detailed video that shows the graphical differences in the settings. Anyone know of one yet or are we still waiting on a future Digital Foundry vid?
 

Evilms

Banned
Interesting that the consoles have some settings that are lower than the low preset on PC.

I still think there is quite a bit of optimization work that can be done to improve PC version performance.

I'd like to see a very detailed video that shows the graphical differences in the settings. Anyone know of one yet or are we still waiting on a future Digital Foundry vid?

The comparison with the console versions will come later, maybe tomorrow.

 

Kazza

Member
The XBOX ONE X version released last year had for example some settings equivalent to High on PC (but not many), some others in Medium, many in Low and even some in Sub-Low, which are not possible on PC.

Oof!

Presumably those are settings which tax the Jaguar processors. I guess that even the weakest PC nowadays have better processors than the consoles.
 

Kenpachii

Member
So pretty much what i mentioned in the old thread.

Console runs the game nowhere near ultra.

Oof!

Presumably those are settings which tax the Jaguar processors. I guess that even the weakest PC nowadays have better processors than the consoles.

Not really, loads of settings are tuned down on the GPU department also in order to even get towards 4k. Visual quality has to be sacrifices to push insane resolutions.

CPU wise the game doesn't seem to be much taxing,
 
Last edited:

lukilladog

Member
So 60fps take 12 thread cpu (to avoid stuttering), 1080p, console like settings, and replace your gtx1060... good port, ok.
 
Last edited:

nkarafo

Member
I don't understand why the 1060 is so crippled in this game, performing almost the same as the 970. I feel like they are doing this on purpose to force people to upgrade? But why the 1060 in particular, is it the one most people have or something?

I wasn't even aiming for 60fps, more like 40fps at high settings/1080p but even that looks unfeasible...
 

lukilladog

Member
I don't understand why the 1060 is so crippled in this game, performing almost the same as the 970. I feel like they are doing this on purpose to force people to upgrade? But why the 1060 in particular, is it the one most people have or something?

I wasn't even aiming for 60fps, more like 40fps at high settings/1080p but even that looks unfeasible...

Yes it is:




Or just run the sub low console version

I play on PC only, I just don´t bother with mediocre ports.

Yes. The only two CPUs in the world are the Ryzen 3 2200G and the i7-8700K. There's nothing more reasonably priced in between these two extremes. Strange I know, you'd think there would be an untapped market there, but apparently not.

People are reporting stuttering with 8 threads too, gamers nexus coments on this.
 

nkarafo

Member
Yeah, after this i'm afraid to invest to another mid-range Nvidia card. Even though i'm an emulator enthusiast and emulators tend to do better on them, i'm going to have to change to AMD. Their mid range cards are of much better value and performance and they don't seem to suddenly get crippled for no reason.

Thanks a lot Nvidia.
 
People are reporting stuttering with 8 threads too, gamers nexus coments on this.
Gamers Nexus also reported stuttering on the 9700K but that the 2600K ran it fine, albeit with lower averages. Does the 2600K have more threads than the 9700K now?
Don't try to paint it like nothing under 6C12T can run it, that's simply not true.
 
Last edited:

lukilladog

Member
Gamers Nexus also reported stuttering on the 9700K but that the 2600K ran it fine, albeit with lower averages. Does the 2600K have more threads than the 9700K now?
Don't try to paint it like nothing under 6T12T can run it, that's simply not true.

12t still seems like your best chance to avoid stuttering on this game. The game is too picky on hardware, Jay2cents and Gamers Nexus reported problems with several configurations, and is crashing for everybody, Rockstar did a poor job and still charges full price after a year of the initial release, shame.
 
Last edited:
12t still seems like your best chance to avoid stuttering on this game. The game is too picky on hardware, Jay2cents and Gamers Nexus reported problems with several configurations, and is crashing for everybody, Rockstar did a poor job and still charges full price after a year of the initial release, shame.
I'll rephrase that for you. There isn't much consistency and no good predictor for what processors will or won't have stutter when under a high load. However The 9700K stuttering could be fixed easily by limiting it to 60FPS. The 2200G, by limiting it to 30FPS (and let's be real now, nobody gaming on a 2200G is expecting it to be a top tier experience). It's not really surprising. Not only can high load introduce frametime inconsistency on its own but RAGE itself likes to have a little room to breath. GTA5 does exactly the same thing on 4C4T CPUs at excessively high framerates. If RDR2 is more CPU heavy then it shouldn't be at all shocking that such stuttering would manifest at lower framerates or on higher tier parts.
 

lukilladog

Member
I'll rephrase that for you. There isn't much consistency and no good predictor for what processors will or won't have stutter when under a high load. However The 9700K stuttering could be fixed easily by limiting it to 60FPS. The 2200G, by limiting it to 30FPS (and let's be real now, nobody gaming on a 2200G is expecting it to be a top tier experience). It's not really surprising. Not only can high load introduce frametime inconsistency on its own but RAGE itself likes to have a little room to breath. GTA5 does exactly the same thing on 4C4T CPUs at excessively high framerates. If RDR2 is more CPU heavy then it shouldn't be at all shocking that such stuttering would manifest at lower framerates or on higher tier parts.

It shouldn´t be cpu heavy, designed for 8 core Jaguar and using Vulkan/dx12 on PC, should run on potato processors.
 
It shouldn´t be cpu heavy, designed for 8 core Jaguar and using Vulkan/dx12 on PC, should run on potato processors.
And GTA5 was made to run on the 3 core PowerPC CPU in the Xbox 360. RDR2 is the first truly current gen title from Rockstar (they took their sweet ass time, it's soon to be last gen).
 

pawel86ck

Banned
And GTA5 was made to run on the 3 core PowerPC CPU in the Xbox 360. RDR2 is the first truly current gen title from Rockstar (they took their sweet ass time, it's soon to be last gen).
Yes, RDR2 was made to run on current gen consoles from the start, so they were able to improve graphics fidelity even more, but GTA5 wasnt simple x360 port, PS4 / xbox one and PC ports use remastered graphics made to run on modern hardware as well. GTA5 has great scaling, you can play it on old PC with normal settings, or with maxed out details, but then not even 2080ti will provide locked 60fps at 4K (in grassy areas performance will dip below 50fps).

Screenshot-2014-06-13-09.12.14.png


I have a feeling that he’ll still adamantly insist that the PC version doesn’t look different from the X version...
I dont know who was that guy, but I also share his impressions. There are some differences probably if you pay attention to really subtle details, but these differences arnt nearly as big as between GTA5 PS3 vs PS4, and even not as big as between GTA5 PS4 vs PC. I have watched RDR2 comparison videos between xbox x and PC, but I couldnt tell much difference despite higher settings on PC.
 
Last edited:

ethomaz

Banned
I disagree with the optimization part because the game looks very unootimized on PC.

Seems like Rockstar had the game running on PC with optimizations for console and choose just to make a new launcher and let it scale with Brute Force.

Sub-low setting shows too what they had to do to archive 4k on consoles.
 
It's nice to see a game that's really pushing GCN come to the PC and allow the 5xx series to slap some Nvidia cards around.

AMD's bet on compute never really paid off them outside of the crypto and professional arenas; it'd be interesting to know if Rockstar really wurkd the async compute capabilities of GCN.

OTOH it's not so nice to see that the Rockstar launcher is a buggy fucking mess that wasn't ready for release.
 

lukilladog

Member
And GTA5 was made to run on the 3 core PowerPC CPU in the Xbox 360. RDR2 is the first truly current gen title from Rockstar (they took their sweet ass time, it's soon to be last gen).

That´s not excuse to make good use of console processors and bad use of PC processors for the port.

Why not just ask him? It was none other than lukilladog lukilladog

It looks very close, that´s all I´ve been saying. Already posted the candyland comparison.

and it probably does...at 30fps.

But there is no reason for that, Potato PC processors are twice faster or more than Jaguar 8 core.
 
That´s not excuse to make good use of console processors and bad use of PC processors for the port.
How do you know they're making bad use of PC processors? Or good use of console processors for that matter? The 9700K was running the game at like 4x the framerate before it started stuttering and just turning it up to 1440p was enough to solve it the problem. Huh...it's almost as if having your CPU be the bottleneck, regardless of how high that bottleneck is, probably isn't a great idea 🤔
It looks very close, that´s all I´ve been saying. Already posted the candyland comparison.
Lower than low intensifies
But there is no reason for that, Potato PC processors are twice faster or more than Jaguar 8 core.
Single core performance, absolutely...but I don't know if you noticed this, the 2200G is a 4C4T CPU. That's half as many cores (and threads) as the Jaguar 8 core. That means that any game that prefers wide CPUs...like... I don't know...any that were developed with the aim of running well on a pathetically weak 8 core Jaguar, are probably not going to run amazingly on the relatively powerful but thin 2200G.
 

lukilladog

Member
How do you know they're making bad use of PC processors? Or good use of console processors for that matter? The 9700K was running the game at like 4x the framerate before it started stuttering and just turning it up to 1440p was enough to solve it the problem. Huh...it's almost as if having your CPU be the bottleneck, regardless of how high that bottleneck is, probably isn't a great idea 🤔

Lower than low intensifies

Single core performance, absolutely...but I don't know if you noticed this, the 2200G is a 4C4T CPU. That's half as many cores (and threads) as the Jaguar 8 core. That means that any game that prefers wide CPUs...like... I don't know...any that were developed with the aim of running well on a pathetically weak 8 core Jaguar, are probably not going to run amazingly on the relatively powerful but thin 2200G.

It´s not flying on potato processors, is having issues on some high end processors, and some of the crashes might be related to non proper processor usage, that´s how I know.

Lower than low still needs to be backed up.

Games rely on 1 or 2 threads that take most of the time, the rest of the threads can easily be accomodated if the processor is fast enough for the instructions... that´s why well optimized ports fly on potato PC processors.
 

paypay88

Banned
Long story short: Another Crysis. If you want to get the console level graphics you can but if you want to go everything ultra in 4K, just wait two generations of Nvidia HW.
No , Crysis wasnt future proof or anything. It wasnt programmed to take care of multi thread . It was bad port that seemed to scale but technically it wasnt capable of it. Thats why most still struggling to run 60 fps without dips.

RDR2 isnt special either , graphical differences seems so minor for performance dump in system. Optimization would do wonders even for ultra settings. If there is literally so much little difference why waste system core power. It doesn't look too great or revolutionary from console versions even at everything ultra.
 

paypay88

Banned
That´s not excuse to make good use of console processors and bad use of PC processors for the port.



It looks very close, that´s all I´ve been saying. Already posted the candyland comparison.



But there is no reason for that, Potato PC processors are twice faster or more than Jaguar 8 core.
Since when Potato PC's have unified 6-7 GB of VRam available with super bus speeds ? Yeah since never.
 

paypay88

Banned
So pretty much what i mentioned in the old thread.

Console runs the game nowhere near ultra.



Not really, loads of settings are tuned down on the GPU department also in order to even get towards 4k. Visual quality has to be sacrifices to push insane resolutions.

CPU wise the game doesn't seem to be much taxing,
Yet console is optimized enough that it looks near ultra levels for most cases(in pro consoles at least) . I played on base ps4 as well it was stunning if you dont mind some fps drops in city
 
It´s not flying on potato processors, is having issues on some high end processors, and some of the crashes might be related to non proper processor usage, that´s how I know.
It runs fine on potato processors...at 30FPS. It runs fine on high-end processors...just not necessarily while bottlenecking at ~120FPS at 1080p low with a 2080ti. Yeah, sure, totally sounds like a realistic use case.
Lower than low still needs to be backed up.
Lower than low further intensifies.
Games rely on 1 or 2 threads that take most of the time, the rest of the threads can easily be accomodated if the processor is fast enough for the instructions... that´s why well optimized ports fly on potato PC processors.

Well clearly that's not true of RDR2, seeing as how it's able to peg a 4C4T CPU at 100%...something GTA5 is also able to do by the way. The only difference being that RDR2 will peg them at significantly lower framerates...almost as if it's more CPU demanding or something. Almost as if RDR2 was developed with the current generation of consoles in mind and GTA5 wasn't 🤔
 

bilderberg

Member
No , Crysis wasnt future proof or anything. It wasnt programmed to take care of multi thread . It was bad port that seemed to scale but technically it wasnt capable of it. Thats why most still struggling to run 60 fps without dips.

RDR2 isnt special either , graphical differences seems so minor for performance dump in system. Optimization would do wonders even for ultra settings. If there is literally so much little difference why waste system core power. It doesn't look too great or revolutionary from console versions even at everything ultra.

that's every pc game released in the last 20 years. Why do you people want "optimized" ultra settings. Lower the damn settings. No one's going to give a shit about whether Ultra is "optimized" 5 years from now and Rockstar knows this.
 
Last edited:

stranno

Member
No , Crysis wasnt future proof or anything. It wasnt programmed to take care of multi thread . It was bad port that seemed to scale but technically it wasnt capable of it. Thats why most still struggling to run 60 fps without dips.
Nowdays we have 5,2GHz CPUs and the most powerful rig you can build will run Ascension level at 45-50FPS in Very High settings, so it was certainly a game of the future. I dont think it was bad programmed at all, it just push a crazy polycount (Nomad's Nanosuit alone has around 70.000 triangles), complex AI behavior and advanced physics.
 
Last edited:

kraspkibble

Permabanned.
i honestly don't mind if i need to use settings similar to XB1X but i just want to be able to run it at 1440p 60fps. if i can increase some settings beyond what the Xbox does then great.

just waiting on the Steam version ....
 

888

Member
i honestly don't mind if i need to use settings similar to XB1X but i just want to be able to run it at 1440p 60fps. if i can increase some settings beyond what the Xbox does then great.

just waiting on the Steam version ....

Depending on your hardware it’s pretty doable. I’m averaging 75 at 1440P at a good mix of settings.
 
i honestly don't mind if i need to use settings similar to XB1X but i just want to be able to run it at 1440p 60fps. if i can increase some settings beyond what the Xbox does then great.

just waiting on the Steam version ....

doesn’t it force you to go through the Rockstar launcher no matter where you buy it from? I was told this, not sure if true
 
  • Like
Reactions: 888
Top Bottom