• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Red Dead Redemption 2: All Settings Explained by DF

888

Member



"The complete list can be found below, but suffice to say, Rockstar targeted the best bang for the buck on consoles and even effects that exhibit lower quality on X than on PC's lowest - volumetric resolution, for example - still look outstanding. Plug those settings into a perfectly affordable mainstream graphics card like the AMD Radeon RX 580 and achieving 1080p gameplay at 60 frames per second isn't difficult. However, running at low and medium settings seems to come with a stigma for the committed PC gamer. There's the sense that only high or ultra will do, and in the case of RDR2, this can cause lower than expected performance.

Is it really a 'sub-optimal' port then? In many ways, the exact opposite is true. Red Dead Redemption 2 is based on the latest version of Rockstar's Rage engine and while there are many similarities to the technology's last outing in Grand Theft Auto 5, much has evolved. For starters, the engine has transitioned away from supporting older graphics APIs and offers options for both Vulkan and DirectX 12. At 60 frames per second on the closest we could get to console equivalent settings, the game does have significant CPU usage, but core utilisation is fairly level on both APIs - usually a good sign for the quality of a PC port. The Vulkan API is the default and we'd suggest sticking to that. Comparing both APIs on GTX 1060 and RX 580, Vulkan delivered faster results on both and eliminated infrequent stutter on the AMD side we noted when DX12 was used."







Xbox One X Console Equivalent Settings
UltraTexture Quality, Geometry Level of Detail (5/5)
HighShadow Quality, Mirror Quality, Soft Shadows, Water Refraction Quality
High/MediumTessellation, TAA, Volumetric Lighting Quality, Parallax Occlusion Mapping
MediumLighting Quality, Screen-Space Ambient Occlusion, Particle Quality, Particle Lighting Quality, Fur Quality, Decal Quality, Water Reflection Quality
Low/MediumGlobal Illumination Quality, Grass Shadows
Low/Lower Than LowFar Shadow Quality, Reflection Quality, Near Volumetric Resolution, Far Volumetric Resolution, Water Physics Quality (1/6), Tree Quality, Grass Level of Detail (2/10)
Disabled FeaturesFXAA, MSAA, Unlocked Volumetric Ray March Resolution, Full Resolution SSAO, Long Shadows, TAA Sharpening

Note: Some of Xbox One X's effects appear to be hybrid versions of low and medium, or medium and high. We'd recommend starting with the higher quality version, but should emphasise that the lower quality rendition of the effect still looks quite comparable across the run of play.


There is a very in depth graph of GPU's running Console settings I was trying to link here for the past 30 minutes but couldn't get it. If you go to the site you can select your GPU and see how it performs at console settings at 4k.

summary ?

Sorry got too caught up working on trying to export that graph.
 
Last edited:

VFXVeteran

Banned
Incredible write up. Thanks Alex!

I can't wait to see Jedi FO next week. Hoping DF does a write up on that too.
 

rashbeep

Banned
summary ?

ZrwEdoL.png
 

JoduanER2

Member
This is great content, thank you DF! Although i understand the game was made future proof, it still need a bit of optimization, but it inst a bad port by any mean. Not like Borderlands 3, which im playing at the moment, that its full of performance spikes, which apparently are getting fix in November with a performance patch... hopefully.
 
Last edited:

pawel86ck

Banned
Interesting results

32 fps on GTX 1080
42 fps on 1080ti
62 fps on 2080ti

So in order to match xbox x results (30fps) people need to play on GTX 1080? And what about 60fps? 1080ti should easily double xbox x results like in other games, yet here is not even close to 60fps. Maybe optimisation is not as bad as in GTA4, but I was expecting much more from rockstar, especially after perfectly optimised GTA5 port.
 

lukilladog

Member
I don´t believe these settings justify such a performance impact, because it´s generally the same effects with some more resolution and distance, something that has caracterized PC ports for decades... don´t know where "this future proofing" comes from, sounds like a cliché.
 
Last edited:
For most of the game so far I've been running that game at 1440p 60fps and nearly the highest settings ( 1080ti ).

Even if you set everything to LOW ( except for textures, leave those on Ultra ) the game still looks surprisingly good. I thought it would look worse. But even at LOW the lighting is still rich and volumetric.

One of the most impressive things about this game is how it handles LOD. Sure, if you look for it you can something spot details popping into existence but for the most part it's not noticeable. And I find this extremely impressive because most other open world games are TERRIBLE at this. A good example of this is AC:Odyssey - I think it's a great game ad the graphics are great BUT the pop-in half ruins it for me. Where ever you go you can constantly see the small details of the world popping into existence just a few meters in front of your character. And it doesn't matter if you are running it on Ultra on the world's most powerful computer, there is no setting to get that pop-in under control.

However RDR2 is handling LOD, I want every other developer to do it like that.
 
Last edited:

CrustyBritches

Gold Member
Amazing video by Alex and Digital Foundry. One of the best breakdowns I've seen, and the fact it starts with the best console version as baseline makes it highly relevant. When you see the settings 1-by-1 it's much more apparent how much detail Ultra pushes over the X1X. In fact, Alex wraps up with the statement, "I do welcome the fact that we can scale this game so much, and that the lowest settings, basically as Xbox One X looks, still looks really great."

The X1X version is essentially a mix of 'Lower than Low"/Low/Medium with Ultra textures. That's why it runs 4K/30fps. I can't be bothered to buy it full price and run it on my RX 480 8GB with max OC, but I'm guessing once the lower than low .ini tweaks are ironed out it will be a match. Consoles aren't really any more optimized than the PC version. imo.
 
Last edited:

sertopico

Member
Interesting results

32 fps on GTX 1080
42 fps on 1080ti
62 fps on 2080ti

So in order to match xbox x results (30fps) people need to play on GTX 1080? And what about 60fps? 1080ti should easily double xbox x results like in other games, yet here is not even close to 60fps. Maybe optimisation is not as bad as in GTA4, but I was expecting much more from rockstar, especially after perfectly optimised GTA5 port.
Simply put, Pascal architecture is older and lacks certain internal optimizations Turing has, that is also why the amd 5700 performs better. I consider myself lucky I can play this game almost at full detail with my 1080 Ti @2k plus 1.250x upscaling. If we think about Crysis more than 10 years ago, those who had a Pentium 4 HT and a Geforce 6xx/7xx were forced to play at lowest detail without even getting 30 stable fps. Situation is much better now I'd say, we're able to run the game with great details even on a 3+ yrs old GPU...

Just remember, volumetric effects af full sample rate as well as lighting in general are EXTREMELY taxing. Sometimes you just need to lower that a bit to improve the situation drastically.
 
Last edited:

pawel86ck

Banned
Simply put, Pascal architecture is older and lacks certain internal optimizations Turing has, that is also why the amd 5700 performs better. I consider myself lucky I can play this game almost at full detail with my 1080 Ti @2k plus 1.250x upscaling. If we think about Crysis more than 10 years ago, those who had a Pentium 4 HT and a Geforce 6xx/7xx were forced to play at lowest detail without even getting 30 stable fps. Situation is much better now I'd say, we're able to run the game with great details even on a 3+ yrs old GPU...

Just remember, volumetric effects af full sample rate as well as lighting in general are EXTREMELY taxing. Sometimes you just need to lower that a bit to improve the situation drastically.
Yes, AMD cards perform better in this game compared to pascal, but still you cant match xbox x settings on RX 580, or double xbox results or 5700XT and radeon 7.

580 25fps
Vega 56 36fps
Vega 64 40fps
5700XT 46fps
Radeon 7 50fps

Xbox x version runs with 30fps cap, so we dont know how many fps xbox hardware really push. However it must be more than 30fps, because game run at locked 30 fps for the vast majority of time, so average is probably like 35fps if not more.

There are only two possible explanations:
a) xbox is faster than people thought, because you need Vega 56 to match xbox x settings on PC
b) game is unoptimised
 
Last edited:
Simply put, Pascal architecture is older and lacks certain internal optimizations Turing has, that is also why the amd 5700 performs better. I consider myself lucky I can play this game almost at full detail with my 1080 Ti @2k plus 1.250x upscaling. If we think about Crysis more than 10 years ago, those who had a Pentium 4 HT and a Geforce 6xx/7xx were forced to play at lowest detail without even getting 30 stable fps. Situation is much better now I'd say, we're able to run the game with great details even on a 3+ yrs old GPU...

Just remember, volumetric effects af full sample rate as well as lighting in general are EXTREMELY taxing. Sometimes you just need to lower that a bit to improve the situation drastically.
Pascal is falling behind more than is normal. For example the 1070 typically trades blows with the 1660ti while in RDR2 it barely beats out the 1660. I'm hesitant to actually pin this as an issue, but it's certainly not the norm.
 

CrustyBritches

Gold Member
There are only two possible explanations:
a) xbox is faster than people thought, because you need Vega 56 to match xbox x settings on PC
b) game is unoptimised
Simple explanation is that those were preliminary settings and testing results, and that the actual detailed console settings in Pt. II were lower than what was tested:

Part I:
We've stuck to the console equivalent settings but added 8x anisotropic filtering and where Xbox One X used hybrid settings, we've opted for PC's higher equivalent

Additionally, there is no current way to match many of the "Lower than Low", or hybrid "Less than Medium/High, above next lowest value" settings, and DF defaulted to the highest setting option of 2 possibilities when faced with "mixed" possible settings. This includes, but is not limited to(X1X value former, DF tested value latter):

1. Anisotropic Filtering 4x instead of 8x
2. Lighting 'Low' instead of 'Medium'
3. Global Illumination 'Low' instead of 'Medium'
4. Soft Shadows 'Medium' instead of 'High'
5. Far Shadows 'Lower than Low'
6. 'Low' instead of 'Medium'
7. SSR 'Lower than native 4K', custom fade in
8. Mirror Quality 'Lower than Low'
9. Near/Far Volumetrics 'Low' instead of 'Medium'
10. Fog Volumes 'Lower than Low'
11. Cloud resolution 'Low' instead of 'Medium'
12. Tesselation: Far- 'Lower than Medium' | Near- 'High' tesselation edges
13. Parallax Occlusion 'Medium' instead of 'High'

Part II:
I do welcome the fact that we can scale this game so much, and that the lowest settings, basically as Xbox One X looks, still looks really great.
 
Last edited:

pawel86ck

Banned
Simple explanation is that those were preliminary settings and testing results, and that the actual detailed console settings in Pt. II were lower than what was tested:

Part I:


Additionally, there is no current way to match many of the "Lower than Low", or hybrid "Less than Medium/High, above next lowest value" settings, and DF defaulted to the highest setting option of 2 possibilities when faced with "mixed" possible settings. This includes, but is not limited to(X1X value former, DF tested value latter):

1. Anisotropic Filtering 4x instead of 8x
2. Lighting 'Low' instead of 'Medium'
3. Global Illumination 'Low' instead of 'Medium'
4. Soft Shadows 'Medium' instead of 'High'
5. Far Shadows 'Lower than Low'
6. 'Low' instead of 'Medium'
7. SSR 'Lower than native 4K', custom fade in
8. Mirror Quality 'Lower than Low'
9. Near/Far Volumetrics 'Low' instead of 'Medium'
10. Fog Volumes 'Lower than Low'
11. Cloud resolution 'Low' instead of 'Medium'
12. Tesselation: Far- 'Lower than Medium' | Near- 'High' tesselation edges
13. Parallax Occlusion 'Medium' instead of 'High'

Part II:
Currently 580RX runs the game at 22fps min and 25fps average with Digital Foundry settings and I dont believe these 3 settings at lower than low would improve performance more than 1-2 fps. You need at least 10 fps more to match xbox x settings, and that's vega 56 territory.

Xbox x GPU is customized part, so we can only guess how fast it really is. Some games xbox x can push twice as many pixels as PS4P GPU at 4.2 TFLOPS, so xbox x run games more like 8.4 TFLOPS polaris. Obviously MS has done something very important in order to improve GPU performance in their console. I remember when Warthunder developer has compared xbox x GPU performance to GTX 1080. People thought it was just a joke, but in order to match xbox x performance in RDR2 on PC then you indeed need GTX 1080, so maybe it wasnt a joke after all😂.
 
Last edited:

CrustyBritches

Gold Member
Currently 580RX runs the game at 22fps min and 25fps average with Digital Foundry settings and I dont believe these 3 settings at lower than low would improve performance more than 1-2 fps. You need at least 10 fps more to match xbox x settings, and that's vega 56 territory.
Preliminary console settings with 8x AF instead of 4x, and in all cases defaulting to the higher setting.

The actual console settings from part II would require up to 13 settings being lowered. That will get you your 5fps pretty easily, even without OC.
 
Pascal is falling behind more than is normal. For example the 1070 typically trades blows with the 1660ti while in RDR2 it barely beats out the 1660. I'm hesitant to actually pin this as an issue, but it's certainly not the norm.
Therse is something wrong with the gpu usage on DX12 with Pascal. My 1080ti and my Gfs 1070ti the fps would drop under 60fps yet the usage will be ~90%.
And its not cpu bottlenecking, happens on my 9900k too.

Its either the driver or something rockstar need to fix.
Problem goes away on both our systems with Vulkan.
 

pawel86ck

Banned
Preliminary console settings with 8x AF instead of 4x, and in all cases defaulting to the higher setting.

The actual console settings from part II would require up to 13 settings being lowered. That will get you your 5fps pretty easily, even without OC.
The most demanding settings are already lowered and Alex has said performance difference between certain settings are almost nonexistent, so you cant say for a fact these minimal differences in settings would get you 10fps on PC. When it comes to AF, in x360 days AF 4 vs 8 performance penalty was noticeable, but not now. Personally I always use AF 16 because performance penalty is so minimal.
 
Last edited:

CrustyBritches

Gold Member
The most demanding settings are already lowered and Alex has said performance difference between certain settings are almost nonexistent, so you cant say for a fact these minimal differences in settings would get you 10fps on PC. When it comes to AF, in x360 days AF 4 vs 8 performance penalty was noticeable, but not now. Personally I always use AF 16 because performance penalty is so minimal.
1. Where are you getting benchmark numbers for RX 580 framerate(I'm assuming Pt. 1, dynamic benchmark graph)
2. What demanding settings are lowered, and timestamps for each statement
 
I've seen people get ~60 fps @1440p on a 980ti. Lower you settings brah

I'm already on mid, High settings. I might drop lighting from high to mid see if that helps. I'm getting around mid 40-50's

Simple explanation is that those were preliminary settings and testing results, and that the actual detailed console settings in Pt. II were lower than what was tested:

Part I:


Additionally, there is no current way to match many of the "Lower than Low", or hybrid "Less than Medium/High, above next lowest value" settings, and DF defaulted to the highest setting option of 2 possibilities when faced with "mixed" possible settings. This includes, but is not limited to(X1X value former, DF tested value latter):

1. Anisotropic Filtering 4x instead of 8x
2. Lighting 'Low' instead of 'Medium'
3. Global Illumination 'Low' instead of 'Medium'
4. Soft Shadows 'Medium' instead of 'High'
5. Far Shadows 'Lower than Low'
6. 'Low' instead of 'Medium'
7. SSR 'Lower than native 4K', custom fade in
8. Mirror Quality 'Lower than Low'
9. Near/Far Volumetrics 'Low' instead of 'Medium'
10. Fog Volumes 'Lower than Low'
11. Cloud resolution 'Low' instead of 'Medium'
12. Tesselation: Far- 'Lower than Medium' | Near- 'High' tesselation edges
13. Parallax Occlusion 'Medium' instead of 'High'

Part II:

Not even change them in the .ini settings? A lot of hidden settings things like async can be enabled this way but not from the menus.
 
Last edited:

CrustyBritches

Gold Member
Not even change them in the .ini settings? A lot of hidden settings things like async can be enabled this way but not from the menus.
DF didn't use .ini tweaks for their 4K built-in benchmark testing, or their Every Graphics Setting video. For their built-in benchmark they defaulted to the higher PC setting in all cases, and used preliminary "Console Equivalent" settings. Part 2 details those settings much better.

AMD slaughters Nvidia in this game, similar to RE2.
---
I should have used the phrasing, "DF stated they had no way of using 'Lower than Low' or hybrid settings".
 
Last edited:

pawel86ck

Banned
1. Where are you getting benchmark numbers for RX 580 framerate(I'm assuming Pt. 1, dynamic benchmark graph)
2. What demanding settings are lowered, and timestamps for each statement
1- OP has lnked article with benchmark results

2-I'm not going to watch entire video again just to give you exact timestamps, but havent you watched this video for yourself? I clearly remember there were instances where certain settings had small or no performance implications and therefore Alex has recommended using higher settings.

PC version is very demandig, because certain settings set to high and especially ultra tanks performance to the extreme, but these settings are already lowered enough in Alex xbox settings assessment. IMO Alex settings are fair and I dont expect big performance improvement from very low water rendering to low (and RX 580 need 30% more performance in order to provide somewhat similar experience to xbox x).

But lets assume real xbox x settings would get you your 30% performance improvement. The problem is on PC you cant use these settings and therefore you are forced to use higher settings on PC to even match xbox quality. So no matter how you look at it, RX 580 on PC will not provide xbox x results (30fps). If you really want to play with close to xbox x quality, then you need Vega 56 based on eurogamer article (36fps average should be good enough for solid 30fps experience with minor dips).
 
Last edited:
DF didn't use .ini tweaks for their 4K built-in benchmark testing, or their Every Graphics Setting video. For their built-in benchmark they defaulted to the higher PC setting in all cases, and used preliminary "Console Equivalent" settings. Part 2 details those settings much better.

AMD slaughters Nvidia in this game, similar to RE2.
---
I should have used the phrasing, "DF stated they had no way of using 'Lower than Low' or hybrid settings".

Fair enough, I'll be interested to see LowSpecGamers video on this game.
 
Based on my experience with a 1080ti and smart settings I can get a 1440p 60fps lock 99% of the time.

4k 30fps is easy. No problem. I can get 4k 60fps but the settings are just too low at that point to justify. Basically to get 4k 60 I need to set everything to LOW but Textures stay on Ultra.
 

CrustyBritches

Gold Member
1- OP has lnked article with benchmark results

2-I'm not going to watch entire video again just to give you exact timestamps, but havent you watched this video for yourself? I clearly remember there were instances where certain settings had small or no performance implications and therefore Alex has recommended using higher settings.
I had 2 simple questions:
1. Where are you getting benchmark numbers for RX 580 framerate(I'm assuming Pt. 1, dynamic benchmark graph)
1- OP has lnked article with benchmark results
->You didn't answer this question directly, but I'm sure my assumption about the dynamic graph was correct

2. What demanding settings are lowered, and timestamps for each statement
2-I'm not going to watch entire video again just to give you exact timestamps, but havent you watched this video for yourself? I clearly remember there were instances where certain settings had small or no performance implications and therefore Alex has recommended using higher settings.
->I watched the "Every setting explained" vid twice and provided you with 13 settings that should be lowered in comparison to the "built-in benchmark dynamic graph" settings used.

The 13 settings I provided were not from Pt. 1, and I didn't source them from either provided chart. They are directly from the video and fall into 3 categories:
1. Stated to be Lower than Low
2. Stated to be a hybrid setting
3. Stated to be either/or setting
---
-Pt. 1 "60fps vid" was using highly preliminary settings by Richard Leadbetter
-Pt. 1 "built-in dynamic graph" settings were different from the "60fps vid" settings, and defaulted to the highest of options for each possible console setting, along with 8x AF
-Pt. 2 "Every setting explained" is the final, updated Console Equivalent Settings and had no benchmarking of the RX 580

DF thoroughly disputes and disproves your assertion "optimization is to blame". That's the overarching theme of both vids.
one of the biggest complaints I have heard so far is concerning the word optimization and how apparently poor optimization is in Red Dead Redemption 2 on PC. Hopping into the game directly after launch I had a level-headed understanding of what performance I could expect thanks to NVIDIA putting out a very small "Settings Recommendation", but wow, this game's ultra settings sure are heavy. I'm a reasonable person though, so I thought to myself this all has to make sense. How is it that an Xbox one X can run Red Dead Redemption to at an okay, but not completely stable 30fps at 4K? Well, after taking the time to look into it, it actually does make a lot of sense how that is possible.
 
Last edited:

pawel86ck

Banned
I had 2 simple questions:
1. Where are you getting benchmark numbers for RX 580 framerate(I'm assuming Pt. 1, dynamic benchmark graph)

->You didn't answer this question directly, but I'm sure my assumption about the dynamic graph was correct

2. What demanding settings are lowered, and timestamps for each statement

->I watched the "Every setting explained" vid twice and provided you with 13 settings that should be lowered in comparison to the "built-in benchmark dynamic graph" settings used.

The 13 settings I provided were not from Pt. 1, and I didn't source them from either provided chart. They are directly from the video and fall into 3 categories:
1. Stated to be Lower than Low
2. Stated to be a hybrid setting
3. Stated to be either/or setting
---
-Pt. 1 "60fps vid" was using highly preliminary settings by Richard Leadbetter
-Pt. 1 "built-in dynamic graph" settings were different from the "60fps vid" settings, and defaulted to the highest of options for each setting, along with 8x AF
-Pt. 2 "Every setting explained" is the final, updated Console Equivalent Settings and had no benchmarking of the RX 580

DF thoroughly disputes and disproves your assertion "optimization is to blame". That's the overarching theme of both vids.
I have told you I was talking about OP link. There's only one link there, so it should be obvious to you I was indeed referring to eurogamer article. But OK, if you want further confirmation here's benchmark chart I'm talking about


N0lh7gR.jpg


22 min 25 fps average on RX 580 is not exactly what I would call xbox x experience, and game is already running close to minimum settings (many settings at low). Maybe with ALL settings at low RX 580 would be finally able to match xbox x performance (solid 30fps), but then game should look even worse than console version. And let's not forget on PC people want to play at 60fps, not at 30fps. People gaming on 1080ti or RTX 2080 could always get twice the performance of xbox x and not just with console equivalent settings, but most of the time with much higher settings, didnt you know that? Here digital foundry tested RDR2 with close to xbox x settings, and game still demolished popular high end cards like 1080ti and 2080, and even 2080ti is barely enough for 60fps (62fps on average). We are not even talking here about ultra settings, far from that, because with ultra settings RDR2 is probably the most demanding game right now. There are also other issues like stuttering and memory leak problems, it almost remind me GTA4. I had 8800Ultra back then, it was still powerful card back then and I had to turn everything to low to get 60fps.
 

Pejo

Member
This was a great vid, I love the detailed breakdown. Also it's really interesting to see where the Xbone version made concessions, and I actually think that it was the right choice. Better up close visuals, lose a little in the distance.
 
I'm already on mid, High settings. I might drop lighting from high to mid see if that helps. I'm getting around mid 40-50's



Not even change them in the .ini settings? A lot of hidden settings things like async can be enabled this way but not from the menus.
Maybe try this as a guide?

 
Last edited:

pawel86ck

Banned
Part 1

Part 2


Another detailed analysis of different settings. Like I have said AF performance is not much different between 4 and 8, and certain settings on low and medium have no longer additional performance penalty.
 
Last edited:
Top Bottom