• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Red Dead Redemption 2 Is Having A Rough Launch On PC

Bullet Club

Member
Red Dead Redemption 2 Is Having A Rough Launch On PC

Red Dead Redemption 2 is out on PC today. A lot of players, myself included, are already having some issues. Here are some of the problems you might be running into and some possible fixes that have been discovered so far.

First things first, you should get the latest graphics drivers for your card, which is usually a good place to start when you’re having issues. I’m running the game on mostly “High” settings on a GTX 1080, and I’m getting between 50 and 60 frames per second. I’m getting some weird freezes at random places, during which time the video pauses but the audio keeps playing. Upon advice from this reddit thread, I switched from the game’s default Vulkan to DX12, which fixed a freeze I consistently got in an early cutscene but hasn’t prevented a bunch of other ones. My freezes happened especially frequently during an early quest in the “Enter, Pursued by a Memory” mission. Some players report seeing their CPU spike when the game freezes; my CPU is a somewhat creaky Intel i5-4570, so that could be the culprit, but other players with beefier specs than mine are also reporting issues. It’s not gamebreaking for me yet, but it’s definitely annoying.

Many players are reporting not being able to start the game at all, with it either crashing on the intro or giving them a message that the Rockstar launcher won’t launch. Some folks are having luck fixing the intro crash by disabling their anti-virus software. Other players are getting a message saying “activation required” when they try to launch the game, with some reporting that logging in and out of the Rockstar Launcher helps.

Other players aren’t getting any audio. Some had luck by switching their audio output.

Several players are reporting getting stuck in an infinite load screen, though there doesn’t seem to be a fix for that yet.

I’ve been a PC gamer a long time, so I know better than to expect a perfect launch, but I’m not alone in feeling a little frustrated. Hopefully we’ll be seeing some patches sooner rather than later. Rockstar sent Kotaku to a link advising players to update their graphics drivers to solve unexpected crashes and wrote, “We’re actively looking into any other issues as they arise and we’ll continue to update the Rockstar Support pages with more information as it becomes available.” The support page for PC issues is fairly sparse so far.

Source: Kotaku

Even with a $1,200 graphics card, you still can’t max out RDR2 on PC

Nvidia recommends feathering in medium settings or upscaling

With Red Dead Redemption 2 now available on Windows PC, graphics card manufacturer Nvidia has announced its guidance for hitting 4K at 60 frames per second. Turns out that even its most expensive consumer-level GPU, the roughly $1,200 GeForce RTX 2080 Ti, can’t get there at the highest settings. That means there’s plenty of headroom for driver optimizations, and/or the next generation of GPUs.

Those lucky folks who aren’t dealing with crashes are already off and running, and most seem pleased with the results. We certainly were during our controlled 4K60 demo last month. But, as it turns out, we weren’t actually playing at the highest settings. As Nvidia revealed in its blog post, the press was working with a mix of medium and high settings at 4K.

xIsNZlB.png


If you’re dead-set on goosing all the bells and whistles, Nvidia rendering at a lower resolution, either 3264x1836 or 2880x1620, and upscaling to 4K for “a sizeable performance improvement, and only a minimal reduction in image clarity.” Coupled with the company’s new latency and sharpening tech, it sounds like a decent solution. At the very least, be sure to update your drivers, as Nvidia thoughtfully dropped a day-zero patch on Monday.

Source: Polygon
 
Last edited:

Jayjayhd34

Member
I had the infinite load screen problem at first fixed it by updating drivers and restarting PC. Been playing it all night and it runs smooth without any hiccups, shame for those with problems, ones get out snow the game looks incredible.
 
Last edited:
Why did anyone expect it to run at 4K 60 on a 2080ti?
Because it runs at 4K 30 on a $400 console and in many ways looks almost exactly the same while that is a $1,200 GPU in a $3,000+ system which should computationally outperform it by leaps and bounds.

The end result is you either get a game that looks just like it does on the X at 60 FPS or you get it where it looks mostly the same with some marketable improvements at the same 30 FPS...

yeshrug.png
 

Airbus Jr

Banned
Maybe this is an outdated question

But do Rockstar have any plans to make dlc for this game?

Im still waiting for a GOTY edition for RDR2
 

ruvikx

Banned
This is why I avoid PC gaming most of the time. A console thread talks about the game, a PC thread is all spec shit. Too much tinkering required, I would never leave shit alone, hoping to optimise further etc.

There's some truth to this. It's especially true when the game lacks a decent benchmark area where we can test all the best settings before starting it. It's common for PC gamers to constantly alter settings based on which part of the game they're in (especially true for open world games like The Witcher III, Assassin's Creed Odyssey & probably Red Dead 2 as well where certain parts are more taxing on hardware than others). Sim racing & stuff like that is perfect on pc, but some of these AAA blockbusters... not so much. They make good tech demos to test hardware & show off on youtube, but for relaxing in a hassle free experience & just focusing on the gameplay? Not really.
 

Lanrutcon

Member
There's some truth to this. It's especially true when the game lacks a decent benchmark area where we can test all the best settings before starting it. It's common for PC gamers to constantly alter settings based on which part of the game they're in (especially true for open world games like The Witcher III, Assassin's Creed Odyssey & probably Red Dead 2 as well where certain parts are more taxing on hardware than others). Sim racing & stuff like that is perfect on pc, but some of these AAA blockbusters... not so much. They make good tech demos to test hardware & show off on youtube, but for relaxing in a hassle free experience & just focusing on the gameplay? Not really.

Go into options menu. Click click. Play game. But any excuse to sling propaganda, eh console warrior?
 
Game is just broken for many (most?) people right now.

That sucks but I think that performance will improve dramatically as they fix things.

For example, right now a 1080ti is getting terrible performance but when you look closely you'll find that the game is only using about 35% of the GPU. It won't stay like this forever, so that means there's a TON of performance being left on the table. Framerates could easily more than double as they fix and optimize.
 
Last edited:

ultrazilla

Member
Yeah this kinda sucks. I have a Nvidia GTX 8gig 2080 Founder's Edition and at 1440p resolution settings/144hz monitor, I'm getting low to mid 40's framerates and that's
not maxed out.
 
Last edited:

ruvikx

Banned
Go into options menu. Click click. Play game. But any excuse to sling propaganda, eh console warrior?

My God man, take some aspirin & calm the proverbial f down. I have a ps4, an xbox one & a gaming pc. Some games run so bad on console I prefer them on pc, whereas many others are perfectly fine on console so I play them on those platforms for ease of use & zero hassle. But don't come & tell me there's no hassle on pc when steam forums etc. are littered with people asking how & why they have stutter in their pc games at 60 fps (hello The Evil Within 2, aka a disgrace on pc). That's just one small example of a commonly found issue with pc games, i.e. lack of optimization & a DIY experience which can take hours to set up correctly - often with different Nvidia control panel settings for different games, thus making jumping from one to another also a hassle (vsync on/off etc.).

If you deny there's a time sink required to get a game running properly on pc, then you're the one peddling dishonest propaganda here.
 

FutureMD

Member
"Even with a $1,200 graphics card, you still can’t max out RDR2 on PC "
So it's bad that a game pushes the graphics to a degree that needs next gen cards to max out at 4k? Would it be better to have a game limited in the max graphical features just so people can feel good about running it ultra and 4k?
 

Lanrutcon

Member
My God man, take some aspirin & calm the proverbial f down. I have a ps4, an xbox one & a gaming pc. Some games run so bad on console I prefer them on pc, whereas many others are perfectly fine on console so I play them on those platforms for ease of use & zero hassle. But don't come & tell me there's no hassle on pc when steam forums etc. are littered with people asking how & why they have stutter in their pc games at 60 fps (hello The Evil Within 2, aka a disgrace on pc). That's just one small example of a commonly found issue with pc games, i.e. lack of optimization & a DIY experience which can take hours to set up correctly - often with different Nvidia control panel settings for different games, thus making jumping from one to another also a hassle (vsync on/off etc.).

If you deny there's a time sink required to get a game running properly on pc, then you're the one peddling dishonest propaganda here.

Yes, I'll deny it because it's not the 80s anymore. Sometimes you get a RDR2 situation where the game is a buggy mess, sometimes you just want to get some extra juice, but sometimes you don't want to tweak so you just run the game and play. You're deluded if you think every game (or even most games these days) take time to get running. You're cherry-picking examples and trying to sell it as the norm. For every Evil Within 2 (if that is even an example, I found posts complaining about the console version too because *gasp* you can find forums complaining about anything if you look hard enough) there's a Gears 5. Or an Outer Worlds. Or a Borderlands 3 which just work.

Its the same misinformation that console warriors have been shoveling for years. We're not running Vista anymore. You don't need a degree to install and play a Steam game. Spending 2 minutes in an options menu is not a time sink. Spending 20 seconds to set vsync to on (per game in your example, for some reason?) isn't going to ruin your life. Cut the crap, cut the hyperbole, stop using the worst case scenarios as some kind of benchmark. They're fucking terrible when they happen, but they certainly don't represent the PC experience as a whole.
 
Because it runs at 4K 30 on a $400 console and in many ways looks almost exactly the same while that is a $1,200 GPU in a $3,000+ system which should computationally outperform it by leaps and bounds.

The end result is you either get a game that looks just like it does on the X at 60 FPS or you get it where it looks mostly the same with some marketable improvements at the same 30 FPS...

yeshrug.png

The X1X game is no doubt highly optimised, and the GPU in it is faster than a 580 at base clocks with more bandwidth than a 590 and three times the memory channels (so likely faster than the 590 in some circumstances). Its got the best performing console version, but even it doesn't always hold 30 fps.

If you really work the X1X well there's no reason to expect double (or higher) frame rate plus noticeably higher settings out of anything other than maybe the 2080 Ti. If you get it, lovely, but you have no right to expect it based on the hardware.

Too many PC gamers with expensive hardware actually buy into this "master race" nonsense and don't just mean it as an admittedly funny joke. It's my preferred and only "current gen" platform, but configuration specific glitches and bugs, and the extent to which you're reliant on (particularity) Nvidia driver optimisations to get the best results out of your hardware are glossed over. And there are so many gimps that just put settings too high and then complain about performance relative to console.

https://www.guru3d.com/articles_pag..._graphics_performance_benchmark_review,4.html

(Effect of resolution and settings in the link).

Performance will no doubt edge upwards with driver and game version advances, but the game gives you the ability to hammer performance for relatively small perceptual differences. If you aren't happy with performance, turn your damn settings down.
 
"Even with a $1,200 graphics card, you still can’t max out RDR2 on PC "
So it's bad that a game pushes the graphics to a degree that needs next gen cards to max out at 4k? Would it be better to have a game limited in the max graphical features just so people can feel good about running it ultra and 4k?
You guys need to dial back what you're saying a bit because there's a little more going on here. I don't think your hardware is being used quite properly.

$400 console getting 30 FPS

49023044027_352554ea6d_o.png


$3,000 PC with a 2080 Ti at maximum settings getting 30 FPS.

RDR2-2019-11-05-22-22-23-643.jpg


$400 console getting 30 FPS

49022288283_0025b726d3_o.png


$3,000 PC with a 2080 Ti at maximum settings getting 30 FPS.

Red-Dead-Redemption-II-Screenshot-2019.11.05-20.34.29.67.jpg


$400 console getting 30 FPS

49022810906_3a961c14ba_o.png


$3,000 PC with a 2080 Ti at maximum settings getting 30 FPS.

RDR2-2019-11-05-21-34-19-619.jpg


$400 console getting 30 FPS

49023013667_960a94cdd8_o.png


$3,000 PC with a 2080 Ti at maximum settings getting 30 FPS.

RDR2-2019-11-05-21-46-53-175.jpg
Is there a difference? A wee bit, but it's nothing to write home about and it's at the same framerate because for whatever reason it's that demanding with that little of a return.

The X1X game is no doubt highly optimised, and the GPU in it is faster than a 580 at base clocks with more bandwidth than a 590 and three times the memory channels (so likely faster than the 590 in some circumstances). Its got the best performing console version, but even it doesn't always hold 30 fps.

If you really work the X1X well there's no reason to expect double (or higher) frame rate plus noticeably higher settings out of anything other than maybe the 2080 Ti. If you get it, lovely, but you have no right to expect it based on the hardware.

Too many PC gamers with expensive hardware actually buy into this "master race" nonsense and don't just mean it as an admittedly funny joke. It's my preferred and only "current gen" platform, but configuration specific glitches and bugs, and the extent to which you're reliant on (particularity) Nvidia driver optimisations to get the best results out of your hardware are glossed over. And there are so many gimps that just put settings too high and then complain about performance relative to console.

https://www.guru3d.com/articles_pag..._graphics_performance_benchmark_review,4.html

(Effect of resolution and settings in the link).

Performance will no doubt edge upwards with driver and game version advances, but the game gives you the ability to hammer performance for relatively small perceptual differences. If you aren't happy with performance, turn your damn settings down.
Take a look above. If more people realized this I think they'd be a bit pissed off.
 
"Even with a $1,200 graphics card, you still can’t max out RDR2 on PC "
So it's bad that a game pushes the graphics to a degree that needs next gen cards to max out at 4k? Would it be better to have a game limited in the max graphical features just so people can feel good about running it ultra and 4k?


Well the thing is the game doesn't push graphics to any degree.

Please, post some cherry picked screenshots showing me the fact the game doesn't push graphics.

Absolutely nothing justifies the performance in this game.
 
Last edited:

Larxia

Member
I'll never understand people getting so hung up over the word "ultra", it really doesn't mean anything and you should look beyond that.

I'm not defending the game or whatever, I haven't played it, but "Ultra" is not always the main way a game is meant to be played, it's a bit less the case nowadays, but back in the days "ultra" modes used to be more like some kind of "future proof" mode, meaning that people years later could still have fun pushing the game's engine because the devs allowed you to do so, but it certainly doesn't mean that medium or high modes suck.

I remember Witcher 2 getting a lot of backlash because of it's "ubersampling" option, and people were really dumb with that. The ubersampling option was something that would render the game at 2X or 4X your display resolution, I'm not sure anymore, but it was basically like running a game in 4K, in 2011, of course it was extremely demanding, but also looked crazy good. It wasn't meant to be enabled by everyone, it was just a future proof feature to allow for high resolution anti aliasing, but because people couldn't enable this option (which was in red in the launcher with a warning about what it actually was) everybody started claiming the game had terrible optimization... People are so frustrating sometimes.

Ultra isn't always the best looking thing by the way, if you just enable ultra with everything maxed out, you will have all the terrible post process effects like motion blur, depth of field, chromatic aberration, sometimes some terrible implementation of ambient occlusion looking really weird.

Enabling everything and pushing everything to the max is really not always the best thing, at all. I played a lot of games that looked better with many stuff disabled (including GTA V).
That's also without mentionning a lot of options that can be lowered without being really that noticeable ingame, like shadow resolution or tessellation.
I'm sure most people who complain about not being able to run a game in ultra wouldn't even notice if you changed the settings to medium while they were not looking.
 

Alexios

Cores, shaders and BIOS oh my!
I'll never understand people getting so hung up over the word "ultra", it really doesn't mean anything and you should look beyond that.

I'm not defending the game or whatever, I haven't played it, but "Ultra" is not always the main way a game is meant to be played, it's a bit less the case nowadays, but back in the days "ultra" modes used to be more like some kind of "future proof" mode, meaning that people years later could still have fun pushing the game's engine because the devs allowed you to do so, but it certainly doesn't mean that medium or high modes suck.

I remember Witcher 2 getting a lot of backlash because of it's "ubersampling" option, and people were really dumb with that. The ubersampling option was something that would render the game at 2X or 4X your display resolution, I'm not sure anymore, but it was basically like running a game in 4K, in 2011, of course it was extremely demanding, but also looked crazy good. It wasn't meant to be enabled by everyone, it was just a future proof feature to allow for high resolution anti aliasing, but because people couldn't enable this option (which was in red in the launcher with a warning about what it actually was) everybody started claiming the game had terrible optimization... People are so frustrating sometimes.

Ultra isn't always the best looking thing by the way, if you just enable ultra with everything maxed out, you will have all the terrible post process effects like motion blur, depth of field, chromatic aberration, sometimes some terrible implementation of ambient occlusion looking really weird.

Enabling everything and pushing everything to the max is really not always the best thing, at all. I played a lot of games that looked better with many stuff disabled (including GTA V).
That's also without mentionning a lot of options that can be lowered without being really that noticeable ingame, like shadow resolution or tessellation.
I'm sure most people who complain about not being able to run a game in ultra wouldn't even notice if you changed the settings to medium while they were not looking.
Nice try but this has been said over and over and people still don't wanna get it. Some are truly ignorant, others just wanna shit post and it always goes ignored.

Some nebulous "game x ultra doesn't run 4K60 on hardware y" doesn't mean shit about either the hardware or the software quality but people gonna people.

And then you have people (supposedly) quitting/bashing "PC gaming" over it. That's your own ignorance and own loss folks, nothing more, lol.
 
Last edited:

Inviusx

Member
I grew out of PC gaming about a decade ago. Reading all this stuff brings back those PTSD moments of buying the hottest new title and realising my faithful old rig can't handle it. It's a never ending cycle of chasing the highest numbers and only feeling satisfied when a arbitrary setting is switched on in the graphics menu.

Moving back to console was the best decision of my gaming career, both for my sanity and the sake of my wallet.
 

Alexios

Cores, shaders and BIOS oh my!
I grew out of PC gaming about a decade ago. Reading all this stuff brings back those PTSD moments of buying the hottest new title and realising my faithful old rig can't handle it. It's a never ending cycle of chasing the highest numbers and only feeling satisfied when a arbitrary setting is switched on in the graphics menu.

Moving back to console was the best decision of my gaming career, both for my sanity and the sake of my wallet.
Your points are all over the place. It's one thing for your PC to be so oudated it doesn't even run a new game (there's always going to be a cut off point, just as your PS4/Pro won't run PS5 exclusive titles once the cross-gen phase passes) and a whole other thing to actually want to always have the latest and greatest and therefor keep upgrading and spending often. It's like the polar opposite situation and it's all a choice, not a requirement of PC gaming. So just be content with your choices about just doing console gaming rather than try to objectively trash what PC gaming stands for with subjective opinions and personal choices that stopped you from enjoying it/made you go crazy or whatever. There's always a middle ground and a way for more sensible behaviour/choices that will maximize the use/enjoyment you can get out of it with just a bit of research.

One can play AAA style demanding games (never mind the hordes of great lower end software that is constantly released and can run on a toaster) just fine without constant/expensive upgrades. My last GPU was an 7970 (launched in 2011 yet if you search online you can see it was more than adequate even for modern high end demanding games like The Witcher 3, the R9 280X that was most commonly benchmarked on the mid/low end was a rebrand of this GPU with roughly equal performance), now I have a GTX1080 (launched in 2016 but I got it last year so seven years after the 7970's launch) while still using my CPU/mobo/ram from the 7970 days (so yes I will upgrade my CPU a while after the next gen consoles come along to make sure it's on par but by then it's going to be almost a decade after I first got this i7 3770K and then some years later I will also upgrade the GTX1080 and keep the same CPU and so on, upgrading each part as deemed necessary rather than all at once and all so frequently) and not once have I been unable to play a game that interested me. Max everything at 4k 60fps? No but that doesn't mean I couldn't play AAA games that looked good and performed decently. If there is a game where I find performance or visual quality has to be too compromised it can wait until the next upgrade cycle, there are tons that are fine to play instead. Nothing about PC gaming requires constant expensive upgrades and number chasing, people who engage in such simply have money to burn and burn it in excess for whatever benefit they see fit, more power to them but it's not any king of a requirement. One might as well proclaim AAA console gaming "requires" mid-gen upgrades now that consoles in the Pro and X style will most likely continue coming every gen even though it's just as clear it's not "required" any more than the latest Titan GPU is.

Maybe that buying behaviour at certain periods doesn't give me better graphics than a new console but that's not what PC is all about, there are many more advantages than the potential to achieve that, plus I can achieve that after the next upgrade cycle without rebuying the games or hoping that any one company will invest in backwards compatibility (never mind the rarer decision to also actually enhance the old software for the new system like Microsoft is doing for a select few titles in comparison to the almost library-wide potential for the same on the PC).

Edit: ok so maybe you were just talking about your experience/whatever, but in this thread alongside all the other posts trashing pc gaming/graphics cards/console users pretending to laugh at pc or whatever ignorant shit it seemed similar, consider this post is not a direct reply despite the quote.
 
Last edited:
You guys need to dial back what you're saying a bit because there's a little more going on here. I don't think your hardware is being used quite properly.

The game is built around using certain features to get certain results. Pushing certain features higher will create an exponential performance hit for a significantly smaller perceived improvement.

If you don't think that's a proper use of your hardware don't turn those settings up that high?

Take a look above. If more people realized this I think they'd be a bit pissed off.

But it's up to them if they use such high settings. The only people they should be pissed off at are themselves.

If you can't stop yourself moving a slider up and getting bad performance for improvements you either can't see or don't value, then you're the problem.

So long as the game presents you with reasonable initial settings that give you good results for good performance, everything else is on you. IF the game just dumps you in with ultra settings, then yeah that's shitty.
 

Gavin Stevens

Formerly 'o'dium'
I still wish that games that have released on both console and pc would just have an options button that changes everything to their exact console settings. Would stop a lot of whining that’s for sure...

RDR2 is a damn demanding game that is likely one of the best, if not the best looking game of the generation. And you plebs with your Voodoo 5s are crying you can’t run it. NO SHIT.
 
Your points are all over the place. It's one thing for your PC to be so oudated it doesn't even run a new game and a whole other thing to actually want to always have the latest and greatest and therefor keep upgrading and spending often. It's like the polar opposite situation and it's all a choice, not a requirement of PC gaming. So just be content with your choices about just doing console gaming rather than try to objectively trash what PC gaming is with subjective opinions and personal choices that stopped you from enjoying it. There's always a middle ground.

One can play AAA style demanding games (never mind the hordes of great lower end software that is constantly released and can run on a toaster) just fine without constant/expensive upgrades. My last GPU was an 7970 (launched in 2011 yet if you search online you can see it was more than adequate even for modern high end demanding games like The Witcher 3, the R9 280X that was most commonly benchmarked on the mid/low end was a rebrand of this GPU), now I have a GTX1080 (launched in 2016 but I got it last year) while still using my CPU/mobo/ram from the 7970 days (so yes I will upgrade my CPU a while after the next gen consoles come along to make sure it's on par but by then it's going to be almost a decade after I first got this i7 3770K and then some years later I will also upgrade the GTX1080 and keep the same CPU and so on, upgrading each part as deemed necessary rather than all at once and all so frequently!) and not once have I been unable to play a game that interested me. Max everything at 4k 60fps? No but that doesn't mean I couldn't play AAA games that looked good and performed decently. If there is a game where I find performance or visual quality has to be too compromised it can wait until the next upgrade cycle, there are tons that are fine to play instead. Nothing about PC gaming "requires" multi-gen upgrades, people who engage in such simply have money to burn and burn it in excess for whatever benefit they see fit, more power to them but it's not any king of a requirement. One might as well proclaim AAA console gaming "requires" mid-gen upgrades now that consoles in the Pro and X style will most likely continue coming every gen.

I think he's trying to say he felt compelled to run at high settings in order to get the best experience. This spoiled his enjoyment of the game and his hardware, and created a perceived need to upgrade that possibly wasn't even there yet.

Going to console eliminated the need to worry about settings, and removed the incentive to replace hardware prematurely to chase those high settings.

Therefore he's happier on console.

That's a perfectly reasonable position IMO.

Personally, I'm happy to use medium / high settings if necessary. Hell, even low sometimes if it gives the performance I want and I'm not too bothered by that particular image hit (motion blur and in-gameplay DoF can generally just fuck offentirely).
 

GymWolf

Member
After the ultra shitty ps4 pro version rockstar can't get the pc version to work properly at launch?

Fakeshock.gif


But hey, it's future proof guys, in 3 years you are gonna put everyrhing on ultra and the game will still be destroyed by any proper triple A nextgen game :ROFLMAO:
 

rofif

Banned
Based on ....??
400 usd gets You 4k30. 3k usd should get you... ?
It's a game about open wilderness... there is no reason it should run poorly. Gta 5 also runs like shit if You turn everything to max. My rtx 2070 can't do 4k 60 with maxed out gta 5. Their optimization is crap for how the games look
 

bitbydeath

Member
"Even with a $1,200 graphics card, you still can’t max out RDR2 on PC "
So it's bad that a game pushes the graphics to a degree that needs next gen cards to max out at 4k? Would it be better to have a game limited in the max graphical features just so people can feel good about running it ultra and 4k?

Just imagine if it were a more graphically impressive game like Horizon:ZD or Days Gone. The 2080Ti would struggle to reach 1080P. 😂
 

bad guy

as bad as Danny Zuko in gym knickers
Highest settings are meant for your next PC in five years or so. Like a built in remaster you don't have to pay for.
 
Last edited:
400 usd gets You 4k30. 3k usd should get you... ?

Those costs represent market and manufacturing factors, they aren't directly related the abilities of the hardware. And the work required by the software at different settings isn't even related to any of that.

It's a game about open wilderness... there is no reason it should run poorly. Gta 5 also runs like shit if You turn everything to max. My rtx 2070 can't do 4k 60 with maxed out gta 5. Their optimization is crap for how the games look

Then don't max the game out. You don't think the hit is worth it, so why would you do it? Any why would open wilderness not be demanding ... that doesn't make any sense.
 

Helios

Member
This is why I avoid PC gaming most of the time. A console thread talks about the game, a PC thread is all spec shit. Too much tinkering required, I would never leave shit alone, hoping to optimise further etc.
Literally every time a new games comes up we've got a DF thread where people go against each-other which console runs it better. Meanwhile, some people complain about poor optimization on PC and you lot come in here with these type of trash comments.


"We console bois talk about the actual game ! "

LKD8OCO.png
 
Top Bottom