• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Red Dead Redemption 2 Is Having A Rough Launch On PC

BigBooper

Member
The complaints about not running maxed settings are the dumbest things I've read in a while. Yea, if you couldn't run with max settings and the game also looked like Half Life or something you might have a point.
 

NXGamer

Member
Sometimes I am still shocked by the expectations here, once next gen consoles are launched the PC market is going to find just hitting 60fps MUCH, MUCH more demanding than currently. I then expect to see these kind of concerns raised far more often, like now, incorrectly.

There are quite a few areas to cover in the Red Dead 2 PC version and I started looking at it on 2 rigs yesterday that Really covers the range. It scales, and it scales well with the obvious issues of pushing a game that already sits atop the current generation will bring. Namely coming with a greater demand on hardware that is non Linear, thus a lower percieved return.

Hopefully my video that will cover this soon will explain this better and how it is actually NOT a terrible port at all.
 
Last edited:

TacosNSalsa

Member
I'm not getting anything near 60fps. I'm getting between 30fps up to mid 40fps. Anyone can trouble shoot me? I'm of the belief that
my cpu i7-7700 is bottlenecking my graphics card. Could my monitor be doing that as well? Thanks!


Acer XFA240 1080p/144hz freesync monitor(running in game resolution at 1440p) 16 gig ddr 4 ram, i7-7700, Nvidia GTX 2080 Founder's Edition 8 gig, all drivers up to date. As for settings, here are my settings:

cOFFYlY.jpg


hek8Bye.jpg


QVUDeKL.jpg


hGJr9kU.jpg
I'm no PC expert and not familiar with Rockstars settings so take this with a grain of salt . Whenever a game isn't running the way i want i always start with shadows and anything with the word volumetric in it . They seem to be the biggest hogs . If it's still not the way I like the reflections. Normally the difference between ultra and high is negligible unless you're specifically looking for it .
 
I'm not getting anything near 60fps. I'm getting between 30fps up to mid 40fps. Anyone can trouble shoot me? I'm of the belief that
my cpu i7-7700 is bottlenecking my graphics card. Could my monitor be doing that as well? Thanks!


Acer XFA240 1080p/144hz freesync monitor(running in game resolution at 1440p) 16 gig ddr 4 ram, i7-7700, Nvidia GTX 2080 Founder's Edition 8 gig, all drivers up to date. As for settings, here are my settings:

I'd say your settings are way too high, turn down water physics quality for one.
 
Running it at a flawless 1080/60. Looks and runs glorious. Only problem I've had so far was the first few hours of RDO had a weird bug that capped the framerate to 30, but it was quickly hotfixed.
 

Shifty

Member
PC is relegated to indie games and coop/online games nowadays. There is literally no reason to have a high end pc for gaming because all the major games will be limited by the next gen console hardware anyway. Buy a video card comparable to that of Ps5 or xbox and you are good to go
You do know that AAA developers don't build their assets solely targeting the current (or next, in the case of crossgen) generation of consoles, yes?

You build that stuff in as high of a fidelity as possible using a resolution independent pipeline (such as Substance for textures or Subdivision Modeling for 3D), then have the art tools crunch it down to console-ready levels in order to hit performance targets.

There's plenty of ceiling for PC to take advantage of.

The "Hjaelp" on the second monitor is perfect :messenger_tears_of_joy:
 

lukilladog

Member
Sometimes I am still shocked by the expectations here, once next gen consoles are launched the PC market is going to find just hitting 60fps MUCH, MUCH more demanding than currently. I then expect to see these kind of concerns raised far more often, like now, incorrectly.

There are quite a few areas to cover in the Red Dead 2 PC version and I started looking at it on 2 rigs yesterday that Really covers the range. It scales, and it scales well with the obvious issues of pushing a game that already sits atop the current generation will bring. Namely coming with a greater demand on hardware that is non Linear, thus a lower percieved return.

Hopefully my video that will cover this soon will explain this better and how it is actually NOT a terrible port at all.

But can it run at a solid 60fps with no frame pacing issues on the likes of a 970 or gtx 1060 with at least a mix of medium/high settings?, at 1080p or 1440P?. I consider anything that falls below that by a considerable margin a mediocre port.

Ps.- I´ve seen a video, it needs low settings to achieve 60fps, some port uh!.

 
Last edited:

DeepEnigma

Gold Member
Because it runs at 4K 30 on a $400 console and in many ways looks almost exactly the same while that is a $1,200 GPU in a $3,000+ system which should computationally outperform it by leaps and bounds.

The end result is you either get a game that looks just like it does on the X at 60 FPS or you get it where it looks mostly the same with some marketable improvements at the same 30 FPS...

yeshrug.png

Don't forget the shitty netbook Jaguar junk calculator, tablet, and/or microwave CPU,... the Intel Zenernators will framerate the piss out of this. Just taking a piss myself.

Which reminds me, next gen is going to be even harder for PCs to "brute force" on the CPU side (at least around launch window and a good portion after), due to consoles having a (rumored/leaked 3.2Ghz) desktop part in the box.
 
Last edited:

888

Member
I'm not getting anything near 60fps. I'm getting between 30fps up to mid 40fps. Anyone can trouble shoot me? I'm of the belief that
my cpu i7-7700 is bottlenecking my graphics card. Could my monitor be doing that as well? Thanks!


Acer XFA240 1080p/144hz freesync monitor(running in game resolution at 1440p) 16 gig ddr 4 ram, i7-7700, Nvidia GTX 2080 Founder's Edition 8 gig, all drivers up to date. As for settings, here are my settings:

cOFFYlY.jpg


hek8Bye.jpg


QVUDeKL.jpg


hGJr9kU.jpg




Here is what I am running. Getting 55 (Mostly near water) - 90FPS. Runing at 1440p with Gsync, 9700K, 16GB 3600Mhz Ram, 512 SSD (Where game is installed). Asus 2070 Super OC to around 2000mhz. Same as you with the Drivers.


JOqS5Ov.png


8bDVapi.png


eRm9lQe.png


sbMEUGG.png
 

ultrazilla

Member
I'd say your settings are way too high, turn down water physics quality for one.

Thanks to all who helped guide me with some settings! I had to humble myself big time. Knocked a ton of stuff down to medium and I'm averaging
71fps. The weird thing is there are times when I'm pretty much at 41fps and depending on the background scenery, it'll go up past 60fps!

So I gotta think that Rockstar didn't optimize the game to it's fullest and hoping they'll be pushing through performance patches. There's definitely
something going on here.....

888 888 Awesome, I'll use your settings and get back to you! :messenger_sunglasses:
 

Skyr

Member
It pisses me off that pascal cards are falling behind with this game. In the vast majority of games a 1080ti is on par with the 2080 and here it’s far behind for whatever reason. I fail to see the logic in that as there is no RTX stuff involved.
I hope they can further improve this through drivers but my guess is Nvidia doesn’t care or doesn’t want to.
 

joe_zazen

Member
I grew out of PC gaming about a decade ago. Reading all this stuff brings back those PTSD moments of buying the hottest new title and realising my faithful old rig can't handle it. It's a never ending cycle of chasing the highest numbers and only feeling satisfied when a arbitrary setting is switched on in the graphics menu.

Moving back to console was the best decision of my gaming career, both for my sanity and the sake of my wallet.

lol, i hear you.

The rabbit hole of modding skyrim with dual 680s kinda killed my enthusiasm for pc gaming. trying to figure out which mod out of 200 was causing issues...just ewwwww. Also, realising better graphics settings and resolution after a certain point did not materially affect my enjoyment of a game: i am still doing the exact same shit for the exact same pretend rewards whether at 4k 120 or 1080 30.

It kinda sucked knowing I spent over $2k on a gaming pc before i realised this, but better late than never.
 

Graciaus

Member
My God man, take some aspirin & calm the proverbial f down. I have a ps4, an xbox one & a gaming pc. Some games run so bad on console I prefer them on pc, whereas many others are perfectly fine on console so I play them on those platforms for ease of use & zero hassle. But don't come & tell me there's no hassle on pc when steam forums etc. are littered with people asking how & why they have stutter in their pc games at 60 fps (hello The Evil Within 2, aka a disgrace on pc). That's just one small example of a commonly found issue with pc games, i.e. lack of optimization & a DIY experience which can take hours to set up correctly - often with different Nvidia control panel settings for different games, thus making jumping from one to another also a hassle (vsync on/off etc.).

If you deny there's a time sink required to get a game running properly on pc, then you're the one peddling dishonest propaganda here.
The majority of games you just click play and adjust your settings to whatever your PC can handle. If you do have an issue most are solved with a simple Google search. A game this poorly optimized is an exeption and in almost 2020 should not be allowed to release. But people buy it anyway.

Hours to set up? Don't spread propaganda.
 

MMaRsu

Banned
I'm not getting anything near 60fps. I'm getting between 30fps up to mid 40fps. Anyone can trouble shoot me? I'm of the belief that
my cpu i7-7700 is bottlenecking my graphics card. Could my monitor be doing that as well? Thanks!


Acer XFA240 1080p/144hz freesync monitor(running in game resolution at 1440p) 16 gig ddr 4 ram, i7-7700, Nvidia GTX 2080 Founder's Edition 8 gig, all drivers up to date. As for settings, here are my settings:

cOFFYlY.jpg


hek8Bye.jpg


QVUDeKL.jpg


hGJr9kU.jpg

Maybe lower some settings durr?
 

bilderberg

Member
Because it runs at 4K 30 on a $400 console and in many ways looks almost exactly the same while that is a $1,200 GPU in a $3,000+ system which should computationally outperform it by leaps and bounds.

The end result is you either get a game that looks just like it does on the X at 60 FPS or you get it where it looks mostly the same with some marketable improvements at the same 30 FPS...

yeshrug.png

You mean this hardware that's 10x more powerful isn't running the game 10x more efficiently? Shocking. A marginally better looking game at 5x the cost has been a reality for pc gaming the past 20 years.
 

joe_zazen

Member
The majority of games you just click play and adjust your settings to whatever your PC can handle. If you do have an issue most are solved with a simple Google search. A game this poorly optimized is an exeption and in almost 2020 should not be allowed to release. But people buy it anyway.

Hours to set up? Don't spread propaganda.

you missed the word ‘can’, and yes there are occasions where it can take hours. I like to point these things out because i wish someone would have given me strait talk beforeI bought my over priced gaming hardware.
 
Last edited:

NXGamer

Member
But can it run at a solid 60fps with no frame pacing issues on the likes of a 970 or gtx 1060 with at least a mix of medium/high settings?, at 1080p or 1440P?. I consider anything that falls below that by a considerable margin a mediocre port.

Ps.- I´ve seen a video, it needs low settings to achieve 60fps, some port uh!.


This is the first issue, who is saying that this is a given. A 970 is a weaker GPU than a Pro and certainly an X, what is the CPU, what is the Ram, why is High even in the conversation when this could be far higher in many aspects. Could be ray marching Volumes at 2x/4x the Console versions. Shadow maps could be 2x more, LOD could be x more , so on and so on. No developer has a requirement to enable 60fps on lower end machines at high settings, this is reason PC has settings, so you choose your sacrifices.

Again Low MAY BE identical to consoles (it is not) but this is still twice the throughput when CPU is not even being mentioned here?
 

bilderberg

Member
Anyone playing on pc should invest in a 120 or 144 hz monitor. The most under rated aspect of a high refresh rate are having so many divisible vsyncs. Using a 1/3 refresh cycle(48 fps) on a 144 hz monitor is so convenient. You don't have to make a drastic choice between 30 or 60 fps. I play many games at 40 fps(1/3 of 120hz), 48fps, 60, or 72 fps. It really gives you a much smoother experience on really demanding games like this. Although you gotta use nvidia profile inspector to get that fine control of what vsync you wanna use, should be an option in the official driver. And I don't know if AMD has something similar.
 

Freeman76

Member
Literally every time a new games comes up we've got a DF thread where people go against each-other which console runs it better. Meanwhile, some people complain about poor optimization on PC and you lot come in here with these type of trash comments.


"We console bois talk about the actual game ! "

LKD8OCO.png

Stay frosty
 

ruvikx

Banned
The majority of games you just click play and adjust your settings to whatever your PC can handle. If you do have an issue most are solved with a simple Google search. A game this poorly optimized is an exeption and in almost 2020 should not be allowed to release. But people buy it anyway.

Hours to set up? Don't spread propaganda.

I ask myself how many Witcher 3 pc players never changed their settings once crossing over the Pontar river? i.e. the forest over there is lush & kills the framerate nearly ten hours into the game. You can play for an hour on one setting & then realize "shit, I need to change here because the stutter sucks". I ask myself how many Assassin's Creed Odyssey players didn't have to change their settings & resolution once they reached the first city, or lit their torch at night? (in that game the benchmark is seriously flawed). Or Fallout players realize Fallout 4 is a stutter fest without locking the framerate in the user settings? These are rhetorical questions FYI, i.e. don't spread bullshit about click & play on pc, because even stuff like rivatuner statistics server is often required to get a smooth framerate beyond what the Nvidia panel can or cannot do.

I'm also referring to the biggest titles as well, aka the blockbuster AAA games which people want. If you're coming in here attempting to get console players to go pc & tell them "don't worry about settings, these assholes who tell you you'll be tinkering with the graphics & framerates are liars spreading propaganda!", I'll call you out.
 

GymWolf

Member
Yes, with 1/10 of the texture quality. Good for you sir.
If i remember well, textures are the same on console, except trees.

You see more details because the iq is superior.
A mod from another forum told me than 4k on pc is noticibly more clear and detailed even compared to the real 4k on xone x, don't ask me how this is possible.

But the low tier texture on console are basically the same on pc at ultra.

Happy to be disproved, textures for me are far more important than lod or shadows.
 

lukilladog

Member
This is the first issue, who is saying that this is a given. A 970 is a weaker GPU than a Pro and certainly an X, what is the CPU, what is the Ram, why is High even in the conversation when this could be far higher in many aspects. Could be ray marching Volumes at 2x/4x the Console versions. Shadow maps could be 2x more, LOD could be x more , so on and so on. No developer has a requirement to enable 60fps on lower end machines at high settings, this is reason PC has settings, so you choose your sacrifices.

Again Low MAY BE identical to consoles (it is not) but this is still twice the throughput when CPU is not even being mentioned here?

The X is 36% more powerful than the 1060 but it´s running at 4k, the 1060 should be able to retain the graphics of the X at 1080p with twice the frame rate or more no problem, and going by the comparisons from the previous page it should be close to ultra settings. It´s a lazy port, they rely on you having the latest and greatest hardware, and even if you do, they still under deliver.
 
Last edited:
The X is 36% more powerful than the 1060 but it´s running at 4k, the 1060 should be able to retain the graphics of the X at 1080p with twice the frame rate or more no problem, and going by the comparisons from the previous page it should be close to ultra settings. It´s a lazy port, they rely on you having the latest and greatest hardware, and even if you do, they still under deliver.

Wait, are you saying the X is close to ultra settings? Cause it ain't.
 
D

Deleted member 17706

Unconfirmed Member
Glad I'm waiting for the Steam version, but I definitely need to check my expectations considering I only have a 6700k and GTX 1080 on a 1440p 165hz G-Sync monitor.
 

nkarafo

Member
Anyone playing on pc should invest in a 120 or 144 hz monitor. The most under rated aspect of a high refresh rate are having so many divisible vsyncs. Using a 1/3 refresh cycle(48 fps) on a 144 hz monitor is so convenient. You don't have to make a drastic choice between 30 or 60 fps. I play many games at 40 fps(1/3 of 120hz), 48fps, 60, or 72 fps. It really gives you a much smoother experience on really demanding games like this. Although you gotta use nvidia profile inspector to get that fine control of what vsync you wanna use, should be an option in the official driver. And I don't know if AMD has something similar.
Yeah, i'm doing this with my 240hz monitor. It also has support for 50hz as well, despite not being a divisible number. So i can choose between 30, 40, 50 and 60fps. Which is a godsend if you have a low-to-middle range card like the 1060.
 

Tygeezy

Member
Don't forget the shitty netbook Jaguar junk calculator, tablet, and/or microwave CPU,... the Intel Zenernators will framerate the piss out of this. Just taking a piss myself.

Which reminds me, next gen is going to be even harder for PCs to "brute force" on the CPU side (at least around launch window and a good portion after), due to consoles having a (rumored/leaked 3.2Ghz) desktop part in the box.
This game is a gpu bottleneck on pc. For console it is more than likely a cpu bottleneck. I think steve from gamers nexus is going to be doing a cpu bench mark for this game soon.

Whats nice about benchmarks like gears 5 is it will show cpu frametimes. On modern cpus you are getting 250 + fps. My guess is the one x cpu would be struggling to hit 60 consistently.
 
just to be clear here i am not spending 3k to play this shoddy port next year

Next year $1200 will net you a 3080 (2080ti performance). Thats just how nVidia pricing works now apparently. To get the 3080ti with CPU/MB combo you are spending way more than 3k next year.
 

Terenty

Member
You do know that AAA developers don't build their assets solely targeting the current (or next, in the case of crossgen) generation of consoles, yes?

You build that stuff in as high of a fidelity as possible using a resolution independent pipeline (such as Substance for textures or Subdivision Modeling for 3D), then have the art tools crunch it down to console-ready levels in order to hit performance targets.

There's plenty of ceiling for PC to take advantage of.

No i dont know all these stuff, but the difference is miniscule all the same. Yes the picture is more sharp and 60fps if you are lucky as we can see in this thread, but imho not worth it

This video a couple posts above shows the difference between ps4 pro, xboxX and Pc, can you honestly say the difference is that major that you need to run and buy the best video card on the market to experience it?
 
Last edited:
Top Bottom