• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Witcher 3 PC Performance Thread

The Goat

Member
Game is running much better than I expected. Was running it native screen res of 1440p and averaging around 45 fps, but decided to bump it down to 1080p to get 60 fps. Have a mix of high with a few ultra presets. I can run hairworks fully on, and still have decent, playable framerate, but really, it doesn't add much, not worth the framerate hit. Going to disable the streaming service, and see if that gives me anything.

i7 2600k (OC 4.6ghz)
680gtx (2gig)
1T Samsung 850 SSD
16 gig ram (2133)

Considered grabbing a new (Ivy Bridge) cpu, as Sandy Bridge doesn't support pci-e 3.0, but would probably be a waste. Once AMD and Nvidia start doing battle again, I'll build a new rig.
 

Croatoan

They/Them A-10 Warthog
Maybe its because I have been playing a lot of Bloodbourne but 30fps locked (and I mean locked) with everything at ultra (save shadows and grass distance on high) runs really really smooth on my 680GTX. Like, I almost don't notice its not 60fps, weird.
 

Devildoll

Member
Sweclockers just released their quicktest of the witcher 3

Nvidia seems to have quite a bit smoother frametimes in this title.

aVOHZLq.png


jNus3Pz.png


-link
 

Static Jak

Member
I tried the 30FPS Cap just out of curiosity and I just couldn't get used to it. Seemed so juddery. I'm already used to seeing the game in 60FPS I suppose.
 

Cincaid

Member
Been playing for another 2 hours, this time without any crashing, yay! Did a restart of my computer and a bit of CCleaning, though I doubt that has anything to do with it. Game looks seriously amazing.

Bit off-topic, but is there no way to take screenshots in the game? Or in GOG Galaxy?
 
Been playing for another 2 hours, this time without any crashing, yay! Did a restart of my computer and a bit of CCleaning, though I doubt that has anything to do with it. Game looks seriously amazing.

Bit off-topic, but is there no way to take screenshots in the game? Or in GOG Galaxy?

I use F12 for steam
 
So I put foliage distance on "High" instead of "Ultra" and Hairworks on "Off". Now I get 60fps.

...Still will take the framerate hit and put everything on max.
 

b0bbyJ03

Member
Or tech just changed. Rapid obsolescence is a well known risk in PC gaming and we can't blame companies every time they make a leap forward. Tessellation performance is a major step forward

I think people are upset because it SEEMS to be related to drivers being less optimized than the actual hardware being outdated. a 780s performance was comparable to a 970 and a 290x. They performed at around the same level in the past yet for this game the 290x/970 are spanking the 780. is the 290 supposed to have better tesselation performance than the 780?
 

sammelito

Member
Game runs like a dream for me. I have everything maxed out except foliage visiblity, density and shadows which run at the high preset. Hairworks, CA, sharpening and vignette are also turned off.

R9 290 Tri-X
i5 4690k (OC to 4,4Ghz)
8 GB 1866Mhz ram

1080P at an almost solid 60 FPS. Lowest frame recorded was 54 FPS.
 

UnrealEck

Member
1080p to get 60 fps. Have a mix of high with a few ultra presets.

680gtx (2gig)

I don't believe you get 60 FPS on high-ultra on a 680.
A GTX 770 (similar to 680) doesn't even get that on High preset with all post-processing stuff entirely off (including AO).
 

viveks86

Member
I think people are upset because it SEEMS to be related to drivers being less optimized than the actual hardware being outdated. a 780s performance was comparable to a 970 and a 290x. They performed at around the same level in the past yet for this game the 290x/970 are spanking the 780. is the 290 supposed to have better tesselation performance than the 780?

Not as plugged in to AMD products as Nvidia. I'd like to know this too
 

johntown

Banned
This thread makes my 980 cry.

I have not played the game yet but it looks like I will need to make graphical sacrifices to maintain 60fps and hairworks. IMO hairworks is a must and 60fps is a must.

BTW GTX 980, 16GB RAM, i7 4790k and Windows 8.1. SSD too.
 

bltn

Member
You could experiment with lowering HairWorksAALevel, and see if you can still manage to keep HairWorks on. That's what I'm gonna experiment with once I get home, at least.

This actually helped a lot. Running with HairWorks on again, staying very close to 60fps at all times (I have Gsync so no hard limits).
 
Anyone else disappointed by the graphics? Even with everything on ultra and with a 4k resolution (DSR), I'm not blown away like I hoped to be. Maybe it's because it's kind of cartoony looking with the saturation of colors as well as the constant swaying vegetation.
 

UnrealEck

Member
This thread makes my 980 cry.

I have not played the game yet but it looks like I will need to make graphical sacrifices to maintain 60fps and hairworks. IMO hairworks is a must and 60fps is a must.

BTW GTX 980, 16GB RAM, i7 4790k and Windows 8.1. SSD too.

Start by turning foliage distance down to High and maybe shadows down to High too if you need to.

Anyone else disappointed by the graphics? Even with everything on ultra and with a 4k resolution (DSR), I'm not blown away like I hoped to be. Maybe it's because it's kind of cartoony looking with the saturation of colors as well as the constant swaying vegetation.

I am a bit. It still looks great but I guess it was hyped a bit too much in the graphics department.
 

Copons

Member
You could experiment with lowering HairWorksAALevel, and see if you can still manage to keep HairWorks on. That's what I'm gonna experiment with once I get home, at least.


I'm tinkering with the rendering.ini myself and I noticed that I have
Code:
UseHairWorks=false
even though in game is fully on, and in the user.settings is properly set to
Code:
HairWorksLevel=2

Should I just disregard the UseHairWorks option in the .ini as long as HairWorksLevel is > 0 in the .settings or I've been playing the game without HairWorks all this time? :D
 

buffelo

Neo Member
With 2 overclocked 780s, 4770k, 8gb ram and Hairworks off, foliage distance to high, and SSAO, I'm struggling to keep a 60 framerate at 1440p. Does this sound normal? If so, I'm kind of bummed out at how Kepler cards are handling this game. I could max out everything in GTA V except for a few of the advanced graphics options and keep it almost rock-solid at 60.
 

Dileas

Member
Not as plugged in to AMD products as Nvidia. I'd like to know this too

Most of AMD's GCN cards have less tessellation power compared to their Nvidia equivalents.

However their recent Tonga cards(R9 285) have shown significant improvements in tessellation, taking a much lower hit in FPS when tessellation features are enabled.
 

Sober

Member
What sort of specs should I target with a 670 GTX and an i7 3770?
Maybe medium or some blend of medium/low. I have a 770gtx and a i5 4670k and I threw everything on medium eventually. I'm usually around 45fps or so, which I'm fine with honestly.

The only thing that's really annoying are rare occasions when sometimes the game goes to cutscenes (either in game or the pre-recorded ones for loading screens), they stutter and freeze for a good 3-5 seconds before playing properly. Luckily there's no audio/visual desync happening.
 

Zakalwe

Banned
This thread makes my 980 cry.

I have not played the game yet but it looks like I will need to make graphical sacrifices to maintain 60fps and hairworks. IMO hairworks is a must and 60fps is a must.

BTW GTX 980, 16GB RAM, i7 4790k and Windows 8.1. SSD too.

Hairworks is very nice, but the frames you lose means you don't really see it on anything other than larger creatures like the Griffon unless you're standing still.

I really like the effect, but it drops me to 40-60 instead of stable 60. I might just enable it for the battles with larger beasts.

-

On another note, I increased shadows to high from medium with no cost to fps.

Now my settings are:

I5 2500k @ 4.3ghz (OC causing no issues so far).
MSI 970 at stock
8 gigs corsair vengeance.

-

Vsync on
1080
Full screen
Hairworks: off
Background Characters: high
Terrain Quality: ultra
Grass Density: high
Texture Detail: ultra
Foliage Vis Range: high
Detail Level: Ultra

Motion Blur: off
Blur: off
AA: on
Bloom: on
Sharpening: on
HBAO+
DoF: on
CA: on
Vignetting: on
Light Shafts: on

Constant 60fps so far. No stutters at all. Cutscenes can drop a little.

Blurs are off purely for preference. On and I get the same performance.

Said it already but I'm very impressed with this card. Highly recommend it if you're considering the upgrade.
 

viveks86

Member
Most of AMD's GCN cards have less tessellation power compared to their Nvidia equivalents.

However their recent Tonga cards(R9 285) have shown significant improvements in tessellation, taking a much lower hit in FPS when tessellation features are enabled.

Thanks! Yeah was just reading about tonga improvements
 

b0bbyJ03

Member
Not as plugged in to AMD products as Nvidia. I'd like to know this too

Me either which is why I tried not to make any claims but I do always remember hearing that Nvidia was supposed to have much better tesselation performance on a 780 vs the 290x. If this is true then you have to wonder what else is holding the 780 from keeping up.
 

viveks86

Member
Me either which is why I tried not to make any claims but I do always remember hearing that Nvidia was supposed to have much better tesselation performance on a 780 vs the 290x. If this is true then you have to wonder what else is holding the 780 from keeping up.

Agreed. Seems like it is true

But if tessellation was the only factor, there is no reason it won't work well on the 780 out of the box right? Unless Nvidia is deliberately sabotaging their own product and making it seem weaker than cheaper alternatives from competitors, which isn't making sense to me. I can see them pushing new cards, but I don't see them doing it at the cost of giving AMD the upper hand over older cards
 

ChawlieTheFair

pip pip cheerio you slags!
I tried the 30FPS Cap just out of curiosity and I just couldn't get used to it. Seemed so juddery. I'm already used to seeing the game in 60FPS I suppose.

If you stick with it you will get used to it again. Had that same problem on AC4 on PC. Couldn't get a solid 60 because of the broken vsync shit, so played for a good while on 30 and eventually it looked normal rather than gross.
 

dragn

Member
after closing the ds4windows program and playing with m&kb the game didnt crash anymore and i finally could change to fullscreen again
 

Durante

Member
I tried the 30FPS Cap just out of curiosity and I just couldn't get used to it. Seemed so juddery. I'm already used to seeing the game in 60FPS I suppose.
The in-game FPS cap is not as bad as some other implementations, but also not as good as external tools. After some rather extensive testing (I'm writing an article) I suggest using no in-game framecap, no in-game Vsync, borderless fullscreen and a 30 FPS limit enforced using RTSS. Best combination of consistent performance and low latency I have found.

(Also, this game is pretty neat)
 

Redmoon

Member
Is the SLI issue still prominent with other users or was it just me?

Would like to use my second Titan X, but not if it adds 5-10 extra fps.
Really want to get a good framerate with andy'd config at 4K(getting ~40 at 1440p).
 
Is the SLI issue still prominent with other users or was it just me?

Would like to use my second Titan X, but not if it adds 5-10 extra fps.
Really want to get a good framerate with andy'd config at 4K(getting ~40 at 1440p).

Both my 970's are reporting 98-99% constant usage, but yeah, I only see a slight increase in performance over having just 1 enabled. I'm not sure what's going on.
 

Aroll

Member
Constant 60fps so far. No stutters at all. Cutscenes can drop a little.

Aren't cutscenes all outputting 30FPS? Pretty sure that was the standard for them, which is why many are okay running things maxed out and getting closer to 30, so there isn't a noticable difference between gameplay and cutscenes.

That being said, I'm in a huge debate right now. I have a game laptop with a 970m, i7 4710, 16gb DDR3, and an SSD. I also own an xbox one. Are the increase in visuals I can expect worth getting the PC version over the Xbox One version? I've seen some videos but that doesn't help my case. I don't know what I would end up setting things to on my laptop, but if the visual improvements I can get are large enough, I will certainly grab the PC version today. If it's not, my xbox one on my big screen is ideal. Anyone know with those specs what I can expect?
 

Leatherface

Member
0 performance difference (alright maybe like a single FPS or something). SweetFX usually costs little to nothing, all that that particular config is doing is just adding a bit of blue to the image and lowering the vibrance of colors. Makes a big difference artistically in how the game looks, but not much else.

Oh, yeah, I'm running a GTX780 by the way.

And here is the config I used:

https://sfx.thelazy.net/users/u/ss-89/

Follow the instructions for manual install.

oh cool. I have the same card. thanks for the info good sir. :)
 

Durante

Member
Aren't cutscenes all outputting 30FPS? Pretty sure that was the standard for them, which is why many are okay running things maxed out and getting closer to 30, so there isn't a noticable difference between gameplay and cutscenes.

That being said, I'm in a huge debate right now. I have a game laptop with a 970m, i7 4710, 16gb DDR3, and an SSD. I also own an xbox one. Are the increase in visuals I can expect worth getting the PC version over the Xbox One version? I've seen some videos but that doesn't help my case. I don't know what I would end up setting things to on my laptop, but if the visual improvements I can get are large enough, I will certainly grab the PC version today. If it's not, my xbox one on my big screen is ideal. Anyone know with those specs what I can expect?
It will certainly do better than the console version.
 
Game is freaking gorgeous. The only complaint would be the AA, which doesn't perform so well at long distances. Though it looks great up close and in dialogue cut scenes.

Also, there appears to be a slight micro stutter in the 30fps cut scenes. Pretty annoying.

Also just had a freeze/lock up in the menu screen, and I just checked and that appears to be a thing a lot of people are getting. I hope this doesn't happen all the time and they fix it soon. I seriously want to play this game. It looks absolutely incredible.
 

Aroll

Member
It will certainly do better than the console version.

But, what are the real differences (besides hairworks, which I have yet to really see a great video showing off the difference that makes). I know or am fairly certain I can play most of the game on the highest settings - but outside of FPS, is it really that big of a visual boost over the xbox one version? I can't seem to tell from YT vids.
 

UnrealEck

Member
Most of AMD's GCN cards have less tessellation power compared to their Nvidia equivalents.

However their recent Tonga cards(R9 285) have shown significant improvements in tessellation, taking a much lower hit in FPS when tessellation features are enabled.

I've seen tesselation mentioned in Kepler vs Maxwell debates relating to Witcher 3 a few times now but I'm still left wondering what significant tesselation does Witcher 3 use, if any?
 

b0bbyJ03

Member
Agreed. Seems like it is true

But if tessellation was the only factor, there is no reason it won't work well on the 780 out of the box right? Unless Nvidia is deliberately sabotaging their own product and making it seem weaker than cheaper alternatives from competitors, which isn't making sense to me. I can see them pushing new cards, but I don't see them doing it at the cost of giving AMD the upper hand over older cards

well i personally wouldn't go into conspiracy theory territory... i just think they probably put the focus of their efforts into the 900 series and let a lot of Maxwell users down. For AMD, you have to remember the the 2xx series is their current set of cards. it would make sense that they are still putting all their efforts into making the drivers the best they can.

Most of AMD's GCN cards have less tessellation power compared to their Nvidia equivalents.

However their recent Tonga cards(R9 285) have shown significant improvements in tessellation, taking a much lower hit in FPS when tessellation features are enabled.

if this is true then it rules out tesselation being the issue for the 780
 
Top Bottom