• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Witcher 3 PC Performance Thread

Piggus

Member
Anyone else have problems with textures (and especially people's hair) loading in after camera transitions in cutscenes? Is there a way to fix that? I'm running the game on Raid 0 SSDs, so I don't think it's my storage. This has been happening since day one and I'd like to fix it for my second play through.
 
Could anyone recommend good TextureMemoryBudget= value when running game on GTX980Ti? I tried googling about setting, but couldn't find any reliable information on how VRAM usage scales as that value is increased.

Anyone else have problems with textures (and especially people's hair) loading in after camera transitions in cutscenes? Is there a way to fix that? I'm running the game on Raid 0 SSDs, so I don't think it's my storage. This has been happening since day one and I'd like to fix it for my second play through.

It would be great to be able to teweak the game's VRAM usage to use as much as possible to prevent texture streaming on characters and otherwise. I have 12 gigs and would not mind if the game used as much as it could (currently 1080p uses only abit more than 1/4th of that at best).
 

Wag

Member
So I added my 3rd 980Ti mainly for better 4k performance in the game and now it just keeps crashing to the desktop left and right (and blanking the screen). With 2 cards it was rock solid. I've benchmarked the 3 cards and they seem to be fine so that's not it. Is it the game itself that's just unstable? It's pretty much unplayable for me now. 😢
 

Easy_D

never left the stone age
Welp. The latest patch kinda messed with my performance a bit, now averaging 40-45 FPS when in the overworld, used to be 55-60 with the same settings. Unless it's the Battlefront beta driver, AMD 280X here. That said, I managed to run it at 40 FPS in 1440p somehow, looked pretty hot. Almost wondered if it was my FX6300 bottlenecking and that I was just misremembering, but nope, with a Titan X, even the FX6300 can hold 55-60 FPS in Novigrad on Ultra per Digital Foundry's testing.

Maybe I should just roll back to 15.7, those ran Witcher 3 no problem, not surprising a beta driver launched for a beta game would be slightly worse for other games.

It would be great to be able to teweak the game's VRAM usage to use as much as possible to prevent texture streaming on characters and otherwise. I have 12 gigs and would not mind if the game used as much as it could (currently 1080p uses only abit more than 1/4th of that at best).

You'd kinda think Ultra texture quality would cache them fast enough to prevent pop in during cutscenes. Game doens't top out my 3gB either so it'd be cool. Witcher 2 had the same problem though, only during cutscenes.
 

comrade

Member
So I added my 3rd 980Ti mainly for better 4k performance in the game and now it just keeps crashing to the desktop left and right (and blanking the screen). With 2 cards it was rock solid. I've benchmarked the 3 cards and they seem to be fine so that's not it. Is it the game itself that's just unstable? It's pretty much unplayable for me now. 😢

Do other games work? Are the cards overclocked?
 
Welp. The latest patch kinda messed with my performance a bit, now averaging 40-45 FPS when in the overworld, used to be 55-60 with the same settings. Unless it's the Battlefront beta driver, AMD 280X here. That said, I managed to run it at 40 FPS in 1440p somehow, looked pretty hot. Almost wondered if it was my FX6300 bottlenecking and that I was just misremembering, but nope, with a Titan X, even the FX6300 can hold 55-60 FPS in Novigrad on Ultra per Digital Foundry's testing.

Maybe I should just roll back to 15.7, those ran Witcher 3 no problem, not surprising a beta driver launched for a beta game would be slightly worse for other games.

What settings are you using? I have a 4690k/stock 7950 3gb/oc and I average 40-45 fps with High preset/Ultra texture. Weird because on low I still get similar performance.
 

Wag

Member
Do other games work? Are the cards overclocked?

No, the cards aren't overclocked, at least not when I play the Witcher 3 (not that that seems to make it difference, I get CTDs either way). I even upped the voltage to 60% on Afterburner and that didn't help. I overclocked it while benchmarking and it worked fine. Ran Haven for quite a while.

One of my cards maxed out @ 91C, but it did when I was running with 2 cards too, that's to be expected especially on graphic intensive games/benchmarks.

I just purchased 3DMark, I'll see how Firestrike runs in 4k. I'll also test a few other games. Any suggestions?
 

Grassy

Member
So I added my 3rd 980Ti mainly for better 4k performance in the game and now it just keeps crashing to the desktop left and right (and blanking the screen). With 2 cards it was rock solid. I've benchmarked the 3 cards and they seem to be fine so that's not it. Is it the game itself that's just unstable? It's pretty much unplayable for me now. 😢

I can't imagine many devs would devote time to worrying about 3-way SLI compatibility...though it could be driver-related.
 

knitoe

Member
No, the cards aren't overclocked, at least not when I play the Witcher 3 (not that that seems to make it difference, I get CTDs either way). I even upped the voltage to 60% on Afterburner and that didn't help. I overclocked it while benchmarking and it worked fine. Ran Haven for quite a while.

One of my cards maxed out @ 91C, but it did when I was running with 2 cards too, that's to be expected especially on graphic intensive games/benchmarks.

I just purchased 3DMark, I'll see how Firestrike runs in 4k. I'll also test a few other games. Any suggestions?

Which 980ti cards do you have? Maybe, not enough power. What PSU?
 

Wag

Member
Which 980ti cards do you have? Maybe, not enough power. What PSU?

Antec High Current Pro 1300w PSU w/EVGA 980Ti ACX 2.0+ cards. They upgraded me from the High Current Pro 1200w, and that might be the problem there. The 1300w could be shit, although I just ran Firestrike Ultra in a loop for quite a while with all 3 cards overclocked and it ran fine. I don't know.

Then the first time I ran the benchmark for Mordor overclocked it locked up, I had to run at normal clock speed.

I don't want to upgrade the PSU if I don't have to, that would be another $200+ investment in my machine which I don't have right now.
 

Easy_D

never left the stone age
What settings are you using? I have a 4690k/stock 7950 3gb/oc and I average 40-45 fps with High preset/Ultra texture. Weird because on low I still get similar performance.

The 280x is essentially a 7970Ghz edition so I should be pulling more frames, your CPU is a lot better than mine for sure, but a stock FX6300 shouldn't bottleneck, hell in the DF video the bottlenecks only show in Novigrad where you have dips to the 40s.

Hariworks: Off
Background Characters: High (apparently this setting does nothing?)
Shadow Quality: Medium
Terrain Quality: Ultra (miniscule fps difference, maybe 1 or 2 frames all in all)
Water Quality: High
Grass Density: Ultra (no difference fps wise between high/ultra)
Texture Quality: Ultra
Foliage Visibility Range: High
Detail Level: Ultra (No difference that I've noticed between High/Ultra setting fps wise)

All post process effects on except CA and Vignetting and AO.

With these settings I used to average 55 FPS when in the open world, even in Velen swamps. I'd dip to 45ish if I had HBAO+ on.

What drivers are you currently using? Thinking it might be the Battlefront beta drivers after all.

Edit: If I put it on Low I get around 75ish FPS
 

mosdl

Member
Anyone else have problems with textures (and especially people's hair) loading in after camera transitions in cutscenes? Is there a way to fix that? I'm running the game on Raid 0 SSDs, so I don't think it's my storage. This has been happening since day one and I'd like to fix it for my second play through.

W3 has had this issue since launch and I ran into it a lot with the new DLC. Oddly enough in the DLC I was getting some bad load times after dying even on SSDs, which is weird.
 

CHC

Member
Wow! I found it =D It's a setting named ShadowDistanceScale, which the tweak guide claims has a "minimal increase in image quality" lol

Wow big difference, that same issue always annoyed me too.

What is the performance impact like? I can't really find any other information regarding that tweak.
 

dottme

Member
I have also a much better time with Dragon Age Inquisition that with Witcher 3.

However, Mass Effect is normal much more action/arcade than Dragon Age.
 
The 280x is essentially a 7970Ghz edition so I should be pulling more frames, your CPU is a lot better than mine for sure, but a stock FX6300 shouldn't bottleneck, hell in the DF video the bottlenecks only show in Novigrad where you have dips to the 40s.

Hariworks: Off
Background Characters: High (apparently this setting does nothing?)
Shadow Quality: Medium
Terrain Quality: Ultra (miniscule fps difference, maybe 1 or 2 frames all in all)
Water Quality: High
Grass Density: Ultra (no difference fps wise between high/ultra)
Texture Quality: Ultra
Foliage Visibility Range: High
Detail Level: Ultra (No difference that I've noticed between High/Ultra setting fps wise)

All post process effects on except CA and Vignetting and AO.

With these settings I used to average 55 FPS when in the open world, even in Velen swamps. I'd dip to 45ish if I had HBAO+ on.

What drivers are you currently using? Thinking it might be the Battlefront beta drivers after all.

Edit: If I put it on Low I get around 75ish FPS

I am on 15.7 drivers. Ah right I guess that GPU difference is important here. I still find it strange though that even low does not raise fps by much.

Will try your settings
 

knitoe

Member
Antec High Current Pro 1300w PSU w/EVGA 980Ti ACX 2.0+ cards. They upgraded me from the High Current Pro 1200w, and that might be the problem there. The 1300w could be shit, although I just ran Firestrike Ultra in a loop for quite a while with all 3 cards overclocked and it ran fine. I don't know.

Then the first time I ran the benchmark for Mordor overclocked it locked up, I had to run at normal clock speed.

I don't want to upgrade the PSU if I don't have to, that would be another $200+ investment in my machine which I don't have right now.

The PSU has good reviews so it's probably not the problem, unless it's faulty. When you said you ran the video card at "stock speeds," do you mean factory OC speeds or the normal default speeds for that GPU? If it's the first, try downclocking to the normal default 980ti speeds and retest. It's not unusual to run into unstable factory OC.
 

Piggus

Member
Welp. The latest patch kinda messed with my performance a bit, now averaging 40-45 FPS when in the overworld, used to be 55-60 with the same settings. Unless it's the Battlefront beta driver, AMD 280X here. That said, I managed to run it at 40 FPS in 1440p somehow, looked pretty hot. Almost wondered if it was my FX6300 bottlenecking and that I was just misremembering, but nope, with a Titan X, even the FX6300 can hold 55-60 FPS in Novigrad on Ultra per Digital Foundry's testing.

Maybe I should just roll back to 15.7, those ran Witcher 3 no problem, not surprising a beta driver launched for a beta game would be slightly worse for other games.



You'd kinda think Ultra texture quality would cache them fast enough to prevent pop in during cutscenes. Game doens't top out my 3gB either so it'd be cool. Witcher 2 had the same problem though, only during cutscenes.

I've noticed that more areas in the game drop my framerate to less than 60 than before patch 1.10, whereas before there were only one or two areas where I could expect to dip under 60. And then when I go back to those same areas later, my framerate is sometimes fine. It's very random. I'll need to check my GPU usage to see if SLI scaling is not working properly.

The hair and beard pop-in during cutscenes is NOT a texture streaming issue. It's actually an LoD issue, and there is a mod on the Nexus which fixes it already.

Thank you! I'll try this when I get home. :D
 

dr_rus

Member
Started playing Hearts of Stone and immediately noticed that the last patches (1.10 or 1.11) made the default gamma almost unbearable on a wide colour gamut display. Sky has heavy banding which seems to add on top of screen space effects now, colors are washed out, some stuff flicker on faces in cutscenes while I'm pretty sure that it shouldn't.

I mean seriously what the fuck is this shit? Do they even play the game which they're building?

witcher3_2015_10_26_24zsws.jpg


Here's a photo of how bad it looks for those who have normal gamut displays or have color management setup in your browser:

2015-10-2702.59.43_re9dsjl.jpg


Some info on topic: http://forums.cdprojektred.com/threads/46806-The-Witcher-3-and-Color-Calibration

Note that it wasn't as bad on 1.08 as I went through the whole game not even noticing anything like this.
 

Easy_D

never left the stone age
Okay, so my frames are a combination of CDProjekt messed up and the AMD 15.10 (beta) drivers are treesh.

Edit: Nope, my CPU was throttling apparently, same settings posted as above, now it hovers around 50-60 instead. I replaced my thermal paste and now everything's gravy. False alarm, game runs as great as ever.
 

AlanOC91

Member
Hey guys. It took me ages to narrow down this problem. But whenever I play the Witcher 3 move Geralt then wall/ground textures are subtlety flickering. It is noticeable but not completely in your face. I've tried disabling all sorts of graphical options one at a time and changing stuff in the Nvidia control panel but none of them seemed to fix it.
Then I decided to lower the resolution to 1440p and it went away. Same with 1080p.
It only shows up in 4k and The Witcher 3 is the only game to do it. Any ideas? This didn't occur with my crossfire R9 290s. I had some slight crossfire flickering with them but it was different.
 

Kezen

Banned
Hey guys. It took me ages to narrow down this problem. But whenever I play the Witcher 3 move Geralt then wall/ground textures are subtlety flickering. It is noticeable but not completely in your face. I've tried disabling all sorts of graphical options one at a time and changing stuff in the Nvidia control panel but none of them seemed to fix it.
Then I decided to lower the resolution to 1440p and it went away. Same with 1080p.
It only shows up in 4k and The Witcher 3 is the only game to do it. Any ideas? This didn't occur with my crossfire R9 290s. I had some slight crossfire flickering with them but it was different.

Driver bug. Update your driver or wait for AMD to fix it.
 

Kezen

Banned
Sorry I have an Nvidia 980ti at the moment! And it has persisted through two driver updates. There is another one out today I may try but I heard people have issues with it.

Oh, sorry I saw many AMD owners with the exact same issue so I took it was AMD specific.
My mistake.

I have no idea what could cause this aside from drivers.
 

AlanOC91

Member
Oh, sorry I saw many AMD owners with the exact same issue so I took it was AMD specific.
My mistake.

I have no idea what could cause this aside from drivers.

No worries! I'll try figure it out. It's driving me nuts and really breaking some of the immersion.

Doesn't make sense to only happen at 4k res with only 1 game either..
 
If I use Borderless Window mode, do I need to turn vsync off or still keep it on?

I know Borderless Window enables Triple Buffering, but I'm not sure if it does the same thing as vsync too.
 
If I use Borderless Window mode, do I need to turn vsync off or still keep it on?

I know Borderless Window enables Triple Buffering, but I'm not sure if it does the same thing as vsync too.

You can turn in-game VSync off.

However in TW3, some users reported that doing so will significantly reduce the game's performance and cause stutter. The same thing happens to me too.

What I did is I still enable in-game VSync and then use borderless windows. No more frame drop and stutter.
 
You can turn in-game VSync off.

However in TW3, some users reported that doing so will significantly reduce the game's performance and cause stutter. The same thing happens to me too.

What I did is I still enable in-game VSync and then use borderless windows. No more frame drop and stutter.

I see. Thank you. I'll keep VSync on then.
 

MrOogieBoogie

BioShock Infinite is like playing some homeless guy's vivid imagination
I just got this game and was wondering if it's running as well as it can on my system.

i3-2100
GTX 960 2GB
8GB RAM

VSync: Off
Maximum Frames Per Second: Unlimited
Resolution: 1680x1050
Display Mode: Full Screen
NVIDIA HairWorks: Off
Number of Background Characters: Ultra
Shadow Quality: Medium
Terrain Quality: Ultra
Water Quality: High
Grass Density: Ultra
Texture Quality: Ultra
Foliage Visibility Range: High
Detail Level: Ultra
Hardware Cursor: On

Motion Blur: On
Blur: On
Anti-aliasing: On
Bloom: On
Sharpening: Low
Ambient Occlusion: HBAO+
Depth of Field: On
Chromatic Aberration: Off
Vignetting: On
Light Shafts: On

Averaging 50-60fps, although I've played only a tiny portion of the beginning. Getting quite a bit of tearing during cutscenes.

What's everyone's opinion of vignetting? Should I turn it on or off? What other settings should I adjust for optimal performance?
 
What's everyone's opinion of vignetting? Should I turn it on or off? What other settings should I adjust for optimal performance?

This is an effect that darkens the corners of the screen. Check here for a visual comparison: http://www.geforce.co.uk/whats-new/...eaking-guide#the-witcher-3-wild-hunt-vignette

If you don't like the effect or you don't mind turning it off, then do so.

In fact, this guide gives you more tips on which settings are more demanding and what they do. It's certainly helped me a lot.

Not sure about Terrain Quality, though.
 
Guys, any idea what the difference between "hairworks high" and "hairworks low" is?

I have yet to really examine it up close. MSAA samples for hairworks and the option to put it on geralt or everything is obvious though.
 
Guys, any idea what the difference between "hairworks high" and "hairworks low" is?

I have yet to really examine it up close. MSAA samples for hairworks and the option to put it on geralt or everything is obvious though.

Hairworks Low makes the hair less smooth and more segmented and also reduces the thickness of the hair.

High vs Low
hairworks_high_front_by_realghostvids-d926boa.jpg
hairworks_low_front_by_realghostvids-d926bnq.jpg


High vs Low
hairworks_high_side_by_realghostvids-d926bo1.jpg
hairworks_low_side_by_realghostvids-d926bng.jpg
 
Hairworks Low makes the hair less smooth and more segmented and also reduces the thickness of the hair.

High vs Low
hairworks_high_front_by_realghostvids-d926boa.jpg
hairworks_low_front_by_realghostvids-d926bnq.jpg


High vs Low
hairworks_high_side_by_realghostvids-d926bo1.jpg
hairworks_low_side_by_realghostvids-d926bng.jpg

Interesting! Great screens by the way, I found it hard to capture geralt standing still actually.

Seems to use less hair strands as well as being less tessellated.

The AMD friend option I assume!
 
How's Witcher 3 on Windows 10? Anyone have any experiences upgrading and comparing?

Depending upone where you are coming from, Win 10 should improve CPU related performanc ein all dx11 games as far as I have come to understand it.

I saw no performance regression upgrading to win 10.
 

Exentryk

Member
Depending upone where you are coming from, Win 10 should improve CPU related performanc ein all dx11 games as far as I have come to understand it.

I saw no performance regression upgrading to win 10.

Good to know, thanks. I am coming from 8.1. Did you have to reinstall the game after upgrading?
 

Exentryk

Member
I believe not, but I did it anyway and just made sure my saves were safe ( a good idea )

Cool thanks.

EDIT - Upgraded to Windows 10. Very smooth process, and all the files are in the same place as they were in 8.1. Had no issues at all. Witcher 3 still running as well as it was running in 8.1.
 

The-Bean

Member
I've decided to play through to completion after playing about 40 hours just after launch. I can't remember much of what was happening so I've decided to start a new game. Does anyone know if it's possible to re-bind quicksave to a gamepad button? It gets annoying reaching over to my keyboard to press F5 all the time, I'd like to bind it to the 'guide' button on the 360 gamepad. The in-game menu doesn't let me but is it possible in a config or does anyone know of some external program that allows you to bind keys to the guide button?

EDIT: Got a program called antimicro which lets you bind keys to gamepad buttons. The issue I'm having now is that the guide button doesn't appear to be working. =/ Completely unrecognized by Windows (10). Using a wireless 360 pad, anyone have any ideas?

EDIT 2: Got it working with help from this post on Reddit:

Same problem; the new xbox integration thing is what did it for me.
  1. Open your browser, hit Windows key+G
  2. A small window should appear, check "yes, this is a game"
  3. Open the settings button at the right end of the window
  4. Uncheck "open game bar using X on a controller" (to remove guide button conflicts) and uncheck "remember this as a game" (so your browser isn't considered a game should you want to keep using this new feature)
Hopefully that should resolve the problem.
 
I've decided to play through to completion after playing about 40 hours just after launch. I can't remember much of what was happening so I've decided to start a new game. Does anyone know if it's possible to re-bind quicksave to a gamepad button? It gets annoying reaching over to my keyboard to press F5 all the time, I'd like to bind it to the 'guide' button on the 360 gamepad. The in-game menu doesn't let me but is it possible in a config or does anyone know of some external program that allows you to bind keys to the guide button?

EDIT: Got a program called antimicro which lets you bind keys to gamepad buttons. The issue I'm having now is that the guide button doesn't appear to be working. =/ Completely unrecognized by Windows (10). Using a wireless 360 pad, anyone have any ideas?

EDIT 2: Got it working with help from this post on Reddit:

Was gonna say, AntiMicro is your friend. Only key I ever bind is F5.
 
Top Bottom