• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Witcher 3 PC Performance Thread

Vamphuntr

Member
I only had 2 crashes so far in 30 hours of gameplay. They are pretty weird as it seems the game simply shuts itself down. Like the game closes cleanly and I don't have to use the task manager to close it. I can also relaunch it from steam right after.
 

Qassim

Member
I've just tested that Kepler fix and it seems to work and from what I can tell not actually change anything in game. I think it may just fall back to running on the CPU when it can't run it on the GPU? Perhaps there's just a bug with the library on Kepler GPUs for now.

I'm pretty sure the clothing includes things like nets, sheets, flags waving in the wind, etc, but I could be wrong on that.
 

Durante

Member
This is interesting if actually true.

If the game doesn't crash if you simply delete the dll, then this means one of two things:
  1. It didn't actually use it, in which case it also can't have a performance impact.
  2. It uses the dll in a system folder (installed by a driver) instead.

In case of 2, it could be that the system dll is a different version which shows better performance in TW3.

Still strange that it would make such a big difference, my first instinct with something like this is always user error ;)
 
Thinking about it a bit and with minor research, I'm wondering if inadequate power may be the issue. I bought a 970 about a month ago to replace a 770, and put the 770 that was in it into my old secondary PC to replace a 560 Ti. Well, the old machine only had a 500W power supply. I didn't even think about that. Before I replace the PSU, does anyone think I'm barking up the right tree in eyeing that as the culprit?

For reference, here was my last post:

After some crashes the first couple of days, the game has run like a dream on my main PC. I don't know if it was switching to unlimited frame rate, full-screen instead of windowed borderless, or the game updates, but haven't had a crash in a week.

...Until I tried running on my secondary PC. Not that it's a big deal or anything as there's always the main PC to go back to, but sometimes the missus and I like to just kick back and play a game in the bedroom. Steam In-Home streaming is an option for this, but despite a wired connection, I'm not overly enamored with the image quality. So then I tried just playing on the machine itself:

1.) AMD Phenom II X4 940
2.) 8 GB Ram
3.) GeForce GTX 770

Honestly, it runs it well enough on low/medium settings. And my wife was able to play for about an hour the other night with no incident. However, since then I can't seem to play more than 10 minutes without a crash that necessitates closing from task manager. I know the card had a factory overclock. I tried lowering the clock speed through Afterburner, but maybe I'm just not enough of a power user to know what I'm doing, as I never mess with overclock settings.

Anybody been able to fix constant crashing? Like I said, I'm not all that concerned about it as the game runs great on my main machine. But I just liked the flexibility of being able to play in different rooms without having to deal with In-Home Streaming.
 

Sevenfold

Member
Cross posting for more reads...
Help. My sprint key has stopped working on Roach. Tried rebinding/swapping to toggle. reached a part of the game where I need my horse and I'm screwed. Won't even canter!
On PC fwiw. Anyone seen this bug?
Any help greatly appreciated. Geralt is fine. Just horses.
 
Well I'll be a dumbass. Returned to the last official AMD drivers (Dec I think) and the fucked up meshes are gone. Drivers would usually be one of the first things I'd think of, but I haven't changed anything since I've been playing the game. Strange.
 

tioslash

Member
This is interesting if actually true.

If the game doesn't crash if you simply delete the dll, then this means one of two things:
  1. It didn't actually use it, in which case it also can't have a performance impact.
  2. It uses the dll in a system folder (installed by a driver) instead.

In case of 2, it could be that the system dll is a different version which shows better performance in TW3.

Still strange that it would make such a big difference, my first instinct with something like this is always user error ;)

Certainly interesting. There were several areas in the game where my fps would inexplicably drop (including the one used in video from that person) from 30 to 25-20 and even less.

And there wasn´t anything particularly GPU intensive going on in these scenes that justified this drop (like just passing by a single hut in a certain place, or entering an area with 5 NPCs and experience a huge drop) while in another huge place, with tons of NPCs, houses, keeping a solid FPS.

After deleting this file it keeps the fps locked at 30 on all those areas.
 

theeraser

Member
Thinking about it a bit and with minor research, I'm wondering if inadequate power may be the issue. I bought a 970 about a month ago to replace a 770, and put the 770 that was in it into my old secondary PC to replace a 560 Ti. Well, the old machine only had a 500W power supply. I didn't even think about that. Before I replace the PSU, does anyone think I'm barking up the right tree in eyeing that as the culprit?

For reference, here was my last post:
Until recently, I was powering a 980 build with a 500W PSU and had no problems whatsoever. I upgraded about two days ago because I just wanted to be safe, and give myself the necessary room for overclocking purposes down the line, but I played TW3 pretty extensively before doing so and didn't have any stability issues.

So, I'd say it's worth upgrading, but I'm not so sure you'll see improvements.
 

Qassim

Member
Screenshots:

APEX_ClothingGPU_x64 - OFF (renamed):

a4LlFM2.jpg

APEX_ClothingGPU_x64 - ON:


Pretty much the same spot, slight change in camera angle but the camera angle doesn't seem to be the reason for the difference from testing (after seeing the screenshots were slightly different).
 

s_mirage

Member
Might be anecdotal, but any time I have Afterburner open, even at stock OC (evga's OC), the game will crash. When I close Afterburner the game doesn't crash (outside of a memory leak once). I'm using an evga 970, latest drivers, everything on ultra except shadows and foliage (both on high) hair off, full screen, frames unlimited. No ini tweaks.

My 3570k is OCd to 4.2ghz for what that's worth. 8gb RAM.

It's not just you, I have the same problem if Precision X is running. Freeze + driver crash.
 
Screenshots:

APEX_ClothingGPU_x64 - OFF (renamed):



APEX_ClothingGPU_x64 - ON:



Pretty much the same spot, slight change in camera angle but the camera angle doesn't seem to be the reason for the difference from testing (after seeing the screenshots were slightly different).

Nice, can't wait to try this when I get home.
 
Until recently, I was powering a 980 build with a 500W PSU and had no problems whatsoever. I upgraded about two days ago because I just wanted to be safe, and give myself the necessary room for overclocking purposes down the line, but I played TW3 pretty extensively before doing so and didn't have any stability issues.

So, I'd say it's worth upgrading, but I'm not so sure you'll see improvements.

Yeah, but -- and I'll note in advance that there are people here who know a hell of a lot more than I do about PC hardware -- the specification for the 970 and 980 only call for a 500W. Meanwhile, the 770 specification recommends a 600W power supply. Are they full of it? I don't know. Like I said, I'm not really in my comfort zone here. All I know is that I didn't even think about the PSUs when swapping cards around, and I'm hopeful that the fact that my current PSU is simultaneously old (got it six years ago) and 100W below the recommended spec might explain crashes every 5 minutes.

http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-980/specifications
http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-770/specifications
 

Google

Member
I'm having some weird cross stitch effect on character models while playing this. It happened with the witcher 2 also.

During gameplay when watching a conversation play out Geralt and co. Will often have a weird blocky texture look to certain parts of their face.

I'm playing on Ultra at 1080p.

Is it a monitor or resolution thing?
 

PnCIa

Member
My game also has the misplaced grass and floating trees bug. I wonder if it has something to do with config tweaking.
 

Qassim

Member
Okay, unless I'm being stupid, I think it has to do with GPU PhysX. I reverted change of the file back to default (so the game loads it) and then went to the NVIDIA control panel and set PhysX processing to CPU.

Screenshot (PhysX - CPU) 71fps:


Screenshot (PhysX - Automatic (auto-selected my 2nd 780)) 62fps:


fps counter on the second is a bit hard to see, especially with imgur compression, but I've double checked, it's 62fps. Keep in mind, I'm running SLI also, so I'm not sure if that'll change anything in relation to your testing - but the guy in the video is running a single card.
 

spuckthew

Member
The fact that these OC crashes are happening consistently across a wide range of cards and varying OC levels suggests that there is something in the game causing crashes to be way more likely then normal.

Dude, it's probably the driver. I rolled back and haven't crashed since. Do the same and forget until Nvidia pull their finger out.


EDIT: Didn't read any of the stuff about that dll :O
 
My game also has the misplaced grass and floating trees bug. I wonder if it has something to do with config tweaking.

The only tweak I did was the one to remove the KB/gamepad message, but when I reinstalled everything, I tested with zero tweaks. Was corrupt drivers after all...ah well, I'm happy I saved $300+.
 

Derp

Member
Okay, unless I'm being stupid, I think it has to do with GPU PhysX. I reverted change of the file back to default (so the game loads it) and then went to the NVIDIA control panel and set PhysX processing to CPU.

Screenshot (PhysX - CPU) 71fps:



Screenshot (PhysX - Automatic (auto-selected my 2nd 780)) 62fps:



fps counter on the second is a bit hard to see, especially with imgur compression, but I've double checked, it's 62fps. Keep in mind, I'm running SLI also, so I'm not sure if that'll change anything in relation to your testing - but the guy in the video is running a single card.
That would explain why the problem is quite area-specific. Like, where there's lots of... PhysX... Stuffs...

Right...?
 

tioslash

Member
Okay, unless I'm being stupid, I think it has to do with GPU PhysX. I reverted change of the file back to default (so the game loads it) and then went to the NVIDIA control panel and set PhysX processing to CPU.


fps counter on the second is a bit hard to see, especially with imgur compression, but I've double checked, it's 62fps. Keep in mind, I'm running SLI also, so I'm not sure if that'll change anything in relation to your testing - but the guy in the video is running a single card.


Yeah, I can confirm that changing the setting PhysX processing to CPU on the NVIDIA Control Panel have the exact same effect as deleting that file. A 10-15% increase in FPS in all specific areas where I was getting drops.

I´m running a single 760.
 

b0bbyJ03

Member
Yeah, I can confirm that changing the setting PhysX processing to CPU on the NVIDIA Control Panel have the exact same effect as deleting that file. A 10-15% increase in FPS in all specific areas where I was getting drops.

I´m running a single 760.

if all this is true then it may explain why the AMD counterparts were performing better. They just default PhysX to the CPU. I wonder if this affects performance on Maxwell cards. Has anyone tried?
 
Okay, unless I'm being stupid, I think it has to do with GPU PhysX. I reverted change of the file back to default (so the game loads it) and then went to the NVIDIA control panel and set PhysX processing to CPU.

Screenshot (PhysX - CPU) 71fps:



Screenshot (PhysX - Automatic (auto-selected my 2nd 780)) 62fps:



fps counter on the second is a bit hard to see, especially with imgur compression, but I've double checked, it's 62fps. Keep in mind, I'm running SLI also, so I'm not sure if that'll change anything in relation to your testing - but the guy in the video is running a single card.

Hrm.. does that mean the game DOES use GPU accelerated PhysX?

The PhysX identifier says it is running on the CPU though...
if all this is true then it may explain why the AMD counterparts were performing better. They just default PhysX to the CPU. I wonder if this affects performance on non-Maxwell cards. Has anyone tried?

You mean on Maxwell cards?

People already tested kepler.
 

jorimt

Member
Okay, unless I'm being stupid, I think it has to do with GPU PhysX. I reverted change of the file back to default (so the game loads it) and then went to the NVIDIA control panel and set PhysX processing to CPU.

Yup, before seeing your post, I thought of the exact same thing when I saw the original "fix." Earlier, I did the same test (runnning a GTX 770 4GB), first with the file deleted, the second time with the PhsyX set to "CPU."

Seems like both do the same thing, and only in certain areas; many areas, for me, show little to no improvement with either fix.

It looks as if the performance drop in affected areas are due to the PhysX processes intended to run solely on the CPU being offloaded to the GPU instead. Strange, seeing as it's been confirmed that all PhysX in The Witcher 3 run off the CPU.
 

spuckthew

Member
I was having trouble with my CPU overclock a few weeks ago but I haven't BSOD for a while and not once in W3. Hopefully offloading PhysX to my CPU won't reveal another instability :S
 

Qassim

Member
Hrm.. does that mean the game DOES use GPU accelerated PhysX?

The PhysX identifier says it is running on the CPU though...

Well Apex clothing can run on either the CPU or GPU and the name of this file is 'Apex_ClothingGPU' presumably meaning they're running it on the GPU when available.

If you look at my GPU usage on the screenshots, when PhysX is set to the CPU - both GPUs running at 98%. Whereas when I set it back to the GPU, the second GPU (which is the selected PhysX processor) is 98% whereas the first GPU is now 84%, presumably bottlenecked by the additional workload on the second GPU?
 

Derp

Member
Yup, before seeing your post, I thought of the exact same thing when I saw the original "fix." Earlier, I did the same test (runnning a GTX 770 4GB), first with the file deleted, the second time with the PhsyX set to "CPU."

Seems like both do the same thing, and only in certain areas; many areas, for me, show little to no improvement with either fix.

It looks as if the performance drop in affected areas are due to the PhysX processes intended to run solely on the CPU being offloaded to the GPU instead. Strange, seeing as it's been confirmed that all PhysX in The Witcher 3 run off the CPU.
Has anyone tried with the file deleted AND PhysX set to CPU to see if there's a bigger improvement? Or do they just do the same thing?
 

CHC

Member
So whether you force CPU PhysX or you delete the file, they achieve the same thing right? Definitely going to try on Saturday once I'm home.
 
Would I double the fps if I bought another gtx 680?

Another 680 is not a good investment in any way at the moment due to the VRAM limitations on that card.

Put the money towards a proper GPU upgrade.

There are 4gb versions of the 680... he didn't state which he had.

I can't speak for 2 680s, but I have 3, and at 1080p with mostly ultra settings (foliage distance on high), I get 80-90 FPS in the most graphic-intensive area I've discovered so far. Probably would be higher if I offloaded PhysX to my CPU which I'll be trying when I get home! So yeah, the scaling seems pretty great. Not sure about double, but I'd wager at least an 80% increase.
 

Serick

Married Member
Does anyone remember what post (or thread even) had the screenshot comparison of Ultra vs. Low shadow settings?

I can't find it :(
 

Leatherface

Member
Wow, thanks for this. I gave up on sweetfx after trying some popular ones, but this one seems to be EXACTLY how i wanted things to look.

edit: i did turn off the bloom effect tho, not a fan of sweetfx bloom

Nice. I figured some of you guys would like this one. My favorite by far. :)
 

[Asmodean]

Member
I'm obviously not 100% on this, but re people deleting / renaming the GPU Physx library.

I'd imagine it's because, if the game doesn't detect the gpu lib on runtime, it reverts to the 'more simple' CPU physics they use on consoles.

Hence the increase in performance. Good tweak, though. If it has no major side effects.
 

mtlam

Member
Has anyone tried with the file deleted AND PhysX set to CPU to see if there's a bigger improvement? Or do they just do the same thing?

Yeah tried it as well no dice but at least I'm able to keep a consistent 30fps across the board. Using a 660ti here though.
 

mintylurb

Member
Hrm.. does that mean the game DOES use GPU accelerated PhysX?

The PhysX identifier says it is running on the CPU though...


You mean on Maxwell cards?

People already tested kepler.

I have a dedicated gpu as physx card(gtx 670) and its usage is 0 when playing witcher 3 when I check MSI's afterburner.
 
This is interesting if actually true.

If the game doesn't crash if you simply delete the dll, then this means one of two things:
  1. It didn't actually use it, in which case it also can't have a performance impact.
  2. It uses the dll in a system folder (installed by a driver) instead.

In case of 2, it could be that the system dll is a different version which shows better performance in TW3.

Still strange that it would make such a big difference, my first instinct with something like this is always user error ;)

What do you think it is in this case?
 

Lingitiz

Member
I've just tested that Kepler fix and it seems to work and from what I can tell not actually change anything in game. I think it may just fall back to running on the CPU when it can't run it on the GPU? Perhaps there's just a bug with the library on Kepler GPUs for now.

I'm pretty sure the clothing includes things like nets, sheets, flags waving in the wind, etc, but I could be wrong on that.

IIRC Physx includes fog right? Anytime fog is introduced in a scene my FPS goes to total shit yet when I've got tons of enemies on screen it holds solid.
 

Qassim

Member
I have a dedicated gpu as physx card(gtx 670) and its usage is 0 when playing witcher 3 when I check MSI's afterburner.

Weird, I just dedicated one of my 780s to PhysX and I get 14-16% usage on it.

IIRC Physx includes fog right? Anytime fog is introduced in a scene my FPS goes to total shit yet when I've got tons of enemies on screen it holds solid.

There are PhysX libraries for stuff like volumetric fog, but I don't think this game uses it.
 
Top Bottom