• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Witcher 2: Performance Thread [Enhanced Edition Patch - New content & 100+ fixes]

Easy_D

never left the stone age
3chopl0x said:
Yeah, at sub HD resolutions and running on graphics close to medium/low
I just tried it on 1024x768 on low, my results? Runs like shit. 360 ports? Run perfect and look way better than their console counterparts, so yeah.

If only they had warned me that the game runs like shit on dual cores I could have waited and saved some cash ): . Oh well, at least I got it for half the price.
 

Echoplx

Member
Corky said:
hey as long as it doesn't affect ( which it obviously won't since the pc version is out the gates ) the pc version then I'm all for console releases so more people can play it and more money goes to the devs.

Yeah I agree, but saying this game should run better because it may get a console release in the future is just stupid.
 

Minsc

Gold Member
cleveridea said:
someone should force developers to use and test on dual core CPU, GTX 260's

This game is coming out for console at some point FFS, no reason to be such a bad performance on people's hardware.

having said that, I am hoping this is GOTY. I am installing to SSD, the only game to have that honor. Dont let me down W2!

Consoles are probably part of the reason it doesn't run well on dual cores... they're either 3 cores or eight. I think they said they optimized for three cores because of the 360.
 

jett

D-Member
Easy_D said:
I just tried it on 1024x768 on low, my results? Runs like shit. 360 ports? Run perfect and look way better than their console counterparts, so yeah.

If only they had warned me that the game runs like shit on dual cores I could have waited and saved some cash ): . Oh well, at least I got it for half the price.

When even CDProject's own videos ran like butt, I figured there were gonna be some issues.
 

MrOogieBoogie

BioShock Infinite is like playing some homeless guy's vivid imagination
Corky said:
big juicy OC?

Yeah, I'm not gonna tamper with that.

Problem is, I can't simply upgrade to a new CPU. I'm using an only Dell Inspiron 531 with its own motherboard, and that 5600+ is the fastest CPU compatible with said motherboard. Sad, I know, but it is what it is. I haven't encountered a game that's kicked my PC's ass as much as this one has, though. I guess it's a wake-up call if anything.
 

MrOogieBoogie

BioShock Infinite is like playing some homeless guy's vivid imagination
jett said:
When even CDProject's own videos ran like butt, I figured there were gonna be some issues.

Is it possible to patch the game to run better on dual cores? Probably a dumb question, but I honestly have no clue.
 

Truant

Member
I'm 99% sure that a lot of the issues are driver related, seeing as the game runs the same on 2 GPUs as it does on 1. The nVidia drivers can't seem to distribute the load correctly, and both GPUs run at only half their capacity, roughly equal to a maxed out single GPU.
 

Corky

Nine out of ten orphans can't tell the difference.
hmm I'm surprised at the SLI woes people are having. Why did ya'll go mess with beta drivers? I'm on the latest ( the one that came a couple of weeks ago, 2-3 weeks ) WHQL drivers and both my gpus are @ 100% when uber is on. I'm guessing that means it's as good as it gets basically?
 

Truant

Member
Corky said:
hmm I'm surprised at the SLI woes people are having. Why did ya'll go mess with beta drivers? I'm on the latest ( the one that came a couple of weeks ago, 2-3 weeks ) WHQL drivers and both my gpus are @ 100% when uber is on. I'm guessing that means it's as good as it gets basically?

Basically, yeah.

The reason I went with the beta drivers is because I was having problems with the latest official ones. The new drivers didn't change a single thing, performance is the same, GPU load is the same. I'll try re-installing the older ones.
 

RS4-

Member
Q6600 stock
2GB ram
6950, 2GB, stock, 11.5a drivers

Getting around 30 fps with these settings on 1080p high (iirc changed only these settings): blur effects disabled, DoF gameplay disabled, ssao disabled, cinematic DoF disabled, uber disabled and vsync enabled.

I should probably disable vsync and motion blur to see how it runs. Then see how it is in 720p with vsync on.

edit - getting around the same FPS with vsync and motion blur off.

720p settings with the above along with vsync and motion blur on: around 30 as well

Yikes, I should stick to 1080; forcing it in 720 is as if it strongarms the game into that resolution but screwing with the AR.
 

longdi

Banned
after testing, i think 1920x1200 is bugged or TW2 is very resolution dependent.

im now running 1680x1050 and get the same smoothness as 1920x1200 but with SSAO turned on(off at 19x12), LOD far (normal at 19x12) shadow quality and number of shadow lights at ultra(high at 19x12). the rest are max settings with Uber disabled.

i7 3.8ghz
hd6970 2gb
intel 320 ssd
12gb ram

50-60fps with drops to ~42fps in very heavy parts.
 

KKRT00

Member
Dont know about those c2d assumptions.

Friend sent me this
45fps in combat in prologue on e3110 @ 3,6 / 4GB ram / radeon 4850 / medium settings. 1280x960 and on scene tends and huge draw distance he had 35.

--edit--
 

Truant

Member
Even more testing. I'm now 100% convinced there is something wrong with either the game or the nVidia drivers.

On my machine, there is absolutely NO difference in performance between resolutons. I get 25-35 @ 800x600, just like I did @ 1080p.

Tried reverting to the old drivers as well. No difference.
 
MrOogieBoogie said:
Wow, really? Damn. So is there basically nothing I can to improve my performance besides buying a new CPU?

I'm getting near enough 100 percent CPU usage with around 25 to 30 percent on the GPU.

That is classic CPU bottleneck.
 

MrOogieBoogie

BioShock Infinite is like playing some homeless guy's vivid imagination
KKRT00 said:
Dont know about those c2d assumptions.

Friend sent me this
45fps in combat in prologue on e3110 @ 3,6 / 4GB ram / radeon 4850. 1280x960 and on scene tends and huge draw distance he had 35.

--edit--

Weird. So what the hell could it be?
 
Fuuuuuuu-

I'm running this:
- AMD Phenom II X4 945
- 4GB RAM DDR3
- GeForce GTX 260

What videocard should I get to replace the 260? Is my CPU going to be able to handle it?

Argggggh
 

jett

D-Member
Truant said:
Even more testing. I'm now 100% convinced there is something wrong with either the game or the nVidia drivers.

On my machine, there is absolutely NO difference in performance between resolutons. I get 25-35 @ 800x600, just like I did @ 1080p.

Tried reverting to the old drivers as well. No difference.

Reminds me of this one game...coincidentally it's titled almost exactly the same as "The Wichter 2".
 
8800GT (new beta drivers)
C2D E6750
4 GB RAM
Win 7 64 bit

Turned everything down to lowest settings or off and ran D3D Overrider. Running at 1680x1050 gave around 22-30 fps in the initial area, playable but not really satisfactory. At 1280x960, it was fine, 30-40 fps, and looked reasonable as well. I'll try turning the settings up a bit and see if what I can get away with. Some hope for those with older rigs anyway :).
 

Easy_D

never left the stone age
jett said:
Reminds me of this one game...coincidentally it's titled almost exactly the same as "The Wichter 2".
Obviously the modified Aurora engine was not to blame in that case, given how things are now.

Edit: This sucks so hard, I just want to play the damn game!
 

Phloxy

Member
For nvidia users try the 166 drivers, the ones before the April 17 official ones, they seem to run the game much better for me than some other reason. Would love for somebody else to rest it out.
 

dino1980

Member
Im running it on Ultra

- no ultrasampling
- no DOF

And running the game on 1600*900 instead of 1080p on my TV.

The result is almost steady 60FPS with just some small times the framerate drops to 50fps. Cant stand playing the game on 1080p when its dipping to 50 fps in short intervalls.

My computer

Processor i5-2500K OC 4.2GHz
Graphiccard Radeon 6970
Ram 8GB
 

Truant

Member
Phloxy said:
For nvidia users try the 166 drivers, the ones before the April 17 official ones, they seem to run the game much better for me than some other reason. Would love for somebody else to rest it out.

Can you give me a release number on those?
 

Alexios

Cores, shaders and BIOS oh my!
With high settings (and some tweaks, some lower than some higher than) on an E8500, 4GB ram, GTX285, 720p, it was getting 30-60 fps in the first few scenes, less so in the camp before the assault, etc, but now at the first town past the jailbreak it's barely hitting 30 fps when outdoor and lowering more settings doesn't seem to help at all. Any hidden settings that can help yet? Are any later areas less demanding or does it keep getting more so? Kind of disappointed after the first few scenes that looked and ran good...

Just noticed you have my 30-60 fps quoted in the OP, you might wanna remove that after this development since it gets as low as 20s in this scene...
 

Minsc

Gold Member
Truant said:
Basically, yeah.

The reason I went with the beta drivers is because I was having problems with the latest official ones. The new drivers didn't change a single thing, performance is the same, GPU load is the same. I'll try re-installing the older ones.

What's your CPU again, and CPU usage? Could you be CPU bound even on lower resolutions?

MrOogieBoogie said:
Weird. So what the hell could it be?

The friend didn't specify what settings he ran under, could be low settings.
 

krzy123

Member
Sportbilly said:
8800GT (new beta drivers)
C2D E6750
4 GB RAM
Win 7 64 bit

Turned everything down to lowest settings or off and ran D3D Overrider. Running at 1680x1050 gave around 22-30 fps in the initial area, playable but not really satisfactory. At 1280x960, it was fine, 30-40 fps, and looked reasonable as well. I'll try turning the settings up a bit and see if what I can get away with. Some hope for those with older rigs anyway :).

minor hope :(, got e6600 @ 3.06 + gtx260-216, can't wait to try it though.
 

Truant

Member
Minsc said:
What's your CPU again, and CPU usage? Could you be CPU bound even on lower resolutions?



The friend didn't specify what settings he ran under, could be low settings.

I've got an i7 920 @ 2.67ghz

Should be fast enough?

Like I said earlier, I'm getting the same performance using a single GPU as I do with SLI enabled.
 

Sober

Member
Minsc said:
I didn't see any pop-in in the little I played, thankfully.
Well, I have a 4850 512mb, and I can't seem to reduce the amount of texture pop-in ... might it be due to having a 512mb card to begin with?

Right now, TextureMemoryBudget is set to 300 and still getting it. Should I try higher, maybe even 512 at this point? The launcher config doesn't really define actual texture size (atm its 2048) versus memory budget, which is it on the launcher?
 

Chiggs

Gold Member
Great thread!

Looks like I'm going to be playing in ultra mode; will save Uber for a future build.
 
So is it true that performance takes a massive nosedive in Act I towns in comparison to the prologue? If that's the case, I might have to dial down my settings from Ultra...
 

mileS

Member
the readme in the Witcher 2 folder explains a lot of this stuff for anyone wondering. I'm still not sure what to do about "Texture Memory Size"

heres the description from the readme

" -Texture memory size: sets the amount of graphics card memory allocated to textures. Larger values will decrease the amount of streaming that occurs in game and will make the game run more smoothly, but they can also cause the graphics card to run out of memory and even result in game crashes. Choose a reasonable value based on the amount of memory available on your graphics card."

I noticed someone with a similar card to me (5850 1g) had it set to high or very high. I left mine at medium because thats what it was set to for the recommended settings when I clicked that (yes I know it sets the presets to low the first time you do this but all you have to do is click it again)

I guess I should play around with that setting a bit more but I was wondering what other people have experienced with it so far. I don't want my game to be crashing because I set it to high.
 

Minsc

Gold Member
Truant said:
I've got an i7 920 @ 2.67ghz

Should be fast enough?

Like I said earlier, I'm getting the same performance using a single GPU as I do with SLI enabled.

I'm not sure, fizzelopeguss mentioned he was getting ~100% CPU usage with only 25-30% GPU usage, though I'm guessing he's on a dual core.

It does sound a bit like everyone, nVidia, AMD/ATi, and CDP have their hands full with work and optimization to do over the next few months.

Hopefully that means playthrough #2 will be at a better framerate and with higher settings.
 

Salaadin

Member
mileS said:
the readme in the Witcher 2 folder explains a lot of this stuff for anyone wondering. I'm still not sure what to do about "Texture Memory Size"

heres the description from the readme

" -Texture memory size: sets the amount of graphics card memory allocated to textures. Larger values will decrease the amount of streaming that occurs in game and will make the game run more smoothly, but they can also cause the graphics card to run out of memory and even result in game crashes. Choose a reasonable value based on the amount of memory available on your graphics card."

I noticed someone with a similar card to me (5850 1g) had it set to high or very high. I left mine at medium because thats what it was set to for the recommended settings when I clicked that (yes I know it sets the presets to low the first time you do this but all you have to do is click it again)

I guess I should play around with that setting a bit more but I was wondering what other people have experienced with it so far. I don't want my game to be crashing because I set it to high.

Yeah, I have a 5850 1GB and set mine to very high. I wonder what they consider a "reasonable value" for a 1GB card. Ill leave mine set at max and see if I have any issues.
 
Truant said:
Even more testing. I'm now 100% convinced there is something wrong with either the game or the nVidia drivers.

On my machine, there is absolutely NO difference in performance between resolutons. I get 25-35 @ 800x600, just like I did @ 1080p.

Tried reverting to the old drivers as well. No difference.

Same here. I just get a slight bump up if I lower all the settings, but still it is nothing worth mentioning.

Even turning 3D vision doesn't seem to affect performances that much; I'd gladly take a game running at 30 fps in 3D mode, if the 2d mode reached 60 fps.

On gog support they suggest installing drivers without 3D vision, maybe that's where the issue is.
 

mileS

Member
Salaadin said:
Yeah, I have a 5850 1GB and set mine to very high. I wonder what they consider a "reasonable value" for a 1GB card. Ill leave mine set at max and see if I have any issues.

Yea thats basically what I was wondering. You have very similar specs to me so let me know how it goes for you in the long run.

also how many people are using in game vsync over D3DOverrider? I know some games get better performance with D3D but I've been getting some floaty mouse with both of them.
 

Corky

Nine out of ten orphans can't tell the difference.
Thought I'd try ubersampling to see the screenshoot quality.


witcher22011-05-1719-5k7q0.png
 

Truant

Member
metareferential said:
Same here. I just get a slight bump up if I lower all the settings, but still it is nothing worth mentioning.

Even turning 3D vision doesn't seem to affect performances that much; I'd gladly take a game running at 30 fps in 3D mode, if the 2d mode reached 60 fps.

On gog support they suggest installing drivers without 3D vision, maybe that's very the issue is.

Tried this, no difference.

I mean, the game is playable, but I can feel the unresponsiveness and it sucks big time. I just want this to run as smooth as it can.
 

Minsc

Gold Member
mileS said:
Yea thats basically what I was wondering. You have very similar specs to me so let me know how it goes for you in the long run.

also how many people are using in game vsync over D3DOverrider? I know some games get better performance with D3D but I've been getting some floaty mouse with both of them.

V-sync works best locked to 60fps, when you go under your controls begin to feel more floaty, so that's to be expected.
 
Truant said:
Tried this, no difference.

I mean, the game is playable, but I can feel the unresponsiveness and it sucks big time. I just want this to run as smooth as it can.

Thanks, I didn't want to mess that much with the drivers.

There's really something strange given that lesser (or at least comparable) rigs can run the game way better than my system.

It still is definitely playable, even into 20-25fps-territory, though.
 

Dacon

Banned
Bisnic said:
I wonder what kind of setting i'll need to run this game at decent FPS(say 30) if i have a AMD quad core Phenom 3.2ghz CPU, 4gm of RAM and a 9800gt 1gb.

I'm still thinking of changing my video card since the fan does some annoying noise sometimes when i boot my computer.
I have a similar system. Phenom X4 3.2 with a Geforce GTS 450 and 4 gigs of ram.
 

mileS

Member
Minsc said:
V-sync works best locked to 60fps, when you go under your controls begin to feel more floaty, so that's to be expected.

Had a feeling it was something to do with Vsync. Readme even talks about input lag. I guess my next question would be is anyone playing with vsync turned off? I don't see very many people talking about the floaty mouse.

I changed MouseSmoothness and Smoothness in the settings to 0. I think it helped a bit but not completely.
 

Salaadin

Member
I almost always get better performance using d3doverrider for Vsync over the in game options. I dont even enable it in game unless d3doverrider is giving me issues.
 
I just tried loading a game with low settings, and guess what? 25-30 fps.

The only setting having a real impact is ubersampling. Everything else, including resolution or 3d vision, gives me 25-30. Lol.
 

Hawk269

Member
Well, even with my rig (Specs Below) using Ultra/Uber I am avereging about 25fps. Sometimes it spikes up all the way to 60, but on average I see it (using Fraps) at around 25. I am really wondering why Uber is so demanding and also leads me to a few questions.

In the Readme it says that Uber is the best possible setting, that it makes the game look better than any amount of AA or other tweaks. With that being said, should the AA be enabled with Uber as well? If Uber is suppose to do all these demanding things and we still have AA are we trying to force AA and Uber at the same time? I am going to test this out to see what happens...but it seems that having AA and the other things enabled and Uber enabled that we may be double dipping and applling all these things on top of what uber is trying to do.

Here are my specs:

Antec Twelve Hundred V3 Black Steel ATX Full Tower Computer Case
ASUS P8P67 DELUXE (REV 3.0) Intel P67 SATA 6Gb/s USB 3.0 ATX Intel Motherboard
G.Skill Ripjaws X Series 8GB (2 x 4GB) 240-Pin DDR3 SDRAM DDR3 1600
EVGA GeForce GTX 580 x2 SLI OC 875/1750/2050
Cooler Master Hyper 212 Plus CPU Cooler
Intel Core i7-2600K Sandy Bridge 3.4GHz OC 4.5Ghz
Corsair Professional Series AX1200 1200W ATX12V v2.31 PSU
ASUS DRW-24B1ST/BLK/B/AS Black SATA 24X DVD Burner
Samsung Spinpoint F3 HD103SJ 1TB 7200 RPM SATA 3.0Gb/s 3.5" HDD x2
Windows 7 64Bit Ultimate Edition
Sony Bravia KDL-55HX800 55" 1080p 3D TV

With the above, right now I am playing on Ultra, Uber is Off. I disabled Motion Blur, DOF (I dont like both of those) and I am getting mostly 60fps, but I do dip on occasion to low 50's everyonce in a while. All other settings are set to enabled/ultra, I also have the Texture Memory Size to the highest settings.
 
Top Bottom