• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Witcher 2: Performance Thread [Enhanced Edition Patch - New content & 100+ fixes]

teiresias

Member
So I just pulled the trigger on an EVGA GTX460 1GB Superclocked to SLI with my current Gigabyte GTX460. The Gigabyte is clocked like 3MHz lower than the EVGA, so my pair will clock at 760Mhz. I'd been wanting to quench my curiosity about trying SLI for a while, and the combo of my Amazon credit and the mail-in rebate (meaning I'll pay about $30 out of pocket at the end of the day) and this FREAKING GAME, made me pull the trigger.

Here's to hoping SLI issues get ironed out soon enough (though I won't really have time to jump into the meat of the game for another two weeks probably).
 

Erebus

Member
Q9550 @ 3.9GHz
GTX460 1GB
4GB Ram
1680x1050
FPS varying 35-60

My settings:
clipboard01a7sg.jpg

*Forced vsync and triple buffering via D3DOverrider

Some screens:

witcher22011-05-1817-1t75h.jpg

witcher22011-05-1817-1r73p.jpg

witcher22011-05-1817-3t7e3.jpg


Need to force some AF now.
 
CabbageRed said:
Radeon Pro. Install it, and create a profile linking the the correct .exe (such as "D:\Program Files (x86)\GOG.com\The Witcher 2\bin\witcher2.exe"). In the "tweaks" tab, you'll find a "CrossfireX Tweaks" section with a drop down box and a slider. Drop down=dirt2. Slider=alternate frame rendering.

Radeonpro should then be able to change the settings as needed when it detects you launch the game, but I always launch from within Radeon Pro.

This works for me in terms fo nearly maxing out the settings and getting 60fps with my Radeon 6870s, but it adds annoying flicker from the game's light sources that makes it not worth it.

I think I'll just go back to playing it slightly dialed down with a solid framerate and everything else intact.
 

Minsc

Gold Member
Darklord said:
I can only find the launcher.exe Is it ok using that when forcing it?

Once you change the settings in your GPU control panel (CCC on ATI cards), it's a universal change, will take place for all programs you run.

EternalGamer said:
This works for me in terms fo nearly maxing out the settings and getting 60fps with my Radeon 6870s, but it adds annoying flicker from the game's light sources that makes it not worth it.

I think I'll just go back to playing it slightly dialed down with a solid framerate and everything else intact.

It sounds from the latest update ATI may have a CAP for TW2 out in a couple days, hopefully that works properly.
 

Sethos

Banned
Every single time I mess around with custom CF settings as per recommendations, it always lead to the same issue, the one described above; flickering light sources. Same happened in SHIFT 2 - It's unbearable and completely not worth it. Think I'll shelf this game until a proper profile is released.
 

Corky

Nine out of ten orphans can't tell the difference.
TheExodu5 said:
Actually...

What's your CPU? That seems to indicate a CPU bottleneck. The GPUs have to wait for some blocking CPU threads to finish to continue rendering.

2600k @ 4.6

:(
 

TheExodu5

Banned
Sethos said:
Every single time I mess around with custom CF settings as per recommendations, it always lead to the same issue, the one described above; flickering light sources. Same happened in SHIFT 2 - It's unbearable and completely not worth it. Think I'll shelf this game until a proper profile is released.

To anyone thinking of going SLI, these are the issues you sometimes have to deal with. Though, it's only the odd game that really needs a second GPU, unless you're running 1600p, so it's not an issues all that often, as you can just disable SLI for that particular game. Still, be prepared to have issues, and be prepared to sometimes need to Google custom profiles to get SLI working properly in that new game of yours.

Still, this is one of the very few games that will validate my SLI purchase. SLI means having my framerate locked at 60fps, which is a pretty huge plus for me.

Corky said:
2600k @ 4.6

:(

That's odd...I wonder if it could be a hyper-threading issue? Try turning off hyper-threading in the BIOS and see if it has an impact.

Just a random thought.

Otherwise, I'm not quite sure what to say. Maybe the rendering path is simply inefficient given certain settings, which is causing the GPU to be underutilized. I'm not quite sure how GPU utilization is measured, and what can affect it.
 

Darklord

Banned
Minsc said:
Once you change the settings in your GPU control panel (CCC on ATI cards), it's a universal change, will take place for all programs you run.



It sounds from the latest update ATI may have a CAP for TW2 out in a couple days, hopefully that works properly.

I tried it. Forced 16xAF and...can't really tell the difference. Unless it's not doing for some reason, I honestly can't see any change. I even compared the 2 in the same spot.
 

TheExodu5

Banned
Darklord said:
I tried it. Forced 16xAF and...can't really tell the difference. Unless it's not doing for some reason, I honestly can't see any change. I even compared the 2 in the same spot.

No AF:
max_nobloom.jpg


16x AF:
ultra.jpg


Notice the ground texture as it gets further away from Geralt.
 

John Harker

Definitely doesn't make things up as he goes along.
I'm just going to ask this here, since I don't want to bog down the other thread.

To play with my Xbox controller, can I just buy the Play & Charge Kit?
I only have wireless controllers, any way to get that to connect via a wire add-on, or do I have to scrounge around to try and find a Wired Xbox 360 controller?
 

TheExodu5

Banned
John Harker said:
I'm just going to ask this here, since I don't want to bog down the other thread.

To play with my Xbox controller, can I just buy the Play & Charge Kit?
I only have wireless controllers, any way to get that to connect via a wire add-on, or do I have to scrounge around to try and find a Wired Xbox 360 controller?

You can't use the play & charge kit. You either need a wireless dongle, or a wired controller. I suggest a wired controller (which will let you use custom XBCD drivers, and work with games like Assassin's Creed). You can probably get a refurb for like $20.
 
John Harker said:
I'm just going to ask this here, since I don't want to bog down the other thread.

To play with my Xbox controller, can I just buy the Play & Charge Kit?
I only have wireless controllers, any way to get that to connect via a wire add-on, or do I have to scrounge around to try and find a Wired Xbox 360 controller?
No, either wired or wireless with USB dongle
 

Salaadin

Member
John Harker said:
I'm just going to ask this here, since I don't want to bog down the other thread.

To play with my Xbox controller, can I just buy the Play & Charge Kit?
I only have wireless controllers, any way to get that to connect via a wire add-on, or do I have to scrounge around to try and find a Wired Xbox 360 controller?

I dont know about the play and charge.
They do make USB wireless adapters for the 360 controller though. Theyre kinda pricey though and some people have issues with them not working and/or breaking. I was in a similar situation as you and I ended up just getting the wired 360 controller for PC.

http://www.amazon.com/dp/B000HZFCT2/?tag=neogaf0e-20 Adapter

http://www.amazon.com/dp/B003ZSN600/?tag=neogaf0e-20 Wired controller

You can probably find better prices if you shop around though.
 

Peterthumpa

Member
Well, I was really waiting for this game after all the previews and the awesome media released, and frankly, it's living up to my expectations.

However, I should point that the issues that most people are experiencing right now are really strange, specially concerning performance. I think that now we should give the deserved praise to Crytek for the job they did with their CryEngine 3. I mean, both CryEngine 3 and the Red engine are DX9 (so no "lulz Crytek sucks DX9 only). Also, strange that nobody complained "Witcher 2 is DX9 only, fail".

Anyway, with my GTX 580, getting 40 - 50 average on the first scene (the raining forest) is pretty disappointing when you realize that you're geared with the fastest single card on the block right now. I've only played some bits of the prologue so far, and the framerate is averaging 50-60 FPS (on ultra @ 1600x900 and supersampling off).

What I'm trying to say here is that Crysis 2 looks just as good (technically speaking, artistically is a whole different game) and the performance is way better.

I hope that CD Projekt Red will solve those issues to make the game playable for everybody out there.
 

Dina

Member
On low/medium is runs fine, but the texture-pop-in is annoying at times. It even happens in conversations when swapping from one talking head to the other. What setting controls this?
 

TheExodu5

Banned
felipepl said:
Well, I was really waiting for this game after all the previews and the awesome media released, and frankly, it's living up to my expectations.

However, I should point that the issues that most people are experiencing right now are really strange, specially concerning performance. I think that now we should give the deserved praise to Crytek for the job they did with their CryEngine 3. I mean, both CryEngine 3 and the Red engine are DX9 (so no "lulz Crytek sucks DX9 only). Also, strange that nobody complained "Witcher 2 is DX9 only, fail".

Anyway, with my GTX 580, getting 40 - 50 average on the first scene (the raining forest) is pretty disappointing when you realize that you're geared with the fastest single card on the block right now. I've only played some bits of the prologue so far, and the framerate is averaging 50-60 FPS (on ultra @ 1600x900 and supersampling off).

What I'm trying to say here is that Crysis 2 looks just as good (technically speaking, artistically is a whole different game) and the performance is way better.

I hope that CD Projekt Red will solve those issues to make the game playable for everybody out there.

I don't think Crysis looks nearly as good. The game looks like an upscaled console game. The Witcher 2 looks like a PC game. It might not be pushing as many shader effects, but there's nothing wrong with having fantastic texture work and extremely detailed and high polycount models. The game runs completely fine for what it is, in my opinion. The main issue is that it's simply not very scalable. The textures scale, and a few shaders can be turned off, but the game is still rendering a massive amount of polygons.

Durante said:
Well, ubersampling fixes the dithering ;)

It improves it to the point where it's not very noticeable anymore. Doesn't quite fix it though:

http://www.thejayzone.com/pics/witcher2/uber.png

Dina said:
On low/medium is runs fine, but the texture-pop-in is annoying at times. It even happens in conversations when swapping from one talking head to the other. What setting controls this?

Texture memory size. If you have 1GB of VRAM on your card, set the texture memory to high, and also set the texture downsize to off.
 

Van Owen

Banned
Salaadin said:
I dont know about the play and charge.
They do make USB wireless adapters for the 360 controller though. Theyre kinda pricey though and some people have issues with them not working and/or breaking. I was in a similar situation as you and I ended up just getting the wired 360 controller for PC.

http://www.amazon.com/dp/B000HZFCT2/?tag=neogaf0e-20 Adapter

http://www.amazon.com/dp/B003ZSN600/?tag=neogaf0e-20 Wired controller

You can probably find better prices if you shop around though.
MS doesn't even make the adapter anymore. The ones on Amazon are Chinese knockoffs, but they do work for a number of people.
 

Peterthumpa

Member
TheExodu5 said:
I don't think Crysis looks nearly as good. The game looks like an upscaled console game. The Witcher 2 looks like a PC game. It might not be pushing as many shader effects, but there's nothing wrong with having fantastic texture work and extremely detailed and high polycount models. The game runs completely fine for what it is, in my opinion. The main issue is that it's simply not very scalable. The textures scale, and a few shaders can be turned off, but the game is still rendering a massive amount of polygons.
You're kidding me right? I think you're taking out the merit that Crytek managed to squeeze a really good looking PC game on consoles, not the opposite. The character models for Crysis 2 are just as good as The Witcher 2, ditto.

Polygons are not the reason for the performance issues, since my FPS are all over the place (albeit always in the 50 - 60 range, but my rig is quite good). One example is when you're sleeping with Triss right when the Prologue starts. There's absolutely nothing to render there besides the characters models and some textures, but the FPS takes a hit in some unusual places.

Don't get me wrong, the game is gorgeous, but the performance should be better.
 

Durante

Member
TheExodu5 said:
It improves it to the point where it's not very noticeable anymore. Doesn't quite fix it though:
That's surprising, since it's a 1 pixel dithering pattern I thought it should be completely eliminated by 2x supersampling.
 

Van Owen

Banned
Yeah, you can argue Witcher 2 looks better than Crysis 2, but I think you'd have a hard time say it looks THAT much better that the performance differences are justifiable.
 

TheExodu5

Banned
Van Owen said:
Yeah, you can argue Witcher 2 looks better than Crysis 2, but I think you'd have a hard time say it looks THAT much better that the performance differences are justifiable.

The thing is...the game performs nearly identically to Crysis 2 on the highest settings. Crysis 2 was simply more consistent throughout. That's understandable considering the focus on consoles (where performance needs to stay relatively consistent), and the enormous budget the game had.

Darklord said:
No AF

16xAF

Looks to me like AF is on in both shots.
 

Peterthumpa

Member
TheExodu5 said:
The thing is...the game performs nearly identically to Crysis 2 on the highest settings. Crysis 2 was simply more consistent throughout.
Not for me. Crysis 2 runs @ 60 FPS almost 90% of the time at max settings, 1080p.
 

pahamrick

Member
Sethos said:
Every single time I mess around with custom CF settings as per recommendations, it always lead to the same issue, the one described above; flickering light sources. Same happened in SHIFT 2 - It's unbearable and completely not worth it. Think I'll shelf this game until a proper profile is released.

Yeah, I started using the Dirt 2 profile when I noticed the improvement over the one I was using before. However, it wasn't until I got to a cinematic scene that I saw the flickering. Tried to ignore it at first, but it eventually got too annoying.

I'm back to the CompatAFR-1x1 profile and no more flickering, at least none that I can see. Depending on where I am the FPS can drop as low as 33 but I've yet to have it drop lower, no matter how many NPCs / enemies are on the screen. This is with everything enabled / high as it can go except for Ubersampling.
 
felipepl said:
You're kidding me right? I think you're taking out the merit that Crytek managed to squeeze a really good looking PC game on consoles, not the opposite. The character models for Crysis 2 are just as good as The Witcher 2, ditto.

Polygons are not the reason for the performance issues, since my FPS are all over the place (albeit always in the 50 - 60 range, but my rig is quite good). One example is when you're sleeping with Triss right when the Prologue starts. There's absolutely nothing to render there besides the characters models and some textures, but the FPS takes a hit in some unusual places.

Don't get me wrong, the game is gorgeous, but the performance should be better.


Textures in crysis 2 are low res, many are of terrible quality with just a repeating detailtexture pasted over em.
Environments are mostly low poly. (the graphics info onscreen in crysis 1 would often report 2-3+ million polys rendered per frame in a scene , crysis 2 on ultra I never saw go over 800k-1 million. dropping down to 500k on console settings because of the agressive LOD )

No pom textures at all... (this is a big deal).
Turn your fov up in crysis2 n watch your performance nosedive.

It very much looks like an upscaled console game because that is exactly what it is, pc gets higher shader settings and less agressive LOD that is all.
 

TheExodu5

Banned
felipepl said:
Not for me. Crysis 2 runs @ 60 FPS almost 90% of the time at max settings, 1080p.

I'd have to say you're either wrong, or you're running a very well overclocked GTX 580.

A stock GTX 570 or 6950 w/ unclocked shaders averages closer to 45fps in Crysis 2.
 

Dina

Member
TheExodu5 said:
Texture memory size. If you have 1GB of VRAM on your card, set the texture memory to high, and also set the texture downsize to off.

Gracias. I got an 5850, so that should work out fine.
 

Durante

Member
I think the only performance "problem" with the game is that they enabled "ubersampling" at ultra. They should have left it disabled by default -- some people just can't stand it if they cannot run a game at "max" settings, even if those settings are designed for future systems. The same thing happened with Crysis 1.
 

Peterthumpa

Member
SneakyStephan said:
Textures in crysis 2 are low res, many are of terrible quality with just a repeating detailtexture pasted over em.
Environments are mostly low poly.
No pom textures at all... (this is a big deal).
Turn your fov up in crysis2 n watch your performance nosedive.

It very much looks like an upscaled console game because that is exactly what it is, pc gets higher shader settings and less agressive LOD that is all.
We'll see in some months when TW2 will be ported to PS3 / 360 with settings comparable to the medium preset.

I still think that there's some patching to do the graphics justice.
 

TheExodu5

Banned
Durante said:
I think the only performance "problem" with the game is that they enabled "ubersampling" at ultra. They should have left it disabled by default -- some people just can't stand it if they cannot run a game at "max" settings, even if those settings are designed for future systems. The same thing happened with Crysis 1.

Agreed.
 

Peterthumpa

Member
TheExodu5 said:
I'd have to say you're either wrong, or you're running a very well overclocked GTX 580.

A stock GTX 570 or 6950 w/ unclocked shaders averages closer to 45fps in Crysis 2.
Nah, I'm telling you the truth. At least the levels where I was able to test it, the framerate was almost at all times a rock solid 60 FPS.

But oh, with the beta drivers from yesterday, which are said to give Crysis 2 a small boost.
 

seeds19

Banned
Mi specs:
Amd phenom x6 1090t.
Msi gtx 560 ti.
8gb ram ddr3 1333mhz.

In ultra, uber off vsync on 1440 x 900 20-35 fps.
In high, vsync on 1440 x 900 solid 60 fps.

Any perfomance increase forcing vsync or triple buffering via d3doverrider? i want to boost the fps in ultra.
 
felipepl said:
Not for me. Crysis 2 runs @ 60 FPS almost 90% of the time at max settings, 1080p.

So, this game looks "about as good" at not maxed out settings. What the hell are people so obsessed with being able to max out everything and still get 60FPS. When you are playing the game, you won't even notice half that shit anyway. You can only see if when you are looking at huge screen shots back and forth and even then it is really difficult.
 
seeds19 said:
Mi specs:
Amd phenom x6 1090t.
Msi gtx 560 ti.
8gb ram ddr3 1333mhz.

In ultra, uber off vsync on 1440 x 900 20-35 fps.
In high, vsync on 1440 x 900 solid 60 fps.

Any perfomance increase forcing vsync or triple buffering via d3doverrider? i want to boost the fps in ultra.

Turn off Ubersampling and/or Depth of Field - Gameplay.


EternalGamer said:
So, this game looks "about as good" at not maxed out settings. What the hell are people so obsessed with being able to max out everything and still get 60FPS. When you are playing the game, you won't even notice half that shit anyway. You can only see if when you are looking at huge screen shots back and forth and even then it is really difficult.

Sup dude? Not on CoG anymore?
 
Durante said:
I think the only performance "problem" with the game is that they enabled "ubersampling" at ultra. They should have left it disabled by default -- some people just can't stand it if they cannot run a game at "max" settings, even if those settings are designed for future systems. The same thing happened with Crysis 1.

It really seems ridiculious to me. I get that people paid a lot of money for their set ups and want to get the optimal performance out of them, which is what this thread is supposed to be about.

But at this point, there seems to be a lot of people whining because they can't click a bunch of boxes and still get a specific number on an FPS screen. Shit is ridiculious. Configure the game according to what your rig can handle and focus on the damn game, not on little boxes you click on config screens.
 

Minsc

Gold Member
BoobPhysics101 said:
Turn off Ubersampling and/or Depth of Field - Gameplay.

I'd also try turning off AA too, if you're in need of a boost, a few people reported that doubling their framerate.
 

n0n44m

Member
felipepl said:
I think that now we should give the deserved praise to Crytek for the job they did with their CryEngine 3. I mean, both CryEngine 3 and the Red engine are DX9 (so no "lulz Crytek sucks DX9 only). Also, strange that nobody complained "Witcher 2 is DX9 only, fail".

[...]

What I'm trying to say here is that Crysis 2 looks just as good (technically speaking, artistically is a whole different game) and the performance is way better.

don't want to derail the thread but here's my €0,02

the difference here for me is that Crytek has done Far Cry & Crysis and is being funded by EA, whereas CD Projekt's first game ran on a terribly optimized 3rd party engine

for a first in-house engine it is running pretty smooth imho, and graphically it's more or less on par with Crysis 2 for me as well

Crysis 2 was supposed to be the pinnacle of graphics technology ... yet a small Polish developer (almost) matches them with their first serious attempt? Oh well looks like Battlefield 3 will be satisfying my thirst for gaming engines filled with cutting edge DX11 technology ;)
 

Minsc

Gold Member
EternalGamer said:
It really seems ridiculious to me. I get that people paid a lot of money for their set ups and want to get the optimal performance out of them, which is what this thread is supposed to be about.

But at this point, there seems to be a lot of people whining because they can't click a bunch of boxes and still get a specific number on an FPS screen. Shit is ridiculious. Configure the game according to what your rig can handle and focus on the damn game, not on little boxes you click on config screens.

I think a valid complaint is that some people are getting the exact same 20-40 fps on low as they are on high/ultra. The engine is not great for everyone. It doesn't seem to scale down very well at all for some people, and there's a few oddities with settings giving massive boosts that others gain next to nothing from, and memory/performance leaks people are reporting that go away with a restart of the game.
 
BoobPhysics101 said:
Turn off Ubersampling and/or Depth of Field - Gameplay.




Sup dude? Not on CoG anymore?

Still post there regularly, just checking in here for suggestions for getting my Crossfire Radeons to work.

Like I said, I get that people want to get the best performance, but when people start spouting vitriol solely on the principle that they can't max stuff out, that seems a little ridiculous to me.

I mean, you could run this game on low settings and it would still look better than the vast majority of the games these people probably play on a regular basis and it would still look better than the vast majority of games on consoles.
 

Corky

Nine out of ten orphans can't tell the difference.
EternalGamer said:
So, this game looks "about as good" at not maxed out settings. What the hell are people so obsessed with being able to max out everything and still get 60FPS.

This isn't a case of people wanting more from less, i.e expecting the game to be less taxing. Rather it is a case of people wanting their parts to be stressed accordingly, people with high end stuff aren't getting everything out of them

When you are playing the game, you won't even notice half that shit anyway.

In certain games, like an RPG like this, I agree. In others framerate is essential.

You can only see if when you are looking at huge screen shots back and forth and even then it is really difficult.

Again I agree to a certain extent, do a blind test with some ( certain ) options adjusted and people will have to guess which is which.
 

K.Jack

Knowledge is power, guard it well
Notebook specs:

i7-2630QM
6970M
8GB DDR3 RAM

I'm rocking a 6970M, which is a desktop 6850.

These settings are working very well for me:

tw22.png


I set Texture Memory Size to Very Large, because the 6970M comes with 2GB GDDR5.

Screens w/ FRAPS counter:













I'm going to test out adding the MLAA, to see the performance hit.
 
Top Bottom