• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Witcher 3 PC Performance Thread

Rodin

Member
Wonder how my specific XFX R9 runs it, kinda a mess to compare with all the different factory clocks and tweaks, I'll just have to wait to sink my teeth into the game on my own.

Biggest worry is of course my FX CPU

They said it's not a cpu heavy game so you should probably fine, especially if it's a 8320-8350.
The game was updated with better LOD and more grass etc in the 1.02 build, on the older build (info in the downgrade thread) that was the ultra setting, however those ultra settings are now the very high setting in the new build and the new better LOD/grass used in this benchmark is the new Ultra setting.

So the pics etc from the downgrade thread based on this new build would be the very high setting.

I think it works out like this.

PS4 = high setting,
Old build ultra = very high setting.
New build ultra = ultra.
Based on the info from last page (or the one before), i don't think there's a very high preset.
 

Agent_4Seven

Tears of Nintendo
I'm preparing for the worst in terms of performance on AMD's hardware.

jhpcye.gif


They said it's not a cpu heavy game
And yet the game requires 8-core FX CPU.
 

Gbraga

Member
The game was updated with better LOD and more grass etc in the 1.02 build, on the older build (info in the downgrade thread) that was the ultra setting, however those ultra settings are now the very high setting in the new build and the new better LOD/grass used in this benchmark is the new Ultra setting.

So the pics etc from the downgrade thread based on this new build would be the very high setting.

I think it works out like this.

PS4 = high setting,
Old build ultra = very high setting.
New build ultra = ultra.

That would be nice. Even if there's not much of a difference in the overall looks of the game, it's nice to have actually taxing ultra settings.

I think it's really crazy that a Titan X is ~60fps at 1080p in this game. That's a $1300 CAD GPU just getting double the FPS of the PS4 version. Sure some of the effects are better looking than the PS4 version, but from the screen comparisons posted earlier its not that big of a leap. On top of that I went back to Dragon Age Iquisition recently, and it compares pretty nicely with the Witcher from what I can tell. So does GTA V, and performs better from what I can tell as well. I have a 970 so I can max it out and play it fine, but those numbers are a little shocking.

You answered it yourself, if the leap is not that big, set it to PS4 settings and you'll be able to either have much higher framerates or much higher resolutions.

Most likely both.

The game doesn't force you to run it on Ultra, you know.
 

Derp

Member
If someone can post the perfect config to run the game on steady 60fps with a 970 i will be very thankful.

Of course when the game launches...

:D
I'm sure this thread will explode with people's configs and tweaks and tips and infos which is why I love these threads. And it won't take long at all which is awesome.
Watching a dude stream
I read that as darude sandstorm, read it twice again, realised it didn't say darude sandstorm, then went away from the thread and came back later, saw the post again and read it as darude sandstorm again.

tumblr_ljzm5vkQhA1qixleeo1_500.gif


The tiredness is real, folks.
 

Redmoon

Member
That's what I used to do before getting a 4K TV. Worked really well
Yeah. I used to a custom 4K res before DSR came out.

Anyway, I'm on 4K DSR and SLI is still active in csgo at least(im on an rcon connection so I cant do much). I could try a benchmark and see if it stacks up correctly.
 

Naedtrax

Banned
60 fps with a Titan X seems off at 1080P.. the only game for me that gets anywhere NEAR that low at 1440p on my Titan X is AC : Unity.
 

viveks86

Member
I think it's really crazy that a Titan X is ~60fps at 1080p in this game. That's a $1300 CAD GPU just getting double the FPS of the PS4 version. Sure some of the effects are better looking than the PS4 version, but from the screen comparisons posted earlier its not that big of a leap. On top of that I went back to Dragon Age Iquisition recently, and it compares pretty nicely with the Witcher from what I can tell. So does GTA V, and performs better from what I can tell as well. I have a 970 so I can max it out and play it fine, but those numbers are a little shocking.

Agreed. Which is why it's a little suspicious, because it contradicts Nvidia's own benchmarks, where they took demanding scenes into consideration. It could very well be a case of different builds again. Review builds and retail/public builds tend to be different code branches because review builds need to be feature complete before launch day while retail/public builds don't (since it's locked till then anyway).
 

Easy_D

never left the stone age
They said it's not a cpu heavy game so you should probably fine, especially if it's a 8320-8350.

Based on the info from last page (or the one before), i don't think there's a very high preset.

It's a 6300 D:. It's better than the minimum AMD requirement at least, and two extra cores. Should do noticably better.
 

Evo X

Member
Yeah. I used to a custom 4K res before DSR came out.

Anyway, I'm on 4K DSR and SLI is still active in csgo at least(im on an rcon connection so I cant do much). I could try a benchmark and see if it stacks up correctly.

DSR+SLI works just fine.

It's DSR+SLI+GSYNC that doesn't work yet.
 

Gbraga

Member
Agreed. Which is why it's a little suspicious, because it contradicts Nvidia's own benchmarks, where they took demanding scenes into consideration. It could very well be a case of different builds again. Review builds and retail/public builds tend to be different code branches because review builds need to be feature complete before launch day while retail/public builds don't (since it's locked till then anyway).

Yeah, Nvidia's recommendations is what makes it weird, but the fact that the card doesn't run it that well on Ultra isn't a bad thing by itself. People need to stop getting so attached to settings names.
 

viveks86

Member
Yeah. I used to a custom 4K res before DSR came out.

Anyway, I'm on 4K DSR and SLI is still active in csgo at least(im on an rcon connection so I cant do much). I could try a benchmark and see if it stacks up correctly.

It will be active, but it will secretly use just one of your cards. You probably didn't notice because CSGO isn't too demanding. Disable one card and you'll see your performance is exactly the same.

DSR+SLI works just fine.

It's DSR+SLI+GSYNC that doesn't work yet.

Nope. Not at 4K. DSR+SLI works fine for lower resolutions such as 1440P.
 

Kaze2212

Member
Agreed. Which is why it's a little suspicious, because it contradicts Nvidia's own benchmarks, where they took demanding scenes into consideration. It could very well be a case of different builds again. Review builds and retail/public builds tend to be different code branches because review builds need to be feature complete before launch day while retail/public builds don't (since it's locked till then anyway).

I don't know which build was previewed at the NVIDIA event in Munich, but one guy stated that he was playing it on Ultra with hairworks disabled on a GTX 970 at around 50-60 fps iirc. And the NVIDIA/CDPR guys told him there will still be a performance patch. The NVIDIA event was like a week/ a week and a half ago.
 

viveks86

Member
I don't know which build was previewed at the NVIDIA event in Munich, but one guy stated that he was playing it on Ultra with hairworks disabled on a GTX 970 at around 50-60 fps iirc. And the NVIDIA/CDPR guys told him there will still be a performance patch. The NVIDIA Event was like a week/ a week and a half ago.

Forget the Munich event altogether. It was a complete mess. A glaring case of incompetence from Nvidia/CDPR there. Any technical analysis based on that event is null and void.
 

Sanctuary

Member
Hmm, I already knew I wasn't going to get 60fps with Ultra on my GTX 780, but I think I'm actually going to just lock it at 30fps. This will be the first time I've ever done this too, but the game bouncing around between 45 - 60 is a lot more jarring than having it stuck at 30. It's not like it has Dark Souls combat either, so it's not going to really change much in that regard. Next year whenever Pascal is released, I'll bump it back up to 60.
 

Kaze2212

Member
Forget the Munich event altogether. It was a complete mess. A glaring case of incompetence from Nvidia/CDPR there.

Yeah, it is still very likely that they had the old preview version. So I guess we will all find out later today/tomorrow how the game runs. :)
 

Rodin

Member
The site that did benchmark seems to state that a new level of very high has been added to the game, unless its translating it wrong.
Well now i'm officially confused xD

Do we know if other websites are preparing some benchmarks before the game is out (maybe some of them said something like "TW3 benchmarks on their way" on their home pages)? The Nvidia performance guide would be especially helpful.

It's a 6300 D:. It's better than the minimum AMD requirement at least, and two extra cores. Should do noticably better.

Yup, should do the trick :)
 

UnrealEck

Member
I don't know which build was previewed at the NVIDIA event in Munich, but one guy stated that he was playing it on Ultra with hairworks disabled on a GTX 970 at around 50-60 fps iirc. And the NVIDIA/CDPR guys told him there will still be a performance patch. The NVIDIA event was like a week/ a week and a half ago.

There is a performance patch. He was implying the performance will decrease with it, contrary to what people would have thought was typical with performance patches.

yes I'm kidding
 

elelunicy

Member
It will be active, but it will secretly use just one of your cards. You probably didn't notice because CSGO isn't too demanding. Disable one card and you'll see your performance is exactly the same.



Nope. Not at 4K. DSR+SLI works fine for lower resolutions such as 1440P.

DSR+SLI doesn't work if your base resolution is 4k (i.e. 8k dsr). It definitely works from 4k to a 1080p monitor.
 

OmegaDL50

Member
7970 GHz and i7 standing in line with you

i5 3570k and HD7950 here as well.

Any other HD7000 users here?

I see no results for HD 7950 and FX 8350 in variety of settings and screen resolutions. These benchmarks is useless to me.

The R9 280X is a rebadged HD7970 for reference. The 7950 isn't much worse then it considering if you OC your 7950 to the same clocks as a 7970 (i.e 925Mhz Core and 1375 Mem) it's a 5% difference.
 

Soi-Fong

Member
Hmm... I'm wondering if I can get a solid 30 fps on my 780 Ti OC + i7 2600k @ 4.2..

I'm hoping to at least have uber settings at 1080p and maybe gameworks.
 

Kezen

Banned
I see no results for HD 7950 and FX 8350 in variety of settings and screen resolutions. These benchmarks is useless to me.

It shows that the game is not "biased" towards Nvidia. So, I don't why AMD performance would be anything but satisfactory.

Don't overestimate what your PC is capable of.

GTX780 sub 30fps? bullshit. i dont believe it
What's shocking or unbelievable ?
 

Gbraga

Member
GTX 780 + i5 2500k here. Should i get this on PS4 instead? Serious question.

No.

Unless you really want to play it on PS4 for whatever reason (party chat with your friends, you can't connect your PC to the TV and would prefer to play on a TV or something like that), but if performance is the only thing you're worried about, don't be.
 

Redmoon

Member
It will be active, but it will secretly use just one of your cards. You probably didn't notice because CSGO isn't too demanding. Disable one card and you'll see your performance is exactly the same.



Nope. Not at 4K. DSR+SLI works fine for lower resolutions such as 1440P.

Ill have to check then. I mentioned csgo cause thats what I had running at the moment.
This would mean I couldave played gta v with 90+ FPS maxed then as i was averaging 45 most of the time :(

I just dont want to loose the sharpness os DSR as my custom res feels blurry, and has a hint of ghosting on high contrast text (like RTSS for example)
 
Top Bottom