• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

The Witcher 3: Wild Hunt Official System Requirements

What I wonder is, will we have another Enhanced Version post release with further optimization (like the first two?) or have they baked that time in this time round?
 
What I wonder is, will we have another Enhanced Version post release with further optimization (like the first two?) or have they baked that time in this time round?

Well, Witcher 2 got its big FPS boost solely from removing the DRM. (Fuck SecuROM forever), but I'm certain they'll work hard to improve the game even after release, that's just their style.
 
Sorry for off topic but what cpu cooler and vcore you are running?

cpu-z1lkuhh.png


My CPU cooler is a Cooler Master Hyper 212 Evo in the pull configuration.
 
Welp, I have a mildly overclocked FX-6300, 8GB DDR3 and a GTX 660.

I am surprised the minimum requirements are so low given how the game looks.

I'm sure it will look like shit on my rig though.
 
I'm referring to this.
It's not the first time I see this shit.
Ubi did the same thing with Unity:



And everyone and their moms know how a i5 fares against a FX and even the i7 when we are talking about games.
And lol @ PII940

What that basically says is that you need a quad at minimum but they recommend 8.
 
Well, Witcher 2 got its big FPS boost solely from removing the DRM. (Fuck SecuROM forever), but I'm certain they'll work hard to improve the game even after release, that's just their style.

Ah...I had no idea. Well that's heartening- I don't think I'm going to be able to wait after launch for a more optimized experience.
 
i might build a pc when witcher 3 comes out. by then hopefully amd's hbm card is released and ddr4 is cheaper and so will the intel cpus.
 
I was waiting on W3 to upgrade my 780.

It seems like Nvidia is really waiting to release those 6gb cards that aren't Titans. Hopefully closer to W3 release date Nvidia will have them. I have no confidence that even at 1080p with a 780 I'll get 60 fps. Not blaming anyone, but I want that 60fps experience and I hope Nvidia will have something to release at $350 that'll do that by W3's release.
 
Well there was a poster here that was vetted by Bish that said the game is actually a mess and in development hell. He said it is not at all what CDPR is leading people to believe it is.

Which is...worrisome to say the least.

Practically every big game gets development hell, so it's hard to tell from one person's perspective about how troubled a game's development is in.

They had a delay, so they've taken some measures.
 
Good thing I got a PS4 instead of upgrading.

Something like this is posted in every PC requirement thread and when game releases, it is always false.

Will we get a proper response to 'PR' specs in my lifetime? Its the same debate every month for like i dont know 7-8 years now.
 
Something like this is posted in every PC requirement thread and when game releases, it is always false.

Will we get a proper response to 'PR' specs in my lifetime? Its the same debate every month for like i dont know 7-8 years now.

Why would that have to be a debate?


I am in the same boat.


I will be playing on PS4 because if I upgraded right now to a flagship GPU, my GPU still wouldn't be powerful enough to run the game as I would want to play it on PC.

Which is at 4K, at 40+FPS, maxed settings.


If I can't get that substantial boost in quality now, I will settle for the decent looking PS4 version, at 900p-1080p, without spending anything on hardware until it gets up to snuff with what I want.
 
How have you managed to make TW2 not run well on a Titan ? Are you playing at 4K with ubersampling enabled ?

What is sure is that you must be doing something wrong. TW2 without ubersampling runs really well on a very broad range of GPUs from 2010 onwards

Post is full of so much wut

Played TW2 on 1080p display with high settings with ssao, über sampling and AA disabled, it had a lot of hitching / micro freezes for me, usually when in combat, with vsync I kept getting irrating dips in frame rate as well, without vsync the game never dipped under 60fps but,like any game wasn't as smooth and I had to deal with screen tearing.

With dragon age running the way it does on my rig I'm not very optimistic about future games.

TITAN to TITAN BLACK...Why? I say this as a 780 to TB: go 980 for gaming. Don't go for that unless you have a CGI rendering project you're rendering on the side. That's the only thing a TB will beat a 980 in.

Fyi, TB's are $750 on the bay new. If you must get rid of your titan, do it now before Maxwell Titans show up.

I'm not too sure which direction to go, I don't want to lose the VRAM but at the same time really tempted to go for 2 x 970 or 980.
 
Well there was a poster here that was vetted by Bish that said the game is actually a mess and in development hell. He said it is not at all what CDPR is leading people to believe it is.


Which is...worrisome to say the least.

Yea but its also gone through two delays for polishing, so who knows. Im sure the game will still have plenty of bugs and glitches. Every open world game Ive ever played has had its share.
 
I was waiting on W3 to upgrade my 780.

It seems like Nvidia is really waiting to release those 6gb cards that aren't Titans. Hopefully closer to W3 release date Nvidia will have them. I have no confidence that even at 1080p with a 780 I'll get 60 fps. Not blaming anyone, but I want that 60fps experience and I hope Nvidia will have something to release at $350 that'll do that by W3's release.

Nvidia will be dragged kicking and screaming into the 6GB consumer ring, just as it was with 3GB and 4GB. I'll be pleasantly surprised if we see x70/x80 parts (or whatever their future equivalents are to be) with 6GB+ VRAM as standard before 2017.
 
Nvidia will be dragged kicking and screaming into the 6GB consumer ring, just as it was with 3GB and 4GB. I'll be pleasantly surprised if we see x70/x80 parts (or whatever their future equivalents are to be) with 6GB+ VRAM as standard before 2017.

I have no idea why nVidia are so reluctant to release consumer cards with decent amounts of memory. It makes no sense.
 
Why would that have to be a debate?
Because this:
I will be playing on PS4 because if I upgraded right now to a flagship GPU, my GPU still wouldn't be powerful enough to run the game as I would want to play it on PC.

Which is at 4K, at 40+FPS, maxed settings.
Is just so completely nonsensical to a lot of us. Why would you require the same game running with 5 times the pixel throughput and higher graphical quality in PC?
Why would you rather play a lesser version?
 
I have no idea why nVidia are so reluctant to release consumer cards with decent amounts of memory. It makes no sense.
It's expensive and it will make zero difference in the benchmarks. Their current GPUs don't have enough grunt to be memory limited. Dual GPU users will feel the pain first though.
 
hope my i7 930 @ 4.0 will do the job. DA Inquisition had the same minimum cpu specs and I'm able to run that at 60 for the most part
 
Why would that have to be a debate?


I am in the same boat.


I will be playing on PS4 because if I upgraded right now to a flagship GPU, my GPU still wouldn't be powerful enough to run the game as I would want to play it on PC.

Which is at 4K, at 40+FPS, maxed settings.


If I can't get that substantial boost in quality now, I will settle for the decent looking PS4 version, at 900p-1080p, without spending anything on hardware until it gets up to snuff with what I want.

Wtf is this nonsense? You are still getting a huge amount out of high end PC parts compared to the PS4.
 
Because this:
Is just so completely nonsensical to a lot of us. Why would you require the same game running with 5 times the pixel throughput and higher graphical quality in PC?
Why would you rather play a lesser version?

I think the mindset is that, if they cant play the game in an order of magnitude of higher quality, an upgrade is not worth the trouble. I guess it goes that the person's threshold for value and putting down money on a PC upgrade is not worth it for the performance gained right now.

Which to me is dumb cause your still getting a better experience, but to each his own. If i cared anything about having better performance on PC as opposed to console, you bet your bottom dollar i'd be getting all my third party games on PC by now.
 
Why would that have to be a debate?


I am in the same boat.


I will be playing on PS4 because if I upgraded right now to a flagship GPU, my GPU still wouldn't be powerful enough to run the game as I would want to play it on PC.

Which is at 4K, at 40+FPS, maxed settings.


If I can't get that substantial boost in quality now, I will settle for the decent looking PS4 version, at 900p-1080p, without spending anything on hardware until it gets up to snuff with what I want.
Its just You and its completely illogical, but hey its Your opinion and You have full right to say it, but it doesnt make it correct.
I mean, just having ability to run the same games on even same settings in double the framerate than PS4 is much better value for me than any 'couch' or 'mobile' advantage console can give You.
You have a choice of higher setting or higher framerate, or higher IQ on PC and You additionally get better versions of games that are backwards compatible to infinity.
 
my cpu is a i7-2600 K, I just looked it up comparing and its almost identical in performance so hopefully thats good enough for max settings.
 
Why would that have to be a debate?


I am in the same boat.


I will be playing on PS4 because if I upgraded right now to a flagship GPU, my GPU still wouldn't be powerful enough to run the game as I would want to play it on PC.

Which is at 4K, at 40+FPS, maxed settings.


If I can't get that substantial boost in quality now, I will settle for the decent looking PS4 version, at 900p-1080p, without spending anything on hardware until it gets up to snuff with what I want.

But is your current PC as good or slightly better than a ps4 spec? If so it makes no sense to get a console version. The PC version will not only be cheaper but allow you to tweak settings to get it to run at an acceptable level to you rather than what has become the norm for graphical intensive games on consoles (30 with frrquesnt dips in the 20s).

That's why it makes no sense. The mind set of it has to run at Max settings or I'm getting the console version is broken logic.


Edit

Just to add to my earlier point. Assassins creed unity on steam has a minimum GPU of a Gtx 680. That is a fair bit better than my gtx765m. Guess what. It runs fine on my card if you get your settings right. Drop resolution to 768p or 900p and use low / medium settings. Turn certain things off.

According to notebookcheck.net the gtx 680 runs unity in ultra at 30-49 fps. And that's a minimum spec card.

So anyone panicking because they are on minimum, wait till its out. I bet it will still run and look great.

Only be concerned if you have to play maxed out for some reason.
 
Its just You and its completely illogical, but hey its Your opinion and You have full right to say it, but it doesnt make it correct.
I mean, just having ability to run the same games on even same settings in double the framerate than PS4 is much better value for me than any 'couch' or 'mobile' advantage console can give You.
You have a choice of higher setting or higher framerate, or higher IQ on PC and You additionally get better versions of games that are backwards compatible to infinity.

What he is saying , for the reasons he says it, makes sense.

He believes he shouldnt upgrade his PC now and rather do it at a later date when it meets his requirements.

I personally think 4K at 40+ fps is a bit overkill, but hey if this is what his goal is, it makes sense.

I mean I can certainly understand waiting to upgrade when more and more games (and ofc gfx cards) that push the boundaries come out. We are still early in this generation of games. Witcher 3 (unless shit hit the fan with this game based on the info we get these days) seems to be the first trully next gen game that will push the boundaries. Maybe waiting 6 months will do wonders, both to his wallet and the specs of his PC when technology meets his requirements.

And to add, no I dont think ACU pushes any boundaries... a game coming from Ubisoft that has some utterly disgusting issues even to the highest PC available.
 
As someone pointed out, the most important factor is the GPU. The differences between desktop and mobile chips aren't as huge as some people make it out to be. Besides the naming scheme, the biggest difference is the frequency they run at (also binning and voltage, but that's not too important). Sure it won't be as fast as a 4790K clocked to >4GHz, but you'll be fine CPU-wise.

Don't take game specs too literally, they just have to draw a line somewhere.

OK thanks, hope I do manage to run it fine, but dat 40 GB oof.
 
Honestly I will play on PS4 as well if I can't hit 60fps on reasonable settings and AMD's 380X still nowhere in sight.

I can tolerate 30fps on consoles but 30fps on PC is way too jank.
 
Well there was a poster here that was vetted by Bish that said the game is actually a mess and in development hell. He said it is not at all what CDPR is leading people to believe it is.


Which is...worrisome to say the least.

It's should be taken with a grain of salt like every information provided by some random guy on the internet.
 
Top Bottom