R9 290 here, feels good.
Same here
R9 290 here, feels good.
Is my 2600k in trouble?
Same here. Got this card for Christmas. Perfect timing. I am still considering getting it on PS4 though. Not sure yet.
Is my 2600k in trouble?
What I wonder is, will we have another Enhanced Version post release with further optimization (like the first two?) or have they baked that time in this time round?
R9 290 here, feels good.
Sorry for off topic but what cpu cooler and vcore you are running?
My 2500k is OCed to 4.0 Ghz. I should be ok right?
I'm referring to this.
It's not the first time I see this shit.
Ubi did the same thing with Unity:
And everyone and their moms know how a i5 fares against a FX and even the i7 when we are talking about games.
And lol @ PII940
Well, Witcher 2 got its big FPS boost solely from removing the DRM. (Fuck SecuROM forever), but I'm certain they'll work hard to improve the game even after release, that's just their style.
i have a feeling that this game will make unity look like the child play in term of glitches and performance issues.
i have a feeling that this game will make unity look like the child play in term of glitches and performance issues.
Yeah... Probably not.
You're dreaming.
Well there was a poster here that was vetted by Bish that said the game is actually a mess and in development hell. He said it is not at all what CDPR is leading people to believe it is.
Which is...worrisome to say the least.
Good thing I got a PS4 instead of upgrading.
Something like this is posted in every PC requirement thread and when game releases, it is always false.
Will we get a proper response to 'PR' specs in my lifetime? Its the same debate every month for like i dont know 7-8 years now.
How have you managed to make TW2 not run well on a Titan ? Are you playing at 4K with ubersampling enabled ?
What is sure is that you must be doing something wrong. TW2 without ubersampling runs really well on a very broad range of GPUs from 2010 onwards
Post is full of so much wut
TITAN to TITAN BLACK...Why? I say this as a 780 to TB: go 980 for gaming. Don't go for that unless you have a CGI rendering project you're rendering on the side. That's the only thing a TB will beat a 980 in.
Fyi, TB's are $750 on the bay new. If you must get rid of your titan, do it now before Maxwell Titans show up.
Well there was a poster here that was vetted by Bish that said the game is actually a mess and in development hell. He said it is not at all what CDPR is leading people to believe it is.
Which is...worrisome to say the least.
I was waiting on W3 to upgrade my 780.
It seems like Nvidia is really waiting to release those 6gb cards that aren't Titans. Hopefully closer to W3 release date Nvidia will have them. I have no confidence that even at 1080p with a 780 I'll get 60 fps. Not blaming anyone, but I want that 60fps experience and I hope Nvidia will have something to release at $350 that'll do that by W3's release.
Nvidia will be dragged kicking and screaming into the 6GB consumer ring, just as it was with 3GB and 4GB. I'll be pleasantly surprised if we see x70/x80 parts (or whatever their future equivalents are to be) with 6GB+ VRAM as standard before 2017.
Because this:Why would that have to be a debate?
Is just so completely nonsensical to a lot of us. Why would you require the same game running with 5 times the pixel throughput and higher graphical quality in PC?I will be playing on PS4 because if I upgraded right now to a flagship GPU, my GPU still wouldn't be powerful enough to run the game as I would want to play it on PC.
Which is at 4K, at 40+FPS, maxed settings.
It's expensive and it will make zero difference in the benchmarks. Their current GPUs don't have enough grunt to be memory limited. Dual GPU users will feel the pain first though.I have no idea why nVidia are so reluctant to release consumer cards with decent amounts of memory. It makes no sense.
Why would that have to be a debate?
I am in the same boat.
I will be playing on PS4 because if I upgraded right now to a flagship GPU, my GPU still wouldn't be powerful enough to run the game as I would want to play it on PC.
Which is at 4K, at 40+FPS, maxed settings.
If I can't get that substantial boost in quality now, I will settle for the decent looking PS4 version, at 900p-1080p, without spending anything on hardware until it gets up to snuff with what I want.
Because this:
Is just so completely nonsensical to a lot of us. Why would you require the same game running with 5 times the pixel throughput and higher graphical quality in PC?
Why would you rather play a lesser version?
Its just You and its completely illogical, but hey its Your opinion and You have full right to say it, but it doesnt make it correct.Why would that have to be a debate?
I am in the same boat.
I will be playing on PS4 because if I upgraded right now to a flagship GPU, my GPU still wouldn't be powerful enough to run the game as I would want to play it on PC.
Which is at 4K, at 40+FPS, maxed settings.
If I can't get that substantial boost in quality now, I will settle for the decent looking PS4 version, at 900p-1080p, without spending anything on hardware until it gets up to snuff with what I want.
I'm sorry you'll have to speak up I can't hear you over the sound of my stock fan![]()
Why would that have to be a debate?
I am in the same boat.
I will be playing on PS4 because if I upgraded right now to a flagship GPU, my GPU still wouldn't be powerful enough to run the game as I would want to play it on PC.
Which is at 4K, at 40+FPS, maxed settings.
If I can't get that substantial boost in quality now, I will settle for the decent looking PS4 version, at 900p-1080p, without spending anything on hardware until it gets up to snuff with what I want.
Its just You and its completely illogical, but hey its Your opinion and You have full right to say it, but it doesnt make it correct.
I mean, just having ability to run the same games on even same settings in double the framerate than PS4 is much better value for me than any 'couch' or 'mobile' advantage console can give You.
You have a choice of higher setting or higher framerate, or higher IQ on PC and You additionally get better versions of games that are backwards compatible to infinity.
As someone pointed out, the most important factor is the GPU. The differences between desktop and mobile chips aren't as huge as some people make it out to be. Besides the naming scheme, the biggest difference is the frequency they run at (also binning and voltage, but that's not too important). Sure it won't be as fast as a 4790K clocked to >4GHz, but you'll be fine CPU-wise.
Don't take game specs too literally, they just have to draw a line somewhere.
Well there was a poster here that was vetted by Bish that said the game is actually a mess and in development hell. He said it is not at all what CDPR is leading people to believe it is.
Which is...worrisome to say the least.