• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Watch Dogs PC specs (x64 only, Quad Core minimum, recommended 8-core and 2GB VRAM)

kharma45

Member
Thank you pal :)

You will have to pardon me being in the dark on most of the info about release since I don't actively look for new CPU's when I have little intention of upgrading a part or feel adequate in what I have. I do right now with my 3930k but I am keeping a close eye on that and eyeballing an upgrade when that hits alongside DDR4.

No problem. Your 3930K is still the dogs balls, I'd not be too worried.

I'm still using an i7 920 overclocked to 3.8Ghz as well. It has been very disconcerting that CPU requirements and utilization in games hasn't seen a significant improvement over the course of several years now. It appears that we're finally seeing significant changes, at the very least games will be fully optimized to take advantage of up to 8 threads. Even so, I expect our CPU to continue to stay moderately-strong for the immediate future. Perhaps Haswell-E or another followed micro-architecture (Broadwell?) will have the significant performance improvements to merit a full upgrade.

Broadwell won't, it's just a Haswell die shrink. Skylake will be proper performance bump for the consumer market, but after the medicore improvement with Haswell over Ivy Bridge who knows what it'll actually bring.

But, I want it to look better than consoles :(

It will.
 
I'm just gonna play this on PS4.

That version sounds pretty damn good.

IT wont be high end PC awesome, but it will be great.
 
Stealth brag?

I'm quite proud of myself and my random chip lottery.

Seriously though, I just want to see if running a 4C/4T cpu faster would help or will it still be somewhat of a bottleneck.

... or, it could be another AC3 and it doesn't matter at all because, you know, ubisoft optimization™.
 

Grief.exe

Member
Wait, Durante said that frame rate is CPU dependent. So why does getting a better GPU always increase frame rate by gobs. What am I missing here?

If what he says is true, then I should be getting better bang for my buck for the frame rate since my i7 3770k will be using more cores, but I thought that was for physics, general processing, and actors, and AI. I thought GPU was by far the dominating factor for frame rate.

I'm confused now!

If your CPU can't handle the amount of data coming from your GPU, then it cannot render the extra frames the GPU could potentially render.

Frame rate is CPU dependent to a point, but then runs into some huge diminishing returns once you have enough to handle your system.
 

Serandur

Member
No problem. Your 3930K is still the dogs balls, I'd not be too worried.



Broadwell won't, it's just a Haswell die shrink. Skylake will be proper performance bump for the consumer market, but after the medicore improvement with Haswell over Ivy Bridge who knows what it'll actually bring.



It will.
I find it so depressing. How long has it been since we've seen real progress on the CPU performance front? The jump to Sandy Bridge? Intel needs competition so badly it hurts.
 
I find it so depressing. How long has it been since we've seen real progress on the CPU performance front? The jump to Sandy Bridge? Intel needs competition so badly it hurts.

Agreed. The mobile market has been infinitely more intriguing (and getting more competitive).
 

Grief.exe

Member
I find it so depressing. How long has it been since we've seen real progress on the CPU performance front? The jump to Sandy Bridge? Intel needs competition so badly it hurts.

No one can currently match the amount of money Intel can put into R&D, and the ones that can don't see a huge profit gain from going ARM to x86.
 

LiquidMetal14

hide your water-based mammals
I find it so depressing. How long has it been since we've seen real progress on the CPU performance front? The jump to Sandy Bridge? Intel needs competition so badly it hurts.

Intel has been innovating but pricing their best CPU's at a premium but if 8 core Haswell-E's come out at around the same price as the 3930k then I will be packing my bags and upgrading. I won't pay 1k for a CPU though.
 

antitrop

Member
... We need some kind of alert in the OP that your top of the line, overclocked CPU will still be able to play Watch_Dogs or something.
 

kharma45

Member
I find it so depressing. How long has it been since we've seen real progress on the CPU performance front? The jump to Sandy Bridge? Intel needs competition so badly it hurts.

Yeah Sandy Bridge was the last major jump and since then really fuck all has happened bar Intel now focusing on cutting power consumption, which in itself they've done a great job with in the mobile space.

AMD I think knows the higher end stuff is a lost cause but I've high hopes for Kaveri. Steamroller cores and a GCN GPU could make a very good APU.

I never thought I'd be concerned if my i5-3750k could run something at maximum...

You shouldn't be.
 

dmr87

Member
Hmmm, I should be fairly okay with a 3770k@4.4ghz and a 3GB GTX780 but we'll see.

I'm confused, will my i7 3820 be enough CPU wise?

I just bought a Radeon 7950. Should I be good?

Alright, i have an i5-3570k OC@ 4.6ghz + 670. AM I GOOD?

So I have a i7 2600k at OC at 4.5ghz, Crossfire 7970's, 32gb ram ddr3. I should be fine? I just don't know about that cpu. I thought we only have so far up to Quad Cores? and 2600k has hyperthreading or something?

I'm not sure if I'm ready for it.

Are my Intel i5-4670k and GTX 660 still okay? I have 8 gigs of RAM. Bought this PC like 4 weeks ago. Lol

Am I screwed then...

  • Core i5-2500
  • 16GB Ram
  • x64 Windows 8.1
  • AMD Radeon HD 7800 Series

I wonder if my i7 2700k will hold me back by much.

3570k @ 4.4Ghz
770 4gb

If my 3570k causes me problems I'm going to punch a wall.

Well, I got everything but the 8 core. I'm running a 3570k Quad. I think I'll be fine.

.........Right?

Well fuck, I was going to go for 8 core when I made my build a month back but I was told it would be a waste and never utilised as 4 core current CPUs are already overkill.

I have an ultra capable GPU but my 4670k is preventing me from hitting all the recommended ultra specs :/

I have no idea how to compare CPUs. I have a i7 2700k @ 4.6, will that be enough for WD at 1440p?

8 core recommended? Da fuck outta here.

I wonder if my 3570k will be enough for next gen.

I cant believe my Sandy bridge i7 3.4 ghz will already be obsolete soon..i thought that and a 7950 will last me for the PS4/Xbox one ports but doesnt seems like it if its already starting off like this

The question now is will my i7-3770,GTX660 run this better than ps4

Great. Gonna have a hard time deciding between an i5 and an i7 if the former's days are that numbered.

bit upset by this new, i only just upgraded to an i5 3570k, with a gtx 680 and 16 ram, we shall see.

oAw.gif


Sorry about a million quotes, I had to.

I find it so depressing. How long has it been since we've seen real progress on the CPU performance front? The jump to Sandy Bridge? Intel needs competition so badly it hurts.

Agree, still on a Sandy 4,5Ghz, haven't seen a reason to upgrade so far, beastly CPU.
 

Grayman

Member
Seems like we're hearing the first death knells of playing modern games on dualcores.

Though really given how people tell me BF3 ran on dualcores, we've been seeing signs for a while.

I got some good use from my dual core but around 2010 or so I was wondering if I should have gotten a quad. In most games that I get lower performance than I wanted have always seemed to be CPU bound.

I am looking forward to a new build in the next few cycles. Probably going to go overboard on cores this time.
 

kharma45

Member
... We need some kind of warning in the OP that your top of the line overclocked CPU will still be able to play Watch_Dogs or something.

Hell even stock i5s will be fine.

Thread title should be 'Watch Dogs PC specs (x64 only, Quad Core minimum, recommended 8-core and 2GB VRAM) - Ignore the CPU recommendations'

Some many people getting worried over frankly nothing, but I appreciate where it is coming from. Ubisoft frankly I believe are talking through their holes here.
 
intel is too occupied with laptop CPU's and trying to bring down their power demands so they can be good for tablets and mobile. they're not really concerned with their really high end CPU's. you can see that by the fact that we haven't gotten any 8 core variants of their consumer i7 chips, nor do they even update their hexacore chips on a regular basis.

i wish AMD didn't suck so much at the high end and had CPUs on 22nm so they didnt suck so much power. we'd at least be able to get some 8 core chips that don't suck as much power as a Titan.
 

charsace

Member
I wonder how those people would have survived the 90s or early 00s, when your 2 year old system really was out of date.

They would not have survived. Especially the 90's where you could have a good card and still get poor performance or even software mode.
 

ghst

thanks for the laugh
it's weird how recommended specs are actually recommended specs nowadays - even highballing it with the octocore (yeah right) bit.

it's a far cry from the PC gaming i remember where recommended specs purely meant that the game would run slightly less like dogshit than minimum specs.
 

Fantasmo

Member
Hell even stock i5s will be fine.

Thread title should be 'Watch Dogs PC specs (x64 only, Quad Core minimum, recommended 8-core and 2GB VRAM) - Ignore the CPU recommendations'

Some many people getting worried over frankly nothing, but I appreciate where it is coming from. Ubisoft frankly I believe are talking through their holes here.
Well I for one have been worried because Crysis 3 (obviously) isn't running 60fps at 1080p on my 670 / 3770k, but what actually worried me a lot was Far Cry 3 not being to do that either with 2xAA.

I freaking bought the thing last year and a handful of games don't do 60@1080p already. Now I see Battlefield 4 people struggling and this thread... shit.. I already feel like I'm obsolete for 60/1080 and I was under the impression I bought a beast then!
 
Well I for one have been worried because Crysis 3 (obviously) isn't running 60fps at 1080p on my 670 / 3770k, but what actually worried me a lot was Far Cry 3 not being to do that either with 2xAA.

I freaking bought the thing last year and a handful of games don't do 60@1080p already. Now I see Battlefield 4 people struggling and this thread... shit.. I already feel like I'm obsolete for 60/1080!

BF4 is running like crap for most people though from what I see. I only get around 30-36 FPS while in BF3 same settings I hover right near 60
 

antitrop

Member
BF4 is running like crap for most people though from what I see.
I think that's more because they haven't worked the SLI issues out, yet. If the final game runs like the Beta does, though, it would be... impressive. Impressive that they were able to fuck it up so badly. BF3 ran great.

I doubt that will be the case, though.
 

Eusis

Member
oAw.gif


Sorry about a million quotes, I had to.



Agree, still on a Sandy 4,5Ghz, haven't seen a reason to upgrade so far, beastly CPU.
Yeah, unless they're going for 120 FPS stable or something I can't really see why what they have WOULDN'T be enough. Though with stuff like 7970s Crossfired that actually seems very plausible, one'd be more than enough for me for 1080p 60 FPS gaming for the most part I'd think, but when jumping to much higher resolutions/FPS I'd want something more.

Still, i5 2500k overclocked to 4.3 or 4.4 GHz (thanks BF4 stuttering!) and a 560ti. I could probably handle it buuut this really may be a borderline case where I could end up better on PS4 for whatever reason, whether it's because it simply runs better or there's annoying issues dragging down PC like BF4's stuttering (which STILL isn't fixed with OC'ing but seems to have entered an acceptable range now.)
 

Fantasmo

Member
BF4 is running like crap for most people though from what I see. I only get around 30-36 FPS while in BF3 same settings I hover right near 60
That's part of what scares me. Are Far Cry 3 and Battlefield 4 horribly unoptimized or is my 1 year old beast PC already a relic for 60/1080?
 

kharma45

Member
Well I for one have been worried because Crysis 3 (obviously) isn't running 60fps at 1080p on my 670 / 3770k, but what actually worried me a lot was Far Cry 3 not being to do that either with 2xAA.

I freaking bought the thing last year and a handful of games don't do 60@1080p already. Now I see Battlefield 4 people struggling and this thread... shit.. I already feel like I'm obsolete for 60/1080 and I was under the impression I bought a beast then!

I wouldn't judge BF4 by its beta performance.

That's part of what scares me. Are Far Cry 3 and Battlefield 4 horribly unoptimized or is my 1 year old beast PC already a relic for 60/1080?

BF4 is likely unoptimised, FC3 was just demanding in the same way (although nowhere near as bad) something like Metro 2033 was.
 
I think that's more because they haven't worked SLI issues out, yet. If the final game runs like the Beta does, though... it would be... impressive. Impressive that they were able to fuck it up so bad.ly

I doubt that will be the case, though.

I see people with non sli setups getting shit performance too even with cards better than you. Go check official forums. There are so many threads on the issue
 

antitrop

Member
I see people with non sli setups getting shit performance too even with cards better than you. Go check official forums. There are so many threads on the issue
Either way, I seriously don't expect the performance issues to carry over the final release.

The BF4 Beta is practically unplayable, no way they could ship something like that. My framerate varied between 20 and mid 50s. My best guess is that the BF4 Beta is using old, unoptimized code because they aren't concerned with testing that facet of the game, or that it simply wasn't possible to have the newest code be available for testing.
 

DJIzana

Member
As amazing as that game is supposed to look graphically, I can't invest as much into PC gaming any more. I find it too annoying and too costly to upgrade / maintain. I'll still with next gen... at least, for awhile. :p
 
My 2600k oced to 4.4ghz, EVGA GTX 670 FTW 2GB, and 16GB of ram better run this game decently. I don't want to upgrade until next year at the earliest.
 

Serandur

Member
Agreed. The mobile market has been infinitely more intriguing (and getting more competitive).
I like new, fast phones as much as the next guy, but I find it interesting how crucial specs seem to smartphone/tablet success whereas the mainstream stopped caring for PC performance it seems. My phone has become important but I don't ever use it for anything intensive really - don't really need much to browse the internet and watch videos effectively. It's odd, but also sad in its own way I think. We're getting more power-efficient, but we've effectively stagnated at the upper limit of performance.
No one can currently match the amount of money Intel can put into R&D, and the ones that can don't see a huge profit gain from going ARM to x86.

Intel has been innovating but pricing their best CPU's at a premium but if 8 core Haswell-E's come out at around the same price as the 3930k then I will be packing my bags and upgrading. I won't pay 1k for a CPU though.

Yeah Sandy Bridge was the last major jump and since then really fuck all has happened bar Intel now focusing on cutting power consumption, which in itself they've done a great job with in the mobile space.

AMD I think knows the higher end stuff is a lost cause but I've high hopes for Kaveri. Steamroller cores and a GCN GPU could make a very good APU.



You shouldn't be.

oAw.gif


Sorry about a million quotes, I had to.



Agree, still on a Sandy 4,5Ghz, haven't seen a reason to upgrade so far, beastly CPU.

intel is too occupied with laptop CPU's and trying to bring down their power demands so they can be good for tablets and mobile. they're not really concerned with their really high end CPU's. you can see that by the fact that we haven't gotten any 8 core variants of their consumer i7 chips, nor do they even update their hexacore chips on a regular basis.

i wish AMD didn't suck so much at the high end and had CPUs on 22nm so they didnt suck so much power. we'd at least be able to get some 8 core chips that don't suck as much power as a Titan.

So that's it? No more real progress anymore in CPU performance in any single iteration? Of course, laptop CPUs are getting rather impressive, but the upper limit of performance isn't going to progress? What can we reasonably expect from Skylake? No chance AMD will catch up at some point and spur some action or have they pretty much abandoned the market completely?
 

Eusis

Member
So that's it? No more real progress anymore in CPU performance in any single iteration? Of course, laptop CPUs are getting rather impressive, but the upper limit of performance isn't going to progress? What can we reasonably expect from Skylake? No chance AMD will catch up at some point and spur some action or have they pretty much abandoned the market completely?
Haswell-E looks promising, but I do kind of expect that next-gen consoles will help push this forward. Granted, gaming's just a niche relative to the rest of the computer market, but it's a niche that keeps demanding better performance and pushes them to advance further and further rather than settle for whatever works for word processing, and now with 8 core CPUs about to be the standard there'll be more pressure to at least improve on that front.
 
What resolution?

They don't know yet lol.

True but I can take my PS4 with me out of town for the holidays where we spend two weeks with relatives.
True true. I rarely game when on vacation out to relatives. I feel like a dick if I do. And I feel bummed if I'm not playing a game the best I can ATM. Especially if the same game cost the same across versions.

Ps4 version should be fine though. Some extra content and vita remote play is making me have a bit of a hard time choosing tbh lol.
 

JohngPR

Member
Forgot about Vita remote play...not to mention the Share button stuff.

Hmmm...I'm torn between PC/PS4 versions of this game.
 
They don't know yet lol.


True true. I rarely game when on vacation out to relatives. I feel like a dick if I do. And I feel bummed if I'm not playing a game the best I can ATM. Especially if the same game cost the same across versions.

Ps4 version should be fine though. Some extra content and vita remote play is making me have a bit of a hard time choosing tbh lol.

Sounds more like they don't want to tell us lol. If it was 1080p I'm sure they'd have no problem announcing it as such.
 

Nibiru

Banned
Sounds like a lazy port if it needs that as minimum, oh, ubisoft you say?

I can run it on my rig but I agree there is no reason at all it should "require all this" other than if it was like you say a lazy port. I mean the game is coming out for 360 and ps3 too for crying out loud lol.
 

VillageBC

Member
Haswell-E looks promising, but I do kind of expect that next-gen consoles will help push this forward. Granted, gaming's just a niche relative to the rest of the computer market, but it's a niche that keeps demanding better performance and pushes them to advance further and further rather than settle for whatever works for word processing, and now with 8 core CPUs about to be the standard there'll be more pressure to at least improve on that front.

The problem isn't just a lack of competition for Intel. It's just that there is nothing performance wise the main stream market needs. Word processing, excel, email and web browsing doesn't take much power at all. Hell phones are likely powerful enough to replace most desktop machines outside of niche markets.

I can't think of anything that would drive a resurgence for mass consumers outside of maybe gaming. Perhaps an attempt to bring Jarvis to every home.
 

Grief.exe

Member
So that's it? No more real progress anymore in CPU performance in any single iteration? Of course, laptop CPUs are getting rather impressive, but the upper limit of performance isn't going to progress? What can we reasonably expect from Skylake? No chance AMD will catch up at some point and spur some action or have they pretty much abandoned the market completely?

Not until 2015 with Skylake.

Haswell/Broadwell are merely stop gap measures.
 
Top Bottom