• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Watch Dogs specs revealed - 8 core CPU recommended

Leb

Member
It's not about PCs vs. consoles. It's about the absolute pigheadedness of the loud PC "enthusiasts" who keep claiming that old i5s (or even i3s) are good enough to play current gen games, thereby misleading new PC buyers into making bad decisions.

The problem, of course, is that you're talking nonsense. Consider comparing an i5 3570K to an FX-6300, which is a 4 core non-HT Ivy Bridge versus a 6 core Vishera. Note that the FX-6300 is considerably more powerful than the 6 developer-facing Jaguar cores found in the current-gen consoles. Now, look at the benchmarks where the i5 dominates the FX-6300 in every single and multi-threaded benchmark it encounters:

http://www.anandtech.com/bench/product/699?vs=701
 

maneil99

Member
It's not about PCs vs. consoles. It's about the absolute pigheadedness of the loud PC "enthusiasts" who keep claiming that old i5s (or even i3s) are good enough to play current gen games, thereby misleading new PC buyers into making bad decisions.

A 4 core i5 will beat out an 8 core AMD in most games
 

maneil99

Member
With those high specs anyone would think we would be getting the quality of the E3 2012 trailer.....

Recommended GPU is on Par with consoles so no. Also Ubisoft games always are shit optimized. AC3 and AC4 are garbage in towns. AC4 only runs well because there is nothing going on Geometry and effects wise.
 

Renekton

Member
The problem, of course, is that you're talking nonsense. Consider comparing an i5 3570K to an FX-6300, which is a 4 core non-HT Ivy Bridge versus a 6 core Vishera. Note that the FX-6300 is considerably more powerful than the 6 developer-facing Jaguar cores found in the current-gen consoles. Now, look at the benchmarks where the i5 dominates the FX-6300 in every single and multi-threaded benchmark it encounters:

http://www.anandtech.com/bench/product/699?vs=701
Yes they are considerably more powerful than Vishera, however remember this:

John Carmack on console performance

So essentially we are comparing apples to oranges, because whatever target spec on consoles, we need a far stronger spec on PCs to match the increased overhead and lack of HW-specific optimizations.
 

TheD

The Detective
They are not near identical, x86 is just an ISA.

PC gaming rigs normally don't use large pool UMA, HSA, extra GPU-CPU bus, SoC, esram, and compute is still left to the CPU. I think each console has different OS scheduler and intrinsics as well.

2 and 3 are part of the same thing, 4 just means they are on the same chip, 5 is not needed on PCs and 6 shows a lack of understanding of hardware architecture (GPU compute was around years before the consoles came out).
 

kortez320

Member
Yes they are considerably more powerful than Vishera, however remember this:

John Carmack on console performance

So essentially we are comparing apples to oranges, because whatever target spec on consoles, we need a far stronger spec on PCs to match the increased overhead and lack of HW-specific optimizations.

Right but a Vishera core is more then twice as fast as a Jaguar core just in clock speed. That's without getting into IPC improvements and without mentioning Intel.

In other words PC CPUs are much more then 2x faster then current gen.

And for that matter so are PC GPUs.
 

Renekton

Member
2 and 3 are part of the same thing, 4 just means they are on the same chip, 5 is not needed on PCs and 6 shows a lack of understanding of hardware architecture.
Not the same. UMA does not mean you can have shared addressing, and memory copy may be still required.
 

Leb

Member
Yes they are considerably more powerful than Vishera, however remember this:

John Carmack on console performance

So essentially we are comparing apples to oranges, because whatever target spec on consoles, we need a far stronger spec on PCs to match the increased overhead and lack of HW-specific optimizations.

Oh, good, someone trotted out the infamous Carmack tweet. By all means, let's ignore the fact that he was, necessarily, talking about the previous generation hardware (which ran on comparatively esoteric architectures) and let's further ignore the substantial performance optimizations gained in the transition from DX9 to DX11. I suppose we could even ignore Mantle, and honestly, no one would fault you for ignoring DX12, since it won't, admittedly, even be out until Q4 2015.
 

Ploid 3.0

Member
This kind of deflate my enthusiasm for PC gaming, though I pretty much only play on PC now. My next upgrade was going to be a motherboard and a i5 intel since I just got a card. These recommended specs are nuts. I'll be fine I'm sure but what next? I will wait a bit before getting the next parts, I want to play WD on my PC with good enough framerates.
 

axb2013

Member
I was going to dissect his argument but it's already done. PS4's will melt because cpu cycles will be used via cloud to help weak i5's run the game.

Seriously, no one considered Ubi is mitigating damage with launching a brand new engine. That entry barrier low enough for Conroe didn't tip people off I guess.
 

Leb

Member
I was going to dissect his argument but it's already done. PS4's will melt because cpu cycles will be used via cloud to help weak i5's run the game.

Seriously, no one considered Ubi is mitigating damage with launching a brand new engine. That entry barrier low enough for Conroe didn't tip people off I guess.

Err... you're saying that AnvilNext games on the PC exhibit poor CPU utilization? But that Watch_Dogs, meanwhile, is running on the Disrupt engine, and considering that Disrupt was written with next-gen consoles in mind, it's not unreasonable to assume that it was built with high concurrency in mind?

If so, I quite agree.
 

Renekton

Member
Oh, good, someone trotted out the infamous Carmack tweet. By all means, let's ignore the fact that he was, necessarily, talking about the previous generation hardware (which ran on comparatively esoteric architectures) and let's further ignore the substantial performance optimizations gained in the transition from DX9 to DX11. I suppose we could even ignore Mantle, and honestly, no one would fault you for ignoring DX12, since it won't, admittedly, even be out until Q4 2015.
Fair point, plus I probably misread the context of the argument (sorry about that).

To clear things up, console CPUs may be somewhat okay since devs will work weekends getting it to run in jaguar's 6 low-power cores. But I do think that using jaguar comparison to say any i5 on rigs will do fine is not guaranteed. The Carmack twitter shows that we definitely need a lot more headroom. My i7-920 OC definitely is very strained under recent games like BF4, so I'm looking at the Haswell refresh or Haswell-E for my next upgrade.
 

TheD

The Detective
Not homogenized either.

For example, PS4's onion and garlic are unique to the architecture and requires specific developer considerations.

But they are a core part of the HSA design.
You can not count these things twice.
 

TheD

The Detective
Onion and Garlic are not common to AMD APUs, only for PS4.


Fine whatever, grumble :mad:

The point stands that the architectures are not the same.

Just because they are not using the same names does not mean they do not have an equivalent!
 

SapientWolf

Trucker Sexologist
Fair point, plus I probably misread the context of the argument (sorry about that).

To clear things up, console CPUs may be somewhat okay since devs will work weekends getting it to run in jaguar's 6 low-power cores. But I do think that using jaguar comparison to say any i5 on rigs will do fine is not guaranteed. The Carmack twitter shows that we definitely need a lot more headroom. My i7-920 OC definitely is very strained under recent games like BF4, so I'm looking at the Haswell refresh or Haswell-E for my next upgrade.
Carmack cares more about getting 60fps out of the consoles than Ubi does, and I doubt that you'll need any type of 8 core CPU to get console performance (or better) on Watch Dogs.
 
There's something I don't get. There's two recommended CPU: A 300 dollars, i7 3770k and a 160 dollars, FX8350.

Which makes me wondering: Are they planning to use 8 threads, or they simply took what sounded like high end CPU from both Intel and AMD ?
 

diaspora

Member
There's something I don't get. There's two recommended CPU: A 300 dollars, i7 3770k and a 160 dollars, FX8350.

Which makes me wondering: Are they planning to use 8 threads, or they simply took what sounded like high end CPU from both Intel and AMD ?

Honestly, the latter imo.
 

Renekton

Member
There's something I don't get. There's two recommended CPU: A 300 dollars, i7 3770k and a 160 dollars, FX8350.

Which makes me wondering: Are they planning to use 8 threads, or they simply took what sounded like high end CPU from both Intel and AMD ?
Most likely the latter.

I'm wondering if there any any examples of well-threaded Ubisoft games.
 

emag

Member
There's something I don't get. There's two recommended CPU: A 300 dollars, i7 3770k and a 160 dollars, FX8350.

Which makes me wondering: Are they planning to use 8 threads, or they simply took what sounded like high end CPU from both Intel and AMD ?

Both. It's very likely that Watchdogs is heavily multithreaded. Also, Ubisoft doesn't want to piss off AMD by only mentioning Intel CPUs.
 

axb2013

Member
Err... you're saying that AnvilNext games on the PC exhibit poor CPU utilization? But that Watch_Dogs, meanwhile, is running on the Disrupt engine, and considering that Disrupt was written with next-gen consoles in mind, it's not unreasonable to assume that it was built with high concurrency in mind?

If so, I quite agree.

Yes, AFAIK, Disrupt is Ubi's first stab at a multi threaded rendering engine, or the 1st one to reach consumers if Snowdrop development started prior to Disrupt.

Requirements nowadays are different compared to when the first Splinter Cell launched,
a lot more PR and legalese.

For the typical buyer Ubi caters to, the people that start panicking only once they see fire, feel heat or smell smoke, the sys reqs are there for Ubi to point to if the consumer has issues. Consumer sees the back of the box or the online listing is telling him his hardware falls below the ideal configuration but above minimal and consumer adjusts expectations accordingly. May even be impressed with performance since "his computer isn't powerful" as the info on the back of the box suggested.

If I were launching a brand new engine to thousands of different configurations, I would probably rely on damage control too.

Another thing to remember is that Disrupt is older than the recommended CPU, as in the CPU didn't exist when the engine was conceived.
 
The problem, of course, is that you're talking nonsense. Consider comparing an i5 3570K to an FX-6300, which is a 4 core non-HT Ivy Bridge versus a 6 core Vishera. Note that the FX-6300 is considerably more powerful than the 6 developer-facing Jaguar cores found in the current-gen consoles. Now, look at the benchmarks where the i5 dominates the FX-6300 in every single and multi-threaded benchmark it encounters:

http://www.anandtech.com/bench/product/699?vs=701
A 6-core Vishera is roughly equivalent to a hypothetical 3 core Intel with hyperthreading enabled. AMD calls them modules instead of physical cores, but the basic technology is similar.

Hyperthreading is calculated at about a 1.25x boost per physical "core", or at least it was the last time I spent time working with such things. So just from the numbers (3.75 vs 4.0), we'd expect the 3570K to win out.

Throw in Intel's manufacturing advantage, and forgetaboutit.
 

GHG

Member
People are overreacting yet again.

4 core intel CPU's will be fine. The GPU will be more important just like it always is especially at higher resolutions.

As far as I remember Crysis 3's reccomended specs were similarvand I destroyed that game with my 3570k.

I will downsample the shit out of this.
 
iZzERBPimu0ds.png

I'm fucked?
Laptop CPUs and GPUs are weak sauce.
 

rezn0r

Member
why am i not surprised at all by this news? it feels like natural progression, seriously

i have every console and a really strong pc, fwiw
 

GHG

Member
why am i not surprised at all by this news? it feels like natural progression, seriously

i have every console and a really strong pc, fwiw

People forget that there was actually a time when pc hardware was rapidly progressing (especially on the CPU side) and a new game would arrive every year that would push the envelope and make your recently purchased new hardware feel ancient.

Not saying we should go back to those times but it would be nice every now and then for a new game to come out and push the bar a little higher.
 

Wag

Member
So if Witcher 3 has steep requirements (it just might) will we say it's unoptimized also?

I say let's wait and see before we pass judgement.
 

Miguel81

Member
Both. It's very likely that Watchdogs is heavily multitudes. Also, Ubisoft doesn't want to piss off AMD by only mentioning Intel CPUs.

Of course, you wouldn't want to act like the armchair fucks that live on tech forums. Especially, if you want to sell a product.
 

JaseC

gave away the keys to the kingdom.
So if Witcher 3 has steep requirements (it just might) will we say it's unoptimized also?

I say let's wait and see before we pass judgement.

The cynicism stems from the performance woes of AssCreed 3 and 4.
 

DinHerio

Banned
First: Everyone's blaming Ubisoft for downgrading the PC version.
Second: Everyone's blaming Ubisoft for high PC requirements to fully experience the graphics.

Is this GAF?

Got a i7-3820 @ 4.2Ghz, I should be fine.

4-core, 8-core, smore-core

Hardcore is what matters.

And we are hard, are we not?!

Bro, I swear, we are so hard, so, so hard man!
 

SapientWolf

Trucker Sexologist
First: Everyone's blaming Ubisoft for downgrading the PC version.
Second: Everyone's blaming Ubisoft for high PC requirements to fully experience the graphics.

Is this GAF?

Got a i7-3820 @ 4.2Ghz, I should be fine.



Bro, I swear, we are so hard, so, so hard man!
Most of the pretty graphical effects should be GPU heavy rather than CPU heavy though. If Ubi was recommending Titans then that would be a whole different can of beans.
 

rezn0r

Member
People forget that there was actually a time when pc hardware was rapidly progressing (especially on the CPU side) and a new game would arrive every year that would push the envelope and make your recently purchased new hardware feel ancient.

Not saying we should go back to those times but it would be nice every now and then for a new game to come out and push the bar a little higher.

don't disagree with you at all - this makes me start to wonder if this is one of those games, or is it Ubi not optimizing things that great? i'm excited to see either way!

i didn't really expect WD to be "that game" on PC but if it is i'll gladly accept it. i was thinking more along the lines of the witcher 3 (reminds me i need to finish #1 and get to #2)

I should really add more RAM to my PC, I only have 4GB.

yes you should
 

DarkFlow

Banned
Again, I don't think they exist yet. I think intel has played around with 6 physical cores, but I'm pretty sure that model is only a 4 physical core model, with 4 theoretical cores with hyperthreading. Considering that seems like a really shitty way to make up for the difference, I'm guessing this game is going run like ass--just like AC3/4 did.
Oh they make them, up to 15 cores right now. They only thing is it's a xeon and it costs a shit load. http://ark.intel.com/m/products/752...-37_5M-Cache-2_80-GHz#@product/specifications
 

DinHerio

Banned
Most of the pretty graphical effects should be GPU heavy rather than CPU heavy though. If Ubi was recommending Titans then that would be a whole different can of beans.

That's not right in my opinion.
I think the reason of heavy CPU & RAM usage are the new demands for parallelism due to current gen consoles. I would agree if it was a shooter with scripted maps but Watch Dogs is a "next gen" open world game. There are lots of stuff to do for the processor in open world. All the mathematics behind weather, traffic, A.I. and more needs a lot of cpu power.
 

SapientWolf

Trucker Sexologist
That's not right in my opinion.
I think the reason of heavy CPU & RAM usage are the new demands for parallelism due to current gen consoles. I would agree if it was a shooter with scripted maps but Watch Dogs is a "next gen" open world game. There are lots of stuff to do for the processor in open world. All the mathematics behind weather, traffic, A.I. and more needs a lot of cpu power.
No one made a thread to complain about AI or traffic downgrades yet. And none of that is even remotely taxing to an enthusiast desktop CPU.
 
Top Bottom