Given the minimum specs for the PC version. I really do have to wonder how the WiiU version will run.
Somewhere below the quality a minimum spec PC will run it.
Given the minimum specs for the PC version. I really do have to wonder how the WiiU version will run.
If you've only got a 1.5GB card, you were bound to start struggling here soon anyways.Guess my 660 doesn't have 2 gigs of "video ram" or "vram" or whatever it is, so I'm fucked.
I wouldn't worry too much about the VRAM as you can just drop the texture quality down a bit.Guess my 660 doesn't have 2 gigs of "video ram" or "vram" or whatever it is, so I'm fucked.
Isn't this old?
I also wouldn't worry too much. If its recommending an AMD processor, I doubt a good 4-thread Intel CPU will be troubled.
And why are people suggesting they'll buy a console version instead? I don't get it. Is a 30fps version really gonna make you happier?
If you've only got a 1.5GB card, you were bound to start struggling here soon anyways.
Wouldn't say you're fucked, though. You'll just have to turn down settings a bit more than others.
What does this even mean? You're expecting this open world game to run 60fps on native monitor res (usually 1080p) $400 PC no problem so why go with consoles?
For good reason Gaf has always been mainly an Intel fan club. However things may change,lets see how the game runs on AMD tech.
My rig should run this great looking at the spec list but we will see.
If you've only got a 1.5GB card, you were bound to start struggling here soon anyways.
Wouldn't say you're fucked, though. You'll just have to turn down settings a bit more than others.
Well, luckily I have a i7 3770@4.4ghz. That said is that good enough for the highest settings (my 780 is good to go)? Didn't have too much trouble with AC4. We'll see.
Hey, well this is my first AMD chip and my situation is unique because I need a machine that can transcode multiple HD streams alongside running a high end game, so multi threaded performance is more important for me than the average gamer.
After a lot of dual core CPUs were sold over quads on bad advice last generation, I'll always err on the side of more cores where I feel it makes sense.
If your budget allows for an i7 or an i5 with an unlocked multiplier then go for that every time. If you're going for longevity on a budget then traditionally more cores with reasonable performance have fared better than fewer slightly faster cores.
Consoles games do not even have 8 cores to work with, 2 are locked off for the OS.
The FX8350 is only a little bit faster in most heavily multi threaded loads than an i5, if games are made with only 6 heavy threads in mind it will not stand a chance.
Lets also not forget Amdahl's Law.
(lots of graphs and text before all this bit)...There's simply no way to make up for this terrible performance. Sure, some games in the future might be n-threaded to better take advantage of the FX architecture, but there aren't many games right now that do. Not only that, but coding to take advantage of those extra cores is incredibly complex. From the mouth of Lord GabeN:
'If writing in-order code [in terms of difficulty] is a one and writing out-of-order code is a four, then writing multicore code is a 10. That's going to have consequences for a lot of people in our industry. People who were marginally productive before, will now be people that you can't afford to have write engine or game code.'
But even then, one of the few games that is n-threaded is Civilization V. You'd imagine that with those 8 cores at it's disposal, the 8350 would perform well, giving credence to the "well when multiplatform games are coded for 8 cores...." argument. Turns out, that's still not the case.
In case you are wondering, that 875K that is outperforming it was released in 2010. In the words of Scott Wasson:
'Either way you cut it, the FX-8350 remains true to what we've seen in our other gaming tests: it's pretty fast in absolute terms, easily improves on the performance of prior AMD chips, and still has a long way to go to catch Sandy Bridge, let alone Ivy.'
All these charts are fine but newer games could be more optimised for eight core chips.
lets wait and see.
Here's a reality check on that 8 core recommendation: http://store.steampowered.com/hwsurvey/cpus/
Less than 0.3% of Steam gamers have 8 or more cores. Unless they completely messed up, 2500k/3570k/4570k etc. will be fine. And if they messed up that bad, some 8 core from AMD isn't going to be any better.
Browsing through that Steam HW survey, the memory section surprised me. Less than 40% of gamers have 8GB or more. Having only 4GB of memory is probably a real problem with Watch Dogs considering their minimum recommendation of 6GB. Surely they aren't doing another Call of Duty Ghosts and require gigabytes of memory for literally nothing.
I really don't understand what you're saying here.What does this even mean? You're expecting this open world game to run 60fps on native monitor res (usually 1080p) $400 PC no problem so why go with consoles?
People need to be careful with direct comparisons to consoles.There are cards (like the 5850) that are just as powerful as the Xbone GPU but only have 1 GB of video memory. You'll have to turn down texture detail a bit but otherwise most 1 GB GPUs (that were capable in their day) should be fine.
Why harder? You can just drop LODs down a bit.I wouldn't worry too much about the VRAM as you can just drop the texture quality down a bit.
Harder to get around CPU requirements.
Here's a reality check on that 8 core recommendation: http://store.steampowered.com/hwsurvey/cpus/
Less than 0.3% of Steam gamers have 8 or more cores. Unless they completely messed up, 2500k/3570k/4570k etc. will be fine. And if they messed up that bad, some 8 core from AMD isn't going to be any better.
Browsing through that Steam HW survey, the memory section surprised me. Less than 40% of gamers have 8GB or more. Having only 4GB of memory is probably a real problem with Watch Dogs considering their minimum recommendation of 6GB. Surely they aren't doing another Call of Duty Ghosts and require gigabytes of memory for literally nothing.
All these charts are fine but newer games could be more optimised for eight core chips.
lets wait and see.
This happens every time new generation of console arrives with multiplatform games that take advantage from new hardware. High-end PCs become "recommended" for gaming, and that type of configuration in a year or two slides down to "affordable" price range.
PC gaming space will quickly stabilize, this time even faster than normal because both MS and Sony did not target high enough like the last time.
Even with properly N threaded games the FX line can't always keep up with Intel
Isn't this old?
I also wouldn't worry too much. If its recommending an AMD processor, I doubt a good 4-thread Intel CPU will be troubled.
And why are people suggesting they'll buy a console version instead? I don't get it. Is a 30fps version really gonna make you happier?
Actually performs well in Crysis 3. Cryengine 3 is really well optimised for 8 core CPUs.
You've got a 780 and 3570k and you're going to buy the console version?I'd rather buy the PS4 version then have a below par experience on my 780/3570k like nearly every other ubisoft game has been, I'm fine with 30 fps on a controller, and assuming its 1080p it will likely look almost the same as the PC like AC4 barring some nvidia tech, hence why I'm not buying this on PC.
It's part of why I said I'd rather wait on updates that would allow a computer to completely destroy consoles: at that point you may be able to brute force past terrible optimization, even though ideally any game would get a proper reworking so that an i5 2500k properly smokes the console CPUs.The problem, of course, is that you're talking nonsense. Consider comparing an i5 3570K to an FX-6300, which is a 4 core non-HT Ivy Bridge versus a 6 core Vishera. Note that the FX-6300 is considerably more powerful than the 6 developer-facing Jaguar cores found in the current-gen consoles. Now, look at the benchmarks where the i5 dominates the FX-6300 in every single and multi-threaded benchmark it encounters:
http://www.anandtech.com/bench/product/699?vs=701
Wow that's amazing. Game optimised for 8 cores runs pretty well on 8 core CPU. Pity we'll never see games optimised for 8 core ever again.Crysis3 was AMD-sponsored game, Crytek optimized game well for 8 cores.
Wow that's amazing. Game optimised for 8 cores runs pretty well on 8 core CPU. Pity we'll never see games optimised for 8 core ever again.
I'd rather buy the PS4 version then have a below par experience on my 780/3570k like nearly every other ubisoft game has been, I'm fine with 30 fps on a controller, and assuming its 1080p it will likely look almost the same as the PC like AC4 barring some nvidia tech, hence why I'm not buying this on PC.
Frostbite favors CPUs with moar cores also. EA is going to use it in a lot of games too.
IMO, right now people building a PC on the cheap are stuck in an awkward transitionary period where Intel hasn't seen fit to grace the low end with anything but dual cores and AMD is still gigagarbage with single-threaded games. I personally wouldn't buy or recommend either, and stretch for a low-end i5 at the least.
Somewhere below the quality a minimum spec PC will run it.
Frostbite favors CPUs with moar cores also. EA is going to use it in a lot of games too.
IMO, right now people building a PC on the cheap are stuck in an awkward transitionary period where Intel hasn't seen fit to grace the low end with anything but dual cores and AMD is still gigagarbage with single-threaded games. I personally wouldn't buy or recommend either, and stretch for a low-end i5 at the least.
The difference in FPS between 4670 and 4770 is literally fuck-all so I am not sure what you mean.
I'd rather buy the PS4 version then have a below par experience on my 780/3570k like nearly every other ubisoft game has been, I'm fine with 30 fps on a controller, and assuming its 1080p it will likely look almost the same as the PC like AC4 barring some nvidia tech, hence why I'm not buying this on PC.
How many more cores does the 4770K have over the 4670K?
Ugh... not sure if I want to go pc or ps4 on this. I've had my i5 2500k (currently at 4.2) and my gtx 570 since early 2011 and upgraded my gpu to a r9 280x (asus dual fan) late last year when it was released. What do you guys think?
Side note, I have to say that the 2500k has to be one of the best cpus likely of all time imo.
Edit: Any word on mantle support for WD?
Da fuck? I have 4670k @4.2Ghz and a 780 and it runs AC4 fine with just about everything cranked up. So if don't think your rig can handle this game, well I don't know what tell you. maybe sell it too someone who will utilize it.I'd rather buy the PS4 version then have a below par experience on my 780/3570k like nearly every other ubisoft game has been, I'm fine with 30 fps on a controller, and assuming its 1080p it will likely look almost the same as the PC like AC4 barring some nvidia tech, hence why I'm not buying this on PC.