• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Assassin's creed Unity PC version System requirements

And There is no way to play this game at 32-bit OSes.

So, If you're 32-bit OS user, It's time to change your OS.

If you're running the minimum specs listed and are running a 32 bit OS, you're doing things wrong. Hell it's 2014 if you're running a 32bit OS on any piece of hardware made in the last 5 years, you're doing things wrong.
 

Seanspeed

Banned
If you're running the minimum specs listed and are running a 32 bit OS, you're doing things wrong. Hell it's 2014 if you're running a 32bit OS on any piece of hardware made in the last 5 years, you're doing things wrong.
Yea, scalability towards older hardware/OS's can only go so far. Next-gen only games are obviously going to have higher minimum requirements to some degree and we have to leave behind these dated things in order to have progress.

Although having a 2500k/GTX680 or 7970 as minimum is a bit more of a leap than I think we should expect at this stage. A game better be downright groundbreaking to require this kind of hardware just to run at the lowest settings.
 

prag16

Banned
rip 1GB 560Ti. Kinda want to wait until next year to upgrade though, so I guess I will just get myself addicted to WoW meanwhile.
I also have a 1GB 560ti. I was told I wouldn't be able to run Watch_Dogs properly, but that turned out to be bullshit. 2550k turned out to be more than fine as well.

Hell, my buddy with a 550ti was able to run it okay. The driving was somewhat rough but otherwise was entirely playable.

Let's wait for benches. Because as many have said, I smell bullshit.

I'll probably do a GPU upgrade in a year. Though I will say that the GTX970 has me tempted.
 

JordanKZ

Member
This is kind of telling really. AC3 and AC4 are two of the worst ports I've ever seen, I remember the frame rate literally cutting in half the minute a few trees appeared in AC4. And this was with my 4770K @ 3.8ghz and SLI'd 780Ti's.

I can't wait to see how bad it performs this time.
 

RulkezX

Member
Yea, scalability towards older hardware/OS's can only go so far. Next-gen only games are obviously going to have higher minimum requirements to some degree and we have to leave behind these dated things in order to have progress.

Although having a 2500k/GTX680 or 7970 as minimum is a bit more of a leap than I think we should expect at this stage. A game better be downright groundbreaking to require this kind of hardware just to run at the lowest settings.

And here lies the main reason for the reactions I think.

Unity doesn't like one of those games that comes along every generation and makes every PC gamer go " oh fuck , time for an upgrade", it just looks like AC *shrug*
 

hengyu

Member
Don't be silly.
Your GPU is fine.

I have a i5-3570k OCed to 4.2 GHz, but that and my 670 still had performance problems with Black Flag where it spiked below 60fps, although it wasn't very frequent. A 470 was the recommended requirement...
 
Hahaha I assume you're trolling, nobody can be that stupid.

Not sure what you're even laughing at.

- CPU:
Minimum - Intel Core® i5-2500K @ 3.3 GHz or AMD FX-8350 @ 4.0 GHz or above

In the OP, did you not read it?

None of those 'require' an i7.

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-Middle-earth_Shadow_of_Mordor-test-ShadowOfMordor_proze.jpg


So could we stop this 'people who thought their old hardware was enough were wrong' talk until at least one game really needs more cpu power to run smoothly? Maybe Unity will be it but so far the official requirements of the latest AAA productions haven't been very reliable, Evil Within's VRAM recommendations for example ended up being so wrong it was almost funny.

I expect ACU to be that game considering it's one of the first non cross gen games and the performance of past AC/Ubi games.
 

Dr Dogg

Member
This is kind of telling really. AC3 and AC4 are two of the worst ports I've ever seen, I remember the frame rate literally cutting in half the minute a few trees appeared in AC4. And this was with my 4770K @ 3.8ghz and SLI'd 780Ti's.

I can't wait to see how bad it performs this time.

Well double buffered vsync certainly was an issue but good old D3DOverider to the resuce. Saying that I know what you mean and both III and IV have some seriously demanding environment tessellation going on that you can't pair back without changing the global environment quality setting pulling back other effects too (I've tried ini tweaks but nothing seemed to work). Worst culprits on IV are around the Iguana hideout and around the Warehouses. Running 2 780's heavily overclocked at a res of 2560x1440 with 2xTXAA I'd get pulled down to the mid 40's in those sections. Dialing back AA to SMAA would keep it in the 50 region but temporal aliasing be damned it looks rough. Going to 1920x1080 still pulls down from 60 a teeny bit every now and then but looks grim on a 2560x1440 monitor. Though it does scale quite linearly as disabling SLI brings the framerate crashing down.

I expect Unity to be a demanding game again at the top end but I just bloody wish Ubisoft would allow more options to scale the game specifically to my hardware without having to sacrifice other details with toggles that encapsulate 4,5 or 6 settings in one.
 

R_Deckard

Member
Your 2500K will still be good to match consoles for the whole gen, perhaps even more with DX12.

PS4/XBO have not inflated PC requirements much at all when I think about it, so much for PC having a hard time "adapting" to new consoles.
I7 was listed as recommended as early as 2011 (Dirt 3).
http://www.codemasters.com/uk/dirt_3/pc/faq/95/
I have seen this game running on a VERY powerfull PC and it was no-where near a locked 60, it peaked and trough'd like a rollercoaster, I think like most Ubisoft games they work on quantity over quality and would need the 2 week delay to get into any optimising for the day one patch.

Infact after seeing it I wondered if Rogue was set for all gens this year and Unity is a year sooner due to the poor sales from WD (relatively) on last gen to now, hence the delayed PC and subsequent X1/PS4 release in a retro package next year.

I think this will have performance issues even after a hulking day 1 patch on ALL formats, just expect the PC version to be the worse (again relatively).
 
Not sure what you're even laughing at.



In the OP, did you not read it?



I expect ACU to be that game considering it's one of the first non cross gen games and the performance of past AC/Ubi games.



Because... we already had games "requiring" i7 CPUs... and working as good on i3 CPUs ?
 

gelf

Member
I think this is the first game thats put my CPU below even the minimum. The others that just said i7 could have just meant first gen i7 which is what I have.

Not that I care too much as its only Assassins Creed.
 

Kezen

Banned
I have seen this game running on a VERY powerfull PC and it was no-where near a locked 60, it peaked and trough'd like a rollercoaster, I think like most Ubisoft games they work on quantity over quality and would need the 2 week delay to get into any optimising for the day one patch.

Infact after seeing it I wondered if Rogue was set for all gens this year and Unity is a year sooner due to the poor sales from WD (relatively) on last gen to now, hence the delayed PC and subsequent X1/PS4 release in a retro package next year.

I think this will have performance issues even after a hulking day 1 patch on ALL formats, just expect the PC version to be the worse (again relatively).

So as long as I'm not forced to play at 900p/30fps it will be just fine. ;)
50fps, 1080p, near maximum settings should be easily achievable on high-end systems.

That should be enough to blow the PS4/XBO skus out of the water, even with performance "issues" a smooth experience is absolutely possible. No "demanding" game is easy to run at 60fps by the way.
And I expect an entry level GPU (R9 265/ GTX 750ti) to run the game better than the new consoles, so all in all the PC should be the best version without breaking a sweat.
 
Given the fact that all previous cases of huge minimum requirements turned out to be hogwash, I see no reason at all not to panic in light of this news.
 

R_Deckard

Member
So as long as I'm not forced to play at 900p/30fps it will be just fine. ;)
50fps, 1080p, near maximum settings should be easily achievable on high-end systems.

That should be enough to blow the PS4/XBO skus out of the water.

yeah I would not bank on anything yet!
 

Grief.exe

Member
I don't know why people put stock in these things. System requirements really aren't worth arguing about. They are far from fact, making any argument inherently fallacious.

However, this is Ubisoft. Any New game they release is likely to be a terrible, lazy port.
 

Cavalier

Banned
Looks like Ubisoft isn't holding back any longer after gamers whined about Watch Dogs graphical downgrades to make their game run decent on an average spec-d PC for its time.

Gotta get used to it. Sooner or later, i5 will be way outdated as they move on to octa-cores to match Console's CPU architecture for ports. But then again, I'm quite happy with my Xbox One so I won't be bothered to upgrade my PC until few years down the road. :)
 

Kezen

Banned
Why are they recommending i7 if i3 is good enough? Is this still true on 1080p and higher? I've seen only 1 image for 720p.

CPU has nothing to do with resolution. So an I3 is "good enough" for 30fps at 1080p in CPU limited scenarios.

But as the generation goes along, more cores (physical or otherwise) will make a difference or become necessary for high framerates.

Ah, if only DX12 released much earlier.
 
There is no way these system requirements are legit. A 2500K and 660 minimum?

Ask yourself if you think this game is going to be more of a graphical bear than Crysis 3. No? Because 2600K and 660 meet the High Performance requirements for Crysis 3.

There is no way AC Unity is pushing Crysis 3 max settings type of quality.

I'm going to assume they are asking for these beefy specs because the port is optimized horribly and will require significant overhead.
 
Why are they recommending i7 if i3 is good enough? Is this still true on 1080p and higher? I've seen only 1 image for 720p.




Because... They don't own multiple PCs to test for sure it's working ? Because minimum and recommended settings means nothing and change depending of the publishers ?

And yes, it's still true on 1080p, because you know, resolution has a few thing to do with CPU...
 

Dario ff

Banned
The power gap between minimum and recommended is way too low to have any conclusions about this. If these requirements (which likely don't) mean Low/High-Max settings, then it would be nowhere near enough power difference to show it (or there's barely any configurable settings). Hell, even the AMD CPU is the same on both. To say meeting the minimum requirements would give you "Low" settings while recommended gives you "High" settings just does not fit on the range of what previous games offered in terms of options.

Worth keeping in mind the Nvidia collaboration, so I wouldn't be surprised if the specs were based around enabling some of their features as well.

Shadow of Mordor, Watch Dogs, The Evil Within all come with i7 recommendations. System reqs will only get higher and old hardware like 2500k will only be enough for low end.
The Evil Within has been shown to have absolutely terrible multi-threading that doesn't even take advantage of i7s (With an i3 near the top, like if it was Dolphin or something), and I've been playing Mordor on my 2500k at 60 FPS with a 760 just fine... (At PS4 settings as well)
 

Grief.exe

Member
Why are they recommending i7 if i3 is good enough? Is this still true on 1080p and higher? I've seen only 1 image for 720p.

Reccomended requirements are generally hyper-inflated for liability reasons. Especially as of late.

As I said earlier, it's essentially impossible to form an argument using recommend system requirements as they are steeped in bullshit.
 

UrbanRats

Member
Shadow of Mordor, Watch Dogs, The Evil Within all come with i7 recommendations. System reqs will only get higher and old hardware like 2500k will only be enough for low end.

I played Mordor on almost maxed settings at 60 fps (pretty stable, too) with a non-OCd (3.3ghz) 2500k.
 
That's weird. I played it just fine on my laptop (at 1440x900, mind you) at mid settings with an i7, 4gb ram and a 460M. My desktop computer had at the time a GTX 660 and a Athlon II X4 630 and it was running great with everything at maximum except one particular setting which I just can't remember right now.

I've always been in the group of people "AC games run fine in my computer" though.

In any case, these system requirements are super weird. There's no way the recommended and minimum processor for AMD are the same, and the gap between graphic cards doesn't make sense either, because, no matter how bad Ubi ports are, you can always disable whichever effects you don't need.

The way I see it, either the minimum requirements or the recommended are absolutely made up. Keep in mind this comes from a distributor for Ubisoft. Not ubisoft themselves. I'd take it with a grain of salt, and definitely wait until we can see it in a reliable official source (such as the Ubi shop).

I don't think the problem was with my PC as much as it was with ACIV's terrible optimization. I had to cap the game at 30 FPS to get a reliable result, otherwise it'd constantly alternate between either 30 or 60. There was no inbetween. :lol
 
I don't think the problem was with my PC as much as it was with ACIV's terrible optimization. I had to cap the game at 30 FPS to get a reliable result, otherwise it'd constantly alternate between either 30 or 60. There was no inbetween. :lol




That's because of messed up vsync. Just disable it and use triple buffering with Nvidia Inspector or AMD CCC.
 

Dr Dogg

Member
Given the fact that all previous cases of huge minimum requirements turned out to be hogwash, I see no reason at all not to panic in light of this news.

Here's the thing thought right (and I did say this in the Steam thread) what actually are the minimum level of requirements targeting? You know what resolution? What framerate? What level of settings? Next to no game ever tells you that and the few times it gets answered are either vague or something along the line of 'It will be the optimal experience the dev tem envisioned'.

So why aren't we asking for more clear guidelines in terms of requirements. Something like a basic 2 axis comparison chart with listings like 1280x720@30hz Minimum settings and then a list of components the QA know will deliver that experience. Then another with 1280x720@60hz Medium settings with an update to components, 1920x1080@60hz Maximum and so on for resolution, framerate and rough catchall settings values.

Sure it's not perfect and probably needs a bit more simplification like only 3 or 4 sets for comparison but it would give a massive amount more information to the potential customer. Well all being said that they're not incorrect of course.
 

Kezen

Banned
I don't think the problem was with my PC as much as it was with ACIV's terrible optimization. I had to cap the game at 30 FPS to get a reliable result, otherwise it'd constantly alternate between either 30 or 60. There was no inbetween. :lol

Welcome to double buffering. You could try for force triple with D3DOverrider and find out that your experience will be much better.
I'm getting 50-60fps with maximum settings (except Physx on low) at 1080p.

It's not enough to call the port a magnificent effort but my experience has been very positive regardless.

I see no reason why having a superior experience to that of consoles will be difficult on PC, even on low end GPUs (GTX 750ti, R9 265). Obviously, 60fps will always be a hard target to reach.
 
That's because of messed up vsync. Just disable it and use triple buffering with Nvidia Inspector or AMD CCC.

I'll keep that in mind, should I ever replay the game.

Gonna hold off on this one until all the kinks have been ironed out, especially considering the fact I won't have time to play at all once the MCC hits.
 

R_Deckard

Member
There is no way these system requirements are legit. A 2500K and 660 minimum?

Ask yourself if you think this game is going to be more of a graphical bear than Crysis 3. No? Because 2600K and 660 meet the High Performance requirements for Crysis 3.

There is no way AC Unity is pushing Crysis 3 max settings type of quality.

I'm going to assume they are asking for these beefy specs because the port is optimized horribly and will require significant overhead.

apple-no-equal-orange.jpg
 

chico

Member
seriously...they just want people to be scared so they buy the more expensive console version. all those recommended specs had been bullshit lately.
 

samn

Member
Welcome to double buffering. You could try for force triple with D3DOverrider and find out that your experience will be much better.
I'm getting 50-60fps with maximum settings (except Physx on low) at 1080p.

It's not enough to call the port a magnificent effort but my experience has been very positive regardless.

I see no reason why having a superior experience to that of consoles will be difficult on PC, even on low end GPUs (GTX 750ti, R9 265). Obviously, 60fps will always be a hard target to reach.

The problem with triple buffering is it adds up to another 33ms of latency.
 
CPU has nothing to do with resolution. So an I3 is "good enough" for 30fps at 1080p in CPU limited scenarios.

But as the generation goes along, more cores (physical or otherwise) will make a difference or become necessary for high framerates.

Ah, if only DX12 released much earlier.

Because... They don't own multiple PCs to test for sure it's working ? Because minimum and recommended settings means nothing and change depending of the publishers ?

And yes, it's still true on 1080p, because you know, resolution has a few thing to do with CPU...

The power gap between minimum and recommended is way too low to have any conclusions about this. If these requirements (which likely don't) mean Low/High-Max settings, then it would be nowhere near enough power difference to show it (or there's barely any configurable settings). Hell, even the AMD CPU is the same on both.

Worth keeping in mind the Nvidia collaboration, so I wouldn't be surprised if the specs were based around enabling some of their features as well.


The Evil Within has been shown to have absolutely terrible multi-threading that doesn't even take advantage of i7s (With an i3 near the top, like if it was Dolphin or something), and I've been playing Mordor on my 2500k at 60 FPS with a 670 just fine... (At PS4 settings as well)

http--www.gamegpu.ru-images-stories-Test_GPU-Action-The_Evil_Within_-test-evilwithin_proz.jpg


Freaky stuff!

Reccomended requirements are generally hyper-inflated for liability reasons. Especially as of late.

As I said earlier, it's essentially impossible to form an argument using recommend system requirements as they are steeped in bullshit.

I played Mordor on almost maxed settings at 60 fps (pretty stable, too) with a non-OCd (3.3ghz) 2500k.

Ah that's cool, I was only looking at system reqs, so my 3570k will be enough for some time


Still I think from next year on with current gen only games will really start using more cores and hyper threading.
 

Newline

Member
Yes the 2500k is listed as a minimum requirement so you should be okay on the lower settings.



Shadow of Mordor, Watch Dogs, The Evil Within all come with i7 recommendations. System reqs will only get higher and old hardware like 2500k will only be enough for low end.
Now clock that i5 2500k at 4.5ghz+, which is a very simple task and a four year old product can play at max. Mine is clocked at 4.5ghz and Shadows of Mordor, WatchDogs and the Evil Within didn't even slightly rattle it.
 
http--www.gamegpu.ru-images-stories-Test_GPU-Action-The_Evil_Within_-test-evilwithin_proz.jpg


Freaky stuff!





Ah that's cool, I was only looking at system reqs, so my 3570k will be enough for some time


Still I think from next year on with current gen only games will really start using more cores and hyper threading.



As I said, recommended or minimum settings means nothing.
And start using more cores ? Good, this means i5 will finally be used at their full potential. More than multithread, IPC is important, and that's why Intel CPUs have the edge. Unless you're using an i3 CPU, you won't have trouble being beyond PS4 settings with an i5 CPU and a decent GPU.
 

chico

Member
What are you talking abuot they are both 60 dollars

here in europe console games are way more expensive, 70€ while the pc version is 50€. (if your search for key resellers even for 30€). so its not far fetched to say they exagerate on purpose. last games have shown that the official specs are not correct.
 

QaaQer

Member
So the consensus is the specs are bullshit? If so, why would ubisoft want to scare off potential customers?

edit:
here in europe console games are way more expensive, 70€ while the pc version is 50€. (if your search for key resellers even for 30€). so its not far fetched to say they exagerate on purpose. last games have shown that the official specs are not correct.

This seems possible I guess.
 
So the consensus is the specs are bullshit? If so, why would ubisoft want to scare off potential customers?




Why would Bethesda want to scare off potential customers with The Evil Within 4GB VRAM stuff ?
Giving minimum specs means you're avoiding getting people on nerves if game doesn't work the way they want.
 
Top Bottom