• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Assassin's creed Unity PC version System requirements

Well, yes every ported game to Pc ends up having
A performance thread on GAF that is quite telling that most ports don't run smooth at all, let's not forget you have to run drm to even get this game running.

DL4LvUf.jpg
 

DarkoMaledictus

Tier Whore
Far Cry 4 is cross-gen so it should be fine. AC Unity is the first Ubi game that's next-gen only.

Mordor and Evil Within were also cross-gen, by the way. AC Unity's reqs might be legit.

I guess we will see, Ubi has some serious issues when it comes to pc optimization. I just hope I am wrong!
 

Riky

$MSFT
There should be a standard to what minimum actually means, what resolution and framerate at what settings, I find it hard to believe my FX6300 and R9270 wont run the game at all.
 

Seanspeed

Banned
Maybe a key motivation is to avoid tech support for gamers who complained that their product cannot run "well" (ie 1080/60fps).
By 'tech support' you mean optimize the game to any reasonable degree that 1080p/60fps is even *possible*?

Cuz I don't understand your point otherwise.

Going by GAF/Reddit threads here (plus my confirmation bias lol), it seems gamers may have outsized expectations on what their i3/GTX750Ti should be able to perform on current-gen.
This isn't about i3's and 750Ti's but proper, powerful PC components being necessary for play the game at all.
 
I don't think these are real.

A 680/7970 for minimum doesn't make sense. Did they like completely lock out settings or something for the visual levels? So something like 760, 750ti or 7870 not even meeting minimum, not recommended, just minimum to play sounds like total BS.
 

Alebelly

Member
There should be a standard to what minimum actually means, what resolution and framerate at what settings, I find it hard to believe my FX6300 and R9270 wont run the game at all.

It'll probably run it just fine, outside of the initial game breaking bugs, followed by a few patches that don't quite fix it
 

Seanspeed

Banned
Seriously, I can't believe people still have faith in Ubisoft after the stuttering mess, that was Watch Dogs, on PC.
Its not about 'having faith' in Ubisoft. Its noticing a trend of stupidly overblown 'requirements' with recent next-gen multiplatforms.

Watch Dogs stuttering issue was not related to its PC specs, as most people encountered the issue no matter what specs they had. Watch Dogs minimum and recommended specs were actually fairly modest as well.
 

Durante

Member
Seriously, I can't believe people still have faith in Ubisoft after the stuttering mess, that was Watch Dogs, on PC.
Watch Dogs was technically equivalent to consoles on equivalent PCs, and potentially much better on good PCs.

Note that this is a 900p30 game on PS4, and that it runs like this on PC at 1080p (again, 44% more pixels) and higher settings:
http--www.gamegpu.ru-images-stories-Test_GPU-Action-Watch_Dogs-test_new-1920_smaa.jpg
 

Renekton

Member
By 'tech support' you mean optimize the game to any reasonable degree that 1080p/60fps is even *possible*?

Cuz I don't understand your point otherwise.

This isn't about i3's and 750Ti's but proper, powerful PC components being necessary for play the game at all.
Well I mean actual tech support, like customer service. ACU owners logging tickets that their rig is not delivering the 1080p/60 promised land. If this spec is true, then it is to cover their butts: "unfortunately you did not meet minimum reqs and knowingly purchased it"
 
Its not about 'having faith' in Ubisoft. Its noticing a trend of stupidly overblown 'requirements' with recent next-gen multiplatforms.

Watch Dogs stuttering issue was not related to its PC specs, as most people encountered the issue no matter what specs they had. Watch Dogs minimum and recommended specs were actually fairly modest as well.
Weren't the initial requirement quite heavy (They later changed them). Not to mention the uproar over the Ultra textures requiring VRAM started with Watch Dogs AFAIK.

Watch Dogs was technically equivalent to consoles on equivalent PCs, and potentially much better on good PCs.

Note that this is a 900p30 game on PS4, and that it runs like this on PC at 1080p (again, 44% more pixels) and higher settings:
http--www.gamegpu.ru-images-stories-Test_GPU-Action-Watch_Dogs-test_new-1920_smaa.jpg
I was not really into PC gaming when Watch Dogs was released. I bought it for PS4 instead. But looking at this chart, I can easily run it on the highest quality settings at full HD resolution.
 
Absurd considering the graphics do not look good. having minimum requirements like this will assure many people just stick with consoles.
 

Seanspeed

Banned
Well I mean actual tech support, like customer service. ACU owners logging tickets that their rig is not delivering the 1080p/60 promised land. If this spec is true, then it is to cover their butts: "unfortunately you did not meet minimum reqs and knowingly purchased it"
This doesn't happen. Minimum requirements don't mean minimum to acheive 1080p/60fps and I don't think anybody believes it does that I've ever heard of.

I think you're reaching a bit.
 

UrbanRats

Member
Watch Dogs was technically equivalent to consoles on equivalent PCs, and potentially much better on good PCs.

Note that this is a 900p30 game on PS4, and that it runs like this on PC at 1080p (again, 44% more pixels) and higher settings:
http--www.gamegpu.ru-images-stories-Test_GPU-Action-Watch_Dogs-test_new-1920_smaa.jpg

Did it stutter on consoles, as it did on PC? Genuine question.

I had to stop playing because no matter the setting it would stutter a lot, and i know people with even more powerful HW had this problem (i have a 970).
Though ofcourse Fraps would technically indicate 60fps, the game wasn't fluid at all.
 

~~Hasan~~

Junior Member
something is wrong with these requirements. minimum of 680 GTX ? like seriously ? the game is running on a Xbox one and PS4 for gods sake.

is the PC version very super superior for these requirements or just lack of good Dev and optimization from Ubisoft like usual crap they give us ?
 
Did it stutter on consoles, as it did on PC? Genuine question.

I had to stop playing because no matter the setting it would stutter a lot, and i know people with even more powerful HW had this problem (i have a 970).
I had the PS4 version day 1 and there was no stuttering at all. It was locked 30 fps without any noticeable drops.

The only major issue, that I remember was present on consoles, was a bug related to the game getting stuck at loading screen. This was related to some glitch in Ubisoft's uplay rewards. It took quite a while to get fixed and this issue was present on all the platforms including PC.
 

Seanspeed

Banned
Weren't the initial requirement quite heavy (They later changed them). Not to mention the uproar over the Ultra textures requiring VRAM started with Watch Dogs AFAIK.
The requirements weren't really changed, though. Minimum and recommended were basically the same as early announced requirements, they just scratched out some of the vague parts of it(like saying 600 series, 700 series).

Watch Dogs was not a bad PC port necessarily, just a badly programmed game in general.
 

Durante

Member
Watch Dogs was not a bad PC port necessarily, just a badly programmed game in general.
That's my point, and the same is true if you actually take the time to investigate pretty much every single instance of what people call "bad PC ports". Just look at The Evil Within for a more recent and even more extreme example.

Did it stutter on consoles, as it did on PC? Genuine question.
"Stuttering" is a thing that can happen based on dozens of factors in the hardware/software stack on each individual system. A friend played W_D on a 670 and an i7 920 and didn't experience any stuttering.
 

UrbanRats

Member
I had the PS4 version day 1 and there was no stuttering at all. It was locked 30 fps without any noticeable drops.

The only major issue, that I remember was present on consoles, was a bug related to the game getting stuck at loading screen. This was related to some glitch in Ubisoft's uplay rewards. It took quite a while to get fixed and this issue was present on all the platforms including PC.

Maybe at 30 fps the game wouldn't stutter on PC, i don't know because my target was 60, and on 30 i would get crazy input lag.
But yeah on 60 it was really annoying, i uninstalled the game for that reason (and also because i hated what i played of it).

"Stuttering" is a thing that can happen based on dozens of factors in the hardware/software stack on each individual system. A friend played W_D on a 670 and an i7 920 and didn't experience any stuttering.

I don't doubt it, but a console being a closed system, it can be less of a crapshoot sometimes.

I buy 90% of my games on PC, so i'm not even trying to make a big case for consoles being better or anything, but with a game like W_D, i can see it being a reasonable choice, if you don't know whether or not you're gonna get stuttering, something that seemed pretty wide spread, at least at launch (i haven't been following any development).
 

Philippo

Member
Yeah with those requirements i'm sure that both this and the fact that the game being 900p on PS4 comes from terrible optimization.
 

Denton

Member
That's my point, and the same is true if you actually take the time to investigate pretty much every single instance of what people call "bad PC ports". Just look at The Evil Within for a more recent and even more extreme example.

"Stuttering" is a thing that can happen based on dozens of factors in the hardware/software stack on each individual system. A friend played W_D on a 670 and an i7 920 and didn't experience any stuttering.
I wouldn't even classify WD as bad port. Its main problem was busted vsync, but then Ryse has the same issue. Once enabled borderless windowed, both games run perfectly stutter-free. Same with AC4.

Shame these devs do stupid mistakes like that though.
 

Alebelly

Member
These are terrible games from a terrible company. Will continue not throwing them my support.

I don't think that's fair, they do what they do, and those that care have been aware if their shortcomings for years. I still enjoy their games, though it is mitigated at this point.
 
Conspiracy theory? When their recommended CPU doesn't change from min to max for AMD? (It's because that is where AMD tops out for the most part regarding consumer CPUs) If you really needed to jump from the i5 to an i7, then you would also need to make a jump from the FX-8350 to something else to still run it at "recommended". Also, I'm a bit rusty on my CPU knowledge, but I believe even though the Sandy Bridge i5 is older, it is still better than the 8350 for everything except heavily multithreaded apps that take advantage of the AMD chip's 8 cores. So if the 8350 can run the game at recommended, so can the i5. And Ubisoft is full of shit.

Please, if it takes that much hardware to run the game, it is only because they made a godawful port.

Unless it actually uses more than 4 threads that the i5 has....
 

Seanspeed

Banned
I don't think that's fair, they do what they do, and those that care have been aware if their shortcomings for years. I still enjoy their games, though it is mitigated at this point.
Yea I still like their games, but probably wouldn't ever buy one of their AAA open-world titles Day 1.

Unless it actually uses more than 4 threads that the i5 has....
Sure, but core performance still seems to matter a lot. So far, a good i5 still beats an 8350 almost everywhere, even with games that take advantage of hyperthreading.
 
Just finished skim reading this thread.

PC gamers sure are a clusterfuck of opinions, how is a developer ever meant to meet your sky high requirements if you keep changing the goalposts.

First you lot are all "omg consoles suck, netbook cpu this, low end gpu that, PC MASTER RACE LOLOLOLLLOOLLLLL" now because an actual current gen game that isn't cross gen with the last consoles actually uses the hardware you PC MASTER RACE LOLOLLOLOL morons claim to have and claim that is so much better now you are raging that the requirements are too high?

I'm not interested in any lame attempts to justify your whining by saying Ubisoft does bad ports. This is the first non cross gen game they've ported, how about stfu and wait to see.

ATS61r0.gif
 

RVinP

Unconfirmed Member
I was pretty much convinced that having 25GB+ of baked light data in the game actually decreased system requirements, But damn...the requirements are still high.

Edit
PC gamers sure are a clusterfuck of opinions, how is a developer ever meant to meet your sky high requirements if you keep changing the goalposts.

I'm not interested in any lame attempts to justify your whining by saying Ubisoft does bad ports. This is the first non cross gen game they've ported, how about stfu and wait to see.

What the fish did I just read?

Edit 2
What the fish really!...I feel super offended and I am still angry after like 15-20 mins of reading the above post (and its still in my mind).
 

Alebelly

Member
Yea I still like their games, but probably wouldn't ever buy one of their AAA open-world titles Day 1.

That's kinda the thing, I think if youre in this thread and posting then you must be fan of their games, at least a little bit.

Otherwise, why would you give a fuck.
 
I was going to pre-order, but you know, I'll hold off for a bit.

The previous ones always ran fine for me?

But you'll miss out on the bonus missions man!

AC3 is an atrocious PC port.

Yeah, I ran AC1, AC2 and AC4 fine, but I had severe problems with AC3. Nice to see it wasn't my imagination.


That's my point, and the same is true if you actually take the time to investigate pretty much every single instance of what people call "bad PC ports". Just look at The Evil Within for a more recent and even more extreme example.

"Stuttering" is a thing that can happen based on dozens of factors in the hardware/software stack on each individual system. A friend played W_D on a 670 and an i7 920 and didn't experience any stuttering.

In the case of WD, stuttering was a real problem that wasn't just the result of some wonky hw combinations, but a great need for vram. It just happened to the majority of users, not a minority. Suttering was a problem with ultra textures, which almost disappeared at high texture settings. Maybe your friend played with textures at high or medium level (if he had a 670, I doubt he played at ultra), that would explain it easier than thinking his copy was more optimized than the rest of people.



But it's true people are people are reacting too fast to this news, exactly in the last two months we had games that we discovered at release the hw requisites weren't as steep as they announced.
 

Lulubop

Member
Just finished skim reading this thread.

PC gamers sure are a clusterfuck of opinions, how is a developer ever meant to meet your sky high requirements if you keep changing the goalposts.

First you lot are all "omg consoles suck, netbook cpu this, low end gpu that, PC MASTER RACE LOLOLOLLLOOLLLLL" now because an actual current gen game that isn't cross gen with the last consoles actually uses the hardware you PC MASTER RACE LOLOLLOLOL morons claim to have and claim that is so much better now you are raging that the requirements are too high?

I'm not interested in any lame attempts to justify your whining by saying Ubisoft does bad ports. This is the first non cross gen game they've ported, how about stfu and wait to see.

ATS61r0.gif

What the fuck is this? Awful.
 

Tablo

Member
GTX 670 am cry, you've served me well lol
Too bad I'm not home often anymore, else I'd have a GTX 980 ready to tear this game a new graphics submenu.
 

KKRT00

Member
Next thread about bloated requirements and next complains.
Did people not learn enough from games in past? Hell, we just had Mordor and Evil Within debates!

Those specs are not real game requirements, just ignore them.

-----
This is the first non cross gen game they've ported, how about stfu and wait to see.
Ryse is already out for more than a week.
 

Bl@de

Member
Next thread about bloated requirements and next complains.
Did people not learn enough from games in past? Hell, we just had Mordor and Evil Within debates!

Those specs are not real game requirements, just ignore them.

I doubt that they even play them on PC ... they just love to scream around
 

Omega

Banned
Just finished skim reading this thread.

PC gamers sure are a clusterfuck of opinions, how is a developer ever meant to meet your sky high requirements if you keep changing the goalposts.

First you lot are all "omg consoles suck, netbook cpu this, low end gpu that, PC MASTER RACE LOLOLOLLLOOLLLLL" now because an actual current gen game that isn't cross gen with the last consoles actually uses the hardware you PC MASTER RACE LOLOLLOLOL morons claim to have and claim that is so much better now you are raging that the requirements are too high?

I'm not interested in any lame attempts to justify your whining by saying Ubisoft does bad ports. This is the first non cross gen game they've ported, how about stfu and wait to see.

ATS61r0.gif

who shit in your cereal this morning
 
What the fuck is this? Awful.

No what is awful are all the idiots who have nice computers who shit on consoles for having bad specs now whining because their PC is near the minimum required for a true current gen game.

I think he wants all the tens of millions of us to get together and agree on a concensus opinion next time before posting.

I'm a PC gamer but not one of these "master race" idiots who shit on everything because their $1500+ PC is so fucking awesome.
 

Alebelly

Member
No what is awful are all the idiots who have nice computers who shit on consoles for having bad specs now whining because their PC is near the minimum required for a true current gen game.



I'm a PC gamer but not one of these "master race" idiots who shit on everything because their $1500+ PC is so fucking awesome.

Now we're getting somewhere, please continue
 
Top Bottom