• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unofficial response from Assassin's Creed dev on 900p drama. Bombcast 10/14/2014

eqyewy.gif

oh my god
 

Valnen

Member
If the game is as pretty and fun as ours will be, who cares?
That's the thing Ubisoft. If your game isn't running at my monitor's native resolution of 1920x1080, your game won't be that pretty. It will be blurry because I'm using an LCD computer monitor to run my PS4 games.
 

orochi91

Member
Ubisoft also has said that this new engine was designed for next-gen hardware.
The new hardware is the one with GPU compute that can take over quite a bit of CPU tasks according to their own presentation.

One of them is lying, it's either anonymous Ubisoft or public Ubisoft. ;-)

All triple AAA Ubisoft games will be sub-1080p I bet.

Seriously, if this engine is what they've come up with to cope with new console tech,
then I can't take them seriously lol
 

GSG Flash

Nobody ruins my family vacation but me...and maybe the boy!
Sounds like a bullshit response, the difference in power between the PS4 and X1 is definitely more than 1 or 2 fps.

My preordered is staying cancelled Ubi.
 

jayu26

Member
Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in, yet it's the most important thing to talk about this gen. Anyhow, the email makes perfect sense and is exactly what I've been assuming since this whole controversy began.
God dammit! I knew it. Curse you awesome eyes of mine that can tell the difference on a well calibrated TV. I want to be like everyone else.
 

KoopaTheCasual

Junior Member
Ubisoft also has said that this new engine was designed for next-gen hardware.
The new hardware is the one with GPU compute that can take over quite a bit of CPU tasks according to their own presentation.

One of them is lying, it's either anonymous Ubisoft or public Ubisoft. ;-)
Oh wow, didn't know they publicly stated that in the past. That's even more embarrassing, since CPU bottlenecks should be even less relevant with their fancy pants "next gen engine".

So yea, someone in the company is lying super hard, one way or the other. Either way, Ubisoft looks completely incompetent/disorganized in the end. Sad.
 

ItsTheNew

I believe any game made before 1997 is "essentially cave man art."
Once again, Ubi, just give us a video of unity running at 1080p on the ps4 and if it's a big stuttering mess you'll avoid all this drama...that is if you're telling the truth
 

Etnos

Banned
Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in.

pretty much this..

The fuck?

It was a week in and no one was sure what resolution Shadow Of Mordor was running on xbox one. We all had to wait to the digital foundry article so we can actually appreciate how inferior the xbone version is.
 

vrln

Neo Member
I can't even imagine how the PC port will be.

This is good news for that version. The CPU´s in these current generation consoles are crummy compared to even the worst you can buy for PC´s. They are netbook level stuff. I look forward to this game´s PC version showing how far a properly driven (= not the PS4) Radeon 7850 can go when it´s not bottlenecked.

I don´t really see any point in all this outrage - just a lot of armchair developers. It´s clear this game is mostly CPU bound (as open world games often are) and in that case resolution parity is to be expected. If the game was a linear "cinematic" experience things would be different, but this isn´t Uncharted.
 
Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in, yet it's the most important thing to talk about this gen. Anyhow, the email makes perfect sense and is exactly what I've been assuming since this whole controversy began.

Yep. Remember Killzone? Shadow of Mordor 60fps? Also AC4: Black Flag. A lot of people were talking about the 1080p PS4 version until Ubisoft put out the statement that they would be patching in 1080 resolution. lol.
 
Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in, yet it's the most important thing to talk about this gen. Anyhow, the email makes perfect sense and is exactly what I've been assuming since this whole controversy began.

You should apply for a job with IGN.
 

sbrew

Banned
He is just saying the PS4 just wasn't quite enough to get to 1080p

Probably Xbone runs at ~30-40fps and the PS4 runs 37-47ish at 900, so they just locked it to 30

He said it was a 1 or 2 fps difference not a 7 fps difference.
 

ymmv

Banned
If the bottleneck is the CPU then this make a lot of sense. The component that is nearly the same in both the PS4 and the XB1 is the CPU. The extra GPU power and memory in the PS4 don't matter if they are not involved in the bottleneck. Sounds reasonable to me.

A CPU bottle neck means the framerate is indeed limited but since the GPU isn't, the PS4 should have GPU headroom for a resolution bump.

Ubisoft are lying through their teeth.
 

Chobel

Member
Compressed lighting information. The engine needs to load it, decompress it and send it to the GPU. Decompressing data in real time can eat up a lot of CPU time, and if you have a massive game world with partially pre-baked lighting that will take a shitton of space. It's not like textures that you tile, it's unique data. I have no idea if 50% of the CPU time is a reasonable number, but it's possible they decided to used that prebaked GI and were stuck with weak CPUs and not much power left.

OK, but both consoles have special hardware for decompressing textures, so you don't need CPU for it, or at least not 50% CPU for it.
 

TGO

Hype Train conductor. Works harder than it steams.
Either a fake or Ubisoft has been hanging round MS too long and caught their "don't know when to shut the fuck up with BS" syndrome
 

ypo

Member
If there's no difference between 1080p and 900p surely there's no difference between 900p and 720p. Might as well stick with 720p, or lower using UbiBuffoon logic.
 

Morts

Member
Everyone calling bullshit sounds like they made up their mind a week ago and won't listen to any new information. Threads like this are also largely useless unless everyone is also posting their software development resumes alongside their arguments.
 

KoopaTheCasual

Junior Member
I don't understand what the CPU strain has to do with resolution.

Am I just really computer illiterate or what's happening here
I think they're taking the popular PR approach of repeating a party line enough until people buy in. This explanation is no better than the 2 they previously issued, but already people in this very thread are saying "Yea, that makes sense. I believe them this time."

Just keep repeating and more people will shrug it off, i guess.
 

SeanR1221

Member
Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in, yet it's the most important thing to talk about this gen. Anyhow, the email makes perfect sense and is exactly what I've been assuming since this whole controversy began.

I remember when the patch that changed Black Flag from 900p to 1080p came out. I didn't know what the patch was for and started playing while it downloaded.

Once the patch was finished, I quit the game and installed the patch. I immediately noticed it looked different. It seemed a lot crisper.

I went online and checked and sure enough, resolution upgrade.

But lol! No one can tell
 

Bl@de

Member
So they have been in developement for 4 years and only got 9fps 9 months ago?

Unbelievable ... Or they program it 3 years on a powerhouse pc. Start it up on consoles and cry. And then crunch time begins ... Great pc footage shown, console settings lowered and optimized. Hand in hand with bullshit ubisoft blah blah blah marketing. You have to love it.
 
CPU on Xbox One is faster.

if you have e.g. an i5 1.75 ghz vs i5 1.6 Ghz with the 1.6 ghz PC having a GPU card which is half a generation newer and faster, there WILL be a difference in performance in favor of the 1.6 Ghz PC and I am not even taking into account the memory advantage.
 
If there's no difference between 1080p and 900p surely there's no difference between 900p and 720p. Might as well stick with 720p, or lower using UbiBuffoon logic.

Rather just go down to 1p since all resolutions are equivalent. Imagine how well you could light the scene.
 

BWJinxing

Member
So say its CPU bound because of baked lighting and package decoding. So in the case of PS4, why didn't they deploy GPU compute? All this CPU talk with no mentions of the GPU. GPU AI, etc would free up cycles for CPU.

Im walking away with the feeling that, its good enough for the release date.
 
Everyone calling bullshit sounds like they made up their mind a week ago and won't listen to any new information. Threads like this are also largely useless unless everyone is also posting their software development resumes alongside their arguments.

Like this alleged dev did?
 

Monster Zero

Junior Member
Yep. Remember Killzone? Shadow of Mordor 60fps? Also AC4: Black Flag. A lot of people were talking about the 1080p PS4 version until Ubisoft put out the statement that they would be patching in 1080 resolution. lol.

Let's not forget when ubisoft uploaded the "PC" footage of watchdogs and and trolled everyone by saying it was actually the PS4 version, but whatever, let's continue to play big pocket, little pocket
 

gruenel

Member
Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in, yet it's the most important thing to talk about this gen. Anyhow, the email makes perfect sense and is exactly what I've been assuming since this whole controversy began.

Do I know you?
 

orochi91

Member

People on this board are able to parse through such blatant lies. But nah, you keep drinking that kool-aid.

Everyone calling bullshit sounds like they made up their mind a week ago and won't listen to any new information. Threads like this are also largely useless unless everyone is also posting their software development resumes alongside their arguments.

So I need to be a developer to call this stuff out?

lolno
 
So it will be curious to see how well the console versions hold 30fps. I'm expecting 30fps with drops, perhaps some heavy.



Mordor = Next Gen Gameplay

Unity = Next Gen Graphics

Seems like Monolith made the right trade off
Hahahahaha
They really just insulted themselves in my opinion. Bragging that youbhave better graphics than a new LOTR game? Probably with a lower budget too. Classy Ubi.
 

SMZC

Member
I have to ask, but do you normally pre-order games in a series you do not find especially fun?

I do not find the AC series fun when speaking strictly from a gameplay standpoint, no, but there are other things in the series that I enjoy. That's why I pre-ordered Unity, and I'd do it again if only Ubisoft's stance on this whole fiasco were different.
 

Asriel

Member
I believe them this time. I really honestly do. He explains that both consoles had trouble reaching 900p, but PS4 could achieve a higher framerate at that resolution than XB1. They still decided to lock it at 30 for consistency. Good call, and perfectly explained IMO.

I think he explained it pretty well also.

The fact that people need to see a crappy 1080p version is eyeroll-worthy. As if their opinions would change after that anyway.
 
Everyone calling bullshit sounds like they made up their mind a week ago and won't listen to any new information. Threads like this are also largely useless unless everyone is also posting their software development resumes alongside their arguments.

What they are saying doesn't make sense. Resolution has nothing to do with the CPU. That's all GPU.

The PS4 has a better GPU than the XBone. The CPU are identical. That's it.

Unless they reveal that the PS4 has better AA, we're wondering what happened to the GPU advantage.
 

Chobel

Member
Yes, we have a deal with Microsoft, and yes we don't want people fighting over it, but with all the recent concessions from Microsoft, backing out of CPU reservations not once, but twice, you're talking about a 1 or 2 fps difference between the two consoles

So we're talking full parity here?
 
Top Bottom