• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unofficial response from Assassin's Creed dev on 900p drama. Bombcast 10/14/2014

I remember when the patch that changed Black Flag from 900p to 1080p came out. I didn't know what the patch was for and started playing while it downloaded.

Once the patch was finished, I quit the game and installed the patch. I immediately noticed it looked different. It seemed a lot crisper.

I went online and checked and sure enough, resolution upgrade.

But lol! No one can tell

You are one of the few I think.
 
So say its CPU bound because of baked lighting and package decoding. So in the case of PS4, why didn't they deploy GPU compute? All this CPU talk with no mentions of the GPU. GPU AI, etc would free up cycles for CPU.

Im walking away with the feeling that, its good enough for the release date.

Ubi PC releases are usually CPU bound.

They are just incompetent.
 
Factually incorrect.
However hard you find the information. The Xbox One CPU is clocked faster. Accept it.

Ugh...... again:
Same CPU's.
But yes, MS upclocked the CPU a bit, because it was lagging behind the PS4's CPU performance, thanks to the Xbone OS's (+ Kinect, and more)
Performance-wise (for games), they're ~equal.
 

Jomjom

Banned
Yeah him calling people bullshitters because they claim 1080p matters is where I stopped reading.

It matters. Simple as that. 900p doesn't look nearly as good on my 1080p display. There's no dispute.
 
With reader's response:

giphy.gif

meanwhile, Ubi's head of PR just back from vacation:

xYyVxC6.gif
 

-griffy-

Banned
Yep. Remember Killzone? Shadow of Mordor 60fps? Also AC4: Black Flag. A lot of people were talking about the 1080p PS4 version until Ubisoft put out the statement that they would be patching in 1080 resolution. lol.

This is some bullshit. Killzone is still rendering a unique 1920x1080 frame in its multiplayer mode, by the very way it works it means there is still a proper 1080p resolution being displayed even though half the lines are being interpolated, which means it still looks sharper than 900p. AC4 was announced as being 900p with a post release patch before it was even out, so of course people knew it was 900p. No one was gloating about the 1080p version since we all knew it wasn't 1080p. People on this very board were among the first to deduce that COD: Ghosts was actually running below 1080p in the campaign on PS4 based on nothing but videos and screenshots from websites like IGN, which eventually forced those websites to look into it and for a patch to eventually be released.

Plenty of people on this board can tell the difference between 900p and 1080p without some other website needing to confirm it for us.
 
Wouldn't it be hilarious if this e-mail is not only legit (in terms of coming from an Ubisoft employee), but it was actually sanctioned by Ubisoft?

I don't buy into crazy conspiracy theories, but if I was a higher-up at Ubisoft, I'd toss around the idea of things like this. Get the word out without marketing spin, make it seem legit.
 

dose

Member
He is just saying the PS4 just wasn't quite enough to get to 1080p

Probably Xbone runs at ~30-40fps and the PS4 runs 37-47ish at 900, so they just locked it to 30
And that's why he said there's a 2 fps difference? What you said makes no sense.
 
I think he explained it pretty well also.

The fact that people need to see a crappy 1080p version is eyeroll-worthy. As if their opinions would change after that anyway.

No. He's only explaining framerate.

Read any tech article. Look up anything related to computers.

http://www.polygon.com/2014/6/5/5761780/frame-rate-resolution-graphics-primer-ps4-xbox-one

"While increasing the resolution only increases GPU load, increasing the frame rate also increases the CPU load significantly," Thoman told Polygon.

All of this talk is focused around the CPU.

This thread disgusts me. You guys should watch this video:

https://www.youtube.com/watch?v=F0JwSuIyDNk
It's not over resolution. It's over the fact it's artificially capped to make the platforms equal. There's literally no reason why an F1 racer has to be governed to go as fast as a Pinto.
 
So they're are still only talking about FPS instead of the 900p/PS4 issue?
If the Xbone can do 900p, the PS4 can do more.


lol....


If they only compare the CPU's to each other, the might be right.
Resolution/GPU/RAM is a different story.

Discrete CPU hardware...yes, they are very close to one another. The extra CUs utilized for Compute (GPGPU) on Sony's machine mean that it is capable of offloading more from the CPU in theory (e.g., if some aspects of physics, lighting, and simulations were offloaded from the CPU). It is up to the dev to leverage this extra hardware. Ubisoft have obviously not leveraged it.
 
His tone makes me wonder if he really is a programmer...

Anyway, trying to identify reasons why resolution is so much CPU bottlenecked in this game I think maybe they took into account the differences in GPU and memory architecture between the main platforms started developing the engine based on the lowest common denominator between the two, which is the CPU.
If then they didn't bother with much optimization for each platform and locked on the same resolution and frame rate it would sort of explain it...
 
This really is about to define a next gen like no other game before. Mordor has next gen system and gameplay, but not graphics like Unity does.

I'll believe it when I see it. Just because Mordor doesn't have very complex lighting doesn't mean it doesn't look better. The shaders in Mordor are crazy! I haven't seen gameplay in Unity yet that can match it.

I believe 50% of the CPU is dedicated to helping the rendering by processing pre-packaged information, and in our case, much like Unreal 4, baked global illumination lighting.[/B] The result is amazing graphically, the depth of field and lighting effects are beyond anything you've seen on the market, and even may surpass Infamous and others. Because of this I think the build is a full 50gigs, filling the bluray to the edge, and nearly half of that is lighting data."[/I]

I very seriously doubt Unity will have lighting that matches Alien. We'll see. For now, Alien is unmatched. I don't know why they mentioned Infamous though.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Ugh...... again:
Same CPU's.
But yes, MS upclocked the CPU a bit, because it was lagging behind the PS4's CPU performance, thanks to the Xbone OS's (+ Kinect, and more)
Performance-wise (for games), they're ~equal.

Factually incorrect. Xbox One cpu is clocked faster.
 

Nabbis

Member
Those next gen graphics.....
XD

Seriously though, since when does 1080p & 60fps make a game have 'next gen graphics'? The graphics has much more to do with the IQ than the resolution.....

Native resolution matters much more for the IQ than the graphical settings you get by downgrading from 1080p 30fps to 900p 30fps. HBAO alone could make that difference, and that's not exactly a very noticeable effect imo. If your games graphics will truly take a hit from the resolution increase to the point that it will look ugly from anything but the resolutions perspective, then you simply have shit programmers.

So yeah, this 900p thing is a disgrace.
 

vrln

Neo Member
People on this board are able to parse through such blatant lies. But nah, you keep drinking that kool-aid.

Or this forum just has a tendency towards hive-mind/group think infested behaviour just like every other forum. This is just what happens when people are emotionally invested in their console of choice and standard group psychology mechanisms set in. I personally find the whole "MS is paying for not using PS4 power advantage" claims pretty much on the same level as theories like the US never setting foot on the moon, fluoride being used for mind control and so forth.
 
Or this forum just has a tendency towards hive-mind/group think infested behaviour just like every other forum. This is just what happens when people are emotionally invested in their console of choice and standard group psychology mechanisms set in. I personally find the whole "MS is paying for not using PS4 power advantage" claims pretty much on the same level as theories like the US never setting foot on the moon, fluoride being used for mind control and so forth.

Yes, it's not the fact that most of us don't like being fed BS, it's totally just the GAF hive mind persecuting you.
 

Rourkey

Member
Devs would be better ignoring the resolution question and leaving the resolution arguments to the pixel counters at DF.and the related Gaf threads, it does them no good discussing it at all.
 

jelly

Member
Unless they reveal that the PS4 has better AA, we're wondering what happened to the GPU advantage.

Ubisoft is giving the PS4 a well deserved break after great success to enjoy a cup of tea and put his little feet up. Xbox One hasn't been doing well and needs to work harder.
 

Ploid 3.0

Member
Hahah Ubisoft, poor Ubisoft.

This is very funny, they are breaking down emotionally it seems. What will they do next? Continue digging a hole for themselves.
 
Or this forum just has a tendency towards hive-mind/group think infested behaviour just like every other forum. This is just what happens when people are emotionally invested in their console of choice and standard group psychology mechanisms set in. I personally find the whole "MS is paying for not using PS4 power advantage" claims pretty much on the same level as theories like the US never setting foot on the moon, fluoride being used for mind control and so forth.

Uh oh.
 
Xbone version is not currently at 900p/30 in the youtube video released few days ago. It seemed like 20~25fps. May be that's why they had to lock the ps4 version at 900p to get it to run at 30 fps.

20~25 fps to locked 30 fps would require about the same power difference between xbone and ps4.
 
Factually incorrect.
However hard you find the information. The Xbox One CPU is clocked faster. Accept it.

It is clocked faster, but Matt says it is lower performing. Could have to do with the way the memory subsystems interface with the CPU. PS4 has a memory subsystem that is near fully HSA compliant and that can be the deciding factor if copy cycles can be saved (between the CPU and GPU).

Matt is a verified 3rd party dev by the way.
 

benny_a

extra source of jiggaflops
I always take off the coff remarks as scientifically proven facts.
The colloquial definition of "off-the-cuff" means not prepared in advance.

Writing a mail and hitting send to write into into a podcast does not seem to fit that definition.
 
the depth of field and lighting effects are beyond anything you've seen on the market, and even may surpass Infamous and others.

So, is it beyond anything we've seen or is it on the same playing field as Infamous? Because if it "may surpass Infamous" then I have a hard time believing it's beyond anything we've seen.

Anyway, using Infamous as reference when talking about lighting, no day/night cycle rumors might be true, sucky.
 

Calabi

Member
I cant wait to see what the graphics look like and how next gen they will be.

But I think Ubisoft should stop speaking now.
 
Q

Queen of Hunting

Unconfirmed Member
i expect nothing less from a game being made by over 900 people.
 
This is some bullshit. Killzone is still rendering a unique 1920x1080 frame in its multiplayer mode, by the very way it works it means there is still a proper 1080p resolution being displayed even though half the lines are being interpolated, which means it still looks sharper than 900p. AC4 was announced as being 900p with a post release patch before it was even out, so of course people knew it was 900p. No one was gloating about the 1080p version since we all knew it wasn't 1080p. People on this very board were among the first to deduce that COD: Ghosts was actually running below 1080p in the campaign on PS4 based on nothing but videos and screenshots from websites like IGN, which eventually forced those websites to look into it and for a patch to eventually be released.

Plenty of people on this board can tell the difference between 900p and 1080p without some other website needing to confirm it for us.

Sorry "native 1080p" and for AC4, THAT is some bullshit.
 
Top Bottom