That's the thing Ubisoft. If your game isn't running at my monitor's native resolution of 1920x1080, your game won't be that pretty. It will be blurry because I'm using an LCD computer monitor to run my PS4 games.If the game is as pretty and fun as ours will be, who cares?
I'm sure the people saying "just unload this to this" know exactly how to program a game for a console.
Ubisoft also has said that this new engine was designed for next-gen hardware.
The new hardware is the one with GPU compute that can take over quite a bit of CPU tasks according to their own presentation.
One of them is lying, it's either anonymous Ubisoft or public Ubisoft. ;-)
God dammit! I knew it. Curse you awesome eyes of mine that can tell the difference on a well calibrated TV. I want to be like everyone else.Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in, yet it's the most important thing to talk about this gen. Anyhow, the email makes perfect sense and is exactly what I've been assuming since this whole controversy began.
Oh wow, didn't know they publicly stated that in the past. That's even more embarrassing, since CPU bottlenecks should be even less relevant with their fancy pants "next gen engine".Ubisoft also has said that this new engine was designed for next-gen hardware.
The new hardware is the one with GPU compute that can take over quite a bit of CPU tasks according to their own presentation.
One of them is lying, it's either anonymous Ubisoft or public Ubisoft. ;-)
Meranwhile at Nintedo;
[rl]http://a.pomf.se/hcbzwq.webm[/url]
Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in.
The fuck?
I can't even imagine how the PC port will be.
Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in, yet it's the most important thing to talk about this gen. Anyhow, the email makes perfect sense and is exactly what I've been assuming since this whole controversy began.
Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in, yet it's the most important thing to talk about this gen. Anyhow, the email makes perfect sense and is exactly what I've been assuming since this whole controversy began.
That is really sad. Why are you proud of that?
He is just saying the PS4 just wasn't quite enough to get to 1080p
Probably Xbone runs at ~30-40fps and the PS4 runs 37-47ish at 900, so they just locked it to 30
If the bottleneck is the CPU then this make a lot of sense. The component that is nearly the same in both the PS4 and the XB1 is the CPU. The extra GPU power and memory in the PS4 don't matter if they are not involved in the bottleneck. Sounds reasonable to me.
Since when is 1080p/60 fps not next-gen graphics?
If you only compare the CPU's, they might be right.Sounds like a bullshit response, the difference in power between the PS4 and X1 is definitely more than 1 or 2 fps.
You should apply for a job with IGN.
Compressed lighting information. The engine needs to load it, decompress it and send it to the GPU. Decompressing data in real time can eat up a lot of CPU time, and if you have a massive game world with partially pre-baked lighting that will take a shitton of space. It's not like textures that you tile, it's unique data. I have no idea if 50% of the CPU time is a reasonable number, but it's possible they decided to used that prebaked GI and were stuck with weak CPUs and not much power left.
He said it was a 1 or 2 fps difference not a 7 fps difference.
I think they're taking the popular PR approach of repeating a party line enough until people buy in. This explanation is no better than the 2 they previously issued, but already people in this very thread are saying "Yea, that makes sense. I believe them this time."I don't understand what the CPU strain has to do with resolution.
Am I just really computer illiterate or what's happening here
Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in, yet it's the most important thing to talk about this gen. Anyhow, the email makes perfect sense and is exactly what I've been assuming since this whole controversy began.
So they have been in developement for 4 years and only got 9fps 9 months ago?
CPU on Xbox One is faster.
If there's no difference between 1080p and 900p surely there's no difference between 900p and 720p. Might as well stick with 720p, or lower using UbiBuffoon logic.
Everyone calling bullshit sounds like they made up their mind a week ago and won't listen to any new information. Threads like this are also largely useless unless everyone is also posting their software development resumes alongside their arguments.
Yep. Remember Killzone? Shadow of Mordor 60fps? Also AC4: Black Flag. A lot of people were talking about the 1080p PS4 version until Ubisoft put out the statement that they would be patching in 1080 resolution. lol.
Or.... which makes more sense, they simply went for parity.
Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in, yet it's the most important thing to talk about this gen. Anyhow, the email makes perfect sense and is exactly what I've been assuming since this whole controversy began.
They're the same and you know that.
.
The graphics has much more to do with the IQ than the resolution.....
My preordered is staying cancelled Ubi.
Oh yeah. 900p and 30fps. The pinnacle of next-gen technology.What he said was stupid yes, but this is an equally stupid thing to say. Unity looks very much next gen.
snip
Everyone calling bullshit sounds like they made up their mind a week ago and won't listen to any new information. Threads like this are also largely useless unless everyone is also posting their software development resumes alongside their arguments.
The random jab at Mordor is pretty funny.
HahahahahaSo it will be curious to see how well the console versions hold 30fps. I'm expecting 30fps with drops, perhaps some heavy.
Mordor = Next Gen Gameplay
Unity = Next Gen Graphics
Seems like Monolith made the right trade off
I have to ask, but do you normally pre-order games in a series you do not find especially fun?
I believe them this time. I really honestly do. He explains that both consoles had trouble reaching 900p, but PS4 could achieve a higher framerate at that resolution than XB1. They still decided to lock it at 30 for consistency. Good call, and perfectly explained IMO.
Everyone calling bullshit sounds like they made up their mind a week ago and won't listen to any new information. Threads like this are also largely useless unless everyone is also posting their software development resumes alongside their arguments.
Yes, we have a deal with Microsoft, and yes we don't want people fighting over it, but with all the recent concessions from Microsoft, backing out of CPU reservations not once, but twice, you're talking about a 1 or 2 fps difference between the two consoles