John Bender
Banned
Which makes.... no sense at all.He is just saying the PS4 just wasn't quite enough to get to 1080p
Probably Xbone runs at ~30-40fps and the PS4 runs 37-47ish at 900, so they just locked it to 30
(or their engine sucks)
Which makes.... no sense at all.He is just saying the PS4 just wasn't quite enough to get to 1080p
Probably Xbone runs at ~30-40fps and the PS4 runs 37-47ish at 900, so they just locked it to 30
That was a good email. The guy sounded so bitter.
What if they could get the PS4 version running at 45fps, but they didn't want to leave the game with a variable framerate, so it just got locked at 30?
Yup, like always.So when they showed AC:U at E3 they were bullshitting everyone?
Lighting data is ~25 GB? This smells like BS to me.
Nowhere does the developer's comment imply that.So the dev wants us to believe the PS4 and X1 are equal hardware?
Are you kidding me? What a stupidly unprofessional thing to say. How are Unity's graphics next-gen in any way?Mordor has next gen system and gameplay, but not graphics like Unity does.
This really is about to define a next gen like no other game before.
That's not going to happen with the CPU being the bottleneck in the first place. If the game is CPU bound and running at the same resolution as the Bone version, then they're not utilizing a big portion of the PS4's GPU. Resolution is almost entirely GPU-bound and has very little to do with the CPU.
But Ubisoft still seems to think we're all morons, so their answer isn't too surprising.
NO, he's basically backing up the fact that the CPU is the limiting factor, and in that department they are pretty much equals, although he does leave the door open for small FPS differences.
So when they showed AC:U at E3 they were bullshitting everyone?
So it will be curious to see how well the console versions hold 30fps. I'm expecting 30fps with drops, perhaps some heavy.
Mordor = Next Gen Gameplay
Unity = Next Gen Graphics
Seems like Monolith made the right trade off
Nowhere does the developer's comment imply that.
"I'm happy to enlighten you guys because way too much bullshit about 1080p making a difference is being thrown around. If the game is as pretty and fun as ours will be, who cares? Getting this game to 900p was a BITCH. The game is so huge in terms of rendering that it took months to get it to 720p at 30fps. The game was 9fps 9 months ago. We only achieved 900p at 30fps weeks ago. The PS4 couldn't handle 1080p 30fps for our game, whatever people, or Sony and Microsoft say. Yes, we have a deal with Microsoft, and yes we don't want people fighting over it, but with all the recent concessions from Microsoft, backing out of CPU reservations not once, but twice, you're talking about a 1 or 2 fps difference between the two consoles. So yes, locking the framerate is a conscious decision to keep people bullshiting, but that doesn't seem to have worked in the end. Even if Ubi has deals, the dev team members are proud, and want the best performance out of every console out there. What's hard is not getting the game to render at this point, it's making everything else in the game work at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed optimization for lots of Ubisoft games in the past, this is crazily optimized for such a young generation of consoles. This really is about to define a next gen like no other game before. Mordor has next gen system and gameplay, but not graphics like Unity does. The proof comes in that game being cross gen. Our producer (Vincent) saying we're bound with AI by the CPU is right, but not entirely. Consider this, they started this game so early for next gen, MS and Sony wanted to push graphics first, so that's what we did. I believe 50% of the CPU is dedicated to helping the rendering by processing pre-packaged information, and in our case, much like Unreal 4, baked global illumination lighting. The result is amazing graphically, the depth of field and lighting effects are beyond anything you've seen on the market, and even may surpass Infamous and others. Because of this I think the build is a full 50gigs, filling the bluray to the edge, and nearly half of that is lighting data."
Read it in this voice/tone:
https://www.youtube.com/watch?v=0yQzDbv6rS8
Dat next-gen gamplay.
jk I know the game probably opens up more later on (I would hope, haven't played yet)
If you're only talking about the framerate, then yes.Yes. I think that's a reasonable conclusion. It won't stop the parity zealots crying foul though.
Yes, we have a deal with Microsoft, and yes we don't want people fighting over it, but with all the recent concessions from Microsoft, backing out of CPU reservations not once, but twice, you're talking about a 1 or 2 fps difference between the two consoles.
I know you are joking, but saying press button to win is ridiculous in SoM when comparing it to Assassin's Creed is laughable.
Which makes.... no sense at all.
(or their engine sucks)
Oh, I know it's nothing simplistic as it sounds, but they say they've dedicated 50% of the CPU to rendering? I mean I can't really say that without knowing details, which I really can't, but there's a long history of using GPU power for that, as I understand.I'm sure the people saying "just unload this to this" know exactly how to program a game for a console.
I think the reaction to this PR is eclipsing any real discussion. I wonder if people are reading quote, regardless of believing it.Yes. I think that's a reasonable conclusion. It won't stop the parity zealots crying foul though.