• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF Performance Analysis: Diablo 3 at 1080p on Xbox One

pixlexic

Banned
I hope they include the pc version in the final conparison. Something seemed off to me playing this morning coming from the pc version. I just can't tell what though without seeing the versions side by side.
 
60 fps most of the time with dips during heavier instances is pretty normal. I don't know any game that's 60 fps all the time. TLoU for example has dips. I mean every game has dips.
 
My guess is they've capped the framerate at 60 on both systems in order to reduce stutter. Unlocked, it's certainly possible the PS4 (and XB1) could hit noticeably higher numbers. I haven't watched the DF analysis yet, but if the PS4 isn't dropping frames at all then that means an unlocked framerate would give it at least 60+ on the low end when action is busy, so it could very well be going MUCH higher than that on occasion.

So in any case, I don't think they gimped the PS4 version at all. We just can't see the performance gap because they capped the framerate, similar to games that run at 1080p/30 on both systems. PS4 might be hitting 45-50 often and XB1 sitting in the 30s, but when you cap them both it will appear to be the same performance.

Yes, fair observation - it might well be that the PS4 could consistently run at 70 for example, while the XBox can just about make 60 generally speaking... the 30 fps vs. 45 fps - wasn't that the approach on TR, as you say (locked at 30 on Xbox, uncapped on Ps4?).

I must admit the res bump on the XBox suddenly has me interested in this game - which is odd as I'm not a gfx junkie, but I guess this may be the straw that breaks the camel's back in terms of interest in the game.
 

DSN2K

Member
perhaps MS are offering financial benefits to Dev's for getting games at 1080p so Blizzard compromised.

Didn't Sony already suggest during launch they were pushing dev's for 1080p.
 

zedge

Member
"Huge" double standards that are prevalent here? care to elaborate?

Last time I checked this is GAF not N4G....
Fooled me.

Anyway. I'm happy with what they have done to be honest. With out a framerate counter I doubt anyone would notice 60 vs 5x fps when playing. They may still be able to optimize more. The engine is a pig apparently as there are issues even on high end PCs
 

Percy

Banned
"Reaching parity with our partners has been important. But in the end I don't want it to be about a number, because 1080p isn't some mythical, perfect resolution. Framerate to me is significantly more important to gameplay than resolution and the mix of those two which brings the right art style and freedom, whether it's on PlayStation or our platform."

I would say he worded it exactly right so people couldn't say he contradicted himself.

Right before the part you bolded:

Framerate to me is significantly more important to gameplay than resolution

This statement is clearly at odds with what Blizzard said happened here. It's not unreasonable to call this a contradiction.
 
Right before the part you bolded:



This statement is clearly at odds with what Blizzard said happened here. It's not unreasonable to call this a contradiction.

I don't think so - framerate can be seen as "way more important" without making resolution "completely unimportant" - and this game seems to be a good example...

1080p @ mostly 60 vs. 900p @ always 60 - it's a big gain in pixels for only a small fps tradeoff.

However, if it was a choice between 900p @ 60fps and 1080p @ 45-50 fps then it would be bad by Phil's statement as FPS is the most (but not only) important thing
 

p3n

Member
As many people have already said: if the game only dips down to the 50s in the most taxing areas then that's a worthy compromise for 1080p.

They haven't come close to anything taxing in their tests yet. A 4 player Rift with multiple packs on screen: if the game can somehow stay above 30fps it should be playable.
 
i actually would like to see the max mob density on the consoles other than that it seems blizzard did a good job.

EDIT:

Seems to be the same as PC, but I'm not too far into the game yet.

nice, i played this game before building my desktop so i didnt max out the game with my laptop but taking as reference streams from other pc players in some parts i saw a shitload of mobs on the screen and the only gameplay shown from the consoles didnt see that many at the same time. im not expecting 1:1 with a high end pc but i would like to know just out of curiosity.
 
This statement is clearly at odds with what Blizzard said happened here. It's not unreasonable to call this a contradiction.

The full quote is not at odds with anything. People are taking one phrase out of this quote and using it. Typical. In laments terms he is saying he wants what is best and most will agree with him that sacrificing a few frames for 1080p is a good decision. That's why I highlighted what I did. While he did say framerate is most important, getting the right mix is their goal. Geez, so many people demonizing the guy.
 
This statement is clearly at odds with what Blizzard said happened here. It's not unreasonable to call this a contradiction.

You're taking that quote, conveniently, to be an extreme statement. As if you know Phil Spencer himself DEMANDS 60fps locked to enjoy video games. The more likely scenario is that he was referring to games running poorly, and reducing the resolution is worth it if it means the game runs smoothly. For 99% of people that play video games, this is a smooth gameplay experience.
 

stay gold

Member
Right before the part you bolded:



This statement is clearly at odds with what Blizzard said happened here. It's not unreasonable to call this a contradiction.

If the push to 1080p resulted in an average framerate of say 49fps, then this would be at odds with what Spencer said and a real issue.

If DF are correct and the game is still a 60fps game with the odd small drop here and there, then it runs like 99.9% of 60fps console games no issue at all.

Phil's statement is basically saying that they need to find the happiest medium between fps and res rather than prioritising resolution over all else. It is not saying that devs must drop resolution if he spots a single framedrop.
 
Like you were in the post I initially quoted you mean?

Well, seems there's no winning with you. If I would have bolded the whole quote you would have seen FRAMERATE but if you think a few frames will make or brake the game then more power to you. G'day.
 

Frostburn

Member
I own D3 and ROS for PC and I own a PS4 and I will be playing it tonight with my wife on the PS4. For anyone thinking of buying the Xbox One version because its their only next gen system or because that's where their friends are: DO IT! The game is really fun co-op and from all the video I've seen so far this morning of the Xbox One version it looks and runs GREAT. The small dips into the mid to high 50s you probably won't even see, the game plays amazing on console. Do yourself a favor and just get it if you were on the fence, you'll enjoy the game and the whole 1080p but now not locked 60fps will never even enter your mind while playing.
 

PSGames

Junior Member
"We did find it challenging early on to get it to 1080p. That's why we made the decision to drop to 900. That's what we demoed and were showing around E3 time. And Microsoft was just like, 'This is unacceptable. You need to figure out a way to get a better resolution.' So we worked with them directly, they gave us a code update to let us get to full 1080p."

From the quote it seems like MS actually rewrote more efficient code for them? Or is he just referring to the SDK update?
 

Percy

Banned
Well, seems there's no winning with you. If I would have bolded the whole quote you would have seen FRAMERATE but if you think a few frames will make or brake the game then more power to you. G'day.

I said nothing about what having a worse framerate than it initially did would actually mean for the game, I merely pointed out Phil Spencer clearly placing greater importance on framerate over resolution in that statement when MS' communication with Blizzard clearly placed a higher importance on resolution over framerate. That's all I did.

This isn't about me, is it?
 

Krakn3Dfx

Member
The areas they tested for drops aren't anywhere nearly as taxing as some of the coop Nephalem Rift stuff later on, low 50s will probably be the best case scenario once people are really digging into late game Torment content.

I wonder what a side-by-side comparison of this thread and the PS4 Tomb Raider Definitive Edition DF thread would bare out. I remember a lot of people complaining about that game dropping down under 60fps at 1080p and screaming for a 30fps lock option or a drop in resolution.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
I'm honestly a little surprised there's only a 17% max drop in performance from a boost to 1080p

Then you don't understand how frame rates work. If a game is getting 60fps, that means it is updating the frame buffer at least 60 times a second. So at 900P it oculd have been 65-70+ fps often. Now they are dropping below 60fps. Unless you have analysis of the un-capped frame rate before and after, you will never know what it cost - you are only looking at the frame rate for 60fps and lower.
 

Jomjom

Banned
Should have left an option to toggle between 900p and 1080p a la Final Fantasy XIV on PS4. You would think a PC dev like Blizzard would be smarter.
 
From the quote it seems like MS actually rewrote more efficient code for them? Or is he just referring to the SDK update?

There's a lot of things that quote could mean. The simplest explanation is that they got in touch with some of the Xbox developer integrations and hammered out the proper way to make various API calls for optimization.

I've spent more than a few final weeks on the phone having those very conversations for programming jobs. In the gaming industry, I'd expect that it's a day of small 0.25% performance gains that add up to something useful.
 
The areas they tested for drops aren't anywhere nearly as taxing as some of the coop Nephalem Rift stuff later on, low 50s will probably be the best case scenario once people are really digging into late game Torment content.

I wonder what a side-by-side comparison of this thread and the PS4 Tomb Raider Definitive Edition DF thread would bare out. I remember a lot of people complaining about that game dropping down under 60fps at 1080p and screaming for a 30fps lock option or a drop in resolution.

That's the real issue. Act II is nothing compared to some of the rifts you'll see on higher torment levels. I've done some on PC that started out with a barrage of enemies so dense that you fill the rift meter right there at the beginning. I can imagine the drops there will be higher.
 

Raist

Banned
So it runs at 60 fps with dips to the low 50s?What's wrong with that?! Would anyone even notice if it was 60 or 52? Doesn't last of us dip to below 50? Should have gone 900p. ;)

It doesn't matter if it's 43, 53, 57 or 48. The point is that it's out of sync with the display's refresh rate and thus causes judder.
It's not like comparing driving at 50km/h vs 60.

MK8's recurring drops to 59fps is very noticeable to many people.
 

rokkerkory

Member
It doesn't matter if it's 43, 53, 57 or 48. The point is that it's out of sync with the display's refresh rate and thus causes judder.
It's not like comparing driving at 50km/h vs 60.

MK8's recurring drops to 59fps is very noticeable to many people.

What % of the MK8 owners do you think notice?
 
1080p definitely the right decision in this case. Looks pretty damn solid in this video, a few drops but nothing nearly as bad as TLOU:R. https://www.youtube.com/watch?v=lvtMXYHasRQ

Never drops below 52 and the vast majority of the time it's at 60. Good on MS helping the devs to reach their potential. I guess once they teach each studio the in's and outs once the MS engineers won't need to go back.

Good performance, shame I'm not interested in the title :/
 
It doesn't matter if it's 43, 53, 57 or 48. The point is that it's out of sync with the display's refresh rate and thus causes judder.
It's not like comparing driving at 50km/h vs 60.

MK8's recurring drops to 59fps is very noticeable to many people.

MK8 duplicated a frame every 59 frames. That's why it was noticeable.
 

Dunlop

Member
Never drops below 52 and the vast majority of the time it's at 60. Good on MS helping the devs to reach their potential. I guess once they teach each studio the in's and outs once the MS engineers won't need to go back.

Good performance, shame I'm not interested in the title :/

I think maybe that is a big part of it, MS is giving the assistance needed to push the performance of their games up
 

Seik

Banned
Do you have an XB1?

Not yet, but I intent to buy one in the coming months, I'm just waiting for the library to garnish a bit.
and my wallet to get beefier too.

The thing is, I got a couple of friends that will probably play on X1...since it's mostly MP I wasn't sure on which version to pull the trigger on. As it is, I'll probably end up buying both, hahaha!

Not having an X1 doesn't make my opinion worthless, though. I want this version running the best it can as much as I want the PS4 version to, please don't mix me in the console war BS that's been floating in this thread.

EDIT: Already ordered the PS4 version earlier today though, will get it tomorrow.
 
The 360 version can end up in single digit framerates during the crazier parts. The early game frame drops are nothing compared to how crazy the game gets on the highest difficulty. So basically, if this game goes down to ~50 fps during Act II on what I assume is Normal, then I bet it's going to drop quite a bit more on higher difficulties.
 
So one hand of MS tells gamers framerate is more improtant, while another tells devs that resolution is more important. Communicate with yourself, MS, and make up your mind.
 

Sanpei

Member
Drop from 60fps to 50fps is not noticeable and it's in certain situations...So doing as 1080p is a good choice
 

panda-zebra

Banned
What's the most important thing to keep an eye on in a DF tech performance video?

I noticed the frame rate ticker seems to update least often, right at the start there are 4 distinct levels in the rolling graph but the frame rate in the corner moves from 60 to 58 back to 60 - I assume that's because it's an average over a given time? The graph seemed to go to what looked like 60-59-58-57-58-59-60 and that averaged 58 for a time.

Bearing in mind they're showing us real-time 60fps gameplay via 30fps YouTube I'd imagine there's some level of smoothing and any noticeable momentary or very infrequent stutter gets masked or smoothed? Given that would the Frame Time indicator be the one to keep an eye on to see how consistently a game is running rather than the Frame Rate?

People are suggesting that further progression into the game will offer more taxing and therefore potentially more interesting material for analysis. Hopefully the ability to make use of previous saves from the older game will remove any practical barrier to capturing this in terms of the required time investment on their behalf (this might not be so, I'm not up on the game and how this works, it will be my first time playing when I receive it).

So many questions, sorry if there's too much OT with regards to this specific video in parts of this post.
 

ypo

Member
The DF analysis is done with only a single player. It's only going to get worse in real world situation.
 
Cant quite get my head round this, so is it 900P to 1080P is not noticeable or is it from 60FPs to 50 FPS is not noticeable? Because if its the first why bother upping the res and if its the second does it mean they should keep it at 900P so we dont notice the FPS.
 

zedge

Member
It doesn't matter if it's 43, 53, 57 or 48. The point is that it's out of sync with the display's refresh rate and thus causes judder.
It's not like comparing driving at 50km/h vs 60.

MK8's recurring drops to 59fps is very noticeable to many people.
No it would not cause judder.

Did you watch that video? When it dips it's still smooth. With out the counter you not even notice.
 

EGM1966

Member
How many titles have worked with MS's engineers now to bump up res? I know Destiny did. Impressive effort from Microsoft if they're really doing a significant portion of the work here. Is that unprecedented? I don't recall it happening before, seems like a sensible way to help dev relations at the same time as making games look better on your console.

Destiny and this are the only two for sure although I suspect they probably are helping others too. Sony did the same thing with the PS3 iirc and when developers are slower to get to grips with your hardware it makes sense.

With PS4 there's probably less need (as there was with 360 lst gen) as it's easier to code for.

Pretty sure we'll see this more. Also as per the other post quoting me I now can't find I admit calling it a Hit Squad maybe doesn't sound right but hopefully people get my drift!
 

MMaRsu

Banned
Ya know, I kinda laugh at people who think there's some sort of parity conspiracy theory, but if MS really told them that launching at 900p was unacceptable, then...

Well I doubt it was ever a conspiracy. At least to me it seemed obvious some devs were being pushed to do things a certain way, maybe pressured even who knows. Im sure the Diablo team isnt really happy about this "choice".
 
Top Bottom