• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

XB1 Retail Version of Battlefield 4 Will Still Run at 720p, 60 FPS, EA Rep confirms

Raistlin

Post Count: 9999
Please explain to me how 48fps would work on a tv that is either 30fps or 60fps?
Actually it could be hacked to work for Active 3D displays (assuming they all use the same frame order ... and assuming you're display has a 48Hz mode).

But since you'd need to send it via framepacking, it doesn't make sense from a rendering/control perspective. Not to mention the actual number of TV's that support it are limited.
 

sam27368

Banned
He's saying if they downgrade the PS4 version down to 720p for political reasons then he would boycott. That's understandable.

Supposedly the GTX 660 will be able to run BF4 at 1080p 60fps with decent settings.

Why would a GTX 660 (1.9 tflops) be able to run BF4 at 1080p 60fps and the PS4 would have to run it at 720p 60fps?

But where has anyone said that the PS4 has been downgraded? That's BS, a dev wouldn't gimp there own hard work over something like this and a publisher would have to have a good reason to tell 300+ workers that they're gimping their hard work.

From the connotations of the tweets, it seems as though PS4 is a higher res but they're trying not to rub it into X1 fans. It sounds as though PR parity, not dev parity to me.
 
it would almost definitely be 32 bit colour depth. We could make the case it wouldn't fit in 32mb depending on the techniques used, for example with 4xFSAA and 1080p we'd be looking at:

Back Buffer:
1920x1080 [Resolution] * 32 [Bits Per Pixel] * 4[FSAA Depth]
= 265420800 bits = 31.6MB

Depth Buffer:
1920x1080 [Resolution] * 32 [24Bit Z, 8Bit Stencil] * 4 [FSAA Depth]
= 265420800 bits = 31.6MB

Front Buffer:
1920x1080 [Resolution] * 32 [24Bit Z, 8Bit Stencil]
= 66355200 bits = 7.91MB

Total Memory Requirements:
31.6 + 31.6 + 7.91 = 71.11MB

It's not quite as simple as you put it.

Without the AA you'd be looking at around 24mb.

Crytek has detailed their buffer multiple times in their documentation. Is this calc based upon that?
 

Skeff

Member
Crytek has detailed their buffer multiple times in their documentation. Is this calc based upon that?

No Idea if this is the same as they are using, just an example, they can fit a 1080p frame buffer in 32mb esram, but it depends on AA solution used, I also don't know the intricacies of the xb1 esram.

Essentially I'm saying the 32mb esram could affect the resolution if they were set on doing something in particular, but it can be worked around.
 

Sylonious

Member
But where has anyone said that the PS4 has been downgraded? That's BS, a dev wouldn't gimp there own hard work over something like this and a publisher would have to have a good reason to tell 300+ workers that they're gimping their hard work.

From the connotations of the tweets, it seems as though PS4 is a higher res but they're trying not to rub it into X1 fans. It sounds as though PR parity, not dev parity to me.

They've said before that the PS4 version was running at a higher resolution than 720p. Now they are saying that the XB1 version will run at 720p and that both versions will be on parity. It's not unreasonable for someone to think that they meant they would lower the resolution of the PS4 version.

IF they do that then he's saying he will boycott.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
it would almost definitely be 32 bit colour depth. We could make the case it wouldn't fit in 32mb depending on the techniques used, for example with 4xFSAA and 1080p we'd be looking at:

Back Buffer:
1920x1080 [Resolution] * 32 [Bits Per Pixel] * 4[FSAA Depth]
= 265420800 bits = 31.6MB

Depth Buffer:
1920x1080 [Resolution] * 32 [24Bit Z, 8Bit Stencil] * 4 [FSAA Depth]
= 265420800 bits = 31.6MB

Front Buffer:
1920x1080 [Resolution] * 32 [24Bit Z, 8Bit Stencil]
= 66355200 bits = 7.91MB

Total Memory Requirements:
31.6 + 31.6 + 7.91 = 71.11MB

It's not quite as simple as you put it.

Without the AA you'd be looking at around 24mb.

I would imagine that Xbox One devs would be looking at using a implementation of FXAA which doesn't require nearly as much memory.
As an aside. I'm looking forward to finding out how Xbox One games used that eSRAM in a future gamasutra.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Actually it could be hacked to work for Active 3D displays (assuming they all use the same frame order ... and assuming you're display has a 48Hz mode).

But since you'd need to send it via framepacking, it doesn't make sense from a rendering/control perspective. Not to mention the actual number of TV's that support it are limited.

Indeed. Quite a niche framerate really.
Triple buffering. I know some game that are 45fps and they used it.

But how does that work on a 60Hz display. If you enable vsync (which all next gen games should have by law). You're either going to end up with single and duplicate frames which will look juddery.
 
Simple equation ( And I know this is bad as they are not the same, blah blah but it does show a simple technical restraint DICE will have).

The X1 has (In Essence) a HD7770 and on High for BF3 @ 1080 :-
battlefield-3-1920.png


And a PS4 has (in essence) a HD7870 and on High for BF3 @ 1080 :-
BF3-1920.png


Now this shows with no AA the (X1) cannot get any higher than AV 45 Fps where as the ( PS4 ) can hit 65fps.

So BF3 <> BF4 but this example gives you an idea on what they are working with.. 7 into 5 wont go!
 

Raistlin

Post Count: 9999
I do think 48Hz actually will become standard (UHD supports it). The problem is that there simply won't be enough HDMI 2.0 display adoption out there for this gen.

Maybe next gen (assuming there is one).
 

RoboPlato

I'd be in the dick
Resolution and framerate parity as a target ?
Not buying.

All they're saying is that both versions have the same targets, not that they're targeting parity. Expect this a lot in the coming generation. I don't buy forced parity but saying both versions are "targeting" 1080p and stuff like that will be the norm, no matter how they turn out. They won't say bad things about XBO versions but I don't think they'll hold back PS4 either.
 

tirant

Member
2013 and still running at 720p, what the fuck is this shit? Even my Nexus 10 tablet from 2012 is running games at native 2560 x 1600. I'm so, so disappointed with next gen :(
 

marvin83

Banned
Please explain to me how 48fps would work on a tv that is either 30fps or 60fps?

120 is just as divisible by 24 as it is 30. So, it makes sense for 120hz TV's too, to set your dvd player to 24.

Example, I have the Toshiba HD-A35 HD-DVD player and you can set it to either 1080p/30 and 1080p/24 (for TV's that support it, not all do). My TV, Toshiba 42HL167 supports 24 and 60 natively, so I set mine to 1080p/24, because that's what movies are made as (well, 23.976~).
 

Chobel

Member
But how does that work on a 60Hz display. If you enable vsync (which all next gen games should have by law). You're either going to end up with single and duplicate frames which will look juddery.

They did it in God of war 3, and no one seems to have an issue with its framerates.
 

Alx

Member
2013 and still running at 720p, what the fuck is this shit? Even my Nexus 10 tablet from 2012 is running games at native 2560 x 1600. I'm so, so disappointed with next gen :(

Do the games on your Nexus running at 2560x1600 look better than BF4 at 720p ?
 
All they're saying is that both versions have the same targets, not that they're targeting parity. Expect this a lot in the coming generation. I don't buy forced parity but saying both versions are "targeting" 1080p and stuff like that will be the norm, no matter how they turn out. They won't say bad things about XBO versions but I don't think they'll hold back PS4 either.

They say:

After the review of the product, during a Q&A session with the personnel at the DICE booth, we learned that the developer aims to have the same frame rate and resolution for the Xbox One and PS4 retail versions. We modified the contents of the article to match. Final specifications are unknown until officially published by EA DICE.​

Now it's up to DICE to clarify in a non-nonsense way, otherwise I'm out.
 

Durante

Member
You don't know what your talking about.

Frame buffer size at 1080p @ 16bit

1920 x 1080 x colour depth (16bit)
1920 x 1080 x 16 = 33177600 (bits)
Now divide by 8 for bytes = 4147200 (bytes)
Now turn that into megabytes = 3.8Mbyte

Two frame buffers is less then 10Mb.

Even if they were using 32bit framebuffers (I've no idea). It would easily fit into the eSRAM.
That's not applicable at all to modern rendering techniques.

Skeff further up is closer but probably not familiar with deferred shading. When you are using deferred shading, you need to store all the input data for your later lighting calculations in your render buffer (usually called a g-buffer). The exact amount of memory you need to do this depends on the engine and what you are going for, but you're unlikely to get by with less than 16 bytes per pixel.


Call of duty is a lot less demanding with a far smaller scale. Same as killzone. There are reasons for everything, and those reasons don't usually involve conspiracies.
Quite.
 

Skeff

Member
I would imagine that Xbox One devs would be looking at using a implementation of FXAA which doesn't require nearly as much memory.
As an aside. I'm looking forward to finding out how Xbox One games used that eSRAM in a future gamasutra.

At 32bit per pixel, which should be standard, your still looking at a minimum of 24mb for 1080p with no AA, That's not ideal when you need at least some form of AA and you need to use esram for more than just the framebuffer.

to compare at 720p the minimum of 24mb is reduced to 10.6mb, which would be much easier on the xb1's esram.

That's not applicable at all to modern rendering techniques.

Skeff further up is closer but probably not familiar with deferred shading. When you are using deferred shading, you need to store all the input data for your later lighting calculations in your render buffer (usually called a g-buffer). The exact amount of memory you need to do this depends on the engine and what you are going for, but you're unlikely to get by with less than 16 bytes per pixel.


Quite.

Thanks for the info, I'm not that familiar with deferred shading as I haven't actually used it yet, Only the method I posted.
 

RoboPlato

I'd be in the dick
They say:

After the review of the product, during a Q&A session with the personnel at the DICE booth, we learned that the developer aims to have the same frame rate and resolution for the Xbox One and PS4 retail versions. We modified the contents of the article to match. Final specifications are unknown until officially published by EA DICE.​

Now it's up to DICE to clarify in a non-nonsense way, otherwise I'm out.

They "aim" to have the same framerate and resolution. Not a guarantee that they will. It's very careful, non committal wording that says nothing.
 
Simple equation ( And I know this is bad as they are not the same, blah blah but it does show a simple technical restraint DICE will have).

The X1 has (In Essence) a HD7770 and on High for BF3 @ 1080 :-
battlefield-3-1920.png


And a PS4 has (in essence) a HD7870 and on High for BF3 @ 1080 :-
BF3-1920.png


Now this shows with no AA the (X1) cannot get any higher than AV 45 Fps where as the ( PS4 ) can hit 65fps.

So BF3 <> BF4 but this example gives you an idea on what they are working with.. 7 into 5 wont go!

A ps4 is more like an 7850(1.76Tflops) then an 7870(2.56Tflops)
 

gofreak

GAF's Bob Woodward
That's not applicable at all to modern rendering techniques.

Skeff further up is closer but probably not familiar with deferred shading. When you are using deferred shading, you need to store all the input data for your later lighting calculations in your render buffer (usually called a g-buffer). The exact amount of memory you need to do this depends on the engine and what you are going for, but you're unlikely to get by with less than 16 bytes per pixel.

Further to my comment on a previous page, I looked up the more exact figure for BF3's gbuffers @ 1080p/4xMSAA: 158MB.

http://www.slideshare.net/fullscreen/DICEStudio/shiny-pc-graphics-in-battlefield-3/20

You could work from that figure for 1080p/NoAA. Though BF4's may differ.
 

Skeff

Member
Further to my comment on a previous page, I looked up the more exact figure for BF3's gbuffers @ 1080p/4xMSAA: 158MB.

http://www.slideshare.net/fullscreen/DICEStudio/shiny-pc-graphics-in-battlefield-3/20

You could work from that figure for 1080p/NoAA. Though BF4's may differ.

Wow, that's big, definitely bigger than 32mb esram. so to get ot that with no AA what would we do? assume it would simply be a subtraction as below?

difference between AA and no AA in my example was 48MB, so without AA the BF3 gbuffer would be around 110MB? that could all be batshit crazy math though.

EDIT: I think I'm wrong now, i need to read more.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
That's not applicable at all to modern rendering techniques.
.

Yeah. I really could do with a bit more research on that subject.
At 32bit per pixel, which should be standard, your still looking at a minimum of 24mb for 1080p with no AA, That's not ideal when you need at least some form of AA and you need to use esram for more than just the framebuffer.

to compare at 720p the minimum of 24mb is reduced to 10.6mb, which would be much easier on the xb1's esram.



Thanks for the info, I'm not that familiar with deferred shading as I haven't actually used it yet, Only the method I posted.

Yeah. It's not ideal for 1080p but you'd hope that the MS choose that amount of eSRAM for a good reason.
 
I think it is entirely possible that if they can't get the XB1 version to run like the PS4 version they may gimp the PS4 version. It makes sense for EA considering they want the game to look similar enough on both next Gen consoles to ensure it will sell well on each. People on this site making a big fuss about it will have little impact on what they decide to do. We are talking about a game that will sell millions of copies to millions of people who don't care about Gaf. Gaf is such a small community of gamers when compared to the masses of people who will buy this game and not even know the PS4 could have looked better.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Further to my comment on a previous page, I looked up the more exact figure for BF3's gbuffers @ 1080p/4xMSAA: 158MB.

http://www.slideshare.net/fullscreen/DICEStudio/shiny-pc-graphics-in-battlefield-3/20

You could work from that figure for 1080p/NoAA. Though BF4's may differ.

I thought that 4 x MSAA basically used up 4 times as much memory as no AA. That only requires a 39.5Mb gbuffer. So maybe with using a form of FXAA and 900p it would all fit in the eSRAM without tiling.
 
Does he mean that 720p for Xbox One is incorrect? because he was very specified in that tweet "everything incorrect or misinformed said on forums"

He said he will "not comment on everything incorrect or misinformed said on forums." He did not say everything said on forums was necessarily incorrect or misinformed.
 

Skeff

Member
I thought that 4 x MSAA basically used up 4 times as much memory as no AA. That only requires a 39.5Mb gbuffer. So maybe with using a form of FXAA and 900p it would all fit.

It uses up 4x the memory but It depends when it is used, I think your right with the 39.5MB without AA now that I have had a think about it, I believe I hear the killzone G-Buffer is around 40+MB without AA aswell.
 

Durante

Member
Further to my comment on a previous page, I looked up the more exact figure for BF3's gbuffers @ 1080p/4xMSAA: 158MB.

http://www.slideshare.net/fullscreen/DICEStudio/shiny-pc-graphics-in-battlefield-3/20

You could work from that figure for 1080p/NoAA. Though BF4's may differ.
That works out to 80 bytes per pixel for 4xAA. Assuming it scales linearly with AA samples (which is an absolute worst case scenario) that would mean 20 bytes per pixel for no AA. Which would result in ~40 MB buffer size at 1080p.
 

kriskrosbbk

Member
I think it is entirely possible that if they can't get the XB1 version to run like the PS4 version they may gimp the PS4 version. It makes sense for EA considering they want the game to look similar enough on both next Gen consoles to ensure it will sell well on each. People on this site making a big fuss about it will have little impact on what they decide to do. We are talking about a game that will sell millions of copies to millions of people who don't care about Gaf. Gaf is such a small community of gamers when compared to the masses of people who will buy this game and not even know the PS4 could have looked better.

I guess you were around when the #PS4NoDRM started . It didn't begin in Africa.

NeoGAF FTW.
 

sam27368

Banned
Does he mean that 720p for Xbox One is incorrect? because he was very specific in that tweet "everything incorrect or misinformed said on forums"
It's the ridiculous behaviour of some GAFFers (in this thread especially) that must really put devs off in terms of posting and communicating with the community. Shame really. All of you "boycott!", "screw you DICE" etc comments must mirror what Devs think of this place
 

Piggus

Member
I think it is entirely possible that if they can't get the XB1 version to run like the PS4 version they may gimp the PS4 version. It makes sense for EA considering they want the game to look similar enough on both next Gen consoles to ensure it will sell well on each. People on this site making a big fuss about it will have little impact on what they decide to do. We are talking about a game that will sell millions of copies to millions of people who don't care about Gaf. Gaf is such a small community of gamers when compared to the masses of people who will buy this game and not even know the PS4 could have looked better.

Why will that ensure that it sells well on each? I'm probably not buying it at all if they are gimping the PS4 version just because they're cozy with their lord Microsoft. PS4 owners have a right to be a little pissed off if a dev makes the product worse just so they don't hurt people's fragile feelings who knowingly bought a weaker system. Explain to me how such a decision translates to better sales, especially considering there will be more PS4s in more territories this year.

You're also underestimating the kind of noise gaf is able to make and overestimating the ignorance of the general public.
 
Top Bottom