• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why 792p on XBO?

I'm no graphics expert of any kind but there are quite a few 1080p & 900p titles on X1 that would argue against 792p(on only 2 titles) being a mythical perfect resolution for X1.

IGN Native Resolution List

The idea that it is is inaccurate, I feel, because games can be vastly different and end up using the eSRAM in very different ways that would work for one game, but may not work so well for another. As you already point out, there are 1080p 60 & 30fps titles on Xbox One, there's 900p titles, there's 792p titles, and there's 720p titles.
 

watership

Member
Why 900p on XBO?
Why 900p on PS4?
Why 792p on XBO?
Why 540p on PS3/360?

Why? Because these consoles aren't powerful enough to do what the developers want. Thus, a sacrifice is made. Resolution or frame rate, take your pick.

/rocketscience

No console is going to be enough to do what a developer wants. You balance available hardware with software and driver tricks to get the best performance you can, while trying to get to look as good as you can.

The benefit is that more people get to play the final vision of what they achieved, because of these set hardware limits. Consoles are cool that way.
 
So what if the xbox one had 8gb gddr5 ram for example, just like the ps4? No esram or ddr3. Could it see a resolution increase from 792p to 900p for example? If so, what about the ROP/Compute deficit and the overall weaker gpu compared to the ps4?
 
Resolution Pixels
720x1280 ---- 921,600
792x1408 ---- 1,115,136 ---- 21.00% more Pixels then 720P
900x1600 ---- 1,440,000 ---- 29.13% more Pixels then 792P
1080x1920 ---- 2,073,600 ---- 44.00% more Pixels then 900P


EDIT:::: Corrected

PS4 is 40-50% faster so it will always be 1 step away
 

Mitsurugi

Neo Member
What happens if (when) some third party makes a game the PS4 can only run at 720p/30fps? Will the 2017 equivalent of Crysis be 600p on XBO?
 

HoodWinked

Gold Member
i was actually doing my own calculations yesterday about this very thing.

32MB still can fit 4x32bit buffers almost perfectly at 1080p, Crisis 3 on 360 also used 16bytes per pixel so modest amount of bytes can still achieve good results. unsure about ryse's configuration.

32MB = 268435456 bits

66355200 32bit @ 1920x1080
46080000 32bit @ 1600x900

268435456 / 66355200 = 4.04

Something else i figured out, CoD: Advance Warfighter I was wondering why it was specifically 882p and not 900p when it was so close. well 882p almost perfectly fits 6 x 32bit buffers.

1568x882 44255232 = 6.065620806145587
 
Best guess is there was a rumored 10% Kinect reservation to be removed, and 792P is precisely 10% more than 1280x720.

1408 = a 10% increase over 1280

and

792 = a 10% increase over 720.

This makes the most sense to me.

I know it may not necessarily translate to performance that way, but this is how I always saw the choice of 792p.
That would be the case if the total resolution increase was 10%, but increasing both directions by 10% makes for a larger increase (more than even 2x 10%).

Also, the resolution was chose prior to any system reservation being given back to devs, so it can't be the reason either.
 

twobear

sputum-flecked apoplexy
Best guess is there was a rumored 10% Kinect reservation to be removed, and 792P is precisely 10% more than 1280x720.

1408 = a 10% increase over 1280

and

792 = a 10% increase over 720.

This makes the most sense to me.

I know it may not necessarily translate to performance that way, but this is how I always saw the choice of 792p.

Doesn't make sense. TitanFall is 792p.
 
People aren't reading the updated opening post.



It's nothing to do with scaling and all to do with filling the Esram.

This doesn't solve anything, really. Even with deferred rendering there's no single buffer that needs that many bytes per pixel at once. It's usually comprised of many smaller buffers, that also don't exactly need to fit in esram all at once.
 

dr_rus

Member
Bingo!

It would be the aspect ratio, if they don't match up then the devs would either have to put black bars on the screen (aka The Order 1886) or stretch the image to fit the screen (which would look horrible).
This has nothing to do with aspect ratios and everything with optimal ROPs utilisation.
 

III-V

Member
Nearly every post here is a tales-from-my-ass-special.

Awesome, and that needs to be pointed out..

Many chime in with opinions, but the problems still get solved.

Its the OP topic like this one that keep me coming back here.

Thanks andromeduck and furious.
 

lord pie

Member
Can't you just keep some of that in the embedded ram? I.e the specific buffer you might be working on at that point, or he ones that get accessed the most? You wouldn't necessarily need to keep the final frame buffer for instance

On PS4 and PC it doesn't really matter because all ram is the same speed so you can clump it all together as one big buffer, but inevitably I think devs will start splitting it up for Xbox one. A pain at first but I'm sure it'll become simpler as time goes on.

Yup that's right. And most games do this.
However the g-buffer rendering pass is typically very bandwidth heavy. And the when generating the g-buffer, you are writing the entire buffer in on go - so it's naturally going to be a very heavy process.

A simplified render path for a deferred renderer may be:

render z-prepass (render depth) -> render g-buffer (render full g-buffer, depth) -> render lighting buffer using g-buffer for materials, etc (render just a lighting buffer, but read the full g-buffer) -> render transparents into lighting buffer, etc -> mess with lighting buffer and apply post processing, etc -> copy to frame buffer

Each of those has different demands on the GPU both in terms of memory, bandwidth, alu, etc. The g-buffer render is one of the bigger hits for bandwidth, so it'd makes sense to keep all the g-buffer render targets in ESRAM. And as they all get equal usage, mixing and matching memory location probably is quite detrimental. It may be that after the g-buffer is written, some of the less important buffers can be copied to DDR after this point, (to make room for shadow maps during the lighting pass, etc).
Certainly things like the frame buffer make no sense at all being in ESRAM.

It's not uncommon for a game to be pushing 300MB of render targets when all said and done, so juggling what goes where is going to be quite tricky for XB1 devs.
 

vypek

Member
Interesting to see the solution. I have to say that as a Computer Science major, I feel pretty ashamed that I don't understand it all too well. Just have a weak grasp :/
 

EndGame82

Banned
I don't think what he said was "hive-minding." It didn't read to me as saying "I think what GAF tells me to think" as it did "thanks to GAF, I'm now noticing and nitpicking things I hadn't been aware of before." Ignorance is bliss, that sort of thing.

Ignorance is bliss is definitely what I was trying to convey. I love reading GAF so I wasn't trying to get banned when I just waited 3 months to post!

I love reading this stuff, some of you guys really know your shit and it definitely helps me know what to look for and appreciate it more when a game is done well.

Once the veil is lifted, so to speak, it's just tough not to notice all the imperfections in games you otherwise might have never payed attention to.
 
This doesn't solve anything, really. Even with deferred rendering there's no single buffer that needs that many bytes per pixel at once. It's usually comprised of many smaller buffers, that also don't exactly need to fit in esram all at once.

Technically no, but you do want to give the scheduler enough flexibility to saturate resources at runtime. Different stages/passes have very different resource usage profiles - some are tons of high locality, some are math heavy, some are lower locality and require paging from main memory etc. You want to mix all those jobs so you're not constantly bottlenecked by one resource.

This is pretty basic stuff man.

Threads like this are why less and less devs bother coming to GAF and taking it seriously.

Instead of getting an actual explanation, I get some guy making up shit because the numbers match and 8 + 1 + 11 = 9/11

I am a fairly low level game developer but I've never had to code that close to hardware.

Posts like this are why less and less devs bother coming to GAF and taking it seriously.
 

Codeblew

Member
Why 900p on a PS4?

Should have been 720p on both, would give us half a chance to run the games at a native resolution (720p display required of course, but at least they exist).

900p would look better on 720p than 720p. But 900p does not look better on 1080p than 1080p. Downscaling = good. Upscaling = bad.
 

Renekton

Member
Interesting to see the solution. I have to say that as a Computer Science major, I feel pretty ashamed that I don't understand it all too well. Just have a weak grasp :/
You're likely not going to learn about that from your major. Except in MIT where students got a PS3 cell crash course just to fucking learn multithreading.. shudder
 

Selfish_Android

Neo Member
because the human eye can't see above 792p this being better than 1080p and 4k because of reasons and the cloud.

on a serious note, i think that will be the standard resolution for XBOX ONE, and 900p for PS4, very sad indeed.
 
Interesting to see the solution. I have to say that as a Computer Science major, I feel pretty ashamed that I don't understand it all too well. Just have a weak grasp :/

You're likely not going to learn about that from your major. Except in MIT where students got a PS3 cell crash course just to fucking learn multithreading.. shudder

virtually all CS programs in ranked universities cover some cache performance/data locality and multi-threading by second/third year

if you're studying graphics and don't touch on pixbuffs during the rasterization section there might be a problem...
 

SMOK3Y

Generous Member
You can't see the difference between 720/900p and 1080p, but the difference between 720p and 792p is massive.
/s

ya all the 'cant tell the difference' getting thrown around at launch between 720 to 1080 but massive difference 720 to 792 LOL, even I can tell the difference between 900p on BF4 & 1080p on Ghosts on the PS4...
 
Top Bottom