• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why 792p on XBO?

Gurish

Member
As a side note: is it true that now because Kinect is optional the XBONE is gonna be more powerful like MS claims?

How is it so? after all some consumers will still have the Kinect plugged in and you have to take them into account.
 

Hypron

Member
Fixed for accuracy:

vZA5vfX.jpg

It took me way too long to find what you changed in the picture haha.
 
Someone inform our new friend how hive minding affects one's future
I'm trying to take my conversion process euphemistically by saying I'm "becoming an even more informed consumer".

For example, I never in a billion years would ever have figured out or would expect to read in gaming media that the 792p thing seems specifically tied to the alleged ESRAM bottleneck on the XB1. We have some honest-to-god objective reasoning for public consumption as to how that might be impacting development. Think of all the fanboy warring saved by bookmarking this OP! You're doing Christ's Work here.

Is this some kind of PSA (...warning?) for all Junior Members? Now how am I supposed to get to sleep? Meanie.
 

EGOMON

Member
I keep hearing that with time devs will learn the hardware and the tools and they will be able to output higher resolution on XBO but isn't learning the hardware and matured dev tools will result in a more demanding games technically? isn't that going to even get worse?
Am not techy person i was just wondering
 

Bgamer90

Banned
As a side note: is it true that now because Kinect is optional the XBONE is gonna be more powerful like MS claims?

How is it so? after all some consumers will still have the Kinect plugged in and you have to take them into account.

It's been rumored that future games that don't use motion control will be taking advantage of the GPU reserve used for Kinect motion. This wouldn't have an impact on people who have kinect plugged in since voice will still be there (apparently) but it will more than likely impact the ability to skype or have yourself in your twitch stream while playing those games.
 
I keep hearing that with time devs will learn the hardware and the tools and they will be able to output higher resolution on XBO but isn't learning the hardware and matured dev tools will result in a more demanding games technically? isn't that going to even get worse?
Am not techy person i was just wondering

The games will be as technically demanding as the developers want them to be. Of course as devs get to grips with the Xbone and the dev tools improve, the same thing will happen with PS4 development so it's all a moot point as far as the gap between the two consoles is concerned.
 

Bgamer90

Banned
I keep hearing that with time devs will learn the hardware and the tools and they will be able to output higher resolution on XBO but isn't learning the hardware and matured dev tools will result in a more demanding games technically? isn't that going to even get worse?
Am not techy person i was just wondering

Games always improve throughout a generation. I mean, we may see a handful of 720p games on the Xbox One in the future but overall I would say to just expect this from here on out (outside of a few exceptions)...

Sports, Racers, Platformers, and Puzzle games will more than likely remain 1080p on the system. The action games like fighters & shooters (especially the "AAA" ones) can be any range though. If 30 FPS then expect 900p-1080p. If 60 FPS then expect 720p-900p.
 

RoboPlato

I'd be in the dick
Good thread. The conclusion makes a lot of sense.

I keep hearing that with time devs will learn the hardware and the tools and they will be able to output higher resolution on XBO but isn't learning the hardware and matured dev tools will result in a more demanding games technically? isn't that going to even get worse?
Am not techy person i was just wondering

It's tough to give a solid answer since there are so many factors that can impact resolution. Some techniques will get more efficient while others will get replaced with heavier ones. I personally hope that resolution remains a focus (at least for 30fps games) before other tradeoffs since a lot of the minute details and effects that would be added with a resolution drop would be negated without being able to see them clearly.
 
Can't you just keep some of that in the embedded ram? I.e the specific buffer you might be working on at that point, or he ones that get accessed the most? You wouldn't necessarily need to keep the final frame buffer for instance

On PS4 and PC it doesn't really matter because all ram is the same speed so you can clump it all together as one big buffer, but inevitably I think devs will start splitting it up for Xbox one. A pain at first but I'm sure it'll become simpler as time goes on.

Far more likely we'll start seeing tiled rendering IMO

better locality - doing bulk transfers is usually easier on the memory controllers and avoids having to constantly stream in the pixbuffs -> more bandwidth for textures/assets
 

ethomaz

Banned
eSRAM size... that is what make they choose 792p.

BTW the shame is PS4 with 900p without any RAM limitation... the Xbone resolution was already expected.
 

Makai

Member
andromeduck, your theory is interesting.

If multiple games use 792p, it points to a common framebuffer usage (popular among various developers) that will fit within the ESRAM.

At the same time, it does not exclude the possibility of other resolutions, as we have seen. As long as there is a 32MB limit, it will always be a compromise between resolution, and per-pixel information (ultimately image quality).
Can you explain to me the difference? Is there a way to increase IQ other than AA or super sampling?
 

ethomaz

Banned
would be curious to see the tech behind why if this is the case.
Not much tech I guess... more pixels on screen = better scaling.

500p scales better to 1080p than 400p for example... scales works trying create addiction pixels based in the original... more is always better. 1079p scaled to 1080p I'm sure you won't see difference unless you use a 1600% zoom.
 
"Deferred shading uses a G-Buffer that is 20 bytes per pixel and a radiance target that is 8 bytes per pixel for a total of 28 bytes per pixel.
source"

This is not a recipe, games have different gbuffer layouts with different precision requirements.

Infamous gbuffer is very hefty, but Sucker Punch themselves noted that they could optimize further but didn't have the need to (but probably will on future releases as it also leads to performance boost due the lower bandwidth requirements).
 
But when games like Titanfall have performance issues at 792p then it doesn't make sense. Why not 720p at that point?

Edit: I guess better scaling makes sense

Because Titanfall drops are not related to the gpu.

Respawn even implied that the game is so cpu bound that it impacts the rendering time the gpu have (meaning if they solved the cpu bottleneck even the resolution could be lifted a little)
 

sinseers

Member
Something I have always wondered. Why do some 360 and PS3 games have 1080p listed as resolutions on the back of the packages? Have no 360 and PS3 games been released in 1080p?
 

Riky

$MSFT
Something I have always wondered. Why do some 360 and PS3 games have 1080p listed as resolutions on the back of the packages? Have no 360 and PS3 games been released in 1080p?

They are scaled to 1080p, all 360 games are and some PS3 games.


There are a few native 1080p games, Ridge Racer 7 was one and a few sports and smaller titles.
 

dwells

Member
Someone inform our new friend how hive minding affects one's future

I don't think what he said was "hive-minding." It didn't read to me as saying "I think what GAF tells me to think" as it did "thanks to GAF, I'm now noticing and nitpicking things I hadn't been aware of before." Ignorance is bliss, that sort of thing.
 

stryke

Member
Something I have always wondered. Why do some 360 and PS3 games have 1080p listed as resolutions on the back of the packages? Have no 360 and PS3 games been released in 1080p?

They're just output resolutions that the game supports, not an indication of the internal resolution rendered.
 

ikariX

Banned
That is an odd resolution. But I'm positive in the case of games like Watch_Dogs that's the best UBI could do resolution wise vs. performance.

BTW....this is my first post on Neogaf. Hi everyone
 

dwells

Member
I think it's something like 792p scales to 1080p TVs better, but I'm not sure. (Compared to 720p).

If anything, I would imagine the opposite. 720p scales into 1080p at a nice 2:3 ratio, while 792p is a less pretty 11:15. From what I know about upscaling, it would be easier to optimize for and reduce artifacts with 2:3, especially given that 720p to 1080p upscaling is something that has been extensively studied and worked on for years and years.

To be above baseline HD, I would guess. 720p is heavily associated last generation.

I think this is the real reason - to be able to claim that they're running at "higher than 720p" and dodge the stigma of 720p, even though any sort of gain in image quality is negligible.

Note that 792p is exactly 10% more resolution in each direction than 720p: 792 - 720 = 72 / 720 = .10 * 100% = 10%
 

Hyun Sai

Member
Do you think Titanfall would have run perfectly at 720P ?

Was it a pure marketing decision that led to the 792P with performance issues ?
 

baconcow

Member
All of these responses that 792p = 720p + 10%. 10% more of what? There is not 10% more total pixels, if that's what everybody is trying to say.

792p = 1408 x 792 = 1,115,136 pixels
720p = 1280 x 720 = 921,600 pixels

720p -> 792p represents a total pixel increase of 21%, not 10%.
 
I just have an image of the dev team first running the game at 1080p, shitty frame and all then simply working down from it until they get a stable frame rate..792p was it for this game.
 

Yoday

Member
If anything, I would imagine the opposite. 720p scales into 1080p at a nice 2:3 ratio, while 792p is a less pretty 11:15. From what I know about upscaling, it would be easier to optimize for and reduce artifacts with 2:3, especially given that 720p to 1080p upscaling is something that has been extensively studied and worked on for years and years.



I think this is the real reason - to be able to claim that they're running at "higher than 720p" and dodge the stigma of 720p, even though any sort of gain in image quality is negligible.

Note that 792p is exactly 10% more resolution in each direction than 720p: 792 - 720 = 72 / 720 = .10 * 100% = 10%
If that was the reason for it, then it would make a whole lot more sense to bump it up 8 more lines to 800p and get out of the 700's completely. I tend to side with the ESRAM limitations theory, as it just makes more sense.
 

alr1ght

bish gets all the credit :)
If that was the reason for it, then it would make a whole lot more sense to bump it up 8 more lines to 800p and get out of the 700's completely. I tend to side with the ESRAM limitations theory, as it just makes more sense.

could be due to aspect ratio

1280x720=1.777777777778
1408x792=1.777777777778
1600x900=1.777777777778
1920x1080=1.777777777778

1424x800=1.78
 

Gestault

Member
It kills me seeing a first page with so many vapid responses ignoring the entire write-up in the original post. The idea of optimizing for the smaller pocket of faster ram in the XB1 makes perfect sense, since it eliminates the need to make too many changes in a multi-platform optimization pipeline. Cross-gen and multi-platform titles will probably see this more often than not.
 

ZehDon

Member
That is an odd resolution. But I'm positive in the case of games like Watch_Dogs that's the best UBI could do resolution wise vs. performance.
A lot of the issues with some of these early current-gen titles are going to be related to cross-gen versions. Regardless of what the game does, they must operate and function on the PS360WiiU consoles. Being that all of them use different architecture compared to the Xbone and PS4, its reasonable that Ubisoft's teams had to target the lowest common denominators when designing the game. The Xbone and PS4 versions can do more visually, but all of the behind-the-scene systems still need to scale down. This is also part of the reason why Watch_Dogs running at sub-1080p on the PS4 is such a point of contention - technically, the 360 can run the game, so hitting 1080 shouldn't have been that much of a problem on hardware that's approximately ten times more powerful. There isn't an eSRAM bottleneck to justify the low resolution.

BTW....this is my first post on Neogaf. Hi everyone
tumblr_ll3nuqFXoh1qge3wjo1_500.gif

Welcome aboard.
 

Human_me

Member
could be due to aspect ratio

1280x720=1.777777777778
1408x792=1.777777777778
1600x900=1.777777777778
1920x1080=1.777777777778

800x1424=1.78

Bingo!

It would be the aspect ratio, if they don't match up then the devs would either have to put black bars on the screen (aka The Order 1886) or stretch the image to fit the screen (which would look horrible).
 

ikariX

Banned
A lot of the issues with some of these early current-gen titles are going to be related to cross-gen versions. Regardless of what the game does, they must operate and function on the PS360WiiU consoles. Being that all of them use different architecture compared to the Xbone and PS4, its reasonable that Ubisoft's teams had to target the lowest common denominators when designing the game. The Xbone and PS4 versions can do more visually, but all of the behind-the-scene systems still need to scale down. This is also part of the reason why Watch_Dogs running at sub-1080p on the PS4 is such a point of contention - technically, the 360 can run the game, so hitting 1080 shouldn't have been that much of a problem on hardware that's approximately ten times more powerful. There isn't an eSRAM bottleneck to justify the low resolution.


tumblr_ll3nuqFXoh1qge3wjo1_500.gif

Welcome aboard.

Thanks
 

Ishan

Junior Member
does anyone with a solid understanding of graphics know why a particular scaling ratio is better? I mean if its a fraction its a fraction right?
 
Best guess is there was a rumored 10% Kinect reservation to be removed, and 792P is precisely 10% more than 1280x720.

1408 = a 10% increase over 1280

and

792 = a 10% increase over 720.

This makes the most sense to me.

I know it may not necessarily translate to performance that way, but this is how I always saw the choice of 792p.
 

Fuchsdh

Member
Yeah it's funny I've been gaming for 20+ years and never even seen 792p as an option before. Now we have 2 X1 games using it.

Threw me off too seeing that. I immediately thought that it avoids 720p headlines. Who knows!

Where were you last gen where the sub-HD resolutions were all over the map? It's going to be a thing going forward on both consoles just like last time.

does anyone with a solid understanding of graphics know why a particular scaling ratio is better? I mean if its a fraction its a fraction right?

There isn't. Scaling something is scaling something. What might actually matter is the type of scaling algorithm--that is, certain scaling %s might look crisper or something by the interpolation, like how you choose different scaling methods in Photoshop when you want to reduce something versus blow it up.

I know nothing about the XB1 or PS4's scaling abilities however. The buffer seems like the most reasonable reason for the odd number.
 

HTupolev

Member
Best guess is there was a rumored 10% Kinect reservation to be removed, and 792P is precisely 10% more than 1280x720.

1408 = a 10% increase over 1280

and

792 = a 10% increase over 720.

This makes the most sense to me.

I know it may not necessarily translate to performance that way, but this is how I always saw the choice of 792p.
It doesn't translate to performance at all that way. Resolution hardly affects load linearly, but 792p is 21% more pixels than 720p.
 

rothbart

Member
All of these responses that 792p = 720p + 10%. 10% more of what? There is not 10% more total pixels, if that's what everybody is trying to say.

792p = 1408 x 792 = 1,115,136 pixels
720p = 1280 x 720 = 921,600 pixels

720p -> 792p represents a total pixel increase of 21%, not 10%.

10% more vertical resolution... 720 x 1.10 = 792
 

Riky

$MSFT
People aren't reading the updated opening post.

EDIT: solved

If we do the math, 792p uses the eSRAM about as perfectly as possible.

(32 MB * 1024 KB/MB * 1024 KB/B) / (1408 * 792 pixels) = 30.01 bytes per pixel


Going for 720p results in 36.4 bytes per pixel
Going for 900p results in 23.3 bytes per pixel
Going for 1080p results in 16.18 bytes per pixel

It's nothing to do with scaling and all to do with filling the Esram.
 
Threads like this are why less and less devs bother coming to GAF and taking it seriously.

Instead of getting an actual explanation, I get some guy making up shit because the numbers match and 8 + 1 + 11 = 9/11
 
Top Bottom