Fixed for accuracy:
I'm trying to take my conversion process euphemistically by saying I'm "becoming an even more informed consumer".Someone inform our new friend how hive minding affects one's future
Is this some kind of PSA (...warning?) for all Junior Members? Now how am I supposed to get to sleep? Meanie.
As a side note: is it true that now because Kinect is optional the XBONE is gonna be more powerful like MS claims?
How is it so? after all some consumers will still have the Kinect plugged in and you have to take them into account.
I keep hearing that with time devs will learn the hardware and the tools and they will be able to output higher resolution on XBO but isn't learning the hardware and matured dev tools will result in a more demanding games technically? isn't that going to even get worse?
Am not techy person i was just wondering
it always bugged me that games went with 720 when tvs were actually 1360x768, i think
I keep hearing that with time devs will learn the hardware and the tools and they will be able to output higher resolution on XBO but isn't learning the hardware and matured dev tools will result in a more demanding games technically? isn't that going to even get worse?
Am not techy person i was just wondering
You'd have a higher resolution TV cutting off the edges of a lower resolution image.overscan
I keep hearing that with time devs will learn the hardware and the tools and they will be able to output higher resolution on XBO but isn't learning the hardware and matured dev tools will result in a more demanding games technically? isn't that going to even get worse?
Am not techy person i was just wondering
ThisIt avoids the stigma of 720p.
Can't you just keep some of that in the embedded ram? I.e the specific buffer you might be working on at that point, or he ones that get accessed the most? You wouldn't necessarily need to keep the final frame buffer for instance
On PS4 and PC it doesn't really matter because all ram is the same speed so you can clump it all together as one big buffer, but inevitably I think devs will start splitting it up for Xbox one. A pain at first but I'm sure it'll become simpler as time goes on.
Can you explain to me the difference? Is there a way to increase IQ other than AA or super sampling?andromeduck, your theory is interesting.
If multiple games use 792p, it points to a common framebuffer usage (popular among various developers) that will fit within the ESRAM.
At the same time, it does not exclude the possibility of other resolutions, as we have seen. As long as there is a 32MB limit, it will always be a compromise between resolution, and per-pixel information (ultimately image quality).
From what I gathered in the Titanfall thread, it scales better to 1080p
Not much tech I guess... more pixels on screen = better scaling.would be curious to see the tech behind why if this is the case.
But when games like Titanfall have performance issues at 792p then it doesn't make sense. Why not 720p at that point?
Edit: I guess better scaling makes sense
Something I have always wondered. Why do some 360 and PS3 games have 1080p listed as resolutions on the back of the packages? Have no 360 and PS3 games been released in 1080p?
Someone inform our new friend how hive minding affects one's future
Something I have always wondered. Why do some 360 and PS3 games have 1080p listed as resolutions on the back of the packages? Have no 360 and PS3 games been released in 1080p?
They are scaled to 1080p, all 360 games are and some PS3 games.
There are a few native 1080p games, Ridge Racer 7 was one and a few sports and smaller titles.
I think it's something like 792p scales to 1080p TVs better, but I'm not sure. (Compared to 720p).
To be above baseline HD, I would guess. 720p is heavily associated last generation.
This
So they can say "welp, it isn't 720p"
If that was the reason for it, then it would make a whole lot more sense to bump it up 8 more lines to 800p and get out of the 700's completely. I tend to side with the ESRAM limitations theory, as it just makes more sense.If anything, I would imagine the opposite. 720p scales into 1080p at a nice 2:3 ratio, while 792p is a less pretty 11:15. From what I know about upscaling, it would be easier to optimize for and reduce artifacts with 2:3, especially given that 720p to 1080p upscaling is something that has been extensively studied and worked on for years and years.
I think this is the real reason - to be able to claim that they're running at "higher than 720p" and dodge the stigma of 720p, even though any sort of gain in image quality is negligible.
Note that 792p is exactly 10% more resolution in each direction than 720p: 792 - 720 = 72 / 720 = .10 * 100% = 10%
If that was the reason for it, then it would make a whole lot more sense to bump it up 8 more lines to 800p and get out of the 700's completely. I tend to side with the ESRAM limitations theory, as it just makes more sense.
A lot of the issues with some of these early current-gen titles are going to be related to cross-gen versions. Regardless of what the game does, they must operate and function on the PS360WiiU consoles. Being that all of them use different architecture compared to the Xbone and PS4, its reasonable that Ubisoft's teams had to target the lowest common denominators when designing the game. The Xbone and PS4 versions can do more visually, but all of the behind-the-scene systems still need to scale down. This is also part of the reason why Watch_Dogs running at sub-1080p on the PS4 is such a point of contention - technically, the 360 can run the game, so hitting 1080 shouldn't have been that much of a problem on hardware that's approximately ten times more powerful. There isn't an eSRAM bottleneck to justify the low resolution.That is an odd resolution. But I'm positive in the case of games like Watch_Dogs that's the best UBI could do resolution wise vs. performance.
BTW....this is my first post on Neogaf. Hi everyone
could be due to aspect ratio
1280x720=1.777777777778
1408x792=1.777777777778
1600x900=1.777777777778
1920x1080=1.777777777778
800x1424=1.78
A lot of the issues with some of these early current-gen titles are going to be related to cross-gen versions. Regardless of what the game does, they must operate and function on the PS360WiiU consoles. Being that all of them use different architecture compared to the Xbone and PS4, its reasonable that Ubisoft's teams had to target the lowest common denominators when designing the game. The Xbone and PS4 versions can do more visually, but all of the behind-the-scene systems still need to scale down. This is also part of the reason why Watch_Dogs running at sub-1080p on the PS4 is such a point of contention - technically, the 360 can run the game, so hitting 1080 shouldn't have been that much of a problem on hardware that's approximately ten times more powerful. There isn't an eSRAM bottleneck to justify the low resolution.
Welcome aboard.
Bingo!
It would be the aspect ratio, if they don't match up then the devs would either have to put black bars on the screen (aka The Order 1886) or stretch the image to fit the screen (which would look horrible).
Yeah it's funny I've been gaming for 20+ years and never even seen 792p as an option before. Now we have 2 X1 games using it.
Threw me off too seeing that. I immediately thought that it avoids 720p headlines. Who knows!
does anyone with a solid understanding of graphics know why a particular scaling ratio is better? I mean if its a fraction its a fraction right?
It doesn't translate to performance at all that way. Resolution hardly affects load linearly, but 792p is 21% more pixels than 720p.Best guess is there was a rumored 10% Kinect reservation to be removed, and 792P is precisely 10% more than 1280x720.
1408 = a 10% increase over 1280
and
792 = a 10% increase over 720.
This makes the most sense to me.
I know it may not necessarily translate to performance that way, but this is how I always saw the choice of 792p.
All of these responses that 792p = 720p + 10%. 10% more of what? There is not 10% more total pixels, if that's what everybody is trying to say.
792p = 1408 x 792 = 1,115,136 pixels
720p = 1280 x 720 = 921,600 pixels
720p -> 792p represents a total pixel increase of 21%, not 10%.
EDIT: solved
If we do the math, 792p uses the eSRAM about as perfectly as possible.
(32 MB * 1024 KB/MB * 1024 KB/B) / (1408 * 792 pixels) = 30.01 bytes per pixel
Going for 720p results in 36.4 bytes per pixel
Going for 900p results in 23.3 bytes per pixel
Going for 1080p results in 16.18 bytes per pixel
It doesn't translate to performance at all that way. Resolution hardly affects load linearly, but 792p is 21% more pixels than 720p.