NeoGAF is the best and worst thing to ever happen to my gaming life.
Before GAF . "Damn this is fun and the graphics are pretty sweet." After GAF: "I'm not buying that game."
Curse you NeoGAF....curse you.
probably the best res if you cant get stable 900p at 30 and above 720p
If they want a pixel ratio of exactly 16:9, they would have to be able to divide (in this instance) 792 with 9 and multiply it with 16 and still get an integer. Which they can. But that doesn't work with 791p or 793p, so you would have to either increase the resolution to 801p or decrease it to 783p. Maybe 792p was the sweet spot for both games?
I meant from a TV that'll downscale from 1080p, my brother has one of those that WILL see and process a 1080p signal but downscales it to the TV's resolution.Doesn't the image still get sent as 720p if the TV reports as 720p?
You definitely lose some sharpness whenever you scale, especially at non integer ratios since you're losing some of that data whenever you mix.
Because it's not 720p.
I wonder when that patch for Titanfall comes out making it 1080p I wonder if Respawn is still working on that...
Wait until devs get a hold of Xbone's kinectless power 1200p confirmed.
Haha, what are you talking about? At the very least it needs to be divisible by 8 so the image can be stretched to 1080p.
Someone inform our new friend how hive minding affects one's futureNeoGAF is the best and worst thing to ever happen to my gaming life.
Before GAF . "Damn this is fun and the graphics are pretty sweet." After GAF: "I'm not buying that game."
Curse you NeoGAF....curse you.
It makes MS happy that people can't call the Xbox One the Xbox 720.
That might be it. Still strange .... but interestingIs it not to do with the maximum frame buffer size they could fit in to the Esram?
I remember Respawn mentioning it when quizzed by Eurogamer about their choice with the resolution.
http://www.eurogamer.net/articles/digitalfoundry-2014-titanfall-ships-at-792p
Why 900p on XBO?
Why 900p on PS4?
Why 792p on XBO?
Why 540p on PS3/360?
Why? Because these consoles aren't powerful enough to do what the developers want. Thus, a sacrifice is made. Resolution or frame rate, take your pick.
/rocketscience
I don't know why they use 792p instead of something like 720p with 8x smaa. 720p should scale better and the smaa would greatly smooth out the jaggies without making things blurry like the shitty fxaa some devs likes to use.
Someone inform our new friend how hive minding affects one's future
No. Real AA adds detail the the rendered image. Single-sample post-processing AA could be considered to remove detail.Resolution adds extra detail. AA basically removes detail from the image to make it look smoother.
Someone inform our new friend how hive minding affects one's future
No. Real AA adds detail the the rendered image. Single-sample post-processing AA could be considered to remove detail.
Why 900p on XBO?
Why 900p on PS4?
Why 792p on XBO?
Why 540p on PS3/360?
Why? Because these consoles aren't powerful enough to do what the developers want. Thus, a sacrifice is made. Resolution or frame rate, take your pick.
/rocketscience
That's just wrong. Read my article for details.There's no such thing as 'Real' AA, they are all different kinds of AA, and unless you are doing something like full screen supersampling you are one way or another interpolating; That's not adding detail.
That's just wrong. Read my article for details.
In short, any AA method which samples the scene at additional locations (so everything which isn't single-sample postprocessing AA) clearly adds detail to the final rendered image. That's not at all limited to supersampling, but also applies to multisampling (and coverage sampling for that matter).
And supersampling is only equivalent to higher resolution rendering if you are using an ordered grid, which is generally not nearly as effective as a sparse grid.
I have a friend who acts like he is a seasoned game programmer because he reads GAF and the mere thought of me possibly not buying a PS4 enrages him! I'm pretty sure he would stab me in my sleep if I ever bought an Xbox One.
However, If you put up two TVS with the same game running from a PS4 and XB1 I assure you he wouldn't be able to correctly pick the PS4 version. He's never even played an XB1 game but GAF tells him it's bad so it must be!
Those who own a XBO and Titanfall, do you find there is a noticeable difference between 720p and 792p? Genuinely very close to dropping the bucks on a XBO, but as a PS4 owner I'm just wondering how much of a step back it'll actually feel. Does it look like Xbox 360 levels of IQ, or is the AA improved compared to that too? .
I think what you said earlier can generically apply to most large communities period, especially ones with people who can articulate opinions rather than it coming out like this. You just have to remember that someone, somewhere hates everything you love whether for good or bad reasons, and to keep that in mind when seeing some of these opinions, never mind that some people can go overboard.Point taken...went over my head...
NeoGAF is the best and worst thing to ever happen to my gaming life.
Before GAF . "Damn this is fun and the graphics are pretty sweet." After GAF: "I'm not buying that game."
Curse you NeoGAF....curse you.
I come here for the information and the community but I rarely agree with the vocal minority about games. You have to be strong and form your own opinion.
actually that's exactly the bottleneck
source
Ubisoft and Respawn must have found some use for the extra two bytes - probably related to some advanced rendering trick, anybody want to hazard a guess?
I'm not too familiar with deferred methods.
It avoids the stigma of 720p.
Wait just a minute... this looks like an educated opinion! This guy knows what he's talking about! Quick - let's get him!!...In short, any AA method which samples the scene at additional locations (so everything which isn't single-sample postprocessing AA) clearly adds detail to the final rendered image. That's not at all limited to supersampling, but also applies to multisampling (and coverage sampling for that matter)...
Those who own a XBO and Titanfall, do you find there is a noticeable difference between 720p and 792p? Genuinely very close to dropping the bucks on a XBO, but as a PS4 owner I'm just wondering how much of a step back it'll actually feel. Does it look like Xbox 360 levels of IQ, or is the AA improved compared to that too? In all honesty though, when I'm playing Peggle 2 and Sunset Overdrive, I probably won't give two shits about the resolution.
Someone inform our new friend how hive minding affects one's future
Those who own a XBO and Titanfall, do you find there is a noticeable difference between 720p and 792p? Genuinely very close to dropping the bucks on a XBO, but as a PS4 owner I'm just wondering how much of a step back it'll actually feel. Does it look like Xbox 360 levels of IQ, or is the AA improved compared to that too? In all honesty though, when I'm playing Peggle 2 and Sunset Overdrive, I probably won't give two shits about the resolution.
That was my suspicious as the reason.
Modern g-buffers* in deferred renderer can get pretty damn huge, especially when they haven't been built with bandwidth in mind (ie, they just keep getting bigger as people dump more stuff in them).
* ('geometry buffer' - basically writing out things like normal, diffuse color, etc to a set of intermediate render target buffers)
For comparison UE4's gbuffer has 4 32bit RGBA render targets, and a 64bit FP16 RGBA target for emissive contribution. When adding depth + stencil on top of that you're looking at a similar size per pixel.
Needless to say, smaller is better for bandwidth.
For a more extreme example, Imfamous:SS has a very large g-buffer, at well over 40 Bytes per pixel. That's over 80MB at FHD...