• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Guerrilla Games: Regarding Killzone Shadow Fall and 1080p

Having 1080 pixels vertically is not native in any way you wanna spin it if the horizontal pixels are not 1920, it's like saying for example that Vanquish (1024*720), Dark Souls (1024*720) & Black Ops 2 (880*720) are now magically native 720p when they are nowhere near the pixel density of a native 720p framebuffer (1280*720 = 921.600, 1024*720 = 737.280, 880*720 = 633.600).
So basically GG dev is right when he says its not native 1080p if you consider every part of pipeline which includes total pixel count. But it is native by the definition that it is not stretched or scaled in any form, kinda like what the Order devs are saying.
 
I know that, but I was thinking of something else. Let me try again

Both Killzone and Ryse/BF4(PS4 ver) render half of needed pixels for native 1080p.
Other half is in KZ case calculated from last two frames and in Ryse/BF4 pixels are calculated from current frame (from nearby pixels).
So at the end all 3 games output 1920x1080 frame, all 3 only render half of the pixel and calculate other half (but with different algorithm).



But by some the KZ solution is considered native 1080p and other two are not.
Why is that?

Because the KZ:SF method is more similar to a dynamic resolution, where BF4 and Ryse are a fixed resolution.

In areas of the frame where there is no motion or where their prediction algorithm is successful you will be getting full 1080p resolution.

When it's not as successful the results will still probably be better than 1600x900 because you're not dealing with upscaling artifacts from non evenly divisible resolutions. In KS:SF worst case scenario there is no upscaling in the vertical domain and a simple 2x upscale in the horizontal.

You're trying to make this black and white when it's not.
 
Nope.

Well, maybe, if you agree a puddle and the Atlantic Ocean are both "bodies of water".

Upscaling doesn't take any resources and is blending; ie its not an accurate way to increase the resolution of the image in the framebuffer. In fact it has the same effect as applying a blur filter. You're always going to lose clarity with a vanilla upscale or applying blur.

GG's method actually reproduces a 1080P image when nothing changes. It drops from there based on how many pixles are in motion and how well they can be predicted. Its resource intensive and much more accurate.

If you're sitting at a cap point, slowly scanning, and waiting for someone attacking, you're probably getting 95% of an accurate 1080P image. If you're spinning like a top in a hail of animated explosions, probably a lot less.

I'd be interested to know how well this prediction works, what errors we're seeing, and what/how often you see the worse case resolution.

this method ALWAYS produces a 1080p image. the question is how accurate it is. the worst case resolution is always 1080p.

dynamitejim said:
Because the KZ:SF method is more similar to a dynamic resolution, where BF4 and Ryse are a fixed resolution.

In areas of the frame where there is no motion or where their prediction algorithm is successful you will be getting full 1080p resolution.

When it's not as successful the results will still probably be better than 1600x900 because you're not dealing with upscaling artifacts from non evenly divisible resolutions. In KS:SF worst case scenario there is no upscaling in the vertical domain and a simple 2x upscale in the horizontal.

You're trying to make this black and white when it's not.

the resolution is NOT dynamic. it is always 1080p. it's just a matter of how accurate the predicted pixels are. i don't even know how you came to the conclusion that there is simple 2x upscaling of the horizontal resolution in a worst case scenario. the resolution is ALWAYS 1080p.
 

BibiMaghoo

Member
Clarity of image is too vague.


The image scaling is gone, which is what caused the clarity to diminish on non native 1080p games.

AA also effects image clarity, one could argue this is more closely related to an AA debate than a resolution one. Though really, this is a completely new situation.


Basically, if people have a problem with this solution, fine, but saying "but it isn't native 1080p" are likely missing the point.

I get that the final output image is rendered at 1080p, but it is not as clear as a native one. You can see the difference within the same game, between single player and multiplayer. One is a crisper, cleaner image than the other, and this is because of the rendering technique used. No upscaling artifacts are present because it does not occur, but other image distortions do occur in place of it, ones that would not do so were the game rendered fully native as the single player. It is an issue of image quality, from my perspective, even if other things such as AA effect that quality. The two base comparisons with no other effects do not produce the same image.
 

quest

Not Banned from OT
Because the KZ:SF method is more similar to a dynamic resolution, where BF4 and Ryse are a fixed resolution.

In areas of the frame where there is no motion or where their prediction algorithm is successful you will be getting full 1080p resolution.

When it's not as successful the results will still probably be better than 1600x900 because you're not dealing with upscaling artifacts from non evenly divisible resolutions. In KS:SF worst case scenario there is no upscaling in the vertical domain and a simple 2x upscale in the horizontal.

You're trying to make this black and white when it's not.

We will have to disagree on that give me 1600x900P any day of the week over this fancy interlacing. Slightly up scaling is no issue on my plasma but motion blur just pisses me off to no end.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
It looks good. They've sold you. They should not be obligated to divulge their techniques to appease a rabid, and often unpredictable, subset of users.

Nobody says that they should be "obliged". It just seems like a good PR move. After all, enough people are reading sites like Digital Foundry, so enough people care.

Knowing how they render a 1080p image is utterly meaningless in the grand scheme of things, and does nothing to improve ones enjoyment of the game. 1080p does not make a game more fun, longer, better written, etc.

It's interesting to people. You are generalizing your preference, but if you don't care about the technology underlying a game, just ignore it. Enough people care, and I disagree that all of them care for no rational reason. First of all, it is just an interesting topic, plain and simple, and secondly, visual performance is noticeable.

In the case of KZ:SF people noticed that the multiplayer is blurry, and now they know why and can judge upcoming products that use the same technique. Say that some game comes out on different platforms and renders native 1080p on one platform and uses KZ:SF's technique on the other. If you know what that means, it further informs your decision process between the different versions if you have both platforms. It also adds to the decision about which platform to buy, if these differences become prevalent in many games.
 
Crikey - all these arguments about exactly what 1080p is and isn't and whether BluRays are or aren't!

In my view all this controversy is because we don't have a clear definition of what we're talking about. Do we really care about whether pixels are displayed 1:1 on the screen from how the game drew them before upscaling/interpolating, or do we *really* mean "how many pixels on the screen go through a full set of rendering passes each frame"?

I'd argue it's the latter - because people are quite reasonably obsessed with performance and power (both in absolute terms and relative to competitor machines).

So in that context - yes The Order renders at native 1080p but who cares, because for a whole chunk of that frame it's basically doing no work pushing out simple black pixels so in no way is "1080p" when we're using that as a shorthand for a machine's power (i.e. all 1920x1080 pixels going through a full complex set of rendering passes).

Similarly KZSF is pushing out a clever native 1080p image, but in the way that matters when it comes to performance - a measure of how many "full fat" pixels it's putting out - it's most definitely not 1080p.

I'm not sure if that's muddied the waters or made it clearer - it's clearer in my head than it is in this post that's for sure!
 

sinnergy

Member
Because the KZ:SF method is more similar to a dynamic resolution, where BF4 and Ryse are a fixed resolution.

In areas of the frame where there is no motion or where their prediction algorithm is successful you will be getting full 1080p resolution.

When it's not as successful the results will still probably be better than 1600x900 because you're not dealing with upscaling artifacts from non evenly divisible resolutions. In KS:SF worst case scenario there is no upscaling in the vertical domain and a simple 2x upscale in the horizontal.

You're trying to make this black and white when it's not.

Yet the other examples are black... fact is that there is tempering in the resolution to get the so called "1080p" yet everyone could see Ryse looked softer, so did BF and people even questioned Killzone SF MP..

In the end you can see that you are looking a a non native image..

(effects at lower res can sometimes not be seen)
 

Gbraga

Member
Interesting read. Personally I don't think people should focus on the definition of native at all though, the thing is, people thought the multiplayer looked bad and the single player looked good. It doesn't matter which definition of native 1080p you're using, the criticism still stands.

we all know what people expect when you say native 1080p

Pretty much this.

Guerilla going beyond Remedy's Alan Wake explanations.

It instantly reminded me of Alan Wake too.
 

DigitalDevilSummoner

zero cognitive reasoning abilities
All I have to say is that this is why I like consoles ! I mean some people try to make this a thing about the capabilities of the PS4 and in extension the new gen but coming up with techniques that squeeze every last drop of power out of the hardware is what consoles are about in my book. The best pixel per dollar ratio. Kudos to GG for a both detailed and easy to follow analysis.
 

Metfanant

Member
Cool, then again the XBO outputs 1920x1080 progressive pixels for each game, so the TV can display it 1:1, means at it is the best quality because its coupled with the displays own native resolution . Great!

So are all XBO games native 1080p or not ?

If not, they neither is the order native 1080p ... and don't mind if you agree or not anymore :)

And there is a difference in the terms "rendered" and "recorded"&"displayed" because as you see XBO games are rendered only at lower res than 1080p but are displayed at 1080p (in your opinion this would be even native 1080p)

No...

either you literally have no idea what youre talking about...or youre just trolling...with the Xbone there is scaling occurring...either the console does it, or your TV will do it...the better result will be determined by which component (Xbone or TV) has a better scaler...

The Order is a situation that couldnt be more different than your Xbone example...

Crikey - all these arguments about exactly what 1080p is and isn't and whether BluRays are or aren't!

In my view all this controversy is because we don't have a clear definition of what we're talking about. Do we really care about whether pixels are displayed 1:1 on the screen from how the game drew them before upscaling/interpolating, or do we *really* mean "how many pixels on the screen go through a full set of rendering passes each frame"?

I'd argue it's the latter - because people are quite reasonably obsessed with performance and power (both in absolute terms and relative to competitor machines).

So in that context - yes The Order renders at native 1080p but who cares, because for a whole chunk of that frame it's basically doing no work pushing out simple black pixels so in no way is "1080p" when we're using that as a shorthand for a machine's power (i.e. all 1920x1080 pixels going through a full complex set of rendering passes).

Similarly KZSF is pushing out a clever native 1080p image, but in the way that matters when it comes to performance - a measure of how many "full fat" pixels it's putting out - it's most definitely not 1080p.

I'm not sure if that's muddied the waters or made it clearer - it's clearer in my head than it is in this post that's for sure!

except...what about how most (probably all) games render certain aspects in lower resolutions (lighting, explosions, shadows etc...) to save resources...does that fit in your description??
 

ItIsOkBro

Member
black-wallpapers.jpg

1080p or 0p?
 

Gestault

Member
In the case of KZ:SF people noticed that the single player is blurry, and now they know why and can judge upcoming products that use the same technique. Say that some game comes out on different platforms and renders native 1080p on one platform and uses KZ:SF's technique on the other. If you know what that means, it further informs your decision process between the different versions if you have both platforms. It also adds to the decision about which platform to buy, if these differences become prevalent in many games.

Err, multi-player, right? Single-player was gorgeous and traditional native-1080p.
 

hodgy100

Member
Killzone: SF Multiplayer is not native 1080p each frame renders at 960 x 1080 and then uses 2 previous frames to calculate the remainder of the framebuffer to output a constructed 1920 x 1080 image. Its a really cool trick and allowed for higher IQ though 1:1 pixel mapping on a 1080p screen, rather than upscaling while enabling a higher framerate. but the game is not rendering a completely new 1920 x 1080 framebuffer every frame, which is the traditional use of "native 1080p"

The Order: 1886 renders its frame at 1920 x 800, this is also not "native 1080p" the game is not calculating a full 1920 x 1080 amge for the game, the remainder is blank and no processing takes place on those pixels each passing frame. this is also however a neat trick, it allows for excellent IQ as it provides 1:1 pixel mapping with a 1080p screen and it freed up the resources to have 4xMSAA which is great for clearing jaggies. but no the game is not "native 1080p" in the context of a game.

To people using movies as examples and saying "they are 1080p" yeah they are in the context of a film, it is just a change in aspect ratio, this reasoning however cannot be applied to a video game. The main reason for this is because the rendering resolution of a game effects the performance of the game. say we have a game that runs at 1920 x 1080 but averages out at 25fps, this is unacceptable due to a fluctuating uneven framerate causing a detrimental effect on the gameplay. however rendering the game at 1920 x 800 could increase performance enough to lack the game at 30fps, providing a much better experience without sacrificing IQ. This is not the case with a film, the director may choose to put their film in any aspect ratio they want without any performance repercussions.

Ryse, BF4 and other not native games, render at sub 1920 x 1080 resolutions and the image is upscaled, this introduces blur into the image and is a pretty bad hit to IQ on 1080p TV's, this is always done to mitigate performance issues.
1080p or 0p?

It's a 1080p image, but nothing is being rendered. its a strawman and irrelevant to the discussion at hand.
 

Jack cw

Member
Cool, then again the XBO outputs 1920x1080 progressive pixels for each game, so the TV can display it 1:1, means at it is the best quality because its coupled with the displays own native resolution . Great!

So are all XBO games native 1080p or not ?

If not, they neither is the order native 1080p ... and don't mind if you agree or not anymore :)

And there is a difference in the terms "rendered" and "recorded"&"displayed" because as you see XBO games are rendered only at lower res than 1080p but are displayed at 1080p (in your opinion this would be even native 1080p)
Wonderful how you explained everything by yourself.
The framebuffer is the important thing in videogames as the games native resolution will be upscaled by the hardware to get a compatible signal to the HDMI out of the console. In this case a 1080p signal that is based on an upscaled low resolution image which is not 1:1 pixel mapped (For example CoD and BF4 with 1280x720 pixels on xbone). The result is a blurry, artifacted, foggy and soft image that simply looks bad. If the framebuffer sends out an true native 1080p image like with Killzones SP, then the consoles output is also 1080p, but the image isnt scaled internally to match that standard as its 1:1 pixel mapped. 1600x900 or 1280x720 pixels need to be upscaled to fit your screen and for your display to detect it as correct signal, thats why xbone outputs "1080p" but the image that is shown is simply a low res native image with bad IQ as its not 1:1 pixel mapped. Movies in wider ratio in Full HD are native, as well as the Order. The letterbox is part of the image! This is fact, deny it as you want, it only shows your ignorance and complete lack of knowledge about this topic.
 

kinggroin

Banned
Do we call games 540p if they use quarter res shadows, or alpha blending, or ambient occlusion? Or does it still count as 1080p because the end result is 1920x1080 new pixels being calculated and drawn, which also happens in KZ?




Fixed for you


Massive threshold difference mrklaw. Big difference between improving performance by dropping the lighting resolution or particle resolution or shadow resolution compared to doing it to the entire frame. Let's not be disingenuous here.

One method will still give you a perfectly sharp image but with some low quality effects (which doesn't usually cross that "not 1080p" threshold). The other will just give you a lower quality image altogether (which for many, obviously, crossed that "not 1080p" threshold).
 

TMNT

Banned
Awesome response by GG. Glad they went this route rather than try to force 1080p normally or drop it down to a lower res.

This teqnique gives MP a fantastic smooth IQ that looks like 1080p with FXAA or much like Ryse did with high quality 900p.


Well done GG.

By 'smooth' do you mean 'blurred'? I prefer smeared but either way, the explanation is interesting and I'd be curious to see if they utilize this method in upcoming titles.
 

zzz79

Banned
A 1080p TV with a resolution of 1920 x 1080 has a total of 2,073,600 pixels.
A 1080p Blu-ray movie with a resolution of 1920 x 1080 has a total of 2,073,600 pixels, regardless of if it has black bars like movies made for the cinema or no black bars like direct-to-DVD movies.

This equals to a true 1:1 pixel mapping, also known as native 1080p.

A game that renders internally at 1920 x 1080 / 1920 x 800 with 280px black bars is native as well, as it maintains the 1:1 pixel mapping with 2,073,600 pixels.

A game that renders internally at 1280 x 720 is not native on a 1080p TV, as it only has 921,600 pixels and thus no 1:1 pixel mapping.

*lol* when wishes comes true ...

Still to biased for the truth ?

A game rendered at 1920x800 is not native 1080p and that's the point, because it contains only 1920x800 real rendered pixels which are calculated by the CPU/GPU.
The original game image size is less than 1920x1080, therefore it's not native 1080p.

With the game resolution 1920x800 you can:
- either add some black bars to get 1920x1080, but its not native rendered although it hits the 1920x1080 progressive pixels in the output signal of the console.
- or you can scale up the game image size to 1920x1080, by adding artificial pixels to hit the output signal at 1080p.

In both methods the game is not rendered at native 1080p so you can't say its native 1080p because the output signal on the console is 1080p.

Or in other way if you want to call the game rendered at 1920x800 including balck bars as native 1080p then also the game rendered at 1920x800 up scaled to 1920x1080 should be called native 1080p ... ;)
 

TyrantII

Member
I know that, but I was thinking of something else. Let me try again

Both Killzone and Ryse/BF4(PS4 ver) render half of needed pixels for native 1080p.
Other half is in KZ case calculated from last two frames and in Ryse/BF4 pixels are calculated from current frame (from nearby pixels).
So at the end all 3 games output 1920x1080 frame, all 3 only render half of the pixel and calculate other half (but with different algorithm).



But by some the KZ solution is considered native 1080p and other two are not.
Why is that?

Because the result is vastly more accurate, and can be quantified in a much clearer image.

To put it another way, one method is just upscaling a DVD. The other is a normal h.246 encode of a film for bluray (or some other advance method).

Both are lossy, but one is magnitudes more accurate. And any way you slice it, that matters.
 

MrZekToR

Banned
A 1080p TV with a resolution of 1920 x 1080 has a total of 2,073,600 pixels.
A 1080p Blu-ray movie with a resolution of 1920 x 1080 has a total of 2,073,600 pixels, regardless of if it has black bars like movies made for the cinema or no black bars like direct-to-DVD movies.

This equals to a true 1:1 pixel mapping, also known as native 1080p.

A game that renders internally at 1920 x 1080 / 1920 x 800 with 280px black bars is native as well, as it maintains the 1:1 pixel mapping with 2,073,600 pixels.

A game that renders internally at 1280 x 720 is not native on a 1080p TV, as it only has 921,600 pixels and thus no 1:1 pixel mapping. To add: In order to reach the 1:1 pixel mapping, the console/Blu-ray player/TV has to interpolate the missing pixels based on the information of neighboring pixels. The result is usually a blurry mess.

...or free anti-aliasing?
 

Gbraga

Member
It may be a bit offtopic, but since there are several PS4 owners here, can someone please test if setting your PS4 to 720p or lower increases the framerate? I've been curious about this for some time. I wonder if the PS4 just downsamples the image to 720p or if it actually renders at a lower resolution.
 

hodgy100

Member
haha, great !

native 1080p if pic was taken form the PS4

scaled 1080p if pic was taken form the XBO

according to some fanboys here ;)

I agree with your argument, but please stop throwing heavy words such as "fanboy" it jsut escalates the tone and prevents genuine discussion.

It may be a bit offtopic, but since there are several PS4 owners here, can someone please test if setting your PS4 to 720p or lower increases the framerate? I've been curious about this for some time. I wonder if the PS4 just downsamples the image to 720p or if it actually renders at a lower resolution.

I believe it down samples by default, but it wouldn't surprise me of devs could make ia game render at lower resolutions if the ps4 was set at one.
 

Jack cw

Member
Hmm, have you played Ryse on a big screen? (genuine question)

I have and I like what I see but lets be honest, Crytek uses a squillion effects like the heaviest motion blur ever and some sort of asymmetrical scaling to hide the 35% lower resolution. Same cheating and obfuscating as Killzones MP actually. Regardless of the resolution, the image looks good but you notice that its not native.
 

butts

Member
It seems strange to me that all of these calculations are less intensive than just rendering the extra pixels but then again I basically know nothing about this stuff so oh well! Interesting read though.
 

Metfanant

Member
*lol* when wishes comes true ...

Still to biased for the truth ?

A game rendered at 1920x800 is not native 1080p and that's the point, because it contains only 1920x800 real rendered pixels which are calculated by the CPU/GPU.
The original game image size is less than 1920x1080, therefore it's not native 1080p.

With the game resolution 1920x800 you can:
- either add some black bars to get 1920x1080, but its not native rendered although it hits the 1920x1080 progressive pixels in the output signal of the console.
- or you can scale up the game image size to 1920x1080, by adding artificial pixels to hit the output signal at 1080p.

In both methods the game is not rendered at native 1080p so you can't say its native 1080p because the output signal on the console is 1080p.

Or in other way if you want to call the game rendered at 1920x800 including balck bars as native 1080p then also the game rendered at 1920x800 up scaled to 1920x1080 should be called native 1080p ... ;)

when you play The Order, the PS4 will be sending an unmolested 1920x1080 frame to your television....therefore 1080p...
 

dark10x

Digital Foundry pixel pusher
Everyone once again misunderstands why this even matters.

It matters because we are stuck with fixed pixel displays. That's it.

1080p matters as it is the standard resolution across the vast majority of modern TVs. If the resolution is not perfectly divisible by the display resolution you will see blurring (or distorted pixels) in upscaling. If a proper 960x540 could actually be output and accepted by a 1080p TV it could potentially appear much cleaner than 1280x720 due to the fact that you could double pixels evenly in both directions.

It's this whole issue of upscaling that makes the Killzone solution so damn good. It still produces an image which looks like full 1920x1080 without any of the scaling artifacts you get with a lower resolution.

Those of us that actually give a shit about this stuff (and aren't simply console warriors) would understand this. It's all about eliminating upscale blur.

If we were all using displays without fixed pixel grids this whole argument wouldn't really matter.

The SNES produced a significantly lower resolution than the Genesis in most cases but it didn't matter on a CRT as it could handle both without an issue. Emulation of SNES is actually difficult for this reason as it used non-square pixels (256x224 vs 320x224 on Genesis).

In both methods the game is not rendered at native 1080p so you can't say its native 1080p because the output signal on the console is 1080p.
The 1920x800 solution will produce a much better image simply because it avoids upscaling artifacts. THAT'S why 1600x900 is inferior.
 

BigTnaples

Todd Howard's Secret GAF Account
haha, great !

native 1080p if pic was taken form the PS4

scaled 1080p if pic was taken form the XBO

according to some fanboys here ;)


Really?


You have done nothing in this thread but prove how little you know about the topic while simultaneously arguing your "points" with a feverish passion, and randomly bringing up Xbox One and calling people fanboys.



Just stop.
 
except...what about how most (probably all) games render certain aspects in lower resolutions (lighting, explosions, shadows etc...) to save resources...does that fit in your description??

Yeah - that muddies the waters (and the IQ :)) doesn't it?

I guess things like lighting/explosions/shadows shouldn't be counted as

a) these things are never been rendered at full 1080p resolution as I understand it
b) these things are broadly "image quality" items rather than "resolution" items

i.e. if you dropped the resolution on b you'd get blocky shadows and lighting changes but the underlying texture surfaces on screen would still have been rendered at 1080p.

The discussion should move on to IQ, but it's still very focussed around "1080p" as that's easier to comprehend. (for example a game with a 1600x900 rendering buffer that also rendered shadows/explosions/lighting at 1600x900 might look better than a 1920x1080 frame buffer with 640x480 shadows/explosions/lighting - I don't know).
 

hodgy100

Member
Everyone once again misunderstands why this even matters.

It matters because we are stuck with fixed pixel displays. That's it.

1080p matters as it is the standard resolution across the vast majority of modern TVs. If the resolution is not perfectly divisible by the display resolution you will see blurring (or distorted pixels) in upscaling. If a proper 960x540 could actually be output and accepted by a 1080p TV it could potentially appear much cleaner than 1280x720 due to the fact that you could double pixels evenly in both directions.

It's this whole issue of upscaling that makes the Killzone solution so damn good. It still produces an image which looks like full 1920x1080 without any of the scaling artifacts you get with a lower resolution.

Those of us that actually give a shit about this stuff (and aren't simply console warriors) would understand this. It's all about eliminating upscale blur.

If we were all using displays without fixed pixel grids this whole argument wouldn't really matter.

The SNES produced a significantly lower resolution than the Genesis in most cases but it didn't matter on a CRT as it could handle both without an issue. Emulation of SNES is actually difficult for this reason as it used non-square pixels (256x224 vs 320x224 on Genesis).

pretty much this, its an ingenious solution to a problem, and while its far from perfect its much better than the alternative (upscaling)
 
I have and I like what I see but lets be honest, Crytek uses a squillion effects like the heaviest motion blur ever and some sort of asymmetrical scaling to hide the 35% lower resolution. Same cheating and obfuscating as Killzones MP actually. Regardless of the resolution, the image looks good but you notice that its not native.

Yeah that's fine but it's not "a low res native image with bad IQ" which is what you said in your previous post :p
 

Ivan

Member
This is a response I expect from technical masters like Guerrlla.

I always liked smart techiques of faking in game development, much more than brute forcing something that hardware can't produce (like Crysis 1 for example).
 

BigTnaples

Todd Howard's Secret GAF Account
Everyone once again misunderstands why this even matters.

It matters because we are stuck with fixed pixel displays. That's it.

1080p matters as it is the standard resolution across the vast majority of modern TVs. If the resolution is not perfectly divisible by the display resolution you will see blurring (or distorted pixels) in upscaling. If a proper 960x540 could actually be output and accepted by a 1080p TV it could potentially appear much cleaner than 1280x720 due to the fact that you could double pixels evenly in both directions.

It's this whole issue of upscaling that makes the Killzone solution so damn good. It still produces an image which looks like full 1920x1080 without any of the scaling artifacts you get with a lower resolution.

Those of us that actually give a shit about this stuff (and aren't simply console warriors) would understand this. It's all about eliminating upscale blur.

If we were all using displays without fixed pixel grids this whole argument wouldn't really matter.

The SNES produced a significantly lower resolution than the Genesis in most cases but it didn't matter on a CRT as it could handle both without an issue. Emulation of SNES is actually difficult for this reason as it used non-square pixels (256x224 vs 320x224 on Genesis).


The 1920x800 solution will produce a much better image simply because it avoids upscaling artifacts. THAT'S why 1600x900 is inferior.



This is what I was trying to get across earlier. Thank you.
 

Stallion Free

Cock Encumbered
It's gotta be a bummer to have a 1080p60fps game that is actually 1920x1080 rendered at 60 times a second and have that watered down to the degree it has been with the twisting of words.
 

Metfanant

Member
Yeah that's fine but it's not "a low res native image with bad IQ" which is what you said in your previous post :p

well lets be perfectly honest...anyone saying that a 1280x720 image upscaled by a 1080p TV (or the console itself) is a "blurry mess" or "smeared with vaseline" is completely overreacting anyway...

would an non-upscaled, native res image look better? absolutely...but saying CoD Ghosts on the Xbone is a "blurry mess" just takes things too far...certainly not as sharp as the PS4 version, but far from a blurry mess
 

dark10x

Digital Foundry pixel pusher
pretty much this, its an ingenious solution to a problem, and while its far from perfect its much better than the alternative (upscaling)
You want to blow some minds? Find a nice quality CRT monitor, load up a modern game at 1024x768 with some MSAA and you'll find image quality more pleasing than many an LCD rendering at a much higher resolution. Factor in the motion resolution and you've got some killer looking visuals.
 

kaching

"GAF's biggest wanker"
Reading the explanation, I can't help but wonder how that's actually less resource intensive than "just" rendering a full native 1080p frame, because the resources needed to interpolate everything from these "half" frames (as well as the full frame used for AA) don't exactly seem trivial. But I readily admit that's just a layman's ignorant perspective on the technique.
 
Wonder if they will show off their fully optimized engine at GDC

I know when SF shipped it was only 70% done

I also wonder if they're presentation will show some concept art for their new I.P
 

zzz79

Banned
Wonderful how you explained everything by yourself.
The framebuffer is the important thing in videogames as the games native resolution will be upscaled by the hardware to get a compatible signal to the HDMI out of the console. In this case a 1080p signal that is based on an upscaled low resolution image which is not 1:1 pixel mapped (For example CoD and BF4 with 1280x720 pixels on xbone). The result is a blurry, artifacted, foggy and soft image that simply looks bad. If the framebuffer sends out an true native 1080p image like with Killzones SP, then the consoles output is also 1080p, but the image isnt scaled internally to match that standard as its 1:1 pixel mapped. 1600x900 or 1280x720 pixels need to be upscaled to fit your screen and for your display to detect it as correct signal, thats why xbone outputs "1080p" but the image that is shown is simply a low res native image with bad IQ as its not 1:1 pixel mapped. Movies in wider ratio in Full HD are native, as well as the Order. The letterbox is part of the image! This is fact, deny it as you want, it only shows your ignorance and complete lack of knowledge about this topic.


So again, is every XBO game native 1080p or not ?!

Not asking you about whether it is blurry, artifacted, foggy and so on ... only the resolution
 

Metfanant

Member
You want to blow some minds? Find a nice quality CRT monitor, load up a modern game at 1024x768 with some MSAA and you'll find image quality more pleasing than many an LCD rendering at a much higher resolution. Factor in the motion resolution and you've got some killer looking visuals.

the downfall of image quality is the obsession with how thin a display can get...

So again, is every XBO game native 1080p or not ?!

Not asking you about whether it is blurry, artifacted, foggy and so on ... only the resolution

So again, absolutely 100% NOT native 1080p
 

chadskin

Member
Cool, then again the XBO outputs 1920x1080 progressive pixels for each game, so the TV can display it 1:1, means at it is the best quality because its coupled with the displays own native resolution . Great!

So are all XBO games native 1080p or not ?

If not, they neither is the order native 1080p ... and don't mind if you agree or not anymore :)

And there is a difference in the terms "rendered" and "recorded"&"displayed" because as you see XBO games are rendered only at lower res than 1080p but are displayed at 1080p (in your opinion this would be even native 1080p)

People have been trying to explain to you the difference between an internally rendered resolution and the resolution of the output you eventually see on the screen. What the hell, man?

When people talk about the resolution of a game, they mean the internally rendered one, NOT the one the console will output eventually, as all consoles, down to PS3 and Xbox 360 output in 1080p.


If a game is internally rendered at 720p like MGS Ground Zeroes, this is the actual size of the image in the picture above. Most people don't have a 720p TV but a 1080p TV though, which leads to a problem because, as you can clearly see in the picture, the 1080p image is much bigger than the 720p image.

One option would be to display the 720p image on the 1080p TV in the actual size, meaning there'd be huge black bars all around the image. Not good.

So the technical wizards came up with a mathematical operation to upscale the 720p picture to 1080p for 1080p TVs.
Obviously, a 720p image has less pixel information (by roughly a million, actually) than a 1080p image which means the additional pixels must be "made up". That's done through interpolating the pixels based on their neighboring pixels.

Naturally, upscaled from 720p to 1080p will result in image differences, most notably the clarity of the image. At a more detailed level, it results also in this:

2564953-5729570096-25649.png


BF4 XB1 @ 720p upscaled to 1080p, PS4 @ 900p upscaled to 1080p and PC @ native 1080p.
 

DirtyLarry

Member
Absolute nonsense. Trying to redefine what 1080p means. It's hard enough for people like me to understand these things.

Edit: It makes sense to everyone else so its me who is speaking nonsense.
Must be one hell of a way to live life. For example unless you yourself are an academic or scientist or sociologist or mathematician, the list goes on and on, I would imagine most everything discussed by the scientific community for example meets your criteria of being "absolute nonsense."

BOT, they should have just shared this method straight away as I think they might have been pleasantly surprised at the reaction for even coming up with this method to begin with. Sure you would have had those declaring it was not native so it was rubbish, but you would have had others who simply just appreciate the fact they developed this to begin with. Which my very basic understanding of it all tells me they developed a pretty complex method to overcome the limitations they were facing.
 

RoboPlato

I'd be in the dick
Really glad they wrote up a clear and detailed explanation of the technique and we finally have a name for it. I still don't like that they have, and seemingly will continue, to call it native 1080p but hopefully anyone else using this method will refer to is as being temporally reprojected.
 

dark10x

Digital Foundry pixel pusher
the downfall of image quality is the obsession with how thin a display can get...
Yup.

It's really a travesty. I walk into a store these days and am dismayed by the fact that virtually every display available would be a downgrade from what I'm using (which isn't even perfect).

PC LCD monitors are the worst, though. I have yet to find a single PC monitor that can do any sort of local dimming or whatever is necessary to achieve decent black levels. The best PC monitors still look like dogshit when you turn off that lights. That alone makes it impossible for me to enjoy games on a monitor.
 
Top Bottom