• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Guerrilla Games: Regarding Killzone Shadow Fall and 1080p

TyrantII

Member
well lets be perfectly honest...anyone saying that a 1280x720 image upscaled by a 1080p TV (or the console itself) is a "blurry mess" or "smeared with vaseline" is completely overreacting anyway...

would an non-upscaled, native res image look better? absolutely...but saying CoD Ghosts on the Xbone is a "blurry mess" just takes things too far...certainly not as sharp as the PS4 version, but far from a blurry mess

Compared to last gen its a god send!

That said, it is very notable on games with long draw distances like open world games. Less ability to resolve distant detail really shows the problems with vanilla upscaling.

AC4 is a great example. BF even more so.
 

quest

Not Banned from OT
Everyone once again misunderstands why this even matters.

It matters because we are stuck with fixed pixel displays. That's it.

1080p matters as it is the standard resolution across the vast majority of modern TVs. If the resolution is not perfectly divisible by the display resolution you will see blurring (or distorted pixels) in upscaling. If a proper 960x540 could actually be output and accepted by a 1080p TV it could potentially appear much cleaner than 1280x720 due to the fact that you could double pixels evenly in both directions.

It's this whole issue of upscaling that makes the Killzone solution so damn good. It still produces an image which looks like full 1920x1080 without any of the scaling artifacts you get with a lower resolution.

Those of us that actually give a shit about this stuff (and aren't simply console warriors) would understand this. It's all about eliminating upscale blur.

If we were all using displays without fixed pixel grids this whole argument wouldn't really matter.

The SNES produced a significantly lower resolution than the Genesis in most cases but it didn't matter on a CRT as it could handle both without an issue. Emulation of SNES is actually difficult for this reason as it used non-square pixels (256x224 vs 320x224 on Genesis).


The 1920x800 solution will produce a much better image simply because it avoids upscaling artifacts. THAT'S why 1600x900 is inferior.

I am hardly a console warrior hell bought a PS4 day 1 and no plans to buy a XBone. And I will disagree with you because I care more about motion bluring than softer image. It is about accepting which trade off you would rather have. A slight upscale is not issue for my plasma. Introduce something that causes blur in motion and the faster the motion the worse blur and artifacts is much worse IMO. But I can't tolerate watching fast moving stuff like sports on LCD TVs.
 

Gestault

Member
Everyone once again misunderstands why this even matters.

It matters because we are stuck with fixed pixel displays. That's it.

1080p matters as it is the standard resolution across the vast majority of modern TVs. If the resolution is not perfectly divisible by the display resolution you will see blurring (or distorted pixels) in upscaling. If a proper 960x540 could actually be output and accepted by a 1080p TV it could potentially appear much cleaner than 1280x720 due to the fact that you could double pixels evenly in both directions.

It's this whole issue of upscaling that makes the Killzone solution so damn good. It still produces an image which looks like full 1920x1080 without any of the scaling artifacts you get with a lower resolution.

Those of us that actually give a shit about this stuff (and aren't simply console warriors) would understand this. It's all about eliminating upscale blur.

If we were all using displays without fixed pixel grids this whole argument wouldn't really matter.

The SNES produced a significantly lower resolution than the Genesis in most cases but it didn't matter on a CRT as it could handle both without an issue. Emulation of SNES is actually difficult for this reason as it used non-square pixels (256x224 vs 320x224 on Genesis).


The 1920x800 solution will produce a much better image simply because it avoids upscaling artifacts. THAT'S why 1600x900 is inferior.

For the sake of the technology conversation, wouldn't the observable blur we see in this example demonstrate why this solution isn't necessarily desirable? Here we have mild ghosting, aliasing, and interpolation artifacts in motion, on top of the apparent blur that was complained about since day one, when a properly handles 900p signal avoids all of those. I don't mean to oversimplify this, and I'm not saying categorically that you're necessarily wrong, but it seems like you're picking and choosing your data based on a conclusion you already reached.

Yes, I completely understand why one-to-one scaling makes the upscaling process clearer, but you're also talking about it in a situation where that doesn't apply because there is no scaling, and we had very early complaints of blurriness. You're acting like this process doesn't introduce its own problems. I just want to make the point that you're making a simplistic conclusion based on one metric at the expense of all others, including the overall impression of the scene.
 

Jack cw

Member
Yeah that's fine but it's not "a low res native image with bad IQ" which is what you said in your previous post :p

Haha, ok it was a generalization I admit that, but the image quality is also dependant on the display you have. The better tech your screen has the better the IQ is with non native material.
 

dark10x

Digital Foundry pixel pusher
I am hardly a console warrior hell bought a PS4 day 1 and no plans to buy a XBone. And I will disagree with you because I care more about motion bluring than softer image. It is about accepting which trade off you would rather have. A slight upscale is not issue for my plasma. Introduce something that causes blur in motion and the faster the motion the worse blur and artifacts is much worse IMO. But I can't tolerate watching fast moving stuff like sports on LCD TVs.
As good as plasmas are, they are still fixed pixel displays and the problems are the same.

Most of the blur you see in Killzone is genuine motion blur, not artifacts from this technique.

Here we have mild ghosting, aliasing, and interpolation artifacts in motion, on top of the apparent blur that was complained about since day one, when a properly handles 900p signal avoids all of those. I don't mean to oversimplify this, and I'm not saying categorically that you're necessarily wrong, but it seems like you're picking and choosing your data based on a conclusion you already reached.
The conclusion is dead simple: fixed pixel displays require input that is evenly divisible by the pixel grid in order to achieve acceptable image quality.

What Killzone does is not a perfect solution but it is vastly superior to something like 900p.

If we were using CRTs or displays without fixed pixels, however, I would definitely take 900p over the method used in Killzone. I only find the Killzone method desirable as it eliminates upscale blur.
 

zzz79

Banned
when you play The Order, the PS4 will be sending an unmolested 1920x1080 frame to your television....therefore 1080p...

and where is the word native ??? lost ?

and what's up to XBO games, the XBO will also send a 1920x1080 frame to the TV ... there fore 1080p

simple as that ... we forgot only that magic word NATIVE ;)
 
Ok, let's see:

A: 1920x800 game pixles + 1920x280 black pixles (no game info) IS native 1080p

B: 1600x900 game pixles + 320x180 artificial pixles (game info) IS NOT native 1080p

Right?

Both are not RENDERED at native 1080p! That's my point ... and I know A would be sharper than B, but still both are not native 1080p

btw. I don't have a XBO, neither a PS4 so chill out and watch out with your judgement ;)

I'm not chatting graphics with a guy who can't even spell the word pixel.
 
Reading the explanation, I can't help but wonder how that's actually less resource intensive than "just" rendering a full native 1080p frame, because the resources needed to interpolate everything from these "half" frames (as well as the full frame used for AA) don't exactly seem trivial. But I readily admit that's just a layman's ignorant perspective on the technique.
that is why the resolution-framerate videos from digital foundry greatly undermine the graphics because it takes two figures and people can just go off of those. kz sf uses a lot of effects in shadows, particles, lighting, etc. that to just sweep all those under "nye nye not 1080p" is just depressing.
 

Figboy79

Aftershock LA
I had to reread it to fully understand it, but it sounds like a fantastic technique to use over simply upscaling a 720p image to 1080p, so the image looks clearer. I actually haven't even played Shadowfall's MP. It amazes me the work that developers put into these games to entertain us in the best way they can.
 

JLeack

Banned
Absolute nonsense. Trying to redefine what 1080p means. It's hard enough for people like me to understand these things.

Edit: It makes sense to everyone else so its me who is speaking nonsense.

Makes sense, but basically theyre dancing around the definition of 1080p. They want to call it 1080p, while I want to call it 1080p with an asterisk, or pseudo 1080p.
 

chadskin

Member
A game rendered at 1920x800 is not native 1080p and that's the point, because it contains only 1920x800 real rendered pixels which are calculated by the CPU/GPU.
The original game image size is less than 1920x1080, therefore it's not native 1080p.

With the game resolution 1920x800 you can:
- either add some black bars to get 1920x1080, but its not native rendered although it hits the 1920x1080 progressive pixels in the output signal of the console.
- or you can scale up the game image size to 1920x1080, by adding artificial pixels to hit the output signal at 1080p.

In both methods the game is not rendered at native 1080p so you can't say its native 1080p because the output signal on the console is 1080p.

Or in other way if you want to call the game rendered at 1920x800 including balck bars as native 1080p then also the game rendered at 1920x800 up scaled to 1920x1080 should be called native 1080p ... ;)

Obviously, a game rendering at 1920 x 800 or 1920 x 1080 with forced black bars is less demanding for the hardware. No one is debating that. Otherwise RAD wouldn't likely be able to tinker with 4x MSAA. That's not the point, though. It's still true 1:1 pixel mapping, thus native.

Put 1920 x 800 and 1920 x 280 in your calculator and add the sums up.
Put 1920 x 1080 in your calculator.

Compare the result. 1:1 pixel mapping either way -> native.
 

quest

Not Banned from OT
As good as plasmas are, they are still fixed pixel displays and the problems are the same.

Most of the blur you see in Killzone is genuine motion blur, not artifacts from this technique.

The single player does not have the motion bluring. So we know the cause of it and it is this fancy interlacing. It is about the trade off you want like I said before. It is like TVs plasma and LCD both have trade offs. We disagree with what trade off is the best.
 

zzz79

Banned
the downfall of image quality is the obsession with how thin a display can get...



So again, absolutely 100% NOT native 1080p

Thanks, finlay :)

So nor is KZ:MP rendered at natvie 1080p nor is the The Order rendered at native 1080p .. so how does it come that some PS4 fans call them native 1080p ?

In both cases the games are not rendered at 1920x1080 but less ... so it depends of the filling up method to reach 1920x1080 whether it's called native 1080p or not?

Filling up with black bars makes a game called native 1080p ?
Filling up with interlaced pixel lines makes a game called native 1080p?
Filling up by scaling the image 8adding artificial pixels) makes a game not called native 1080p?

So it depends only of the filling method whether a game is called native 1080p or not?
 
It's essentially a fancy version of 1080i. I quite like it, but it's noticeably blurrier than 1080p.

Most AV geeks agree that 720p is better than 1080i, right? In this case I'd say 1080i is better, but it's still considerably worse than the SP or any other 1080p game.
 

zzz79

Banned
Obviously, a game rendering at 1920 x 800 or 1920 x 1080 with forced black bars is less demanding for the hardware. No one is debating that. Otherwise RAD wouldn't likely be able to tinker with 4x MSAA. That's not the point, though. It's still true 1:1 pixel mapping, thus native.

Put 1920 x 800 and 1920 x 280 in your calculator and add the sums up.
Put 1920 x 1080 in your calculator.

Compare the result. 1:1 pixel mapping either way -> native.

So what about interlaced method or even the scaling method by adding artificial pixels ?

Using the calculator everything ends up in the 1920x1080 ... but tat still not native 1080p game rendering :p
 

daveo42

Banned
Do you even need to render each frame in native 1080p if you're output is native 1080p? By my logic, they render 60 half-frames, but all historical frames and current frames are native and meshed together making a full native frame.

Is using historical frames really mean that what is output isn't native 1080p? I mean, the other option would have been running MP at 30fps with each frame completely rendered.

I think what they've done here is quite fantastic and brilliant.
 

Jack cw

Member
So again, is every XBO game native 1080p or not ?!

Not asking you about whether it is blurry, artifacted, foggy and so on ... only the resolution

What is your point mate?
You have no idea what you are talking about and everytime a fellow gaffer corrects your statments, you just twist and turn and jump to another hilarious claim denying and ignoring any technical facts. Your question has been answered half a dozen times now. Any game has its own internal resolution and that is what matters. A 720p or 900p image is a 720p or 900p image regardless of the consoles HDMI output. A native 1080p internal resolution is a 1080p image 1:1 pixel mapped for 1080p displays.
 

Metfanant

Member
and where is the word native ??? lost ?

and what's up to XBO games, the XBO will also send a 1920x1080 frame to the TV ... there fore 1080p

simple as that ... we forgot only that magic word NATIVE ;)

NO....the Xbone games youre refering to render at 1280x720, or 1600x900...and are then UPSCALED to 1920x1080....hence not native...

Thanks, finlay :)

So nor is KZ:MP rendered at natvie 1080p nor is the The Order rendered at native 1080p .. so how does it come that some PS4 fans call them native 1080p ?

The Order is native 1080p in a 2.40:1 aspect ratio....Killzone is a different ballgame...not sure im comfortable calling it native 1080p...its more akin to a 1080i image to me...still natively 1920x1080 pixels though...

In both cases the games are not rendered at 1920x1080 but less ... so it depends of the filling up method to reach 1920x1080 whether it's called native 1080p or not?

Filling up with black bars makes a game called native 1080p ?
Filling up with interlaced pixel lines makes a game called native 1080p?
Filling up by scaling the image 8adding artificial pixels) makes a game not called native 1080p?

So it depends only of the filling method whether a game is called native 1080p or not?

the order is 1920x1080...the bottom and top 200 pixels just happen to be black....

I
Most AV geeks agree that 720p is better than 1080i, right? In this case I'd say 1080i is better, but it's still considerably worse than the SP or any other 1080p game.

720p is going to give better results with fast moving images (sports)...this is why ESPN chooses to broadcast in 720p over 1080i...but 1080i still offers the pixel count and clarity of a 1920x1080 image...
 

wicko

Member
The single player does not have the motion bluring. So we know the cause of it and it is this fancy interlacing. It is about the trade off you want like I said before. It is like TVs plasma and LCD both have trade offs. We disagree with what trade off is the best.

You're still just guessing whether the blurring is caused by motion blur or temporal reprojection. The only way you could be 100% certain is if you had a build of KZ: SF where you could toggle on/off these effects. Motion blur may be intentionally turned off in SP for performance reasons. SP may be using effects that aren't in MP. We already know the SP is rendered differently, yet your reasoning implies they are the same.
 
Nice by them to give us a statement but in the end the multiplayer looks soft. And on top of that they are using a FXAA like aa solution to introduce even more blur.

In the future please real native 1080p output.
 

MrZekToR

Banned
Hmm, have you played Ryse on a big screen? (genuine question)

I have. (on a 47" LG 1080P TV)

It looks absolutely fantastic. Even at 720P I reckon it would have looked fantastic too.

Personally, I think all of this 'resolution' fiasco is a total waste of time.

Objects rendered close to the 'camera' (players perspective) will be nigh on no different visually between 720, 900, 1080P.

I think the only benefit you will see from higher resolution is in the finer details 'in the distance' (away from the camera / players perspective).

...and those 'in the distance' details (for most games) probably account for less than 10% importance in the overall image.

For instance, trying to locate a sniper / spot an enemy in the distance.

The benefit of 1080P over 720P in this respect - is actually fairly minimal.

But some would have you believe the difference is like night and day. Which is absolutely not the case.

People will state that the graphics on the PS4 are 'significantly' better than the Xbox One's because of 1080P. That's utter BS.

The PS4's graphics are 'significantly' better than those of a say a machine of the N64's era.

Hell, the PS4's graphics are not even 'significantly' better than what we've seen on Xbox360 and PS3 of recent times.

They are better, of course, but not 'significantly' better.

The differences graphically between the Xbox One and the PS4 are negligible at best.

If people are worried about the crispness of a less than 10% distance detail rendered on screen - go buy a PS4.

I've gone the Xbox One route because it does more of the things I like to enjoy. I like Kinect - it integrates nicely into my AV setup and is very convenient to use.

I'm not worried by frankly 'trivial' details like whether that tree over yonder is rendered with slightly more pixels on one compared to the other. Because it is just that - trivial.

I'm more interested in great games. As long as they look good enough, and play well - then I am more than happy.

Obviously, I've replied to a question you were directing to someone else... just felt like responding to your post.
 

Lonely1

Unconfirmed Member
Everyone once again misunderstands why this even matters.

It matters because we are stuck with fixed pixel displays. That's it.

1080p matters as it is the standard resolution across the vast majority of modern TVs. If the resolution is not perfectly divisible by the display resolution you will see blurring (or distorted pixels) in upscaling. If a proper 960x540 could actually be output and accepted by a 1080p TV it could potentially appear much cleaner than 1280x720 due to the fact that you could double pixels evenly in both directions.

It's this whole issue of upscaling that makes the Killzone solution so damn good. It still produces an image which looks like full 1920x1080 without any of the scaling artifacts you get with a lower resolution.

Those of us that actually give a shit about this stuff (and aren't simply console warriors) would understand this. It's all about eliminating upscale blur.

If we were all using displays without fixed pixel grids this whole argument wouldn't really matter.

The SNES produced a significantly lower resolution than the Genesis in most cases but it didn't matter on a CRT as it could handle both without an issue. Emulation of SNES is actually difficult for this reason as it used non-square pixels (256x224 vs 320x224 on Genesis).


The 1920x800 solution will produce a much better image simply because it avoids upscaling artifacts. THAT'S why 1600x900 is inferior.

Most of my acquaintances couldn't understand why I spend so much effort on repairing my 44 inch Samsung CRT TV instead of getting a "new, better TV". If only it was 1080p instead of 1080i/720p...
 

chadskin

Member
So what about interlaced method or even the scaling method by adding artificial pixels ?

Using the calculator everything ends up in the 1920x1080 ... but tat still not native 1080p game rendering :p

Oh, sorry, it seems you didn't learn to multiply numbers in school just yet.
Let me help you:

1920 x 800 = 1,536,000 pixels
+
1920 x 280 = 537,600 pixels
== 2,073,600 pixels

1920 x 1080 = 2,073,600 pixels

Both numbers are the same, you see. Which brings me back to my statement approximately two hours ago: No matter how you slice it, it's still 1080p. It's still native. It's still being displayed in a true 1:1 pixel mapping manner on a 1080p TV. This applies to games as well as to movies.

As for your other questions: You don't even seem to understand what people are talking about here in the first place, the very basic principles of what native and upscaled means. I don't see what good it'll do if I spend my time explaining what interlacing is to someone as ignorant as you are.
 

zzz79

Banned
Any game has its own internal resolution and that is what matters. A 720p or 900p image is a 720p or 900p image regardless of the consoles HDMI output. A native 1080p internal resolution is a 1080p image 1:1 pixel mapped for 1080p displays.

I know :)

rendered at 1920x1080 = native 1080p

rendered at 1600x900 is not native 1080p, even if the output signal is 1080p
rendered at 1280x720 is not native 1080p, even if the output signal is 1080p
rendered at 1920x800 is not native 1080p, even if the output signal is 1080p
rendered at 960x1080 is not native 1080p, even if the output signal is 1080p

end of the story ... and my point is that some of the PS4 owners (so you do) don't agree on that and that makes the whole story ridiculous

--update--
or if you like it the other way round, then just call everything native 1080p :p
 

TyrantII

Member
It's essentially a fancy version of 1080i. I quite like it, but it's noticeably blurrier than 1080p.

Most AV geeks agree that 720p is better than 1080i, right? In this case I'd say 1080i is better, but it's still considerably worse than the SP or any other 1080p game.

No, no. Its very different than 1080i.

That's part of the problem in explaining it. Its a new method that's easily to misconstrue.

Someone above said it best: its always 1080P but with 100% accuracy issues from if it was straight rendered. Its almost like a sort of lossy encoding that results in IQ better than 960P, more than anything.
 

BigTnaples

Todd Howard's Secret GAF Account
I know :)

rendered at 1920x1080 = native 1080p

rendered at 1600x900 is not native 1080p, even if the output signal is 1080p
rendered at 1280x720 is not native 1080p, even if the output signal is 1080p
rendered at 1920x800 is not native 1080p, even if the output signal is 1080p
rendered at 960x1080 is not native 1080p, even if the output signal is 1080p

end of the story ... and my point is that some of the PS4 owners (so you do) don't agree on that and that makes the whole story ridiculous

--update--
or if you like it the other way round, then just call everything native 1080p :p



Just stop man. Please.
 

Jack cw

Member
As for your other questions: You don't even seem to understand what people are talking about here in the first place, the very basic principles of what native and upscaled means. I don't see what good it'll do if I spend my time explaining what interlacing is to someone as ignorant as you are.

.

I'm out of here. This is getting to a point where this guy denies fact. I guess its all said and done and its sad a topic like this had to derail that hard because of childish trolling.
 

RoboPlato

I'd be in the dick
Rereading it, I still don't know why they went with a technique like this that seems so complex. The results look surprisingly good at 60fps but the game hits that far too little to make the drop in image quality worth it. If this technique can improve to the point where there is a bit less blur and it can guarantee 60fps all the time, then it could be very worthwhile to avoid upscaling artifacts in situations where 1920x1080 isn't achieveable.
 

BigTnaples

Todd Howard's Secret GAF Account
I'm pretty sure people have expressed disappointment at the difference in image clarity between SP/MP since the game released.


Did we notice the difference? Yes. But KZSF multi still looks graphically superior to every other multiplayer game and most single player games on either platform at the moment.



While not quite as good as the clarity of SP, it is still better IQ than BF4, Ryse, CoD XBO, or any other sub native 1080p game.

The technique isn't perfect , but it looks damned good, to the point where most people, even people who are absolute graphics whores were tricked into thinking it was 1080p with FXAA. That's impressive.
 

Metfanant

Member
Rereading it, I still don't know why they went with a technique like this that seems so complex. The results look surprisingly good at 60fps but the game hits that far too little to make the drop in image quality worth it. If this technique can improve to the point where there is a bit less blur and it can guarantee 60fps all the time, then it could be very worthwhile to avoid upscaling artifacts in situations where 1920x1080 isn't achieveable.

as others have said in this and other threads...i think there was a very late decision made to push for 60fps in multiplayer...and this was an attempt at getting there...

personally i never had a problem with KZ2/3 MP on the PS3 being 30fps....so IMO i think the full 1920x1080p image @ a locked 30fps would have been the way to go in both modes.....

but im the outlier on these forums...30fps doesnt make my eyes bleed, and 720p doesnt look like vaseline...
 
PSA: Please study the following image (that has been clearly upscaled but has been upscaled bilinearly or bicubicly I leave to you experts) if you like to throw around words like "native 1080p" without actually understanding for example anything that was posted in OP

47t2114508.jpg
 

140.85

Cognitive Dissonance, Distilled
I know :)

rendered at 1920x1080 = native 1080p

rendered at 1600x900 is not native 1080p, even if the output signal is 1080p
rendered at 1280x720 is not native 1080p, even if the output signal is 1080p
rendered at 1920x800 is not native 1080p, even if the output signal is 1080p
rendered at 960x1080 is not native 1080p, even if the output signal is 1080p

end of the story ... and my point is that some of the PS4 owners (so you do) don't agree on that and that makes the whole story ridiculous

--update--
or if you like it the other way round, then just call everything native 1080p :p

But again we have a moving target as far as definitions go. What exactly does rendering mean? Is that an objective term that everyone agrees on? I'd say it's not. When I think of native 1080p output what I'm looking for is if the final rendered image sent to the display is mapped 1:1 on the pixels of my 1080p TV.

It's not the developers fault that a lot of these standards are not broken down and completely separated from the actual rendering side of things. Clearly, rendering final output at a lower resolution and upscaling is not native. Constructing 1920 lines horizontally using the current and past frames information and then sending it to the display counts as native according to at least one objective definition.

If gamers are serious about knowing how a game is performing then I guess it's time to demand a RENDERED and OUTPUT breakdown on every box to clear up confusion. Until then, don't accuse people of lying when they meet objective standards.
 
As good as plasmas are, they are still fixed pixel displays and the problems are the same.

Most of the blur you see in Killzone is genuine motion blur, not artifacts from this technique.


The conclusion is dead simple: fixed pixel displays require input that is evenly divisible by the pixel grid in order to achieve acceptable image quality.

What Killzone does is not a perfect solution but it is vastly superior to something like 900p.

If we were using CRTs or displays without fixed pixels, however, I would definitely take 900p over the method used in Killzone. I only find the Killzone method desirable as it eliminates upscale blur.

960 x 1080 is a lot less pixels than 1600 x 900, heck it's even less than 1408 x 792. Should developers start using that resolution instead going forward as opposed to using 900p as was the case with BF4? I guess I would have to compare BF4 using temporal 960 x 1080 vs 1600 x 900 before making a conclusion as to which is better.
 
I don't care too much about resolution, so I wish devs wouldn't even try to defend their resolution choices (framerate is a little different, there I like to at least hear why they chose fixed 30 over whatever).
Anyway if a defense looks like this short, simple, non-patronzing and even educative explanation from Guerrilla Games, then I'm all for resolution debates.
 
Top Bottom