• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Killzone: Shadow Fall Multiplayer Runs at 960x1080 vertically interlaced

I spent time running around in multi yesterday and it really is convincing. It looks like proper 1080p. There is something slightly strange about it in motion but it looks like a post processing effect more than anything. It's worlds beyond even 900p due to lack of upscaling blur.

This is a good point. There is no way the DF guys just figured this out from looking at it. We have some of the best eyes on the internet at peeping pixels and figuring out tech shit here on GAF, and for months not one of them was able to figure this out. So GG clearly shared info with DF (99.9% certain) but they only brought it to light now. Why?

So are we back to resolution doesn't matter now?

All in all, it's amazing tech and I'm sure that with more time 60fps would have been attainable. More devs should be looking at similar solutions.

Don't like the dishonesty coming from GG and Sony though.
 

RoboPlato

I'd be in the dick
Now it's interlaced, too? Funny thing people made such a fuss about the Xbox One and its 720p games while clamoring if was easy to see the difference between 720p and 1080p.

And we saw the difference here too, despite being told otherwise. This more proves that people can tell the difference more than it proves we can't.
 

m@cross

Member
I guess this would bother me more if KZ was designed as a multi-player game. Sure they do promote the multi-player, but really it is a single player first type of thing for me.
 

bombshell

Member
Something bothers me about the new title. The game runs 1920x1080 interlaced if anything (not to mention interlaced usually is the horizontal lines, not as here the vertical).
 

RoboPlato

I'd be in the dick
Something bothers me about the new title. The game runs 1920x1080 interlaced if anything (not to mention interlaced usually is the horizontal lines, not as here the vertical).

I'm not sure who changed it in the first place or what to change it to. Kind of hard to sum this up in a thread title.
 

dark10x

Digital Foundry pixel pusher
So are we back to resolution doesn't matter now?
The more important question is, do you actually understand why resolution mattered in the first place?

The answer is simple; upscaling is ugly.

If we were using displays without fixed resolutions, 1280x720 would actually still appear perfectly clean and attractive. You would lose some detail but the image would be totally acceptable. This resolution does not divide perfectly into 1920x1080, however, and the resulting pixels are distorted which requires a type of bilinear filtering to smooth out the uneven pixels. The end result is ugly.

This is why 720p, 900p, and anything else in between is considered less than optimal. If all TVs were 1440p then 1920x1080 would actually appear rather unsightly as well.

1080p isn't a magic number, it's simply a number that matches the native resolution of most TVs.

The method used in Killzone manages to avoid upscaling blur completely. While there is some loss in detail, the end result is still sharp and relatively clean.

The whole argument in favor of 1080p should revolve around the elimination of upscaling on fixed pixel displays.

Funny thing people made such a fuss about the Xbox One and its 720p games while clamoring if was easy to see the difference between 720p and 1080p.
It's very easy to see the difference between native 720p and 1080p. This situation is completely different and sidesteps the problems with using 720p.
 

RoboPlato

I'd be in the dick
The more important question is, do you actually understand why resolution mattered in the first place?

The answer is simple; upscaling is ugly.

If we were using displays without fixed resolutions, 1280x720 would actually still appear perfectly clean and attractive. You would lose some detail but the image would be totally acceptable. This resolution does not divide perfectly into 1920x1080, however, and the resulting pixels are distorted which requires a type of bilinear filtering to smooth out the uneven pixels. The end result is ugly.

This is why 720p, 900p, and anything else in between is considered less than optimal. If all TVs were 1440p then 1920x1080 would actually appear rather unsightly as well.

1080p isn't a magic number, it's simply a number that matches the native resolution of most TVs.

The method used in Killzone manages to avoid upscaling blur completely. While there is some loss in detail, the end result is still sharp and relatively clean.

The whole argument in favor of 1080p should revolve around the elimination of upscaling on fixed pixel displays.

We really should make this point more often. A lot of people probably don't understand why 1080p looks so much better on their sets versus other resolutions. Too bad it doesn't seem like we're going to be getting any non fixed displays anytime in the near future.
 

dark10x

Digital Foundry pixel pusher
We really should make this point more often. A lot of people probably don't understand why 1080p looks so much better on their sets versus other resolutions. Too bad it doesn't seem like we're going to be getting any non fixed displays anytime in the near future.
Yeah, it's kind of impossible if we wish to maintain the thin profile of modern displays.

CRTs worked out that way due to the way the picture was drawn.

It's just a shame that 720p to 1080p was only a 50% increase in resolution. 720p to 1440p provides a clean upscale with square pixels.
 

RoboPlato

I'd be in the dick
Yeah, it's kind of impossible if we wish to maintain the thin profile of modern displays.

CRTs worked out that way due to the way the picture was drawn.

It's just a shame that 720p to 1080p was only a 50% increase in resolution. 720p to 1440p provides a clean upscale with square pixels.

The state of the display industry makes me so sad. No focus on PQ anymore and motion resolution is disgusting these days.
 

RoadHazard

Gold Member
Wait, why is the thread title now saying it's "960x1080 interlaced"? That implies each frame is just 960x540 (1080i is 1920x540, with alternating horizontal lines rendered). But that's not true, is it?

If we're gonna call this "interlaced" (which, again, usually implies vertical interlacing), we should be saying it's "1920x1080 horizontally interlaced" or something. What's in the title now is quite misleading.

EDIT: What "vertical" vs "horizontal" interlacing means depends on how you look at it, I guess. I think of it like you're talking about the direction in which the rendered lines are alternating. So that would make vertical interlacing what you usually mean, and horizontal interlacing what's happening here. But maybe I've got the terminology wrong there.
 

bombshell

Member
Wait, why is the thread title now saying it's "960x1080 interlaced"? That implies each frame is just 960x540 (1080i is 1920x540, with alternating horizontal lines rendered). But that's not true, is it?

If we're gonna call this "interlaced" (which, again, usually implies vertical interlacing), we should be saying it's "1920x1080 horizontally interlaced" or something. What's in the title now is quite misleading.

I made this observation a few posts up, so I agree :)

But the normal is horizontal interlaced, here it uses an unusual vertical interlacing.
 

RoadHazard

Gold Member
I made this observation a few posts up, so I agree :)

But the normal is horizontal interlaced, here it uses an unusual vertical interlacing.

I updated my post to include an edit about "vertical" vs "horizontal". Maybe I got it wrong, just made more sense it my brain to call the normal kind of interlacing "vertical", since that's the direction in which the lines are alternating. In the horizontal direction you've just got normally rendered lines (the ones that are there). But if you think of it as rendering half as many horizontal lines per frame, then the opposite makes sense. Oh well.
 

bombshell

Member
I updated my post to include an edit about "vertical" vs "horizontal". Maybe I got it wrong, just made more sense it my brain to call the normal kind of interlacing "vertical", since that's the direction in which the lines are alternating. In the horizontal direction you've just got normally rendered lines (the ones that are there). But if you think of it as rendering half as many horizontal lines per frame, then the opposite makes sense. Oh well.

Yeah, it can be looked at both ways, but the definition is that you are (normally) interlacing the horizontal lines of the display, but here the vertical lines are interlaced.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
I agree with you but I still don't like that they called it 1080p native multiple times since that directly implies 1920x1080.

Yes, no question here.

DF not reporting it is very odd though.

Indeed, especially since the topic of using information from previous frames has definitly been a topic that they covered:

http://www.eurogamer.net/articles/digitalfoundry-the-making-of-killzone-shadow-fall

DigitalFoundry said:
Adding to the quality is Guerrilla's chosen anti-aliasing solution. Back in the February reveal, FXAA was utilised with the firm hinting at a more refined TMAA solution under development in Sony's Advanced Technology Group. TMAA is now TSSAA - temporal super-sampling anti-aliasing.

"It is still very similar to FXAA. It's not FXAA, it's a similar filter but much higher quality," explains Michal Valient.

"It's still screen-space - TSSAA - we've lost the sub-sampling of depth. We had that sub-sampling as well, reconstructing edges with that but it was costly and didn't really add that much so we removed that," adds van der Leeuw.

"We collect samples from the previous frames and try to blend that into the mix on a basis that is a derivative of FXAA. It's something that blends in a little bit of temporal math, trying to get something from previous frames."

"We're very careful in what we use from the previous frames. If it's a good frame, why throw it away? We basically gather as much of the pixels from the previous frames that you can apply to this one as well and re-project them," adds Valient. "We get more samples per pixels. Normally you'd need to apply super-sampling or multi-sampling to get a similar effect. We try to get the data from the previous frame."

Re-using existing data for other systems is an approach often used by developers - Guerrilla uses this not just for anti-aliasing, but for ambient occlusion too. But this isn't your standard screen-space approach:

"It's called directional occlusion. It's a by-product of our reflection system. We use the calculation for reflections to determine how much of a space is actually visible from a certain point and we re-use the same information for AO," explains Michal Valient. "AO, if you think about it, is the same as reflection but with a much wider lobe. You want to know about the entire hemisphere around any point, how much is visible, so you can re-use that information. I wouldn't say it's SSAO because there's way too many tricks in it to make it look good."
 

bombshell

Member
I left that in on purpose because that's what is actually being rendered before the interlace takes effect. I know it's a bit misleading but I wanted to make it clear that it wasn't rendering 1920x1080.

But it is rendering the full resolution, it's in the definition of interlacing that it interchanges the lines every frame.

Now 960x1080 interlaced means it's interlacing 480 lines and 960 is the full width, which is incorrect.
 

thuway

Member
You have a Kuro? That would explain a lot (and I should have figured it out from your avatar). I just have a 32inch Samsung LCD from a few years back. Planning a substantial upgrade later this year.

My unprofessional opinion suspects your aging LCD might be exacerbating issues with artifacts and is having a hard time dealing with the insane amount of blending in between frames. Putting it nicely, if your television is already having trouble with fast motion images; this will only amplify the problem.

This is some surreal stuff we have going on.
 

mrklaw

MrArseFace
Here's a shot taken while rotating the camera at a medium steady rate (taken from a capture card rather than using the built-in sharing feature).

Considering the technique being used, I'm impressed with how artifact free the image appears in motion. From what I can see, this type of rendering has the most noticeable impact on thin objects (such as fences) but even then it simply looks as if it is part of the camera blur.


Here's the same area taken with maximum camera rotation speed. Motion blur is in full effect. If you look at the metal flooring you can see increased aliasing with larger steps but the image still looks good. Without the excellent motion blur it would certainly be more obvious.



I'd love to know just how many resources this technique frees up. It could be a real alternative to traditional rendering that would allow for a higher framerate without the massive loss in image quality associated with lowering overall resolution (which really only looks bad as a result of scaling).

Also, just for fun, here's a shot with the PS4 set to output at 720p. The system is downscaling the image so jaggies are minimized compared to what you'd get with a traditional 720p image. When blown up to 1080p it looks dramatically worse than the 960x1080 method they used.

Click on the images to see them at full resolution.

plus the relatively low motion resolution of most modern LCD TVs will help mask artifacting during motion.


I also think using the term 'interlaced' without any qualifying information is as misleading as calling it native 1080p. interlaced implied that it is two half-res buffers being shown alternately to approximate the target resolution. But with the computation taking place on the missing parts, it is clearly more than simple interlacing and calling it such is a disservice IMO
 

VanWinkle

Member
But it is rendering the full resolution, it's in the definition of interlacing that it interchanges the lines every frame.

Now 960x1080 interlaced means it's interlacing 480 lines and 960 is the full width, which is incorrect.

No, it's only rendering 960x1080 during any given frame. The fact that it takes pixels from a previous frame makes it harder to describe to people, but to say it's rendering at full resolution is extremely misleading.
 

danwarb

Member
But it is rendering the full resolution, it's in the definition of interlacing that it interchanges the lines every frame.

Now 960x1080 interlaced means it's interlacing 480 lines and 960 is the full width, which is incorrect.
It's 960x1080, because it renders 960 alternate lines per frame and fills in the gaps using the previous frame/blur. This is still a 60fps (or close enough) game, not a video, and reduced image quality is clear. Thread title makes the appropriate distinction.

All games output 1080p, many upscale using various solutions. KZ MP is interlaced for more frames and gives you the illusion of 1080p while static.
 

RoboPlato

I'd be in the dick
My unprofessional opinion suspects your aging LCD might be exacerbating issues with artifacts and is having a hard time dealing with the insane amount of blending in between frames. Putting it nicely, if your television is already having trouble with fast motion images; this will only amplify the problem.

This is some surreal stuff we have going on.

No need to be nice about my TV. I'm no fan of LCD :lol
 
The more important question is, do you actually understand why resolution mattered in the first place?

The answer is simple; upscaling is ugly.

If we were using displays without fixed resolutions, 1280x720 would actually still appear perfectly clean and attractive. You would lose some detail but the image would be totally acceptable. This resolution does not divide perfectly into 1920x1080, however, and the resulting pixels are distorted which requires a type of bilinear filtering to smooth out the uneven pixels. The end result is ugly.

This is why 720p, 900p, and anything else in between is considered less than optimal. If all TVs were 1440p then 1920x1080 would actually appear rather unsightly as well.

1080p isn't a magic number, it's simply a number that matches the native resolution of most TVs.

The method used in Killzone manages to avoid upscaling blur completely. While there is some loss in detail, the end result is still sharp and relatively clean.

The whole argument in favor of 1080p should revolve around the elimination of upscaling on fixed pixel displays.


It's very easy to see the difference between native 720p and 1080p. This situation is completely different and sidesteps the problems with using 720p.

I completely understand why 1080p, it's multiples and divisables, and the fixed resolution of our displays matters.

My point was that at the beginning of this thread, we got a lot of, "no one noticed"......"yes we did, I always knew something was off but I thought is was the AA" ........ "GG lied" ......."No, GG just didn't bring it up and I could understand why".......to "you know what, it still looks great and the difference is near imperceptible" all paraphrasing but you get my drift.

Straight up scaling sucks for mathematical reasons we all know. But tricks that can allow for better IQ, regardless of resolution are always welcome. So in the end IQ is MORE than just resolution.
 

RoadHazard

Gold Member
No, it's only rendering 960x1080 during any given frame. The fact that it takes pixels from a previous frame makes it harder to describe to people, but to say it's rendering at full resolution is extremely misleading.

Yes, but the word "interlaced" by definition means that the (normally vertical) resolution is actually half of what is stated each frame. 1080i means "1920x1080 interlaced", which tells you that each frame is actually 1920x540, and only every two frames will a full frame have been displayed. We don't say that 1080i is "1920x540 interlaced", so neither should this thread's title.
 

RoboPlato

I'd be in the dick
But in return you get sharpening, crushed blacks, and motion interpolation. Obviously worth the trade in!

please kill me

My set has undefeatable dynamic contrast, even though I've turned the option to off. Looked it up and sure enough, the entire line can't turn the setting off, just put it on it's lowest setting. It's literally impossible to set brightness/contrast with test patterns because of it.
 

Angel76m

Neo Member
No, it's only rendering 960x1080 during any given frame. The fact that it takes pixels from a previous frame makes it harder to describe to people, but to say it's rendering at full resolution is extremely misleading.

missleading is 960x1080 depends it is rendering in 1920x1080. But it is using a technick to achive that by rendering even and odd full 1080p frames and put them together. It is never a 960x1080. It is 1920x1080 with odd end even information in it the rest is filled with 0 and in the end it is a 1920x1080.
To tell it 960x1080 is definitly wrong.

As in the first post this picture shows how it really renders and it is a FULL 1920x1080 Picture and not a 960x1080. It doesn't matter if there are black lines in it or not. It is a full HD Picture with the full Sitze which is never coming out to the Screen.

http://i.imgur.com/mD6aH5a.png
 
I've seen footage of the multiplayer. It looks great to my eyes. Is this another one of those if you pause the screen, look away from the action, and blow up something 300 ft in the background to 50x size you can clearly see things that you guys like to get mad over but normal person who is playing the game for the purpose of playing the game would ever notice?
 

Durante

Member
About the current discussion, calling the rendering resolution 960x1080 is clearly the most correct. What happens afterwards with these pixels requires more explanation than "scaling" or "interlacing", but it's rendering 960x1080 new pixels every frame, so that's its rendering resolution.
 

mrklaw

MrArseFace
No, it's only rendering 960x1080 during any given frame. The fact that it takes pixels from a previous frame makes it harder to describe to people, but to say it's rendering at full resolution is extremely misleading.

what is 'rendering'? isn't it about taking various inputs in terms of environment, lighting, movemet etc and computing the end value of each pixel?

I'd argue that GG are 'rendering' 1920x1080 each frame. Half of that frame is done traditionally, and the other half is referencing previous frames but there is clearly additional work done on them, they aren't just displaying them like normal interlacing would be.

This feels a little like some games using reprojection to get stereo 3D support. It lets them produce a full 3D scene but without having to fully calculate and render both scenes from scratch.


edit: I like Durante's explanation too :)
 

RoboPlato

I'd be in the dick
About the current discussion, calling the rendering resolution 960x1080 is clearly the most correct. What happens afterwards with these pixels requires more explanation than "scaling" or "interlacing", but it's rendering 960x1080 new pixels every frame, so that's its rendering resolution.

I did specify rendering when I requested a title change but it didn't happen. Oh well. This is probably about as close as we can get to an accurate summation in a thread title.
 
No, it's only rendering 960x1080 during any given frame. The fact that it takes pixels from a previous frame makes it harder to describe to people, but to say it's rendering at full resolution is extremely misleading.

I think the real question is whether it looks better than just a stretch scaling. Of all the ways to get a 1080p 30fps engine to 60fps, is this better?

When there isn't a lot of motion going on, IMO yes. When there is a lot of motion going on, is it worse than aggressive motion blur? Er...I don't really like motion blur in general but it beats a constantly blurred image I guess. Vs. 30fps or just a normal resolution drop, I prefer GG's method.

The whole argument in favor of 1080p should revolve around the elimination of upscaling on fixed pixel displays.
Most important point about resolution.
 

RoadHazard

Gold Member
About the current discussion, calling the rendering resolution 960x1080 is clearly the most correct. What happens afterwards with these pixels requires more explanation than "scaling" or "interlacing", but it's rendering 960x1080 new pixels every frame, so that's its rendering resolution.

Yeah, this is certainly also correct, but then the word "interlaced" needs to disappear from the thread title.
 

thuway

Member
So in summary:

A. The image displayed to the player is Full HD 1080p with ghosting.
B. The framebuffer renders every other frame at 960X1080p and stitches them together. Every frame on display is full HD 1080p, but the means to display the image are questionable.
C. The advantages of a sharp Full HD image are present with a blur / artifacting on high resolution edges.

Case closed?
 
clever-girl-o.gif
 
I wish GG would come out and explain this method. It sounds fascinating.

It makes me wonder how far this technique can be pushed. As resolution gets higher, combing artifacts in interlaced signals become less noticeable. How good would a single player campaign rendering at 1280x1440 look? How close would 1920x2160 look to actual 4K resolution?
 

ghst

thanks for the laugh
So in summary:

A. The image displayed to the player is Full HD 1080p with ghosting.
B. The framebuffer renders every other frame at 960X1080p and stitches them together. Every frame on display is full HD 1080p, but the means to display the image are questionable.
C. The advantages of a sharp Full HD image are present with a blur / artifacting on high resolution edges.

Case closed?

if by this you mean, it's a stitched together band-aid for a low rendered resolution which can be passed of as "full HD" by people with a questionable desire to fervently argue semantics? sure.
 

Gestault

Member
So in summary:

A. The image displayed to the player is Full HD 1080p with ghosting.
B. The framebuffer renders every other frame at 960X1080p and stitches them together. Every frame on display is full HD 1080p, but the means to display the image are questionable.
C. The advantages of a sharp Full HD image are present with a blur / artifacting on high resolution edges.

Case closed?

I'd disagree with this, for the same reasoning that 1080i isn't the same as 1080p in motion. The output is in effect a progressive image of an interlaced signal, so it's a mix of old data with new. "Sharp" doesn't describe the result of this, if you've had the chance to play the MP mode compared to single-player.
 

flkraven

Member
So in summary:

A. The image displayed to the player is Full HD 1080p with ghosting.
B. The framebuffer renders every other frame at 960X1080p and stitches them together. Every frame on display is full HD 1080p, but the means to display the image are questionable.
C. The advantages of a sharp Full HD image are present with a blur / artifacting on high resolution edges.

Case closed?

Let me try!

A. The image displayed to the player is Full HD 1080P with some blurring.
B. The game renders every frame at 720P, but upscales the images to full HD 1080P (but the means to display the image are questionable)
C. The advantages of a sharp Full HD image are present with a blur (that isn't present on images rendered originally at 1080P)

Sounds the same, eh? Just about every complaint regarding upscaling fits your A,B,C

So case closed in so much that this is no different then rendering at a lower resolution and upscaling. Same or similar sacrifices which basically boil down to one simple statement: This is not being rendered at 1080P.
 
Top Bottom