• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Killzone: Shadow Fall Multiplayer Runs at 960x1080 vertically interlaced

Here's a question:

Why aren't Xbox One developers doing this?

Surely they could get closer to "pseudo-1080p" like Killzone does using this interlaced technique??

Could this trick by Guerilla Games actually be the magic formula to make Xbox One games look better too?

And heck, what about with people using older or low to mid-range video cards on PCs?

Could this be the trick to make everything "LOOK" 1080p quality, just with ghosting and some minor artifacts that are barely noticeable?
 
The first 960x1080 "interlaced" game running at 30FPS in 21:9 with black bars is going to look amazing.

30fps will create more temporal artifacts given how the delta time between frames is bigger.
But its kinda interesting way to achieve pseudo 1080p@60fps.

Wonder if more games will use it now and drop 900p for pseudo 1080p and use the extra performance gains for better interpolation and shading.
 

BigTnaples

Todd Howard's Secret GAF Account
Well, I tried a few bot matches at launch and noticed MP was a bit softer than SP, I like most chalked It up to different AA techniques.


After playing 150+ hours of Bf Multiplayer, finishing Ryse, Ghosts, Tomb Raider, and playing countless other next gen titles at length, I gave Killzone multi an actuall spin today.


What ever technique they are using, it is amazing. The game looks as beautiful as ever, and although the image looks softer, it is still a superbly clean image.

Of all the titles I have played, Shadowfalls IQ reminds me of Ryse the most. You can tell it is lower resolution, but it still seems cleaner than most games, even most full 1080p games.


Very impressive tech on display here.
 

dark10x

Digital Foundry pixel pusher
Here's a question:

Why aren't Xbox One developers doing this?

Surely they could get closer to "pseudo-1080p" like Killzone does using this interlaced technique??

Could this trick by Guerilla Games actually be the magic formula to make Xbox One games look better too?

And heck, what about with people using older or low to mid-range video cards on PCs?

Could this be the trick to make everything "LOOK" 1080p quality, just with ghosting and some minor artifacts that are barely noticeable?
I suppose we don't really know enough about this technique to say. 1080p is proving difficult on XO as a result of memory constraints more than anything else and I suspect this technique will do little to help there.
 

Gestault

Member
30fps will create more temporal artifacts given how the delta time between frames is bigger.
But its kinda interesting way to achieve pseudo 1080p@60fps.

Wonder if more games will use it now and drop 900p for pseudo 1080p and use the extra performance gains for better interpolation and shading.

Honestly, I think directly scaling one dimension of the frame (most similar to the GT5 process) with reasonable AA results in fewer artifacts and arguably a cleaner image than the process for MP Shadowfall. We've also seen Ryse's 900p with custom scaling and AA give some of the better IQ of the launch-window games, at least of those that aren't native resolution.
 

creyas

Member
Wow. Just tried a round courtesy of the free week and it's pretty amazing. I'd have had no idea this was going on if i didn't know.
 
Honestly, I think directly scaling one dimension of the frame (most similar to the GT5 process) with reasonable AA results in fewer artifacts and arguably a cleaner image than the process for MP Shadowfall. We've also seen Ryse's 900p with custom scaling and AA give some of the better IQ of the launch-window games, at least of those that aren't native resolution.

I believe Halo:Reach did it only scaled in horizontal directions.
But GG method seems to keep texture crisp compared to upscaliing solutions.
 
this is an image in that resolution

6CGIM.png


...wat

This basically makes no sense...
 

HTupolev

Member
Yeah, it's kind of impossible if we wish to maintain the thin profile of modern displays.

CRTs worked out that way due to the way the picture was drawn.
Eh, sort of. The irony with CRTs is that you're pretty much always getting something that basically amounts to scaling on the horizontal axis, unless you specifically choose a resolution that aligns with your CRT's horizontal pixel count. Aperture grille CRTs can scan any number of lines, though (within their maximum scan speed constraints).

The biggest single contributor to CRTs looking awesome is that smashing electrons into phosphors is awesome and looks awesome.

For all we know TO:1886 is using KZ's technique :p
Nope, one of the devs said otherwise.
 

Gestault

Member
I believe Halo:Reach did it only scaled in horizontal directions.
But GG method seems to keep texture crisp compared to upscaliing solutions.

Do you have any citation for that? My understanding from the technical write-ups was that it was a native 720p framebuffer, and they used temporal AA as a low-cost anti-aliasing solution (hence mild ghosting).
 

coldfoot

Banned
Honestly, I think directly scaling one dimension of the frame (most similar to the GT5 process) with reasonable AA results in fewer artifacts and arguably a cleaner image than the process for MP Shadowfall. We've also seen Ryse's 900p with custom scaling and AA give some of the better IQ of the launch-window games, at least of those that aren't native resolution.
How about rendering only 75% of pixels and interpolating the other 25%, randomly chosen for each frame, and no two consecutive frames having the same pixel interpolated? I am really curious to see if that's achievable and how it would be.
 

HTupolev

Member
Do you have any citation for that? My understanding from the technical write-ups was that it was a native 720p framebuffer, and they used temporal AA as a low-cost anti-aliasing solution (hence mild ghosting).
Reach's native resolution is 1152x720.

As far as TAA in Reach goes? It renders each frame with a diagonal half-pixel offset from the frame before. In areas of the screen where motion isn't detected*, it uses quincunx-pattern blending, resulting in a very smooth albeit somewhat soft look. When ghosting occurs, it's because the motion buffer failed to correctly detect movement (seems like this usually happens with animated dynamic objects; maybe the motion detection doesn't correctly account for animation outside of the basic object velocity vector, I don't know).

*It doesn't use reprojection, it just turns AA off when it expects the blending won't be accurate.
 

VanWinkle

Member
Here's a question:

Why aren't Xbox One developers doing this?

Surely they could get closer to "pseudo-1080p" like Killzone does using this interlaced technique??

Could this trick by Guerilla Games actually be the magic formula to make Xbox One games look better too?

And heck, what about with people using older or low to mid-range video cards on PCs?

Could this be the trick to make everything "LOOK" 1080p quality, just with ghosting and some minor artifacts that are barely noticeable?

It's definitely a creative thing they did, but I don't think it was actually a good thing. Most of us would agree Killzone's MP is quite blurry. I hope it doesn't become a trend.
 
It's definitely a creative thing they did, but I don't think it was actually a good thing. Most of us would agree Killzone's MP is quite blurry. I hope it doesn't become a trend.

Depends, really. With more refinement and a consistent frame-rate you'd end up with less artefacts. At least for MS, it may be the best chance of getting a high pixel density in games that isn't Forza.
 
I love how up until a few days ago many of us were laughing at Microsoft's ridiculous "it outputs on 1080p anyway" PR crap, now we are searching for the most creative ways to spin an otherwise simple fact, a 960x1080 resolution in a video game. I can understand being a fan of a platform, I am a PC fan myself, but I value my personal integrity above all else. Putting a company's interests above it is a slippery slope, people.
 

Gestault

Member
How about rendering only 75% of pixels and interpolating the other 25%, randomly chosen for each frame, and no two consecutive frames having the same pixel interpolated? I am really curious to see if that's achievable and how it would be.

In terms of specific "artifacts" from that process, it would probably be similar to the screen-door effect in low quality alpha-layers in other games. That would be more distracting than you'd think. Aiming for native 1080p or finding an unobtrusive scaling algorithm seems like a better use of resources at the game design stage. Creating irregular "holes" in an engine's frame-buffer (from my understanding) would take more processing power than it would gain even before "filling" them because you're basically processing an additional full-screen alpha layer if it's truly irregular.

I've love to see tests for that process though.
 

sirap

Member
I love how up until a few days ago many of us were laughing at Microsoft's ridiculous "it outputs on 1080p anyway" PR crap, now we are searching for the most creative ways to spin an otherwise simple fact, a 960x1080 resolution in a video game. I can understand being a fan of a platform, I am a PC fan myself, but I value my personal integrity above all else. Putting a company's interests above it is a slippery slope, people.

There's a big difference between what GG are doing with this vs rendering at 720p and simple scaling the image up. Not saying either are as good as native 1080p but i'd take the former over the latter any day.
 

benny_a

extra source of jiggaflops
I love how up until a few days ago many of us were laughing at Microsoft's ridiculous "it outputs on 1080p anyway" PR crap, now we are searching for the most creative ways to spin an otherwise simple fact, a 960x1080 resolution in a video game. I can understand being a fan of a platform, I am a PC fan myself, but I value my personal integrity above all else. Putting a company's interests above it is a slippery slope, people.
I think you're a bit late with strange post here. This thread already had the blow-up where people including myself shit on GG for not being up front about the game.

Now it's about discussing the tech, the impact and the future of it. Or can't some posters express that they think this is an interesting solution?
 

Gestault

Member
I think you're a bit late with strange post here. This thread already had the blow-up where people including myself shit on GG for not being up front about the game.

Now it's about discussing the tech, the impact and the future of it. Or can't some posters express that they think this is an interesting solution?

He's not wrong though, and not even a full-page back we have Thuway himself trying to rationalize how it's actually a 1080p image.
 

benny_a

extra source of jiggaflops
He's not wrong though, and not even a full-page back we have Thuway himself trying to rationalize how it's actually a 1080p image.
That's not the top of my page. All this trying to find a shorthand for this technique is so boring anyway.

This is something new and trying to fit it into technically incorrect categories just so one can claim to having found the perfect 1080p defense or having found the gotcha-720p after all is just so uninteresting.

But then again I hate in general that some posters like to come back after a top had over 1500 posts and then claim that "we are searching for the most creative ways to spin". Yeah really, if all are doing it then go ahead and quote all that allegedly laughed at one and now spin the other. Because that's what happens on GAF: You quote people that are being hypocritical and intentionally disingenuous and then they get banned.
 

EL CUCO

Member
It makes me wonder how far this technique can be pushed. As resolution gets higher, combing artifacts in interlaced signals become less noticeable. How good would a single player campaign rendering at 1280x1440 look? How close would 1920x2160 look to actual 4K resolution?

Exactly.
 
About the current discussion, calling the rendering resolution 960x1080 is clearly the most correct. What happens afterwards with these pixels requires more explanation than "scaling" or "interlacing", but it's rendering 960x1080 new pixels every frame, so that's its rendering resolution.

All the pixels are new each frame, its just that half of those are rasterized while the other half are computed based on color, z and motion information from the previous and current frame. None of the current terminology accurately applies.
 

rayisbeast

Neo Member
So in summary:

A. The image displayed to the player is Full HD 1080p with ghosting.
B. The framebuffer renders every other frame at 960X1080p and stitches them together. Every frame on display is full HD 1080p, but the means to display the image are questionable.
C. The advantages of a sharp Full HD image are present with a blur / artifacting on high resolution edges.

Case closed?

A. It is 1920x1080 , but it is not 1080p
B. Look at A.
C. Not full HD
 

thelastword

Banned
While I agree with your argument in general, I think it's overstating the case to say they "butchered" the IQ. From what I've seen, the results aren't really any worse than Battlefield's 900p.
That's all good, but Battlefield was never 1080p in the first place or presented as such.

Alienous said:
I'd be totally ok with the tricky resolution shit if they managed to hit 60fps. But the multiplayer struggles with 30fps in a full lobby. These kinds of compromises to reach smoother gameplay are fine, but you should be honest. Shadow Fall tries to push their engine too far and fails because of it.
The baseline for all games this gen should be 1080p 30fps imo. If you want 60fps at that res, then redo your engine, update it and begin the development process with 60 in mind so the possibility of 60fps is locked or consistently up there.

What I don't like or agree with is that certain persons are suggesting that devs keep doing this in the future to bolster framerate, that's a big no no. The IQ in mp is immediately worse than sp when you boot it up, it is not native and it is indeed a compromise, yet the purpose for muddying the IQ was still not realized as you've stated.

I like the TR approach frankly, their game is full hd, they were aiming for a baseline of 30fps but the extra overhead on the PS4 allowed for an average closer to 60, that is just fine. To be very honest, when Guerilla said MP was going to be 60fps at 1080, I figured that they were closer to 60fps already, so all they'd have to do was to lower some effects and other graphical perks to better optimize for a consistent 60fps, I never imagined they would compromise with the res/iq.

Angel76m said:
think people here mixed something up.

The game is runningin full 1080p in SP and MP.
I'm sorry, but what you described is not 1080p native, it's an alternative method to free up resources to bolster framerate, your tv or gpu may not be doing the upscaling in the traditional fashion but the blurriness is just as evident in motion. Some of the games which were 960 x 1080p last gen never scaled nicely on PS3; transformers, incredible hulk and conflict denied ops to name a few, they looked better and sharper at 720p.

If Guerilla had to lower the rez, It would have been better to go for 1280 x 1080p as that scales nicely to 1080p (GT5 was one of the sharper PS3 games on my tv). 1440 x 1080p or 1600 x 1080 are all cool resolutions to upscale from and are even better because of the extra pixels on the horizontal scale. When you go below 1280 horizontal and cut the horizontal pixel fidelity in half (960), it starts to look muddy. With the PS4 having a better scaler than the PS3,1280 x 1080p upscaled would have looked sharper for killzone mp.
 

Alo81

Low Poly Gynecologist
Let me try!

A. The image displayed to the player is Full HD 1080P with some blurring.
B. The game renders every frame at 720P, but upscales the images to full HD 1080P (but the means to display the image are questionable)
C. The advantages of a sharp Full HD image are present with a blur (that isn't present on images rendered originally at 1080P)

Sounds the same, eh? Just about every complaint regarding upscaling fits your A,B,C

So case closed in so much that this is no different then rendering at a lower resolution and upscaling. Same or similar sacrifices which basically boil down to one simple statement: This is not being rendered at 1080P.

This is totally wrong. No scaling is being done on Shadowfalls image.
 
At the end of the day, the results are subpar. I would rather they upscale from a lower native progressive resolution rather than use an interlaced framebuffer with interpolation based on previous frames.
 
Here's a shot taken while rotating the camera at a medium steady rate (taken from a capture card rather than using the built-in sharing feature).

Considering the technique being used, I'm impressed with how artifact free the image appears in motion. From what I can see, this type of rendering has the most noticeable impact on thin objects (such as fences) but even then it simply looks as if it is part of the camera blur.

ARA.png


Here's the same area taken with maximum camera rotation speed. Motion blur is in full effect. If you look at the metal flooring you can see increased aliasing with larger steps but the image still looks good. Without the excellent motion blur it would certainly be more obvious.

BRA.png

This transition when standing still vs moving is one of the most visually frustrating things I've experienced in recent years of gaming. The headaches were real, and caused by this blur. I passed it off as some sort of near-sighted, intentional blurring to make it "feel more real" or some dumb shit.

It's really, really unacceptable.
 

Jinfash

needs 2 extra inches
So in summary:

A. The image displayed to the player is Full HD 1080p with ghosting.
B. The framebuffer renders every other frame at 960X1080p and stitches them together. Every frame on display is full HD 1080p, but the means to display the image are questionable.
C. The advantages of a sharp Full HD image are present with a blur / artifacting on high resolution edges.

Case closed?

is74Y4s.gif
 

VanWinkle

Member
I love how up until a few days ago many of us were laughing at Microsoft's ridiculous "it outputs on 1080p anyway" PR crap, now we are searching for the most creative ways to spin an otherwise simple fact, a 960x1080 resolution in a video game. I can understand being a fan of a platform, I am a PC fan myself, but I value my personal integrity above all else. Putting a company's interests above it is a slippery slope, people.

You're reading into things wrong, perhaps based on your own biases or preconceived notions. There are always people that will defend anything, from any company. Many of us in here, even ones that are fans of the platform, are not in any way defending this resolution.
 

dark10x

Digital Foundry pixel pusher
This transition when standing still vs moving is one of the most visually frustrating things I've experienced in recent years of gaming. The headaches were real, and caused by this blur. I passed it off as some sort of near-sighted, intentional blurring to make it "feel more real" or some dumb shit.

It's really, really unacceptable.
I don't doubt that but it really doesn't bother me at all. Something to consider though
 

Z3M0G

Member
I'm going to throw in a potentially very dumb question...

If a screenshot is the capture of a single frame, why don't we see the empty interlace lines in a screenshot?
 
It makes me wonder how far this technique can be pushed. As resolution gets higher, combing artifacts in interlaced signals become less noticeable. How good would a single player campaign rendering at 1280x1440 look? How close would 1920x2160 look to actual 4K resolution?
... and would it look better than rendering natively at a lower res?
 
I don't doubt that but it really doesn't bother me at all. Something to consider though

Legit, it got to the point where I simply stopped looking anywhere but dead center of the screen when running, and I'd even relax my eyes (not focus) at times. In any other FPS, I'm scanning to the very edges of my field of vision so I can know what's coming. I simply can't do that in KZSF. I *kinda* got used to adjusting how I looked at the game to compensate, but it was a very uncomfortable experience.
 

quest

Not Banned from OT
Legit, it got to the point where I simply stopped looking anywhere but dead center of the screen when running, and I'd even relax my eyes (not focus) at times. In any other FPS, I'm scanning to the very edges of my field of vision so I can know what's coming. I simply can't do that in KZSF. I *kinda* got used to adjusting how I looked at the game to compensate, but it was a very uncomfortable experience.

Just curious you ever seen a DLP projection screen TV? If you have do you get eye strain and strobing effect from it?
 
I'd disagree with this, for the same reasoning that 1080i isn't the same as 1080p in motion. The output is in effect a progressive image of an interlaced signal, so it's a mix of old data with new. "Sharp" doesn't describe the result of this, if you've had the chance to play the MP mode compared to single-player.

Most video that is put on bluray or D-VHS has been vertically filtered and some of it even down-sampled, KZ's mp doesn't seem to get either of these luxuries and it's being fed to a progressive display that doesn't understand how to de-interlace. I'd say that's the real issue.
There isn't a thing wrong with 1080i on a proper display built to handle it, but when the content being sent isn't prepped for it and the display it's being sent to isn't designed for it at all I guess you can have issues.
 

RoboPlato

I'd be in the dick
Legit, it got to the point where I simply stopped looking anywhere but dead center of the screen when running, and I'd even relax my eyes (not focus) at times. In any other FPS, I'm scanning to the very edges of my field of vision so I can know what's coming. I simply can't do that in KZSF. I *kinda* got used to adjusting how I looked at the game to compensate, but it was a very uncomfortable experience.

That was probably the fake motion blur that was on the edges of the screen when sprinting. They must have patched it out or toned it down because I didn't notice it yesterday.
 
You're reading into things wrong, perhaps based on your own biases or preconceived notions. There are always people that will defend anything, from any company. Many of us in here, even ones that are fans of the platform, are not in any way defending this resolution.

I absolutely agree. Those Sony fans who didn't hesitate to call out Sony and GG on the fact that they lied have earned my respect. Others clearly tried to find ways to downplay or obfuscate the issue. Thuway's post in particular is troubling, since it's coming from an insider who had a starring role in Resolutiongate. And I still wonder how not one of the insiders got wind of this for so many months. As I said, troubling.
 
I absolutely agree. Those Sony fans who didn't hesitate to call out Sony and GG on the fact that they lied have earned my respect. Others clearly tried to find ways to downplay or obfuscate the issue. Thuway's post in particular is troubling, since it's coming from an insider who had a starring role in Resolutiongate. And I still wonder how not one of the insiders got wind of this for so many months. As I said, troubling.

No, it sounds like you are troubled by anyone who is trying to accurately represent a complicated technique instead of adopting reductive and misleading labels.

But FWIW, thuways description is inaccurate, so is the current thread title.
 

Artorias

Banned
I absolutely agree. Those Sony fans who didn't hesitate to call out Sony and GG on the fact that they lied have earned my respect. Others clearly tried to find ways to downplay or obfuscate the issue. Thuway's post in particular is troubling, since it's coming from an insider who had a starring role in Resolutiongate. And I still wonder how not one of the insiders got wind of this for so many months. As I said, troubling.

So it's just Thuway's post then? You seemed to be implying it was an issue with many people in the thread...

I'm sure there are defenders, but I'm curious if you even read more than a handful of posts or just made an assumption and ran with it.
 

jaxpunk

Member
I absolutely agree. Those Sony fans who didn't hesitate to call out Sony and GG on the fact that they lied have earned my respect. Others clearly tried to find ways to downplay or obfuscate the issue. Thuway's post in particular is troubling, since it's coming from an insider who had a starring role in Resolutiongate. And I still wonder how not one of the insiders got wind of this for so many months. As I said, troubling.

Well if we've got a random guys respect on the internet. I think that pretty much wraps things up then?
 
I absolutely agree. Those Sony fans who didn't hesitate to call out Sony and GG on the fact that they lied have earned my respect. Others clearly tried to find ways to downplay or obfuscate the issue. Thuway's post in particular is troubling, since it's coming from an insider who had a starring role in Resolutiongate. And I still wonder how not one of the insiders got wind of this for so many months. As I said, troubling.

Truly masterful concern trolling.
 
Top Bottom