• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Killzone: Shadow Fall Multiplayer Runs at 960x1080 vertically interlaced

Alienous

Member
I'd take it if the 16 player framerate was half decent.

And I think GAF just lost its licence to discuss resolution. What a CBOAT move, dudes.
 
People got caught with their pants down over this. A lesson learned. Resolution is overblown for casuals which is why resolution and IQ nitpickers are always fighting uphill. Damn the people eh.
 

Mr Moose

Member
So was this always that resolution or did it come with the updates?
I guess this goes to show nobody plays Killzone Shadow Fall MP lol.
 

Kuro

Member
People got caught with their pants down over this. A lesson learned. Resolution is overblown for casuals which is why resolution and IQ nitpickers are always fighting uphill. Damn the people eh.

Have you ignored all the posts of people complaining about the multiplayer's image quality at launch?
 

GoaThief

Member
Interesting, I have a feeling we will be seeing more of this technique in the future.

I'll take it over a standard lower resolution as jaggies are far less apparent. That said I'd much prefer 1080p.

Has it been in from the start or added via some of the largeish updates?
 
I was actually wondering if Killzone SF uses resolution scaling, since it worked so well for Killzone Mercenary.

I'm not mad really, since it works well and is SUCH a fascinating tech.
 

Alienous

Member
Have you ignored all the posts of people complaining about the multiplayer's image quality at launch?

I don't recall anyone saying it wasn't 1080p, specifically, though I'm sure a minority did. And that certainly wasn't the consensus.
 
Guys, everyone noticed. Just nobody expected Sony to be lying when they said it the SP and MP were native 1080p. We'd all just chalked it down to FXAA or something shit.
 

Durante

Member
I think this is pretty clever compromise to hit a high framerate in MP. How responsive is it?

If viewed as a horizontal image compression, the math can make a better interpolation with better detail than retaining of detail when scaling in X&Y pixels..

Also Not to different than tricks used by Liverpool - Wipeout HD Dynamic Frame Buffer
Actually, that's quite significantly different. The dynamic framebuffer is simply upscaled. Here you apparently have a spatial and a temporal sample (or actually a range of temporal samples?) per 2 pixels which are fed into some heuristic to derive the final color values.
 

hodgy100

Member
I don't recall anyone saying it wasn't 1080p, specifically, though I'm sure a minority did. And that certainly wasn't the consensus.

well we were told It was native 1080p so people assumed it was a dodgy post processing AA method that was causing the artefacts. still there is a noticeable hit in IQ pretty much everyone noticed that.
 

nelchaar

Member
It astounds me how many people are "fake" pissed off about this.

The only people that should be upset are the 2000ish Killzone MP players on it every day, and I being one of them, don't care.

For the past couple of months I've been playing the game with the terrible blur, anti-aliasing and weird shadows and always knew something was off. The fact that this phenomenon has a name and a reason behind does not change anything.

Enough with the resolution wars. Enough with the frame rate wars. Why can't we all just enjoy games nowadays?
 

thuway

Member
This method is surely expensive and without its own pitfalls. The real question is this-

Does this rendering technique yield better image quality than 900p? There must be some logic behind it. Pixel counting just got tougher, and fake 1080p is looking more like real 1080p every day.

Also- Guerilla you got some explaining to do'.
 
Too bad I have no interest in this game, if I ever looked at it I might have noticed something :p
That screenshot Dictator93 posted is pretty terrible.
I'd be willing to bet there's motion blur making it look worse. Here's another screen captured by the same person:
27yGYLK.jpg


Notice that you can still see the artifacts from the temporal reprojection, but the image as a whole seems sharper--except for the motion blur on the very near rock faces.

I'd also point out that most people (not necessarily you) don't really understand how "bad" a frame can look divorced from its context. Many modern techniques damage the clarity of the frame to improve the overall look in motion. For example, here's a frame out of the other "best looking console game", Ryse:

pwPnHYJ.jpg

Tons of motion blur rushing in different directions, heavy film grain, motion blur artifacts all around the character...this frame is a mess. But the sequence it's from looks quite good.
 
Now, question: Should this even matter if nobody noticed in the first place? I'm sure there will be some people who now probably say they noticed this all along, or knew something was up, but come on, that's not really true now, is it? Nobody's going to believe that.

If our local 1080 paparazzi truly thought that something was up, or even worth investigating with the game's resolution, a pixel count would have been done ages ago. That didn't happen, regardless of what people may be saying currently. So, what it all boils down to is that this is just one more example of a game (if this is all accurate and no mistakes were made) in which something that was supposed to be so effortlessly spotted and identified, was missed completely by just about everybody. I distinctly remember posts of people mocking developers and posters who sometimes said that it was tough to notice the difference between certain resolutions, and that it wasn't always as easy to spot as people claimed it was.

We now have an example with Killzone that's very close to a 720p resolution for the MP, an MP that many people, even myself, were impressed with visually, and nobody seemed to notice anything amiss with the resolution despite this. That just about tells you all you really need to know.

Nope, the guy is right. Everybody missed this. Since when has a developer telling people a game was 1080p stopped people from doing a pixel count anyway? I think I'm only being fair in saying that if the game was an Xbox One title, a resolution analysis would have almost certainly taken place no matter what the developer claimed, and I think everybody knows that. But the best part here is that if people thought the game's MP looked great before (lots of people did) then nothing should be different about today.

Don't even know what to say to this
 
Ryse's temporal supersampling takes place over the entire 1600x900 buffer. And I wouldn't be surprised at all if the game is overall doing more work for reprojection than KZSF MP.

Even if they had to bleed half a 1080p buffer into DDR3, that wouldn't be that big of a deal. A single pass over a 960x1080 buffer isn't likely to be very catastrophic for a 68GB/s memory pool.

Ryse's render target was just 1600x900. That is the level of detail that their temporal AA solution worked with. Once that image was complete it was stretched to fill the typical 1920 x 1080. Killzone didn't have to take that extra step. It's render target was the final 1920 x 1080. My guess as to why the XB1 didn't render directly to 1080p was because there simply wasn't enough room in the ESRAM to hold it.

While your reasoning about using DDR3 sounds logical, it is still extra work above what is being done now. I have no idea how much of a FPS hit a game would take to do this extra round trip to the slower main memory, but if you have a game that is just hitting 30 FPS now, can you afford to take that hit? After all there is a reason high end graphics cards don't use DDR3 and why it is generally assumed that XB1's fame buffers would be held in ESRAM not DDR3.

Btw wouldn't it be two passes. The first pass would copy the current frame buffer from ESRAM to DDR3 ram in order to save it while calculating the next frame. The second pass would have to read that saved DDR3 buffer and combine it with the next created fame in ESRAM order to produce and store the the final image. And where would that final frame be stored? If it is in the ESRAM then this technique is using a lot of that precious memory. If it is in DDR3 ram then that is yet another trip to the slower memory.


Everything I just wrote is speculation based on my limited understanding of how this works. I'd love for a real dev to explain where I'm wrong and how it'd work.
 
Wasn't the same technique used for either GT5 or MGS4 or both? I vaguely remember reading about a technique where the game was rendered thin then stretched sideways.
 

thuway

Member
I wish a dev would come in here and essentially school us on the benefits and disadvantages from this proposed rendering solution. For the life of me it look better than 900p in both Ryse and Battlefield 4, but yet, the artifacts, ghosting, and strange aliasing is still omnipresent.

This is something totally new, alien, and abstract. This is the feels MLAA gave when it came around and busted its head through the seams.
 

UnrealEck

Member
I could tell from just videos and screenshots that the multiplayer appeared washy looking, but I just assumed they had toned down various graphical effects to raise performance to try and hit that 60 target.
So isn't this like half the effective pixel count?
 

Durante

Member
I'd be willing to bet there's motion blur making it look worse. Here's another screen captured by the same person:

Notice that you can still see the artifacts from the temporal reprojection, but the image as a whole seems sharper--except for the motion blur on the very near rock faces.
It's hard to say in the case of this particular technique what's motion blur and what is lack of sharpness due to camera motion. Because obviously, frames where the camera doesn't move at all will look "almost 1080p", while presumably the reprojection will fail and produce blur (or worse, artifacts) with fast camera movement.

Wasn't the same technique used for either GT5 or MGS4 or both? I vaguely remember reading about a technique where the game was rendered thin then stretched sideways.
No, those just used anamorphic rendering + scaling. To the best of my knowledge, this particular technique wasn't used before.
 
people on reddit having a field day with this.
Not surprising. A certain subgroup over there has been salivating whilst waiting for a moment to 'attack'. And here it is.
Although going nuts over this whilst deriding 'SonyGaf' for pointing out XBOne deficiencies is rather hypocritical.
 

thuway

Member
It's hard to say in the case of this particular technique what's motion blur and what is lack of sharpness due to camera motion. Because obviously, frames where the camera doesn't move at all will look "almost 1080p", while presumably the reprojection will fail and produce blur (or worse, artifacts) with fast camera movement.

It's so pathetically smart. It essentially renders 1080p at the moment you most likely would notice 1080p and renders a faux 1080p or a blur when it knows you won't be paying attention.
 
People got caught with their pants down over this. A lesson learned. Resolution is overblown for casuals which is why resolution and IQ nitpickers are always fighting uphill. Damn the people eh.

Resolution makes a huge difference and casuals know good IQ when they see it. Of course they don't know any of the details of why something looks good but they do notice.
 
I really hope GG comes out and explain the technique with pros and cons .
This stuff is rather interesting and it always fun to learn about stuff like this .
 

bombshell

Member
Wasn't the same technique used for either GT5 or MGS4 or both? I vaguely remember reading about a technique where the game was rendered thin then stretched sideways.

The technique used here by GG is not stretching the image. It is rendering half of the pixel columns and interpolating the other half temporally from the previous frames. Then the pixel columns that are rendered versus interpolated are reversed in the next frame.
 

hodgy100

Member
It's so pathetically smart. It essentially renders 1080p at the moment you most likely would notice 1080p and renders a faux 1080p or a blur when it knows you won't be paying attention.

Killzone Mercenary did a kind of similar thing didnt it? Native resolution when there is little movement and then dynamic resolutions to hold the framerate during movement.
 
Not surprising. A certain subgroup over there has been salivating whilst waiting for a moment to 'attack'. And here it is.
Although going nuts over this whilst deriding 'SonyGaf' for pointing out XBOne deficiencies is rather hypocritical.

But if this was an xbox game it would have a thread of 1000 posts by now!
 

Footos22

Member
Been playing mp exclusively for the last week and a half.
Havent even noticed and i game quite close to a 50" plasma,

Then again resolution doesnt bother me at all.
Didnt even think cod looked any different on ps3 then it did on 360.
 

benny_a

extra source of jiggaflops
Killzone Mercenary did a kind of similar thing didnt it? Native resolution when there is little movement and then dynamic resolutions to hold the framerate during movement.
I felt KZ:M looked fine in MP because of the blur, but KZ:SF always looked off in the MP.
 

hodgy100

Member
But if this was an xbox game it would have a thread of 1000 posts by now!

nice sarcasm :p good job we are about to hit 1k post's on this :p

I felt KZ:M looked fine in MP because of the blur, but KZ:SF always looked off in the MP.

Oh I agree, slight up scaling is pretty easy to hide during movement. whereas in KZ:SF movement causes these horrible interlacing artefacts for the sake of a "1080p" frame. Plus with the former, the dynamic resultuion was explained before release and it actually seemed like a neat way to deal with the detail / framerate tradeoff.
 

nbnt

is responsible for the well-being of this island.
Whatever GG did here, I wish every sub-1080p game uses it. The difference in motion is negligible. Yes it looks blurrier than the 1080p singleplayer but it looks far, far better than anything rendered at 720p despite being close in pixel count. I'd take its artefacts over the mess 720p creates.
 
Hmmm well GG needs to get to the talking to explain themselves over this because this is kind of ridiculous.

However what's with the co-signing in here and the fake anger? When did all of a sudden people care so much about Killzone now? I get that this obviously is a resolution issue but I see a lot of dwellers coming out the woodworks as if they played the game to even be mad. People who actually played the game probably noticed. I don't know why DF didn't come out an break this earlier. It seems lame for this to come out now with this well after launch. This should of been called out from early and I'm kind of disappointed by Sony, GG and DF who probably knew there was a small fine print of it all but didn't report and own up to it.

But if this was an xbox game it would have a thread of 1000 posts by now!

Dead at you not looking at how much post this thread has already hit. SMH!
 

hodgy100

Member
Hmmm well GG needs to get to the talking to explain themselves over this because this is kind of ridiculous.

However what's with the co-signing in here and the fake anger? When did all of a sudden people care so much about Killzone now? I get that this obviously is a resolution issue but I see a lot of dwellers coming out the woodworks as if they played the game to even be mad. People who actually played the game probably noticed. I don't know why DF didn't come out an break this earlier. It seems lame for this to come out now with this well after launch. This should of been called out from early and I'm kind of disappointed by Sony, GG and DF who probably knew there was a small fine print of it all but didn't report and own up to it.

This is basicly the crux of it. Everyone was originally assuming KZ:SF would be 1080p 30fps in multiplayer, when they said 60fps it was kind of a wow moment, but looking at what they had to sacrifice to do that, I think id rather it be 30fps
 
Killzone Mercenary did a kind of similar thing didnt it? Native resolution when there is little movement and then dynamic resolutions to hold the framerate during movement.
GT3 and GT4 on PS2 had interlaced resolution as well.
This sounds like the PS4 is rendering interlaced images and the PS4 is in effect de-interlacing to get a final 1080p image?
No. There is no interlacing, and the upscaling is limited to very small areas of the screen (and may not be "true" upscaling at all). This isn't the same technique as Killzone Mercenary, or the Gran Turismos. Shadow Fall is apparently the first game to ever use this new method.

It gives better results than the old methods, and more accurately approximates what a native 1080p screen would look like. As a tradeoff, it also creates scattered artifacts that look a little like dithering (but aren't really related to old uses of that either).
 

Lucifon

Junior Member
Resolutiongate, the placebo effect at its best. Tell everyone a game is 1080p when it isn't and no one notices anything wrong. Months later it's found that (god forbid) it isn't actually the p's they were promised and of course people 'always thought something looked off'. I hope one day we'll be able to just enjoy games and appreciate them for how good they actually look, rather than going back and forth over counting pixels. The game looks great, and performs well in multiplayer which is absolutely crucial.
 

remnant

Banned
This is basicly the crux of it. Everyone was originally assuming KZ:SF would be 1080p 30fps in multiplayer, when they said 60fps it was kind of a wow moment, but looking at what they had to sacrifice to do that, I think id rather it be 30fps

Why? It plays incredibly well and looks fucking amazing.

Seriosuly, did you really care about Killzone MP resolution 3 days ago. Why is it a issue now. The MP is fantastic to play.
 

Saty

Member
Man, never imagined Sony's 'For the Player's motto will backfire so fast. Shutting GT5 online servers, Santa Monica stuff and now not being truthful\misleading regrading aspects of a game.
 

benny_a

extra source of jiggaflops
Resolutiongate, the placebo effect at its best. Tell everyone a game is 1080p when it isn't and no one notices anything wrong. Months later it's found that (god forbid) it isn't actually the p's they were promised and of course people 'always thought something looked off'.
Except we have hundreds of posts in threads about the multiplayer from the first Gamersyde footage to the multiplayer impressions thread with people arguing that there was a very visible blur in the MP that is not present in the SP.

Shitty for you we actually have an archive on this forum. If there wasn't you might have had a point and not just a bad rehash of a post that has been debunked several times.

It gives better results than the old methods, and more accurately approximates what a native 1080p screen would look like. As a tradeoff, it also creates scattered artifacts that look a little like dithering (but aren't really related to old uses of that either).
I would like to see this method on the singleplayer footage to see how much of an impact this has on the final image. Because this is employed in the MP you can't isolate of how much an impact it has.
 

Sweep14

Member
Am I ok in thinking that it's a clever technique used by Guerilla. Allowing them to keep their sp assets while giving 60 fps at the cost of a more intensive motion blur ?
 

big_z

Member
strider, thief and now this. if infamous gets mediocre reviews sony gaf will be on suicide watch.

I think the whole resolution thing is stupid, give it a year or two and well be seeing 720p 30fps with some AA on nearly every big budget game. much like how last gen started with 720p and ended with many games being slightly above SD.
 
Resolutiongate, the placebo effect at its best. Tell everyone a game is 1080p when it isn't and no one notices anything wrong. Months later it's found that (god forbid) it isn't actually the p's they were promised and of course people 'always thought something looked off'. I hope one day we'll be able to just enjoy games and appreciate them for how good they actually look, rather than going back and forth over counting pixels. The game looks great, and performs well in multiplayer which is absolutely crucial.

People always noticed something was off and most accounted the blur and lower image quality for the tradeoff for higher framerate. We just didn't know how the disparity in image quality was being achieved.
 

Ape

Banned
Man, never imagined Sony's 'For the Player's motto will backfire so fast. Shutting GT5 online servers, Santa Monica stuff and now not being truthful\misleading regrading aspects of a game.

It's still not that big of a deal. The GT5 servers are from a 3 and a half year old game with a sequel out. The SSM stuff is really unfortunate but there's nothing you can do when a project isn't working out and I really hope that this doesn't lead anyone to not purchase the game.
 

Bundy

Banned
Man, never imagined Sony's 'For the Player's motto will backfire so fast. Shutting GT5 online servers, Santa Monica stuff and now not being truthful\misleading regrading aspects of a game.
lmao
This post his hilarious!
Resolutiongate, the placebo effect at its best. Tell everyone a game is 1080p when it isn't and no one notices anything wrong. Months later it's found that (god forbid) it isn't actually the p's they were promised and of course people 'always thought something looked off'. I hope one day we'll be able to just enjoy games and appreciate them for how good they actually look, rather than going back and forth over counting pixels. The game looks great, and performs well in multiplayer which is absolutely crucial.
Placebo effect? Alright!
So the devs can use 720p for all games now? Right?
Because there is nearly no difference at all between 720p and 1080p, right!?
 
Top Bottom