I thought I'd repost my explanation of the resolution since it has gotten kinda buried.
Added to OP
I thought I'd repost my explanation of the resolution since it has gotten kinda buried.
Can we add this to the OP?
I thought I'd repost my explanation of the resolution since it has gotten kinda buried.
I hope this is just because the game was a launch game.
Though Titanfall is indeed rendering some more pixels per frame in the traditional meaning, the result on a 1080p display is overall worse IQ than Shadow Fall's approach. That's because, after rendering what pixels it does, Killzone then calculates an approximation for what the rest would've looked like if rendered. Titanfall, on the other hand, simply enlarges all its pixels to cover over the holes. That's why KZ MP, though blurrier than SP, it still sharper than Titanfall.Oh it does actually seem to be true. 1408x792 vs apparently 960x1080
information from previously rendered frames is used to plug the gaps
I'd personally be pretty hesitant to call alternating 960x540 fields "native 1080i."1080i is 'essentially' 60 complete separate images per second @ 960x540 layered as 1920x1080. This doesn't sound like what they are doing.
1080i is 'essentially' 60 complete separate images per second @ 960x540 layered as 1920x1080. This doesn't sound like what they are doing.
Would explain why it looks like blur
They don't, the output is a 1920x1080 progressive signal.This one is a head scratcher because it seems they would still need to render the graphics at 50-60FPS even if they were interleaving the output for some reason. How can they just split up the geometry like that and render every other scan line every other frame? I can see how it outputs that way, but rendering that way seems totally different.
This is an interesting development. So the solution in use is a strange and esoteric trade off that presents a faux-1080p. Interesting.
Yes, but only for stuff with high parallax; the concertina further on doesn't have them. Overall it's an interesting choice. There's that ghosting on thin, fast-moving objects, and textures are blurred, but edge aliasing is reduced. Perhaps that's why Guerilla misleadingly referred to this technique as a "different type of AA in multiplayer"?There are definitely ghosting artifacts.
They don't, the output is a 1920x1080 progressive signal.
They render the fresh "960x1080" component of the current field. This data fills half the lines in the 1920x1080 buffer for the current frame. Then they fill in the gaps in the 1920x1080 buffer by projecting data from the previous frame into the missing pixels.
I hope so too but you know, depends on the game and the dev team.I hope we get 1080p for 30fps games and 900p for 60fps games. It seems realistic enough to expect.
lol. Though, I'm genuinely excited about those two things in next gen games. I guess my expectations for us getting 1080p all of the time are just really low.There are four words you're going to hear a lot of when to PS4 and together, they change the game.
better graphics
bigger worlds
Give me 960x1080 or give me death.
It shouldn't be, not compared to native 1920x1080, since they aren't actually increasing sampling. Maybe it's being turned into noise because of reprojection inaccuracies making things a bit more "stochastic," but any overall reduction in edge garbage should be a result of actual AA and not simply reprojecting things.There's that ghosting on thin, fast-moving objects, and textures are blurred, but edge aliasing is reduced.
Yes, but only for stuff with high parallax; the concertina further on doesn't have them. Overall it's an interesting choice. There's that ghosting on thin, fast-moving objects, and textures are blurred, but edge aliasing is reduced. Perhaps that's why Guerilla misleadingly referred to this technique as a "different type of AA in multiplayer"?
As for people saying this approach is 1080i, it absolutely isn't. Nor is it "1920i", even though that's closer because the render is skipping columns, not rows. But it's not an interlaced output of any type; it does send 1920x1080 pixels in every frame (i.e. there are no fields).
And it's not a horizontal-only upscale either. That approach was used relatively frequently on PS3 last gen: take the pixels you've rendered, and stretch them wider to fill the screen. But Killzone's method doesn't do that; it fills the missing pixels with different information.
And that different information is not just the same pixels from the last frame. If it was, there would be vertical combing artifacts on every single object in motion, not just thin stuff that's crossing quickly, as seen above. So there's some algorithm in place which tries to interpolate things in an intelligent way (and fails on certain stuff in the ~20ms window it has).
All in all, there's no existing jargon to describe what's going on. The only accurate approach is a full description, like "It's 960x1080 temporally reprojected to 1080p." I know that's wordy and unsatisfying, but it's the best way. Talking about interlacing or upscaling is just not correct. (Though it's hard to get away from those words; I believe I may have used "interlaced" somewhere in here myself.)
"Internal horizontal interlace with reprojection" might be more accurate.I think calling it an "internal horizontal interlace with blending" is fair.
"we can almost certainly assume that this effect is not cheap from a computational perspective."
I always thought the multiplayer looked off for some reason, and I also assumed it was bad FXAA, since we were told it would be 1080p already (although I always had an inkling that that would only mean 1080 vertical pixels so it could be called "1080p"). Is the game still going to be an example of "getting more out of closed hardware"?
I thought I'd repost my explanation of the resolution since it has gotten kinda buried.
The grey lines are dead space where nothing is being rendered on this frame.
So techinically the game is rendering at 1920x1080, just not on each frame.
"Internal horizontal interlace with reprojection" might be more accurate.
"Blending" makes more sense when temporal sampling is used for "supersampling."
I always thought the multiplayer looked off for some reason, and I also assumed it was bad FXAA, since we were told it would be 1080p already (although I always had an inkling that that would only mean 1080 vertical pixels so it could be called "1080p"). Is the game still going to be an example of "getting more out of closed hardware"?
The fact that no one knew or cared about this shows just how little all this shit matters outside of petty console wars on either side. Always built up into the worst news ever when publicized, but if no one bothers to check, no one knows or cares at all.
The last sentence makes no sense. If you never render a frame natively in 1920x1080 then you are never rendering natively in 1920x1080...
You could make that claim if the game rendered at twice the target framerate and combined the two frames or something like that (which is similar to what some passive 3d sets do).
More RAM for less constrained game design(hopefully). Still makes me really sad that Bioshock Infinite's map design was downgraded from its E3 trailer.If we didn't expect 1080p then why the fuck did we even upgrade from PS360?
Okay, perhaps I misunderstood the terminology you're using. By "reprojection", do you mean simply repeating the identical pixels from the previous frame? Because that definitely doesn't seem like what they're doing. If it were, every object in motion should have vertical combing, but they certainly don't."Internal horizontal interlace with reprojection" might be more accurate.
"Blending" makes sense when temporal sampling is used for "supersampling."
I don't think that's right, I believe it's 1920 always and it interleaves fields of 540 to make 1080i.
This one is a head scratcher because it seems they would still need to render the graphics at 50-60FPS even if they were interleaving the output for some reason. How can they just split up the geometry like that and render every other scan line every other frame? I can see how it outputs that way, but rendering that way seems totally different.
I always thought the multiplayer looked off for some reason, and I also assumed it was bad FXAA, since we were told it would be 1080p already (although I always had an inkling that that would only mean 1080 vertical pixels so it could be called "1080p"). Is the game still going to be an example of "getting more out of closed hardware"?
The last sentence makes no sense. If you never render a frame natively in 1920x1080 then you are never rendering natively in 1920x1080...
You could make that claim if the game rendered at twice the target framerate and combined the two frames or something like that (which is similar to what some passive 3d sets do).
I thought I'd repost my explanation of the resolution since it has gotten kinda buried.
Agreed.
"Reprojection" in this case means they're likely shifting the positions of things based on a motion buffer, so that the content from the previous frame lines up better when placed alongside pixels from the current field. The most naive form of reprojection would indeed be to just repeat the pixels, which as you note would create more combing artifacts than we see.Okay, perhaps I misunderstood the terminology you're using. By "reprojection", do you mean simply repeating the identical pixels from the previous frame? Because that definitely doesn't seem like what they're doing. If it were, every object in motion should have vertical combing, but they certainly don't.
Or does "reprojection" subsume some kind of interpolation, differing from "blending" only in the number of samples?
So that's basically 1080i but with the interlacing being done vertically instead of horizontally?
The native output is still 1920x1080, every other column is just not being rendered at the exact same time.
There were game reviews that noted specifically how blurry the MP was. People acting like "no one could tell the difference until we were told what the resolution was!" are trying too hard.
http://www.gameplanet.com.au/playst...3575/Killzone-Shadow-Fall-multiplayer-review/
The native output is still 1920x1080, every other column is just not being rendered at the exact same time.
So the mp looks blurry because it's not 1080p?
The neogaf hysteria about 1080p is just ridiculous.
So the mp looks blurry because it's not 1080p?
The neogaf hysteria about 1080p is just ridiculous.
So does this essentially trick the brain into feeling 1080p @ 60 fps?
It's certainly sharp and fluid, it's just that it introduces dithering and other artifacts when the interpolation doesn't quite work.
There's so much action in the game that you tend not to notice.
I tried going back and forth between MP and SP and they both looked just as sharp, but the SP had a better picture quality and lesser framerate.