• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Guerrilla Games: Regarding Killzone Shadow Fall and 1080p

RoboPlato

I'd be in the dick
i really find it rather amusing that you people are retroactively disappointed knowing this :D
This proves that we can tell the difference even when being told it was native 1080p and in certain situations it can even defeat pixel counting. We didn't notice the reprojection artifacts because the effect hasn't been used before.
 

Begaria

Member
After reading that explanation, this is all I could think of:



GREAT SCOTT!

What...what have I done? I don't believe this. Misterxmedia just used my picture in today's post XD

I AM SO SORRY EVERYONE.

For the record, the picture was a joke. Not the explanation from GG.
 

TyrantII

Member
960 x 1080 is a lot less pixels than 1600 x 900, heck it's even less than 1408 x 792. Should developers start using that resolution instead going forward as opposed to using 900p as was the case with BF4? I guess I would have to compare BF4 using temporal 960 x 1080 vs 1600 x 900 before making a conclusion as to which is better.

Can't forget the two big bullet points:
Its all along the virticle axis (less noticible by the human eye)
Its a calculation, so some number of them ARE accurate as a pure render.

That's not the case with a simple upscale. You're getting a much more accurate (better clarity) image.
 
I'm willing to drop calling it 1080p as long as people quote that entire thing when they mention kz multiplayer res. 1:1 pixel mapping and the rest is being rendered through prediction using past frames and IMO its still sharper then any upscaled game I've played.
 
They're being surprisingly transparent, huh. Even though Shadowfall failed in terms of gameplay design, it seems like they were trying to set a technical precedent for other PS4 games to follow, from the lack of annoying health warnings and splash screens to the native resolution.
 
I am increasingl drawn to the technical side of gaming and found this description of what they are doing fascinating. They said that they will be releasing the slides of their GDC talk, but I wonder if anyone knowns if the actual GDC presentations are made public? I'd love to see them.
 

hodgy100

Member
Please, stop.

Just stop man. Please.

despite his dubious techniques of getting his point across his post is 100% correct.

But again we have a moving target as far as definitions go. What exactly does rendering mean? Is that an objective term that everyone agrees on? I'd say it's not. When I think of native 1080p output what I'm looking for is if the final rendered image sent to the display is mapped 1:1 on the pixels of my 1080p TV.

It's not the developers fault that a lot of these standards are not broken down and completely separated from the actual rendering side of things. Clearly, rendering final output at a lower resolution and upscaling is not native. Constructing 1920 lines horizontally using the current and past frames information and then sending it to the display counts as native according to at least one objective definition.

If gamers are serious about knowing how a game is performing then I guess it's time to demand a RENDERED and OUTPUT breakdown on every box to clear up confusion. Until then, don't accuse people of lying when they meet objective standards.

The thing is that doesn't mean its 1080p. it means the pixels are mapped 1:1 but it doesnt mean its 1080p under your definition an image like this:

thw3LvX.jpg

But we know for a fact that the game is being rendered at 1280 x 720 and the black borders make up the rest of the image, this is not 1920 x 1080 despite that being the resolution of the image. Likewise because of this the order is also not rendered at 1080p because it is rendered at 1920 x 800 not 1920 x 1080, the rest of the image is made up with black bars, exactly the same as the image above. therefore when stating the resolution of a game its is by far the best to stick with stating its rendering resolution, anything else allows people to muddy the waters.
 
despite his dubious techniques of getting his point across his post is 100% correct.

No it isn't.

The dev could choose to render those black bars in a 2.35/40:1 presentation - that would be a 1080p image, no ifs or butts. Of course the decision - at least in film - is not to render the black bars as it saves on bandwidth.
 

Metfanant

Member
despite his dubious techniques of getting his point across his post is 100% correct.



The thing is that doesn't mean its 1080p. it means the pixels are mapped 1:1 but it doesnt mean its 1080p under your definition an image like this:



But we know for a fact that the game is being rendered at 1280 x 720 and the black borders make up the rest of the image, this is not 1920 x 1080 despite that being the resolution of the image. Likewise because of this the order is also not rendered at 1080p because it is rendered at 1920 x 800 not 1920 x 1080, the rest of the image is made up with black bars, exactly the same as the image above. therefore when stating the resolution of a game its is by far the best to stick with stating its rendering resolution, anything else allows people to muddy the waters.

EXCEPT....unless your TV has a 1:1 pixel mapping feature (most dont) you cant display Killzone 2/3 on your TV in that fashion...in regards to the order...the PS4 is rendering and outputting a full 1920x1080 image with 240 lines of black pixels at the top and bottom...

there is a MAJOR difference there...
 
Rereading it, I still don't know why they went with a technique like this that seems so complex. The results look surprisingly good at 60fps but the game hits that far too little to make the drop in image quality worth it. If this technique can improve to the point where there is a bit less blur and it can guarantee 60fps all the time, then it could be very worthwhile to avoid upscaling artifacts in situations where 1920x1080 isn't achieveable.
Agree. As it stands in practice, this solution hurt the game. In an FPS game, slightly soft picture >>>> motion blur.

To be sure the technique has some promise. It looks great when standing still or looking around while aiming down sight because the motion is limited. But when it's time to spin or run, the blur is at best distracting...and for myself and others, headache-inducing (actual real life pain and discomfort).

I do feel deceived. Not because of the resolution, but because of the blur-inducing technique used to simulate the normal native rendering introduces problems that don't need to exist purely to check off the 1080p box. I would very much like to have seen what a normal 900p version of KZ's MP looked like. I'm pretty sure I would have preferred it to all the blur and artifacting we see here just to attain a quasi-1080p. I'm not really sure how any of the many apologists can be okay with this motion blur, let alone this being detailed only now when most of the sales that were going to happen have been made and only after someone else figured out the truth. How very convenient.

They need to either make their guessing algorithm better to significantly reduce the motion blurring or not use the technique at all. For FPS games, anyway. It's not worth it. Unless your objective in a competitive online shooter is to move slowly and look at the scenery.
 

chadskin

Member
despite his dubious techniques of getting his point across his post is 100% correct.

Nope.

But we know for a fact that the game is being rendered at 1280 x 720 and the black borders make up the rest of the image, this is not 1920 x 1080 despite that being the resolution of the image. Likewise because of this the order is also not rendered at 1080p because it is rendered at 1920 x 800 not 1920 x 1080, the rest of the image is made up with black bars, exactly the same as the image above. therefore when stating the resolution of a game its is by far the best to stick with stating its rendering resolution, anything else allows people to muddy the waters.

The Order is made with the black bars in mind, though, be it as stylistic device or whatever. It can either be rendered at 1920 x 1080 with forced black bars internally rendered and then output or in 1920 x 800 which results in black bars enforced by the TV and a 1920 x 1080p output image. Either way: 1:1 pixel mapped.

The 720p image is just a 720p image not upscaled to the 1080p display. Different story.
 

hodgy100

Member
No it isn't.

The dev could choose to render those black bars in a 2.35/40:1 presentation - that would be a 1080p image, no ifs or butts. Of course the decision - at least in film - is not to render the black bars as it saves on bandwidth.

can you clarify this for me, it seems to be implying that rendering the black bars will use up bandwidth and this would be clearly incorrect. The game would not render a 1920 x 1080 scene and then place the black bars over it, yes some games do this for cutscenes, but this is not the case for the order 1886 the entire game is like that it would be completely wasteful.

Nope.

The Order is made with the black bars in mind, though, be it as stylistic device or whatever. It can either be rendered at 1920 x 1080 with forced black bars internally rendered and then output or in 1920 x 800 which results in black bars enforced by the TV and a 1920 x 1080p output image. Either way: 1:1 pixel mapped.

and yes my image is just another image pasted on a 1920 x 1080 black box, but if that was an actual game rendering like that, with the boarders being put in, then by your definition it would be rendering at 1080p
The 720p image is just a 720p image not upscaled to the 1080p display. Different story.

I'm not disputing that it wasn't because of a stylistic choice, id probably go with the aspect ratio if it meant it gave the performance for 4xAA while keeping 1:1 pixel mapping. but thats besides the point, nothing is calculated for the black boarders, the console is only doing work for the 1920 x 800 pixels, this is why it is not 1080p the working rendering resolution of the game is 1920 x 800. i dont know how this can be explained any simpler.
 

Vintage

Member
b78.gif


I love reading about optimization techniques. It's amazing how people can come up with such ideas.

And btw, EVERYTHING in video games is tricks and lies.
 

nelchaar

Member
Sounds like a slippery explanation.

Absolute nonsense. Trying to redefine what 1080p means. It's hard enough for people like me to understand these things.

Edit: It makes sense to everyone else so its me who is speaking nonsense.

I think that's what you call 'baffling people with science'

People ask Guerilla for an explanation for what they have done to the multiplayer, calling them dishonest for not detailing their technique, Guerilla obliges. People then call out Guerilla for over-explaining.

Sounds like some people just want to see Guerilla flayed. Truth to the matter is, they are a highly technical studio that pioneer a lot of techniques. You asked for a detailed explanation, and you got it. Can't claim that it's "baffling". That's YOUR shortcoming.
 

eso76

Member
This proves that we can tell the difference even when being told it was native 1080p and in certain situations it can even defeat pixel counting. We didn't notice the reprojection artifacts because the effect hasn't been used before.

Few people noticed any difference and those who did thought it was just a different AA solution.
No one ever doubted it was native 1080p afaik

Some of you guys should really stop using numbers to understand if a game should or shouldn't look good to you.

Yes, you may be pissed if it negatively impacts IQ, but would you rather throw the performance boost away ? It's not like you could have both anyway

Devs should use all the tricks they can come up with; that's what optimisation is all about:
 
This proves that we can tell the difference even when being told it was native 1080p and in certain situations it can even defeat pixel counting. We didn't notice the reprojection artifacts because the effect hasn't been used before.

LOL, not a single person ever said it was sub 1080p native. Not even the "insiders" who always leak resolutions.
 

Chumpion

Member
Notwithstanding their 1080p claims, it's good these techniques are being developed. When VR hits, you can't drop frames any more (unless you want your customers to puke their guts out), so dynamic resolution schemes become very important.
 

RoboPlato

I'd be in the dick
LOL, not a single person ever said it was sub 1080p native. Not even the "insiders" who always leak resolutions.
We noticed a big difference in image quality but tried to attribute it to other things based what we knew. AA was the most obvious since they said the method differed between Campaign and MP.
 

thuway

Member
Rereading it, I still don't know why they went with a technique like this that seems so complex. The results look surprisingly good at 60fps but the game hits that far too little to make the drop in image quality worth it. If this technique can improve to the point where there is a bit less blur and it can guarantee 60fps all the time, then it could be very worthwhile to avoid upscaling artifacts in situations where 1920x1080 isn't achieveable.

Well, this is -

A. A launch title
B. First time this technique has ever been used
C. The first of its kind

Lots of room to improve and I expect this will be the only way we can ever get 4k games.
 

RoboPlato

I'd be in the dick
Agree. As it stands in practice, this solution hurt the game. In an FPS game, slightly soft picture >>>> motion blur.

To be sure the technique has some promise. It looks great when standing still or looking around while aiming down sight because the motion is limited. But when it's time to spin or run, the blur is at best distracting...and for myself and others, headache-inducing (actual real life pain and discomfort).

I do feel deceived. Not because of the resolution, but because of the blur-inducing technique used to simulate the normal native rendering introduces problems that don't need to exist purely to check off the 1080p box. I would very much like to have seen what a normal 900p version of KZ's MP looked like. I'm pretty sure I would have preferred it to all the blur and artifacting we see here just to attain a quasi-1080p. I'm not really sure how any of the many apologists can be okay with this motion blur, let alone this being detailed only now when most of the sales that were going to happen have been made and only after someone else figured out the truth. How very convenient.

They need to either make their guessing algorithm better to significantly reduce the motion blurring or not use the technique at all. For FPS games, anyway. It's not worth it. Unless your objective in a competitive online shooter is to move slowly and look at the scenery.
I think the blur you're referring to is the fake ass motion blur filter they put around the edges of the screen in MP. KZ3 and Mercenary do the same and it looks awful.

Few people noticed any difference and those who did thought it was just a different AA solution.
No one ever doubted it was native 1080p afaik

Some of you guys should really stop using numbers to understand if a game should or shouldn't look good to you.

Yes, you may be pissed if it negatively impacts IQ, but would you rather throw the performance boost away ? It's not like you could have both anyway

Devs should use all the tricks they can come up with; that's what optimisation is all about:
The reason I don't care for it is because it isn't enough of a performance boost to negate the IQ tradeoff. At 60fps it's quite convincing but the game isn't near that enough to make it worthwhile.

Well, this is -

A. A launch title
B. First time this technique has ever been used
C. The first of its kind

Lots of room to improve and I expect this will be the only way we can ever get 4k games.
I agree with you. Technique has promise, just not quite there yet.
 
We noticed a big difference in image quality but tried to attribute it to other things based what we knew. AA was the most obvious since they said the method differed between Campaign and MP.

I never said people didn't notice a difference. I've played the game and noticed a difference myself. The simple fact is that none of the "resolution warriors" ever said it was sub native 1080p yet these same exact people are the one's in all of these resolution threads claiming expertise on the subject.
 

imtehman

Banned
All I have to say is that this is why I like consoles ! I mean some people try to make this a thing about the capabilities of the PS4 and in extension the new gen but coming up with techniques that squeeze every last drop of power out of the hardware is what consoles are about in my book. The best pixel per dollar ratio. Kudos to GG for a both detailed and easy to follow analysis.

u sure its squeezing out the hardware and not readjusting to fit the limits of the hardware? surely they wouldnt have to do this "technique" if the ps4 was capable of running kz at 1920x1080 @ 60fps.
 

Metfanant

Member
u sure its squeezing out the hardware and not readjusting to fit the limits of the hardware? surely they wouldnt have to do this "technique" if the ps4 was capable of running kz at 1920x1080 @ 60fps.

compare Resistance: Fall of Man to TLoU....then think about the word optimization...then rethink your post...
 

railGUN

Banned
I never said people didn't notice a difference. I've played the game and noticed a difference myself. The simple fact is that none of the "resolution warriors" ever said it was sub native 1080p yet these same exact people are the one's in all of these resolution threads claiming expertise on the subject.

Hard to argue when the developers explicitly states it was 1080p. Also, bold claim that "no one" entertained the idea it was sub 1080p, what with the number of posters on this board. I actually find that hard to believe.
 

VanWinkle

Member
I never said people didn't notice a difference. I've played the game and noticed a difference myself. The simple fact is that none of the "resolution warriors" ever said it was sub native 1080p yet these same exact people are the one's in all of these resolution threads claiming expertise on the subject.

If a developer straight up says the multiplayer is "native 1080p", and we know that they used a different AA from single player to multiplayer, it's fair to assume that it must be the AA.
 
I think the blur you're referring to is the fake ass motion blur filter they put around the edges of the screen in MP. KZ3 and Mercenary do the same and it looks awful.
Oh? If that's the case I'm okay with the technique. I suppose I need further clarity on whether the blur I see all over the screen when running is intentional (and why the fuck would you do that Guerilla? It looks awful!), or secondary to this rendering technique. My impression has been that it's the later.

Any clarity on this point will be appreciated.
 

dark10x

Digital Foundry pixel pusher
It definitely is weird that no insider caught wind of this and that DF didn't bother mentioning it till now.
It looks like 1080p. That's why. There's something strange about it but there's really minimal loss in detail even in motion.

It doesn't look upscaled, basically.
 

RoboPlato

I'd be in the dick
Oh? If that's the case I'm okay with the technique. I suppose I need further clarity on whether the blur I see all over the screen when running is intentional (and why the fuck would you do that Guerilla? It looks awful!), or secondary to this rendering technique. My impression has been that it's the later.

Any clarity on this point will be appreciated.
Is it in the center of the screen or is there an oval of clarity in the middle while the edges are blurred?
 
I never said people didn't notice a difference. I've played the game and noticed a difference myself. The simple fact is that none of the "resolution warriors" ever said it was sub native 1080p yet these same exact people are the one's in all of these resolution threads claiming expertise on the subject.

yeah man if they were really 'experts' they'd have had detailed knowledge of, and the ability to recognize immediately, a new rendering technique that still produced a pixel-countably 1080p image alongside assertions that the game was native 1080p.

do you even
read
bro
 

MrZekToR

Banned
Just wait until MS unleash their dual Xbox One SLI configuration...

...there's a reason why the machine has HDMI input.

:eek:)
 
I never said people didn't notice a difference. I've played the game and noticed a difference myself. The simple fact is that none of the "resolution warriors" ever said it was sub native 1080p yet these same exact people are the one's in all of these resolution threads claiming expertise on the subject.

Because it uses a new technique that nobody's ever seen before, that gives different results that nobody's ever really seen before.

It's also probably why no "insider" ever heard about it. "We're targeting 1080p on this platform and 720p on the other" is easy to understand and something that can easily leak because everybody would understand it. "The tech guys have cooked up this weird new pseudo-1080p mode involving an image that's filled in with algorithms and it only happens in multiplayer and it looks almost exactly like 1080p if the camera isn't moving" isn't.
 

dark10x

Digital Foundry pixel pusher
Because it uses a new technique that nobody's ever seen before, that gives different results that nobody's ever really seen before.

It's also probably why no "insider" ever heard about it. "We're targeting 1080p on this platform and 720p on the other" is easy to understand and something that can easily leak because everybody would understand it. "The tech guys have cooked up this weird new pseudo-1080p mode involving an image that's filled in with algorithms and it only happens in multiplayer and it looks almost exactly like 1080p if the camera isn't moving" isn't.
That's the thing, it still looks like 1080p even when the camera is moving.
 
can you clarify this for me, it seems to be implying that rendering the black bars will use up bandwidth and this would be clearly incorrect.
Well if it's true I stand corrected.
he game would not render a 1920 x 1080 scene and then place the black bars over it, yes some games do this for cutscenes, but this is not the case for the order 1886 the entire game is like that it would be completely wasteful.
And the entire game may be rendering in 1080p with black bars rather than "800p" then the bars inserted either by the console's scaler or the TV.
 
Sigh. Still amazes me that this turned into such a shit storm.

You guys remember Gamescom last year? IW and MS teamed up to make a special announcement about dedicated servers for COD Ghosts on the XB1. Then we all spent months debating whether the other systems would get them. Then, right before launch, IW announced that all systems would get them.

Then the game came out, and in the first game I play I get hit with a host migration screen. Turns out, they're using a "hybrid" system, where you "might" be playing on a dedi, or you "might" be playing P2P, depending upon your connection. And, by saying it "might" be P2P, what they really mean is that it's pretty much P2P, which I realize after playing a couple more weeks with constant black screens.

Seems to me that this was a hell of a lot bigger of a deal than what flavor of pixel math gets used in one mode of the game, but I don't remember IW eating anywhere near the amount of shit for it that GG is having flung at them right now...

*shrugs*
 
Because it uses a new technique that nobody's ever seen before, that gives different results that nobody's ever really seen before.

It's also probably why no "insider" ever heard about it. "We're targeting 1080p on this platform and 720p on the other" is easy to understand and something that can easily leak because everybody would understand it. "The tech guys have cooked up this weird new pseudo-1080p mode involving an image that's filled in with algorithms and it only happens in multiplayer and it looks almost exactly like 1080p if the camera isn't moving" isn't.


I know it's a new technique but this throws a monkey wrench into the whole "1080 native" argument & puts EVERY SINGLE game announced at a native resolution into question. How do we know other games don't use similar graphical techniques?
Should we put a * next to games that use these types of techniques?
 
Top Bottom