• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Killzone: Shadow Fall Multiplayer Runs at 960x1080 vertically interlaced

hodgy100

Member
Why? It plays incredibly well and looks fucking amazing.

Seriosuly, did you really care about Killzone MP resolution 3 days ago. Why is it a issue now. The MP is fantastic to play.

Oh I agree the game is great fun, but there was a noticeable IQ drop between MP and SP and its interesting to find out the cause of this, even if it's a bit disappointing that we have been mislead.
 
the multiplayer engine is a complete mess

*not 1080p
*frame rate all over the place
*needless reflections that probably hurt the frame rate
*low res effects
*huge amount of lod pop ins
*annoying shadow pop ins

Dice beat GG this time around, they ported a PC engine to PS4 so quickly and it looks and performs way better.

Dice has their own issues to deal with so I don't think that's a very good example. And please add players warping all over the place as another problem with the multiplayer. It's fun to just stand and watch everyone warp around the map.

Nobody needs to believe anything. We have a post history and the "graphics" have been discussed in the context of multiplayer in KZSF threads and people were rightfully calling it downgraded compared to the singleplayer but came to the wrong conclusion what method was employed.

Shouldn't have to explain anything to him. How a man can love a company so much I will never understand
 

zzz79

Banned
iOL4D6upJBTdQ.gif

haha, GREAT !

It's time to blame GG/Sony for lying !
 

RVinP

Unconfirmed Member

That image looks better when its downscaled/downsampled, I wonder if this technique can be used to;
.render at more than 1920x1080 resolution
like 2304x1296, 2560x1440, 2688x1512, 2880x1620 or 3840x2160
..utilize this temporal rendering method to render data for each frame at half the resolution
like 1152x1296, 1280x1440, 1344x1512 (2.03MP), 1440x1620 or 1920x2160
..put the final frame at their respective resolution
.downsample to 1920x1080 onscreen
 

Arklite

Member
Well at least things make sense. I always thought MP looked grainier because they skimped on AA or added some kind of strange filter. Kind of deflating that full HD is still so challenging for brand new hardware, Guerrilla was really looking way ahead of the game.
 

Saty

Member
Just stating facts. Always thought that 'For the Players' motto is a double-edged sword because Sony is a corporation and as one they are guaranteed to make, at the very least, questionable moves. But i didn't think it will happen this fast.
 
I would like to see this method on the singleplayer footage to see how much of an impact this has on the final image. Because this is employed in the MP you can't isolate of how much an impact it has.
A lot of the same assets are used in multiplayer as in the campaign--places, soldiers, weapons, etc. I think it's already really clear that this method looks noticeably worse than the same assets running in native 1080p.

A better comparison would be to see the multiplayer footage simply upscaled from 763p, without this new method. Because that's the alternative they didn't take. Going by simple logic and by the appearance of regular upscaled games, I bet it would look a lot worse.
 

remnant

Banned
Just stating facts. Always thought that 'For the Players' motto is a double-edged sword because Sony is a corporation and as one they are guaranteed to make, at the very least, questionable moves. But i didn't think it will happen this fast.

If Sony fucking up is releasing a game as awesome as killzone, then they are okay in my books.
 

sTaTIx

Member
strider, thief and now this. if infamous gets mediocre reviews sony gaf will be on suicide watch.

What about Thief? Last I checked, Thief runs and looks far superior on the PS4 than the Xbox One.

Also, "Sony GAF" will be fine so long as the sales numbers keep going the way they are.
 

benny_a

extra source of jiggaflops
A lot of the same assets are used in multiplayer as in the campaign--places, soldiers, weapons, etc. I think it's already really clear that this method looks noticeably worse than the same assets running in native 1080p.

A better comparison would be to see the multiplayer footage simply upscaled from 763p, without this new method. Because that's the alternative they didn't take. Going by simple logic and by the appearance of regular upscaled games, I bet it would look a lot worse.
Either way, one needs a before/after done where everything is the same except the technique discussed here. That way it's isolated and then only the impact of said technique can be analyzed.

SP and MP parts between games can be quite different despite using the same assets. Some games even load a different variation of their engine.
 
That image looks better when its downscaled/downsampled, I wonder if this technique can be used to;
.render at more than 1920x1080 resolution

..utilize this temporal rendering method to render data for each frame at half the resolution

..put the final frame at their respective resolution
.downsample to 1920x1080 onscreen
Used that way, it would just be another method of AA...and one that generates artifacts scattered across the image, while being inaccurate in other areas. If you're able to render at a higher res than your output, that headroom would be better used on other methods, like MSAA or the regular types of temporal AA.
 

Alienous

Member
It's a matter of personal preference. I'd like devs to avoid upscaling and this thread is the reason why: it is noticeable blurrier. If someone is fine with that and values other things higher, for example framerate or graphics, then I can accept that opinion.

I disagree with the second part of your post. 720p is the standard resolution for last-gen consoles games. Sub-720p and supra-720p games are exceptions. Especially 1st-party studios managed to steadily increase graphics fidelity without dropping resolution. I expect 1080p to be the standard resolution for PS4 throughout the whole cycle, but there will of course be exceptions. I also think that we won't see a single 720p game on PS4.

Eh. If the Xbox One drops to a 720p standard, it's possible that some PS4 games will do the same in order to achieve better visuals outside of the resolution.
 

FeiRR

Banned
It's so pathetically smart. It essentially renders 1080p at the moment you most likely would notice 1080p and renders a faux 1080p or a blur when it knows you won't be paying attention.

You are close to devs so you should know that making a game look and run as good as possible with limited resources at hand is the most essential part of their work. If you can use tricks, you use them. It's not about a numbers race or console wars (PR takes care of that). It's about shipping an attractive product that people want to buy.

I smile (because I am a mean person) when I see people melt down over that famous Witcher gif or scraps of The Order footage. Because those people will go into meltdown again. Technology never lives up to the hype because hype has no limit.
 

velociraptor

Junior Member
KZSF's MP didn't have IQ as good as the SP but I always chalked it up to the FXAA. However, not once I thought to myself 'this isn't 1080p'. It looks like 1080p.
 

d9b

Banned
Killzone single player was/is visually spectacular in 1080p. Didn't play MP... If true this sure is disappointing. Was there any official word from GG over this?
 
Someone with more skills than me needs to make a comedy Banderas gif maybe just showing the left half of the pic for 0.5s then the right half for 0.5s and so on.
 

Timu

Member
I think GG made a bad move by trying to go for 60fps in the MP. The end result was pretty shoddy.

They should have kept it at 1920x1080, lowered some effects and try to get a solid 30fps.


instead of this:
kz3.png
Holy crap that's noticeable.o_O Wish the plants had better AA.

ibkusvWpveHuPc.png


Hmm I just noticed the odd low res rock texture on the right.
 
Either way, one needs a before/after done where everything is the same except the technique discussed here. That way it's isolated and then only the impact of said technique can be analyzed.

SP and MP parts between games can be quite different despite using the same assets. Some games even load a different variation of their engine.
All true, and it would be interesting to see the difference from a technical standpoint. I just think we know the results in general even without the exercise: this method looks noticeably worse than native 1080p, but noticeably better than any form of simple upscaling. After all, that's exactly what brought us here: people could tell MP was worse than native 1080p singleplayer, but it didn't really look like a true upscale. For example:

KZSF's MP didn't have IQ as good as the SP but I always chalked it up to the FXAA. However, not once I thought to myself 'this isn't 1080p'. It looks like 1080p.
 
It's so pathetically smart. It essentially renders 1080p at the moment you most likely would notice 1080p and renders a faux 1080p or a blur when it knows you won't be paying attention.

Damned good point. Am I right in thinking that most TVs will smear a bit anyway (because they're not properly set up -- see last night's tv set up thread) so you're not that likely to notice it? I guess the effect would be way more pronounced on a monitor?
 

R_Deckard

Member
The discussion was starting to take over the other thread about this article and since it's only one small point and we didn't know this before, it would be best to make another thread.



http://www.eurogamer.net/articles/digitalfoundry-2014-in-theory-1080p30-or-720p60

I was very disappointed to learn this. I always thought that the added blur in MP was due to a poor implementation of FXAA but in reality it's not rendering too many more pixels than 720p. I have no idea why this wasn't mentioned in their previous article about the game's tech since he clearly talked to GG about it back then. For me this really kills my opinion of the tech on the multiplayer side of the game. Low res and can't hit 60fps regularly. They even claimed full res too

EDIT
Explanation of the upscaling process

I think it says wonders about the tech that GG used and how well implemented it is as myself who works in tech and design, loves games and I was one of the few to jump on the TR X1 lower Rez claim before it was public, missed this and NO_ONE in any part of the industry saw this and we all just thought a slightly blur AA solution was in fact an interlaced display (for want of a better description).

It is bad that DF never saw this (or never announced it until now all of a sudden) or that GG never came out and said it alittle after the game launched, but then looking at GAF with all the Resolution gate stuff hardly surprising, I bet they cannot believe they got away with it for so long.

I am not angry, just shocked to be fair
 
So do people think this method is better than 720p? only slightly higher pixel count compared to 720p but can be mapped per-pixel on 1080p displays.
 

sTaTIx

Member
Eh. If the Xbox One drops to a 720p standard, it's possible that some PS4 games will do the same in order to achieve better visuals outside of the resolution.

I highly doubt we'll ever see a 720p game on the PS4. Reason being, the Xbox One will never go below 720p on any game (for PR reasons), and PS4 still outperforms the Xbox One handily at 900p (see: Battlefield 4).

Basically, PS4 has shown that it can render a full 56% more pixels and still run at a significant higher framerate than the Xbone.
 
Holy crap that's noticeable.o_O Wish the plants had better AA.

ibkusvWpveHuPc.png


Hmm I just noticed the odd low res rock texture on the right.
It's not regular aliasing you're seeing on the plants, it's artifacts from the reprojection method as the plants move. As for the "rock texture on the right", if you mean the one bound by the arc, that's actually a low-res reflection in water.
 
So this is basically just vertical interlaced? That worked fine on old TV's because the way they worked meant that the interlacing got ignored and the frames blended fine. I suppose this temporal blending is the modern equivalent of that!

But... it's a slightly weird existential question. At work whenever we render sequences that are particularly heavy, we tend to only actually render every other frame and then see well it blends using a frame blending algorithm. Some of them work literally flawlessly, where comparing them to an actually rendered sequence offers no difference. Some work terribly and it turns into a mess. The algorithms are complicated, but they work - mostly - on a purely pixel basis. There is not additional information and even for pure CG we don't provide it with any other information (like velocity data) that could potentially make its calculations more exact.

I assume, however, that KZ does. It has all the information - if it's rendering what is effectively half a frame, it knows where all the objects are on the screen, it knows where the shadows are falling (hohoho), it knows what objects are moving and what are static, how much the camera is moving etc. What it doesn't have to do is actually sample the pixels, AA them and all that gubbins. As such, I imagine that their ability to actually do this frame blending stuff well is very, very high (and the proof is in the output, obviously). It also leads to the question of "what counts as 1080p"? Afterall, the machine is outputting 1080p and whilst half of the frame is "made up", it's "made up" using a load of very useful data. It'd be easier to count the things it's not doing than the things that it is. Some games - like KZ Mercenaries - render certain effects at a lower resolution and overlay them onto the higher resolution main render. What resolution is that? It's all CG. It's all computer generated. I can understand why a 720p upscaled to 1080p can't be called "true" 1080p, but this? I dunno. I think it's more complicated than just "It's only rendering half the pixels".

Edit: This is all based on my understanding of the tech, which may be wrong.
 

benny_a

extra source of jiggaflops
All true, and it would be interesting to see the difference from a technical standpoint. I just think we know the results in general even without the exercise: this method looks noticeably worse than native 1080p, but noticeably better than any form of simple upscaling. After all, that's exactly what brought us here: people could tell MP was worse than native 1080p singleplayer, but it didn't really look like a true upscale. For example:
Oh for sure. This is just to satisfy my own curiosity. GG has done a lot with resolution. Their 3D was done the same way WipEout HD was with the "stretched horizontal pixels" and KZ:M had dynamic resolution and now this title has the never before seen interpolation.

It would just be fun to see it go up against various methods and how much of an impact they have.

I'm open for exploration in those realms, but I'm also a person that doesn't care that much about resolution past a certain threshold. I think Thief at 900p on Xbone vs. 1080p on PS4 is visually not representative of what a 44% resolution difference would indicate.
 

sTaTIx

Member
So do people think this method is better than 720p? only slightly higher pixel count compared to 720p but can be mapped per-pixel on 1080p displays.

I never played KZ:SF, but considering it took people months to figure out for certain that the game's MP mode runs at a non-native 1080p format, I would definitely have to say it's superior to upscaled 720p. It may even be superior to upscaled 900p (depending on one's preferences as to type/amount of scaling artifacts/blur).
 

RVinP

Unconfirmed Member
Used that way, it would just be another method of AA...and one that generates artifacts scattered across the image, while being inaccurate in other areas. If you're able to render at a higher res than your output, that headroom would be better used on other methods, like MSAA or the regular types of temporal AA.

Absolutely.

But when you downscale/downsample, some of the artifacts and inaccuracies would disappear with respect to the size of the footprint against the (higher)amount of downsampling.

If 2688x1512 resolution is to be targeted, pixels to render real time would be 1344x1512 or 2.03MP (Note: 1920x1080 is 2.07MP). Also hoping that the overhead for motion compensation/interpolation algorithm with downsampling would accommodate less resources. Then it might just be worth, just worth..
 

Alienous

Member
I highly doubt we'll ever see a 720p game on the PS4. Reason being, the Xbox One will never go below 720p on any game (for PR reasons), and PS4 still outperforms the Xbox One handily at 900p (see: Battlefield 4).

Basically, PS4 has shown that it can render a full 56% more pixels and still run at a significant higher framerate than the Xbone.

Framerate and resolution aside (features you can't sell) if the Xbox One has Quantum Break at 720p and the PS4 has The Order: 1886 at nearly 1080p the graphics of the Xbox One are competitive (because resolution won't mean much in that debate). You'll get games that end up looking as good as other disregarding resolution.

However, in the event that a Sony game decides to go 720p it will force a Xbox One game to go subHD or look worse. Upholding 1080p doesn't help PlayStation if Xbox sets the standard as lower.
 
So do people think this method is better than 720p? only slightly higher pixel count compared to 720p but can be mapped per-pixel on 1080p displays.
Yes, very much better. It requires no upscaling whatsoever in the vertical direction, and it limits horizontal upscaling-type effects to small selected areas on screen. (Plus it's still rendering more pixels than 720p to begin with, even if only half as many as native 1080p.)

The real question is whether this method looks better than upscaled 792p, or 900p. Those are rendering more pixels overall, but then using coarser methods to spread them over a 1080p screen. Ignoring personal preferences, it seems likely that Guerilla's method would give better IQ than 792p, and perhaps about equivalent to 900p.

In any case, I'd expect that Guerilla wouldn't have gone to all the trouble to invent an entirely new method if they didn't think it gave better results than a simple upscale that used equivalent rendering power.
 

EL CUCO

Member
We've been bamboozled. I want to be mad at GG but I'm too busy being fascinated at how they got away this long with it.

It's going to be a interesting day. Can't wait to hear the damage control on this one.
 

zoukka

Member
Lol even hardcore gamers can be fooled with resolution, how much do you guys think the average joe gives fucks? A little? Very little? Or not at all :)
 
Lol even hardcore gamers can be fooled with resolution, how much do you guys think the average joe gives fucks? A little? Very little? Or not at all :)

tbh with you, if I had a PS4 and saw this in person, I would 100% notice. In fact, I did notice this in the pre-release footage but then just doubted myself when sony said it was 1080p.

A lot of people just took the devs and all the media content at its word (which is hilarious given so much of the resolution gate stuff).

Also, shame on DF for shitting the bed here.
 

sTaTIx

Member
Lol even hardcore gamers can be fooled with resolution, how much do you guys think the average joe gives fucks? A little? Very little? Or not at all :)

Well, if you mean 1080p whores can be bamboozled by an image that's damn near equivalent 1080p in detail/edge quality, then your point would be correct.

However, no one here would ever mistake a straight upscaled 900p or 720p image for a native 1080p image.
 

hesido

Member
This seems to be one of those cases where we unfairly omit how it looks in motion: due to the temporal nature of the technique it is unfair to pick on screenshots. Picking on screenshots is part of the reason why we can't have nice things (e.g. 60fps) because devs prefer to make games that look good in screenshots but doesn't necessarily move fluidly (30 fps games that doesn't even run at 30fps.)

Edit: People seem to notice the blurriness also during play but the situation shouldn't be as bad as it looks on screenshots.
 

remnant

Banned
Lol even hardcore gamers can be fooled with resolution, how much do you guys think the average joe gives fucks? A little? Very little? Or not at all :)

1. Even the most ridiculous resolution arguments conceded the fact that not everyone cares. It's an issue to those wanting to spend money on the best possible version if the game.

2. Many people including myself noticd the disparity between the SP and MP versions. If anything this goes to show that resolution/fps is something you can notice.
 
We've been bamboozled. I want to be mad at GG but I'm too busy being fascinated at how they got away this long with it.

It's going to be a interesting day. Can't wait to hear the damage control on this one.

I doubt there will be any. The game has already been out for 4 months now. What damage control needs to be done when Infamous, The Order and other anticipated games with higher resolutions are coming out. The only people who are going to hold on to this are disgruntled Xbox users who can finally find a game in the Sony arsenal that isn't exactly what it is. AC4, BF4, NBA2k14 and others remain unbothered. At this point, it's a non issue but mostly moving forward (Sony exclusives only) we will have to keep a keen eye out.
 

Zephyx

Member
The real question is whether this method looks better than upscaled 792p, or 900p. Those are rendering more pixels overall, but then using coarser methods to spread them over a 1080p screen. Ignoring personal preferences, it seems likely that Guerilla's method would give better IQ than 792p, and perhaps about equivalent to 900p.

The main issue I can see here is even if they are giving the illusion of 1080p, the interpolation algorithm does not bode well in terms of framerate. We do not know how costly it is in terms of resources or if the code is still in infancy stage (possibly unoptimized algorithm) that's causing performance issues. The tech indeed is amazing but it appears the trade-off they made was not enough to provide the 60 FPS experience people were looking for. Hope GG can give more insight on this issue.
 

leng jai

Member
Resolutiongate, the placebo effect at its best. Tell everyone a game is 1080p when it isn't and no one notices anything wrong. Months later it's found that (god forbid) it isn't actually the p's they were promised and of course people 'always thought something looked off'. I hope one day we'll be able to just enjoy games and appreciate them for how good they actually look, rather than going back and forth over counting pixels. The game looks great, and performs well in multiplayer which is absolutely crucial.

Read the thread for christ's sake.
 

sTaTIx

Member
I wonder if Titanfall can be rendered in this res instead of 792p. Since the pixel count isnt that different.

I'm assuming that this temporal interpolation method uses a nice, big chunk of video memory. 32MB of ESRAM would prohibit this method from ever being used in an Xbox One game.

Which is a shame (and ironic), because if there were ever a console that desperately needs a higher-quality scaling technique, it would be the Xbone.
 
Top Bottom