• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Polygon: Why frame rate and resolution matter: A graphics primer

You're trying to be edgy, but good upsampling does in fact do exactly that (try to maintain sharp detail while not introducing artifacts).


Of course, I very strongly disagree with the notion that better upscaling technology can render the difference between native and non-native resolution realtime rendering immaterial.

I mean, real life isn't CSI, you can't create information that isn't there, but you can do a lot better than nearest neighbor. But still.
 
I mean, real life isn't CSI, you can't create information that isn't there, but you can do a lot better than nearest neighbor. But still.
Basic upscaling techniques are pretty much all a compromise of blurriness, ringing, and funky patterned artifacts (typically blocking, i.e. nearest-neighbor). Eliminating blurriness in upscaling isn't even that difficult, but it doesn't mean you have a great image.

Scenes (in both the real world and video games) consist heavily of hard edges on opaque objects. By making the assumption that something that looks like a hard edge is a hard edge, it's entirely possible to create algorithms that actually preserve clean edges as clean edges at a higher resolution.
(Much in the same way that SMAA can replace stairsteps with correct-coverage gradients, by assuming that all "stairstep" patterns are evil jaggies.)

Not that even such sophisticated approaches make upscaling perfect. If you want a sharp image, fine, but you're still going to deal with aliasing on small details. More sampling, higher resolution, can always help there. You want to reproduce a scene into a high-resolution buffer, it turns out that it's handy to have more information about the scene.
 
That's one change in tone from their PS4/Xbone launch streams.

Hasn't Polygon spent the last year telling us that Frame Rate and Resolution DON'T matter? Wonder what made them change their tune. Did Gies get fired?

This is rich coming from Polygon.

I see, so now it matters.

Jesus. You people are insufferable. More than one person works at Polygon. Give up already.
 
Basic upscaling techniques are pretty much all a compromise of blurriness, ringing, and funky patterned artifacts (typically blocking, i.e. nearest-neighbor). Eliminating blurriness in upscaling isn't even that difficult, but it doesn't mean you have a great image.

Scenes (in both the real world and video games) consist heavily of hard edges on opaque objects. By making the assumption that something that looks like a hard edge is a hard edge, it's entirely possible to create algorithms that actually preserve clean edges as clean edges at a higher resolution.
(Much in the same way that SMAA can replace stairsteps with correct-coverage gradients, by assuming that all "stairstep" patterns are evil jaggies.)

Not that even such sophisticated approaches make upscaling perfect. If you want a sharp image, fine, but you're still going to deal with aliasing on small details. More sampling, higher resolution, can always help there. You want to reproduce a scene into a high-resolution buffer, it turns out that it's handy to have more information about the scene.

I know about more interesting upscaling techniques, I was just pointing out that you still have no more information than what you start out with, so you can never be as good as a native 1080p image, unless the 1080p image is a very flat image, or its Fourier transform has nothing in its high frequency domain somehow.

Jesus. You people are insufferable. More than one person works at Polygon. Give up already.

As someone explained earlier, there's a position called "editor" who ensures consistency of quality and message is achieved across a site.
Of course, Polygon's opinion piece editor is Ben Kuchera, and the less said about that the better.
 
I know about more interesting upscaling techniques, I was just pointing out that you still have no more information than what you start out with, so you can never be as good as a native 1080p image, unless the 1080p image is a very flat image, or its Fourier transform has nothing in its high frequency domain somehow.



As someone explained earlier, there's a position called "editor" who ensures consistency of quality and message is achieved across a site.
Of course, Polygon's opinion piece editor is Ben Kuchera, and the less said about that the better.

Ok, but what's being discussed here has nothing to do with the quality of output. And since when is it an editor's duty to ensure "consistency in message". False. Not every publication operates like the National Review.

I don't even see how what's being said here is in direct opposition to earlier points made about the relevance of resolution/framerate to the masses. At the end of the day it's mostly people on forums such as gaf that fetishize over this stuff. I'm one of them, lol. What, were they supposed to fully endorse PS4? Is that what you guys are waiting on?

That's right. You should just accept shitty 'journalism' guys. Come on now.

I think a lot of people are confused about the meaning of 'editorial'. Add 'reviews' to that as well.
 
Jesus. You people are insufferable. More than one person works at Polygon. Give up already.

When an article is approved by a company to post on their site as an official Polygon article, I don't see why we can't reach the conclusion that it's the company's stance on a particular topic. Unless the article being posted was specifically labeled as an opinion piece / blog post.

Gies could have just tweeted it if it was his own personal opinion and not representative of how the staff at Polygon feels.


Edit: If it was an editorial that changes things.
 
I know about more interesting upscaling techniques, I was just pointing out that you still have no more information than what you start out with, so you can never be as good as a native 1080p image
Sure, but that doesn't mean the image is blurry. Something like linear interpolation naturally has that effect, but some approaches (like certain lanczos implementations) maintain pretty sharp edges (but in these cases you often wind up with a bunch of ringing).

unless the 1080p image is a very flat image, or its Fourier transform has nothing in its high frequency domain somehow.
Blurry scenes will upscale great!

:D
 
I have a 27'' computer monitor and the difference between 1080p and 720p is massive, when I hook my ps3 up to my monitor it just looks pixelated and stretched out in comparison to playing PC games at a 1080p resolution.

I can maybe understand if you have a smaller resolution tv or monitor its harder to tell, but for me I get a headache sometimes playing 720p games on my screen. The standard should always been 1080p for next-gen IMO.
 
Last I heard resolution didn't matter to you guys. Just pick a narrative and stick with it Polygon. This is why nobody takes you seriously.
 
When an article is approved by a company to post on their site as an official Polygon article, I don't see why we can't reach the conclusion that it's the company's stance on a particular topic. Unless the article being posted was specifically labeled as an opinion piece / blog post.

Gies could have just tweeted it if it was his own personal opinion and not representative of how the staff at Polygon feels.


Edit: If it was an editorial that changes things.

No, this is not an editorial. But in respect to what's been said by other staffers in reviews and commentary, it's unfair to say this guy can't write about the topic because blablahblah 6 months ago someone opined "resolution doesn't matter". That's not how journalism works since people want to keep throwing the word around.
 
Maybe Polygon employs multiple people. That would theoretically allow the quality of their output to range from "very good" to "literal poop from a butt."
Did they hire someone new who can write above the "literal poop from a butt" level? I'm still waiting for that. :P
 
Does motion resolution factor into this? Most LCDs that do not have any motion compensation enabled will have a motion resolution of around 300 lines. So assuming you aren't playing a game with a static screen, you're losing some of that 1080p resolution advantage as long as the image is being smeared by the TV. Does that not mitigate some of the loss of resolution if you are rendering in 720p upscaled?
 
Daaamn

1579736524_1637141180.gif

Jesus does anyone know if that dudes chin has landed yet?
 
I have a 27'' computer monitor and the difference between 1080p and 720p is massive, when I hook my ps3 up to my monitor it just looks pixelated and stretched out in comparison to playing PC games at a 1080p resolution.

I can maybe understand if you have a smaller resolution tv or monitor its harder to tell, but for me I get a headache sometimes playing 720p games on my screen. The standard should always been 1080p for next-gen IMO.
A 27" 1080p monitor is pretty big. If you're sitting just 1.5 to 2 feet away, its kinda the same thing as having an 80" 1080p TV from about 5-7 feet away or so.

So yes, resolution differences will be drastically noticeable.

I totally buy that the average consumer cant tell the difference between 720p and 1080p on a standard size(40-50") TV at normal viewing distance. I know damn well my mother and my sister cant. My mom is still perfectly fine watching SD channels even if she has the HD channel available!

People like to come on here and say, "Well I can tell the difference, therefore the article is bullshit" but forget that the sort of person who is on NeoGAF and is entering threads talking about resolution/framerate is likely NOT to be your average consumer.

And I've mentioned it many times before, but this is also something we train our eyes to notice over time. Different people will have a different sensitivity to it, so even with the same size display at the same distance, two people with equal eyesight may notice(or not notice) the difference in resolution to different degrees. And this sensitivity isn't static. I'm sure many gamers will become more attune to 1080p gaming with the next-gen consoles and the more they become attuned, the more noticeable the difference between it and 720p will become to them.
 
Understanding how resolution/frame rate affects performance is one of the reasons I am thankful to have been a PC gamer for so long.

Unlike consoles, it's really easy to up the resolution, lower it, change graphic settings with the whole purpose to find the perfect medium. I sometimes feel sorry for game developers because that stuff really should be left up to the user, but with consoles they have the be the ones making the call.

I think it really shouldn't be left to the consumer. Native resolution and 60fps are better than everything else below that. Make your game with that in mind, no compromises, no problems. Ask Nintendo or rage or Studio Liverpool (rip) how to pull that of.
 
Who really cares what "generic consumers" are able to see? I don't shop for my console based on what I think the average consumer can detect visually.

To keep bringing up what resolution the "average consumer" can see just sounds like console war bullshit. I can tell the difference. Therefore I want the better picture quality.

Same for framerate. Thanks. The average argument is a marketing one. And it's marketing why we value graphic fidelity over playability. And devs who cave in all the time. It's hilarious that two of Sonys big titles for ps4 run at unlocked framerates. That is surely not the decision of a competent developer who knows his stuff.
 
Well he's talking about the average consumer. Many average consumers can't even tell the difference between SD and HD, so I wouldn't say he's wrong. But what he said means nothing since gamers aren't exactly average consumers.

my missus is the height of average. when I bought our first 42" 16:9 TV (we had a 68cm 4:3 TV) it took her 4 hrs of watching it before she noticed it was 'new' lol but even she can tell the difference between 720p & 1080p on the one TV not side by side
 
I totally buy that the average consumer cant tell the difference between 720p and 1080p on a standard size(40-50") TV at normal viewing distance. I know damn well my mother and my sister cant. My mom is still perfectly fine watching SD channels even if she has the HD channel available!
They don't care, aren't paying attention and are probably watching mostly material that has subpar image quality in the first place, so they don't actively notice the lack of quality. That is different from "can't see". If they have normal eyesight, they will see the difference if you put CGI material like a game on the display and flip it between 1080p and 720p.
 
They don't care, aren't paying attention and are probably watching mostly material that has subpar image quality in the first place, so they don't actively notice the lack of quality. That is different from "can't see". If they have normal eyesight, they will see the difference if you put CGI material like a game on the display and flip it between 1080p and 720p.
Already went over how this isn't necessarily the case.
 
Yeah, for years developers said that ps3 and 360's versions would look/play the same and sadly for ps3 only owners like me, most of the multiplats were superior on the 360. It got a bit better in the last years, but for the most part, 360 was/is the console to play multiplats on. And that with two consoles more similar in power than the ones we have right now. The gap is bigger now, I don't get why they say these things.

They surely know better than us, they make the damn games, so it would seem they're just being disingenuous (and smart too, it's not advised to piss off half of your market and try to sell them an inferior version). Even though the AC series always ran worse on ps3, Ubi was constantly beating the parity drum. They had (and still have) a marketing deal with Sony so that would explain it. I dunno if this is the same case.

It doesn't make any sense from any perspective the comment. PS4 has a substantial power gap over Xbox One; that is never going to be eliminated even if Microsoft prayed to all the Gods on Earth. It's not one that's eliminated by "more experience" with the developer tools; it's not one that's even close to eliminated by unlocking a bit more GPU power; it's not one that's even remotely eliminated by the span of time (since, by default, that would benefit all parties).

It's just imaginary bullshit meant to try to muddy the waters for people who don't know any better. And developers should fucking know better than to say straight bullshit like that. You ain't coddling anybody but the fanboys on the inferior platform. Telling them sweet lies doesn't change the truth, and doesn't make the situation better.
 
You disappoint me NeoGAF. Three pages?

This thread coulda been a contenda. This thread has it all. It has tech. It has resolutiongate. It has salt. It has Polygon. It's right before E3, and the forum is crawling with junior console warriors.

For NeoGAF, this thread should be like a porterhouse wrapped in bacon and sprinkled with crack cocaine. It should be 50 pages long with mass bannings and dozens of Bish gifs.

What gives?
 
You're trying to be edgy, but good upsampling does in fact do exactly that (try to maintain sharp detail while not introducing artifacts).


Of course, I very strongly disagree with the notion that better upscaling technology can render the difference between native and non-native resolution realtime rendering immaterial.

Whew, good thing I never said that.
 
Stratton actually thought that game devs would purposely hold back on using the console's tech just so future sequels would look better?! WAT?! I can't imagine anyone coming up with such an asinine belief like that.
 
Bottom line is the most celebrated game in the industry right now is a 720p title, and it deserves it.

(Mario Kart 8)

When I play Titanfall the framerate really hurts the experience for me, but the resolution isn't something I could ever be thinking about in the heat of battle.

Console developers' priorities should be to make more fun games. And to make them run smoothly. Do that, and then do it at the highest resolution that is feasible.

And everyone will be happy, minus a few system wars fanboys and OCD folks.
 
I just want 1080p and a push for ultra clarity in the graphics. Mario Kart 8 couldn't break a PS4 but it looks prettier than anything on that system due to its ultra clean, high clarity look. I don't always need to see all of these cool looking effects if the artstyle and image clarity is ultra sharp. Crisp, clean textures and good draw distance with little to no blur goes quite a long way nowadays.
 
Bottom line is the most celebrated game in the industry right now is a 720p title, and it deserves it.

(Mario Kart 8)

When I play Titanfall the framerate really hurts the experience for me, but the resolution isn't something I could ever be thinking about in the heat of battle.

Console developers' priorities should be to make more fun games. And to make them run smoothly. Do that, and then do it at the highest resolution that is feasible.

And everyone will be happy, minus a few system wars fanboys and OCD folks.

It's also a 60FPS title, dude.

I just want 1080p and a push for ultra clarity in the graphics. Mario Kart 8 couldn't break a PS4 but it looks prettier than anything on that system due to its ultra clean, high clarity look. I don't always need to see all of these cool looking effects if the artstyle and image clarity is ultra sharp. Crisp, clean textures and good draw distance with little to no blur goes quite a long way nowadays.

Is that a joke post? 720p with no FSAA solution is now "ultra clean" and "ultra sharp". What in the world...
And it doesn't get the praise BECAUSE Iit's 720p. It would still be more enjoyable at 1080p and a STABLE 60FPS without stutter.

Also I'm not sure about the "most praised" stuff. It surely is a nice game, but, well, it's also Mario Kart...EIGHT. See that number? 8!
 
I just want 1080p and a push for ultra clarity in the graphics. Mario Kart 8 couldn't break a PS4 but it looks prettier than anything on that system due to its ultra clean, high clarity look. I don't always need to see all of these cool looking effects if the artstyle and image clarity is ultra sharp. Crisp, clean textures and good draw distance with little to no blur goes quite a long way nowadays.

P35G52w.gif


are you serious or just breaking balls? i mean mk has that nintendo game characteristic (great game overall) but thats one bold statement mate
 
I just want 1080p and a push for ultra clarity in the graphics. Mario Kart 8 couldn't break a PS4 but it looks prettier than anything on that system due to its ultra clean, high clarity look. I don't always need to see all of these cool looking effects if the artstyle and image clarity is ultra sharp. Crisp, clean textures and good draw distance with little to no blur goes quite a long way nowadays.

I don't even know how to take this

MK8 looks lovely but it's far from perfect and doesn't come close to any top tier PS4 game. Like most Nintendo games the art style elevates it to heights the hardware can't reach.
 
I just want 1080p and a push for ultra clarity in the graphics. Mario Kart 8 couldn't break a PS4 but it looks prettier than anything on that system due to its ultra clean, high clarity look. I don't always need to see all of these cool looking effects if the artstyle and image clarity is ultra sharp. Crisp, clean textures and good draw distance with little to no blur goes quite a long way nowadays.

Mario Kart looks great, but there's no ultra clean clear image. It doesn't come close to what we've seen on PS4.

Daytona USA ran at 60 in arcades and a blistering 20 on Sega Saturn. It got some pretty rough reviews for that, and rightfully so.

Except the reviews weren't bagging on the frame rate. Frame rates weren't discussed, nor were they an issue. They were bagging on the horrendous pop up that was jumping out at you. And nothing like the pop up we might see these days, I'm talking entire sections of track and scenery just popping onto the screen. Back then nobody expected arcade perfection, but that was just a bad port. Later on when Gran Chaser and Sega Rally appeared the discussion was how much less pop up there was, not frame rates.
 
1080p/60fps is an honest benchmark, but too often people scoff at 30fps when the hardware is so limited. Developers should be more practical over balancing performance and visuals, but people should be more realistic about their $399 machine's 'budget' power. Consistently getting both requires gaming PCs brute-forcing everything.
 
Well I'm sure there will be some kind of sackboy/avatar cart racer that's way less fun but 1080p within a year or two. I'm sure you two will be very happy together.

You do realize the discussion is about resolution, not the quality of the game as a whole right? You can put your stupid little pitchfork down, I've been singing the praises of Mario Kart like everyone else. But if someone's going to say the clarity and clearness is better than what's on PS4 (and probably X1), then they're wrong.
 
You do realize the discussion is about resolution, not the quality of the game as a whole right? You can put your stupid little pitchfork down, I've been singing the praises of Mario Kart like everyone else. But if someone's going to say the clarity and clearness is better than what's on PS4 (and probably X1), then they're wrong.

No, it's the whole package. The game is indeed a visual delight but that's due to design and animation, not resolution. The latter is being overemphasized these days.
 
Top Bottom