You're trying to be edgy, but good upsampling does in fact do exactly that (try to maintain sharp detail while not introducing artifacts).
Of course, I very strongly disagree with the notion that better upscaling technology can render the difference between native and non-native resolution realtime rendering immaterial.
Basic upscaling techniques are pretty much all a compromise of blurriness, ringing, and funky patterned artifacts (typically blocking, i.e. nearest-neighbor). Eliminating blurriness in upscaling isn't even that difficult, but it doesn't mean you have a great image.I mean, real life isn't CSI, you can't create information that isn't there, but you can do a lot better than nearest neighbor. But still.
That's one change in tone from their PS4/Xbone launch streams.
Hasn't Polygon spent the last year telling us that Frame Rate and Resolution DON'T matter? Wonder what made them change their tune. Did Gies get fired?
This is rich coming from Polygon.
I see, so now it matters.
Basic upscaling techniques are pretty much all a compromise of blurriness, ringing, and funky patterned artifacts (typically blocking, i.e. nearest-neighbor). Eliminating blurriness in upscaling isn't even that difficult, but it doesn't mean you have a great image.
Scenes (in both the real world and video games) consist heavily of hard edges on opaque objects. By making the assumption that something that looks like a hard edge is a hard edge, it's entirely possible to create algorithms that actually preserve clean edges as clean edges at a higher resolution.
(Much in the same way that SMAA can replace stairsteps with correct-coverage gradients, by assuming that all "stairstep" patterns are evil jaggies.)
Not that even such sophisticated approaches make upscaling perfect. If you want a sharp image, fine, but you're still going to deal with aliasing on small details. More sampling, higher resolution, can always help there. You want to reproduce a scene into a high-resolution buffer, it turns out that it's handy to have more information about the scene.
Jesus. You people are insufferable. More than one person works at Polygon. Give up already.
Jesus. You people are insufferable. More than one person works at Polygon. Give up already.
I know about more interesting upscaling techniques, I was just pointing out that you still have no more information than what you start out with, so you can never be as good as a native 1080p image, unless the 1080p image is a very flat image, or its Fourier transform has nothing in its high frequency domain somehow.
As someone explained earlier, there's a position called "editor" who ensures consistency of quality and message is achieved across a site.Of course, Polygon's opinion piece editor is Ben Kuchera, and the less said about that the better.
That's right. You should just accept shitty 'journalism' guys. Come on now.
Jesus. You people are insufferable. More than one person works at Polygon. Give up already.
Sure, but that doesn't mean the image is blurry. Something like linear interpolation naturally has that effect, but some approaches (like certain lanczos implementations) maintain pretty sharp edges (but in these cases you often wind up with a bunch of ringing).I know about more interesting upscaling techniques, I was just pointing out that you still have no more information than what you start out with, so you can never be as good as a native 1080p image
Blurry scenes will upscale great!unless the 1080p image is a very flat image, or its Fourier transform has nothing in its high frequency domain somehow.
How does doubling the resolution quadruple the pixels? Quadrupling the resolution would quadruple the pixels.
When an article is approved by a company to post on their site as an official Polygon article, I don't see why we can't reach the conclusion that it's the company's stance on a particular topic. Unless the article being posted was specifically labeled as an opinion piece / blog post.
Gies could have just tweeted it if it was his own personal opinion and not representative of how the staff at Polygon feels.
Edit: If it was an editorial that changes things.
Did they hire someone new who can write above the "literal poop from a butt" level? I'm still waiting for that.Maybe Polygon employs multiple people. That would theoretically allow the quality of their output to range from "very good" to "literal poop from a butt."
Did they hire someone new who can write above the "literal poop from a butt" level? I'm still waiting for that.![]()
Daaamn
![]()
A 27" 1080p monitor is pretty big. If you're sitting just 1.5 to 2 feet away, its kinda the same thing as having an 80" 1080p TV from about 5-7 feet away or so.I have a 27'' computer monitor and the difference between 1080p and 720p is massive, when I hook my ps3 up to my monitor it just looks pixelated and stretched out in comparison to playing PC games at a 1080p resolution.
I can maybe understand if you have a smaller resolution tv or monitor its harder to tell, but for me I get a headache sometimes playing 720p games on my screen. The standard should always been 1080p for next-gen IMO.
Understanding how resolution/frame rate affects performance is one of the reasons I am thankful to have been a PC gamer for so long.
Unlike consoles, it's really easy to up the resolution, lower it, change graphic settings with the whole purpose to find the perfect medium. I sometimes feel sorry for game developers because that stuff really should be left up to the user, but with consoles they have the be the ones making the call.
Who really cares what "generic consumers" are able to see? I don't shop for my console based on what I think the average consumer can detect visually.
To keep bringing up what resolution the "average consumer" can see just sounds like console war bullshit. I can tell the difference. Therefore I want the better picture quality.
Well he's talking about the average consumer. Many average consumers can't even tell the difference between SD and HD, so I wouldn't say he's wrong. But what he said means nothing since gamers aren't exactly average consumers.
They don't care, aren't paying attention and are probably watching mostly material that has subpar image quality in the first place, so they don't actively notice the lack of quality. That is different from "can't see". If they have normal eyesight, they will see the difference if you put CGI material like a game on the display and flip it between 1080p and 720p.I totally buy that the average consumer cant tell the difference between 720p and 1080p on a standard size(40-50") TV at normal viewing distance. I know damn well my mother and my sister cant. My mom is still perfectly fine watching SD channels even if she has the HD channel available!
Already went over how this isn't necessarily the case.They don't care, aren't paying attention and are probably watching mostly material that has subpar image quality in the first place, so they don't actively notice the lack of quality. That is different from "can't see". If they have normal eyesight, they will see the difference if you put CGI material like a game on the display and flip it between 1080p and 720p.
Yeah, for years developers said that ps3 and 360's versions would look/play the same and sadly for ps3 only owners like me, most of the multiplats were superior on the 360. It got a bit better in the last years, but for the most part, 360 was/is the console to play multiplats on. And that with two consoles more similar in power than the ones we have right now. The gap is bigger now, I don't get why they say these things.
They surely know better than us, they make the damn games, so it would seem they're just being disingenuous (and smart too, it's not advised to piss off half of your market and try to sell them an inferior version). Even though the AC series always ran worse on ps3, Ubi was constantly beating the parity drum. They had (and still have) a marketing deal with Sony so that would explain it. I dunno if this is the same case.
You're trying to be edgy, but good upsampling does in fact do exactly that (try to maintain sharp detail while not introducing artifacts).
Of course, I very strongly disagree with the notion that better upscaling technology can render the difference between native and non-native resolution realtime rendering immaterial.
Bottom line is the most celebrated game in the industry right now is a 720p title, and it deserves it.
(Mario Kart 8)
When I play Titanfall the framerate really hurts the experience for me, but the resolution isn't something I could ever be thinking about in the heat of battle.
Console developers' priorities should be to make more fun games. And to make them run smoothly. Do that, and then do it at the highest resolution that is feasible.
And everyone will be happy, minus a few system wars fanboys and OCD folks.
I just want 1080p and a push for ultra clarity in the graphics. Mario Kart 8 couldn't break a PS4 but it looks prettier than anything on that system due to its ultra clean, high clarity look. I don't always need to see all of these cool looking effects if the artstyle and image clarity is ultra sharp. Crisp, clean textures and good draw distance with little to no blur goes quite a long way nowadays.
I just want 1080p and a push for ultra clarity in the graphics. Mario Kart 8 couldn't break a PS4 but it looks prettier than anything on that system due to its ultra clean, high clarity look. I don't always need to see all of these cool looking effects if the artstyle and image clarity is ultra sharp. Crisp, clean textures and good draw distance with little to no blur goes quite a long way nowadays.
I just want 1080p and a push for ultra clarity in the graphics. Mario Kart 8 couldn't break a PS4 but it looks prettier than anything on that system due to its ultra clean, high clarity look. I don't always need to see all of these cool looking effects if the artstyle and image clarity is ultra sharp. Crisp, clean textures and good draw distance with little to no blur goes quite a long way nowadays.
Daytona USA ran at 60 in arcades and a blistering 20 on Sega Saturn. It got some pretty rough reviews for that, and rightfully so.
I just want 1080p and a push for ultra clarity in the graphics. Mario Kart 8 couldn't break a PS4 but it looks prettier than anything on that system due to its ultra clean, high clarity look. I don't always need to see all of these cool looking effects if the artstyle and image clarity is ultra sharp. Crisp, clean textures and good draw distance with little to no blur goes quite a long way nowadays.
Daytona USA ran at 60 in arcades and a blistering 20 on Sega Saturn. It got some pretty rough reviews for that, and rightfully so.
Mario Kart looks great, but there's no ultra clean clear image. It doesn't come close to what we've seen on PS4.
Well I'm sure there will be some kind of sackboy/avatar cart racer that's way less fun but 1080p within a year or two. I'm sure you two will be very happy together.
Im always surprised when publishers claim that resolution differences dont matter if that is the case, then why are most of them sending out screenshots rendered at 8K?
You do realize the discussion is about resolution, not the quality of the game as a whole right? You can put your stupid little pitchfork down, I've been singing the praises of Mario Kart like everyone else. But if someone's going to say the clarity and clearness is better than what's on PS4 (and probably X1), then they're wrong.
Well I'm sure there will be some kind of sackboy/avatar cart racer that's way less fun but 1080p within a year or two. I'm sure you two will be very happy together.