I just don't get it. The next consoles should be powerful enough that 1080p should not be an issue, even with tons of effects being applied to all those pixels.
With nearly everyone owning a 1080p TV at this point, I just don't get why you wouldn't want to meet that bar every time. I really hope the console makers demand that baseline resolution. It sounds like it's our only hope. :/
I think its very smart and I like that people think more about movie quality picture etc and not about hitting some crazy resolutions like in PC games ,,,
I'm kind of confused here, because I thought films were rendered out at 2xxxXwhatever resolution to start with. Like, that Avatar CG...it wasn't rendered out at 720p.
Why/how would games be 'oversampling' vs film in that case?
distastee said...
Brian sneaked in the point I wanted to make: The better comparison is between Games and Animated films - since we are also required to fully render our frames.
However! That Wall-E "production shot" is rendered at 4961x2070 - which means that frame is from Marketing and not from the movie. Almost all of our films are at 1920x____ (the few exceptions are lower res, not higher) The Blu-Ray is an accurate representation of the softness in your average film frame.
We do what is essentially MSAA. Then we do a lens distortion that makes the image incredibly soft (amongst other blooms/blurs/etc.) Softness/noise/grain is part of film and something we often embrace. Jaggies we avoid like the plague and thus we Anti-Alias the crap out of our images.
In the end it's still the same conclusion: games oversample vs film. I've always thought that film res was more than enough res. I don't know how you will get gamers to embrace a film aesthetic, but it shouldn't be impossible.
If they were going to simply use the current methods of FXAA or MSAA rather than truly try something new, I'd agree, but the whole idea of implementing "filmic" methods into visuals would be to produce something that looks unlike games we have today using rendering techniques that touch everything from image quality, to lighting, to the way post processing works, etc.Again in response to dark10x: it's unrealistic to ever expect that kind of image quality from a game unless supersampling is involved. What developers are promising is AA + FXAA at best, and I can assure you that while it looks good in screenshots, it doesn't look so good in motion, especially for fine details. Heck, just look at the image problems it can cause with BF3 at 1080p:
(notice the really bad aliasing on the railing)
The problems would be far worse at 720p.
Geez, they just don't get it. 60FPS changes everything.
If they were going to simply use the current methods of FXAA or MSAA rather than truly try something new, I'd agree, but the whole idea of implementing "filmic" methods into visuals would be to produce something that looks unlike games we have today using rendering techniques that touch everything from image quality, to lighting, to the way post processing works, etc.
I have no confidence that this is what we'd see next generation, but I like the idea. My entire points are all conceptual here there's no point in getting so defensive.
With current techniques, it's going to look flawed, and I'm not saying there is a magical bullet out there that could solve this without supersampling either, but I don't think we should draw hard conclusions based on what we have today.The idea is nice and all, but the fact remains: you cannot achieve this without supersampling. Why is the image quality in Avatar so good? It's not all the post-process filters. It's the fact that the source is 4K resolution.
If they're trying to achieve this through post-processing with current rendering techniques, it's going to have flaws.
Any framerate that isn't an integer multiple of 30 will look wonky because NTSC and PAL60 displays have 60 vertical interrupts per second. As is the case with 3:2 pulldown on movies, displaying some frames for longer than others makes the game look less smooth.BS. If I hook my PC up to my HDTV (32 inch, so it isn't exactly huge) after having played some console games, the difference between the two in IQ is night and day, even if I use little or no AA with the PC.
The FPS is a different matter, and most high-end games on the PC can only reach 60FPS on very powerful machines, so I'd be alright with them aiming lower. Out of curiosity, why do we never see console games aiming for 40 or 50, only 30 (or sub-30, lol) and 60?
But isn't the native res of Avatar something massive like four thousand by something? That would go a long way towards ridding the image of any undesirables when scaled down to 720p.
The resolution Avatar was filmed at is only 2K, a little above 1080p.
Now tell us what you downsampled from
This suggests that you don't know anything about framerates and displays.
A little above 1080p? 4x the pixels is a little?
He just said so. 4x supersampled (2x2, I assume) would mean 2560 x 1440p (aka the native res of his monitor).
Yes, but those games make up a very small percentage of the overall library. You can't simply make that claim and then list off six examples in a library of thousands to prove a point.D10x--I should probably check DF, but I thought BF3, Batman AC, AC R, Uncharted 3, Killzone 3, Infamous 2 all had annoying drops on PS3. Really most games I've played on PS3, especially anything open world. As someone new to pc gaming from PS3, my opinion is that variable 50-60 is way better than whatever PS3 games are running at. I can't say for 360, as I sold mine a few years ago.
I think it's ending the bold too soon to loseNirolak said:Here's what the guy from Pixar said:
which basically means a less-wasteful form of supersampling.We do what is essentially MSAA.
A little above 1080p? 4x the pixels is a little?
He just said so. 4x supersampled (2x2, I assume) would mean 2560 x 1440p (aka the native res of his monitor).
How it runs and plays.Planning to not hit 60? That's fine, developers. I'll just play Nintendo games because they actually give a shit about how a game runs.
2K is 2048 lines of horizontal resolution, meaning the resolution Avatar was filmed and rendered at(taking the aspect ratio into account) is 2048 x1156, or thereabouts. So yes, a little over 1080p. You are thinking of 4K.
which basically means a less-wasteful form of supersampling.
FUCK ALL THAT NOISE
1080p 60 FPS 4x AA 16x AF or motherfucking BUST
I'd rather look at the 1680x1050 Oblivion I've been playing recently.Maybe they mean something like this:
Witcher 2, ubersampling+their postAA, 720p native, lanczos scaled to 1080p.
...
What's the verdict?
What is wrong with wanting to achieve superior visuals? That's precisely what Sega was doing in the 90s, afterall. Filmic need not imply "realism".What is wrong with games looking like games?
The way console gaming seems to be going might as well be true. PC gaming save us!I see.
How about 720p, 30fps, FXAA, and 8x AF?
Well, okay, they *might* give you 16x AF.
Wait, did they mention supersampling in the article? I see a lot of people mentioning it but I don't think that's what they're referring too. Supersampling at 720p would be nice but I have to imagine that would be a lot more taxing than actually targeting 1080p.
I can't believe how quickly people on here make useless posts without even thinking about the topic at hand.
"YOUR DOING 30 FRAMESPS AGAIN!?!?! I"M STICKING WITH NINTENDO CUZ THEY KNOW BEST SO THERE!!!1111"
That adds nothing to the discussion and bares no relevance. It's an ignorant statement that basically ignores the entire point of the article.
edit: ha ha, just two posts below this one someone else actually went there. smh
What is wrong with wanting to achieve superior visuals? That's precisely what Sega was doing in the 90s, afterall. Filmic need not imply "realism".