• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Developers Discuss Benefits Of Targeting 720p/30fps Next Gen, Using Film Aesthetics

Raide

Member
720p@60fps should be the minimum target. I am sure some talented devs will get that to 1080p+ and 60ps.
 

mrklaw

MrArseFace
I'd be up for this.

Of course ideally you'd apply the same logic to 1080p, but perhaps thats unacheiveable just at the moment.

I don't fully understand the argument that at 1080p you don't see any full pixel details. Even 1080p is downsampled from film though. You can't say that is lack of detail and you can get the same results from 720p. Its more detail, just ending up as subpixels on a 1080p display.
 

TheExodu5

Banned
Again in response to dark10x: it's unrealistic to ever expect that kind of image quality from a game unless supersampling is involved. What developers are promising is AA + FXAA at best, and I can assure you that while it looks good in screenshots, it doesn't look so good in motion, especially for fine details. Heck, just look at the image problems it can cause with BF3 at 1080p:

park.png


(notice the really bad aliasing on the railing)

The problems would be far worse at 720p.
 

Nakimushi

Banned
I think its very smart and I like that people think more about movie quality picture etc and not about hitting some crazy resolutions like in PC games ,,,
 

Nirolak

Mrgrgr
I just don't get it. The next consoles should be powerful enough that 1080p should not be an issue, even with tons of effects being applied to all those pixels.

With nearly everyone owning a 1080p TV at this point, I just don't get why you wouldn't want to meet that bar every time. I really hope the console makers demand that baseline resolution. It sounds like it's our only hope. :/

Reading through this, they seem to endorse using filters and soft image aesthetics to make the difference between different resolutions less noticeable.

That's what they mean by the lack of pixel detail with films.
 

danmaku

Member
As much as I hate the idea that games should look like movies, that's clearly what most people wants. Enjoy your 30 fps future!
 

Sethos

Banned
I can tell you right now, I refuse to buy ANY games that run 30FPS next generation - If the entire console generation is based on 30FPS I won't take part. They really need to bump up that shit.
 

Nirolak

Mrgrgr
I'm kind of confused here, because I thought films were rendered out at 2xxxXwhatever resolution to start with. Like, that Avatar CG...it wasn't rendered out at 720p.

Why/how would games be 'oversampling' vs film in that case?

Here's what the guy from Pixar said:

distastee said...

Brian sneaked in the point I wanted to make: The better comparison is between Games and Animated films - since we are also required to fully render our frames.

However! That Wall-E "production shot" is rendered at 4961x2070 - which means that frame is from Marketing and not from the movie. Almost all of our films are at 1920x____ (the few exceptions are lower res, not higher) The Blu-Ray is an accurate representation of the softness in your average film frame.

We do what is essentially MSAA. Then we do a lens distortion that makes the image incredibly soft (amongst other blooms/blurs/etc.) Softness/noise/grain is part of film and something we often embrace. Jaggies we avoid like the plague and thus we Anti-Alias the crap out of our images.


In the end it's still the same conclusion: games oversample vs film. I've always thought that film res was more than enough res. I don't know how you will get gamers to embrace a film aesthetic, but it shouldn't be impossible.
 

dark10x

Digital Foundry pixel pusher
Again in response to dark10x: it's unrealistic to ever expect that kind of image quality from a game unless supersampling is involved. What developers are promising is AA + FXAA at best, and I can assure you that while it looks good in screenshots, it doesn't look so good in motion, especially for fine details. Heck, just look at the image problems it can cause with BF3 at 1080p:

(notice the really bad aliasing on the railing)

The problems would be far worse at 720p.
If they were going to simply use the current methods of FXAA or MSAA rather than truly try something new, I'd agree, but the whole idea of implementing "filmic" methods into visuals would be to produce something that looks unlike games we have today using rendering techniques that touch everything from image quality, to lighting, to the way post processing works, etc.

I have no confidence that this is what we'd see next generation, but I like the idea. My entire points are all conceptual here there's no point in getting so defensive.
 

Stallion Free

Cock Encumbered
Look, I loved the way that Resistance 3 used color-grading in a very filmic way, but these devs are fucking lame if they think it's fine roll with 720p/30 fps as a target again. Shit looks dreadful upscaled and 30 fps really messes with the gameplay.
 

TheExodu5

Banned
If they were going to simply use the current methods of FXAA or MSAA rather than truly try something new, I'd agree, but the whole idea of implementing "filmic" methods into visuals would be to produce something that looks unlike games we have today using rendering techniques that touch everything from image quality, to lighting, to the way post processing works, etc.

I have no confidence that this is what we'd see next generation, but I like the idea. My entire points are all conceptual here there's no point in getting so defensive.

The idea is nice and all, but the fact remains: you cannot achieve this without supersampling. Why is the image quality in Avatar so good? It's not all the post-process filters. It's the fact that the source is 4K resolution.

If they're trying to achieve this through post-processing along current rendering techniques (aka not ray tracing), it's going to have flaws.
 

Dennis

Banned
Here are two screenshots from Deus Ex:HR that illustrates that 1280 x 720 resolution is good enough to supply a clean, high IQ image






with 4 x SSAA
 

dark10x

Digital Foundry pixel pusher
The idea is nice and all, but the fact remains: you cannot achieve this without supersampling. Why is the image quality in Avatar so good? It's not all the post-process filters. It's the fact that the source is 4K resolution.

If they're trying to achieve this through post-processing with current rendering techniques, it's going to have flaws.
With current techniques, it's going to look flawed, and I'm not saying there is a magical bullet out there that could solve this without supersampling either, but I don't think we should draw hard conclusions based on what we have today.
 

TheExodu5

Banned
Here is a screenshot from Deus Ex:HR that illustrates that 1280 x 720 resolution is good enough to supply a clean, high IQ image




with 4 x SSAA

Damn that is a nice shot. :eek:

May I make a request? Can you take some 720p supersampled shots and scale them to 1080p? Can you take the same shots in native 1080p with 4x AA? That would be a nice way to examine the pros and cons of this idea.
 

Orayn

Member
BS. If I hook my PC up to my HDTV (32 inch, so it isn't exactly huge) after having played some console games, the difference between the two in IQ is night and day, even if I use little or no AA with the PC.

The FPS is a different matter, and most high-end games on the PC can only reach 60FPS on very powerful machines, so I'd be alright with them aiming lower. Out of curiosity, why do we never see console games aiming for 40 or 50, only 30 (or sub-30, lol) and 60?
Any framerate that isn't an integer multiple of 30 will look wonky because NTSC and PAL60 displays have 60 vertical interrupts per second. As is the case with 3:2 pulldown on movies, displaying some frames for longer than others makes the game look less smooth.
 

jett

D-Member
I honestly don't quite get what the nvidia is trying to say. The only way you'd manage to get Film/CG-like IQ would be to render at a much higher resolution than your intended framebuffer and downsample, and then filter the image. Only using filters is really not going to cut it.

But isn't the native res of Avatar something massive like four thousand by something? That would go a long way towards ridding the image of any undesirables when scaled down to 720p.

The resolution Avatar was filmed at is only 2K, a little above 1080p.
 

Fredrik

Member
I'd rather take jaggy 720p/60fps locked on every single game.
I can't believe that a 25 year old NES still beats the crap out of current consoles (and possibly next gen) when it comes to framerates.
 

Jtrizzy

Member
D10x--I should probably check DF, but I thought BF3, Batman AC, AC R, Uncharted 3, Killzone 3, Infamous 2 all had annoying drops on PS3. Really most games I've played on PS3, especially anything open world. As someone new to pc gaming from PS3, my opinion is that variable 50-60 is way better than whatever PS3 games are running at. I can't say for 360, as I sold mine a few years ago.
 

TheExodu5

Banned
The resolution Avatar was filmed at is only 2K, a little above 1080p.

A little above 1080p? 4x the pixels is a little?

edit: I'm an idiot. 2K refers to horizontal pixels (2048 x 1556). Still, let's not pretend Avatar uses conventional triangle based rendering as games do.

Now tell us what you downsampled from :p

He just said so. 4x supersampled (2x2, I assume) would mean 2560 x 1440p (aka the native res of his monitor).
 

NBtoaster

Member
Maybe they mean something like this:

Witcher 2, ubersampling+their postAA, 720p native, lanczos scaled to 1080p.

wmplayer2012-01-1204-ssjgg.png


What's the verdict? Acceptable on 1080p screens? It doesn't actually look terrible when playing it at 720p (scaled to 900p by GPU). Though I'm suprised by how badly it runs still..
 

SmokyDave

Member
If they 'target' 720p/30, is it fair to assume we're looking at the same res and framerate as this gen? I'm fairly sure they were 'targetting' 720p/30 as a minimum this gen. Didn't happen though.

It really is going to be a mild bump next-gen, isn't it.
 
720p i can deal with, but we've got to shoot for 60 FPS. there's a reason why Jackson and Cameron are starting to push for filming at higher frame rates.

and if we take Pixar at their word, no, you don't have to super sample to get 'movie quality' aesthetics. they render at 1920 x ____ and MSAA and post process.

i don't remember anyone bitching about the IQ of Wall E.

720p... well, that's going to look great on my native 720p HMZs.
 

mrklaw

MrArseFace
A little above 1080p? 4x the pixels is a little?



He just said so. 4x supersampled (2x2, I assume) would mean 2560 x 1440p (aka the native res of his monitor).

2x the pixels

and its not just downsampling. Avatar and movie CG does a lot more per pixel to achieve the quality they get. Plus probably a lot less texturing and more polygons so more things are modelled.

It'd be an interesting experiment to take something like Deus Ex or Crysis 2's world models and convert them to run through a Pixar renderman pipeline, apply proper surfaces to them etc, see how they come out then.
 

dark10x

Digital Foundry pixel pusher
D10x--I should probably check DF, but I thought BF3, Batman AC, AC R, Uncharted 3, Killzone 3, Infamous 2 all had annoying drops on PS3. Really most games I've played on PS3, especially anything open world. As someone new to pc gaming from PS3, my opinion is that variable 50-60 is way better than whatever PS3 games are running at. I can't say for 360, as I sold mine a few years ago.
Yes, but those games make up a very small percentage of the overall library. You can't simply make that claim and then list off six examples in a library of thousands to prove a point.

Also, variable framerates are awful. 50 fps judders like crazy on a 60 Hz (or higher) display while it would look fine on an old 50 Hz display. When you separate framerate from refresh rate in such a fashion it results in a jerky image.
 

Emitan

Member
Planning to not hit 60? That's fine, developers. I'll just play Nintendo games because they actually give a shit about how a game runs.
 

TheExodu5

Banned
2K is 2048 lines of horizontal resolution, meaning the resolution Avatar was filmed and rendered at(taking the aspect ratio into account) is 2048 x1156, or thereabouts. So yes, a little over 1080p. :p You are thinking of 4K.

Yeah I jumped the gun a bit on that one. :p

which basically means a less-wasteful form of supersampling.

Less wasteful, sure, but the results of supersampling are leagues beyond MSAA. Images may not show it, but a game shimmers a lot in motion with just MSAA. Supersampling results in incredibly IQ which does not shimmer at all in motion.

Though now you can achieve near supersampled results with MSAA + SGSSAA at this point.

Not sure what SGSSAA involves exactly, but it is relatively costly.
 

RoboPlato

I'd be in the dick
Wait, did they mention supersampling in the article? I see a lot of people mentioning it but I don't think that's what they're referring too. Supersampling at 720p would be nice but I have to imagine that would be a lot more taxing than actually targeting 1080p.
 

CrunchinJelly

formerly cjelly
What is wrong with games looking like games?

Since Sega left the industry
near enough
it seems a bunch of bond-headed tech people have taken over gaming.

Companies like Sega prided themselves on the 60fps experience.
 

linkboy

Member
While 1080p standard would be nice next gen, I want more stable frame-rates. Nothing pisses me off more then games with bad frame-rates (screw you PS3 Skyrim, which I know isn't caused by resolution).

If 720p allows developers to set stable frame-rates, I'm all for it.
 

dark10x

Digital Foundry pixel pusher
I can't believe how quickly people on here make useless posts without even thinking about the topic at hand.

"YOUR DOING 30 FRAMESPS AGAIN!?!?! I"M STICKING WITH NINTENDO CUZ THEY KNOW BEST SO THERE!!!1111"

That adds nothing to the discussion and bares no relevance. It's an ignorant statement that basically ignores the entire point of the article.

edit: ha ha, just two posts below this one someone else actually went there. smh

What is wrong with games looking like games?
What is wrong with wanting to achieve superior visuals? That's precisely what Sega was doing in the 90s, afterall. Filmic need not imply "realism".
 

Wiktor

Member
720 is to low. Even on smaller screen it gets blurry. Watching a movie is different, because you're passive viewer, you do not get around to explore and look wherever you want.
Especially in games with lots of tiny details 1080 is a minumum. BF3 and Arkham City both look really bad at 720 when compared to 1080.
 

TheExodu5

Banned
Wait, did they mention supersampling in the article? I see a lot of people mentioning it but I don't think that's what they're referring too. Supersampling at 720p would be nice but I have to imagine that would be a lot more taxing than actually targeting 1080p.

No they didn't. We're just throwing supersampled examples out there to show what they 'might' hope to achieve. It's the bar that's being set.
 

Emitan

Member
I can't believe how quickly people on here make useless posts without even thinking about the topic at hand.

"YOUR DOING 30 FRAMESPS AGAIN!?!?! I"M STICKING WITH NINTENDO CUZ THEY KNOW BEST SO THERE!!!1111"

That adds nothing to the discussion and bares no relevance. It's an ignorant statement that basically ignores the entire point of the article.

edit: ha ha, just two posts below this one someone else actually went there. smh


What is wrong with wanting to achieve superior visuals? That's precisely what Sega was doing in the 90s, afterall. Filmic need not imply "realism".

I don't find aiming for mediocrity from the start something to be celebrated
 
Top Bottom