• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Developers Discuss Benefits Of Targeting 720p/30fps Next Gen, Using Film Aesthetics

Nirolak

Mrgrgr
I hope people weren't expecting 1080p next generation.

Well, and if they were, that they're fine with filters and upscaling.

GamesIndustry.biz said:
An interesting discussion kicked off on the blog of NVIDIA's Timothy Lottes recently, where the creator of FXAA (an anti-aliasing technique that intends to give games a more filmic look) compared in-game rendering at 1080p with the style of visuals we see from Blu-ray movies.

"The industry status quo is to push ultra-high display resolution, ultra-high texture resolution, and ultra sharpness," Lottes concluded.

"In my opinion, a more interesting next-generation metric is can an engine on an ultra high-end PC rendering at 720p look as real as a DVD quality movie? Note, high-end PC at 720p can have upwards of a few 1000s of texture fetches and upwards of 100,000 flops per pixel per frame at 720p at 30Hz."

Comparing screengrabs of a game (Skyrim running with a super-sampled anti-aliasing hack) with the Robert Downey Jr Iron Man movie, the NVIDIA man reckons that even at native 1080p with no MSAA, game rendering is still effectively super-sampling compared to the quality we see in theatrical presentations, and maybe game developers could pursue a more filmic look using fewer pixels in concert with other processing techniques. Lottes noted that there is little or no single pixel-width detail in 1080p Blu-ray movies, as we can see in spades in ultra-precision PC presentation, suggesting that the same level of detail can be resolved in gaming without recourse to a 1080p framebuffer - or else utilising 1080p with a lot of filtering that gives the illusion of a lower resolution.

The notion was endorsed by many games developers, with DICE's rendering architect Johan Andersson saying that "It's not about the amount of pixels, it is about the quality of the pixels and how the overall (moving!) picture looks like. Less aliasing = less noise for your brain to interpret = more pleasing and easier to see visuals."

"Something else that bothers me in most games these days is how much contrast there is in the textures," added Prey 2 lead graphics programmer Brian Karis.

"Having physically based material guidelines help but the artists seem to try everything they can to create higher contrast. The result in my opinion is crunchy, noisy and often nasty looking images. I'd call the status quo ultra sharpness, ultra contrast."
GamesIndustry.biz said:
"We do what is essentially MSAA. Then we do a lens distortion that makes the image incredibly soft (amongst other blooms/blurs/etc). Softness/noise/grain is part of film and something we often embrace. Jaggies we avoid like the plague and thus we anti-alias the crap out of our images," added Pixar's Chris Horne, adding an interesting CG movie perspective to the discussion - computer generated animation is probably the closest equivalent gaming has in Hollywood.

"In the end it's still the same conclusion: games oversample vs film. I've always thought that film res was more than enough res. I don't know how you will get gamers to embrace a film aesthetic, but it shouldn't be impossible."
GamesIndustry.biz said:
To achieve a resolution equivalent to today's full HD Blu-ray movies, it therefore follows that the processing requirements could fall quite dramatically from what was previously thought. You don't need MSAA, you don't need super-sampling. While the image may fall foul of pixel-counting in that it doesn't hit native 1080p, precious rendering resources can still be deployed elsewhere.
Source: http://www.gamesindustry.biz/articles/digitalfoundry-tech-focus-does-pixel-count-matter?page=1
The Blog: http://timothylottes.blogspot.com/2012/01/games-vs-film.html
 

BY2K

Membero Americo
Well, they can't even achieve that half of the time, so I guess it's a nice goal for next-gen.
 

Nirolak

Mrgrgr
So there is hope for 60 fps next gen?

They say both 720p and 30fps to get the most benefits.

Well, they can't even achieve that half of the time, so I guess it's a nice goal for next-gen.

Yeah, and I feel their habits this generation suggest they will.

We can actually make 1080p games this generation, and 720p isn't an especially hard bar, but we almost always go for better graphics.
 
So there is hope for 60 fps next gen?
More like inconsistent 30 to maximize the quality of the pixels.

I keep saying temper your expectations guys. They'll sacrifice IQ, resolution, and framerate to get even the slightest bit more performance out of the systems.
 
or else utilising 1080p with a lot of filtering that gives the illusion of a lower resolution.
Unless you're going full BIT.TRIP, hell no.
Jtrizzy said:
So there is hope for 60 fps next gen?
Sorry, we've got to wait for James Cameron and Peter Jackson to convince people that film can be greater than 24 fps.
 

jett

D-Member
nonononononono.gif
 
So next gen is all about the full film effect with 24 fps huh.

Are you reading this game devs/console makers? Fuck sub 720p graphics and fuck the blurry mess FXAA creates. You do not want your games to look like or play as a film. Might as well go watch an actual film. Better acting, better story, actual nudity.
 

Shepard

Member
As a mainly PC gamer, 720p still gives me a relatively blurry image, no matter how good the game looks, and that's going to get even worse with tvs getting bigger and bigger, they should be aiming higher imo. I'm okay with a constant 30fps instead of 60 though.
 

Jtrizzy

Member
But better fidelity--textures, artifacts, draw distances, etc.

Locked 30>>>Unstable 60

I'd much prefer a variable 50 to 60 than even a truly locked 30, and whose to believe it would be locked based on this gen though? None of the current crop are locked at 30.

I hate to keep bringing up Skyrim as I was in the other thread, but I'm playing it right now in the upper 50's with dips in towns (vanilla, no mods), and when you compare that to the PS3 where the dips are down to the low 20's, it's staggering. (My comparisons were done early in the PS3 version, well before the save bug did whatever it does)
 

NBtoaster

Member
Don't have a problem with 720p next gen. Just hope MS brings back their policy of needing some form of AA in games to ship (and stick with it).
 

Linkup

Member
I wonder how that will go for the Wii U.

I think devs already had their way with wii putting any turd on it they pleased so I'm sure they can go 480P 24FPS on Wii U and Nintendo wouldn't blink an eye.

Also why target 30fps when 24fps is the standard? 720P and 24FPS here we come.
 

dark10x

Digital Foundry pixel pusher
I appreciate the line of thinking here. If this were the path some developers chose to follow I think we could see some really incredible visuals far beyond anything available now. I'd like to a see a fundamental shift in the way rendering is handled and I'd be willing to give up resolution to achieve it.

Here is a 1280x720 shot of Avatar:

2736_19_large.jpg


This is an incredible smooth, detailed image. On a higher resolution display this would still look phenomenal and very clean. You can't pick out individual pixels or any other image quality blemishes (obviously). Even if we don't technically reach the same level of geometry in a scene like that, we could fake a lot of it and still produce a reasonable facsimile.

Super high bitrate 1280x720 video looks infinitely better than a console game running at 1280x720, doesn't it? In fact, if you pause a 1080p Blu-ray and examine an individual frame, you aren't going to see the same razor sharp pixel definition that a current realtime game might produce. It will appear soft around the edges, though still sharp. This article is suggesting that developers put more powerful hardware in future game machines to use in order to produce a look closer to this rather than 1280x720 games of the past. Put that processing power into image quality in a different way than simply bumping up resolution. With proper filmic motion blur, it would look absolutely stunning at 30 fps even.

This wouldn't work for every game, but I'd love to see developers tackle this in the future.

They promised it this generation but they didn't always achieve it. There are tons of games that run sub-HD and then upscale up to 720p.
Not as many as you seem to think. Just some high profile titles that put the spotlight on it.

None of the current crop are locked at 30.
That's one hell of a generalization. There are loads of locked 30 fps games available on consoles (99% solid, at least).
 

Orayn

Member
I think devs already had there way with wii putting any turd on it they pleased so I'm sure they can go 480P 24FPS on Wii U and Nintendo wouldn't blink an eye.

Also why target 30fps when 24fps is the standard? 720P and 24FPS here we come.
Ten bucks says they tout 24->30 FPS judder as a new feature.
 
BS. If I hook my PC up to my HDTV (32 inch, so it isn't exactly huge) after having played some console games, the difference between the two in IQ is night and day, even if I use little or no AA with the PC.

The FPS is a different matter, and most high-end games on the PC can only reach 60FPS on very powerful machines, so I'd be alright with them aiming lower. Out of curiosity, why do we never see console games aiming for 40 or 50, only 30 (or sub-30, lol) and 60?
 
Almost feels like this is a matter of taste, upon reading this thread. Personally I'm going with "fuck you, 1080p/50-60 because I don't want to buy a new system that just makes better use of technology that already exists."

Yes, a locked 30 is all well and good, but better hardware ought to be taken advantage of.
 

dark10x

Digital Foundry pixel pusher
I think devs already had there way with wii putting any turd on it they pleased so I'm sure they can go 480P 24FPS on Wii U and Nintendo wouldn't blink an eye.

Also why target 30fps when 24fps is the standard? 720P and 24FPS here we come.
This suggests that you don't know anything about framerates and displays.

Displays typically refresh at multiples of 30 (60 Hz, 120 Hz, 240 Hz). A 30 fps video on a 60 Hz display allows for an even 1 frame for every 2 refreshes, basically. This provides a smooth, even motion.

24 fps is a practice used by film. Most displays do not support a mode necessary to display 24 fps without significant judder and, as a result, 24 fps at home will not look as smooth as 24 fps in a theater (unless you're TV properly handles 24 fps content).

BS. If I hook my PC up to my HDTV (32 inch, so it isn't exactly huge) after having played some console games, the difference between the two in IQ is night and day, even if I use little or no AA with the PC.
You're comparing current console games and PC games to one another. What they are talking about is something entirely different. It would look nothing like the 1280x720 console games of today.

I feel as if a lot of people here aren't able to see beyond what they already know.
 
I'm a firm believer in quality, not quantity.

I love 60fps games and know some franchises (Forza, for example) will always target 60fps. But 30fps is just fine for many titles and would allow developers to do much more on screen. The Battlefield 3 on ultra video in the linked thread looks absolutely stunning, and I expect that and more on the next generation of consoles.
 

RoboPlato

I'd be in the dick
:(

I was really hoping 1080p would be standard next gen. I'm fine with 30fps as long as it's locked but 1080p would make a big difference. I'm actually surprised that they're talking about sticking with 720p. 1080p isn't hard for most PCs from the past few years to achieve and I would assume that if the next gen is coming at the end of next year it would be able to handle that resolution while still adding a lot of more advanced effects. A resolution upgrade and some new effects would have a bigger impact than just more effects at the same resolution.
 

TheExodu5

Banned
I appreciate the line of thinking here. If this were the path some developers chose to follow I think we could see some really incredible visuals far beyond anything available now. I'd like to a see a fundamental shift in the way rendering is handled and I'd be willing to give up resolution to achieve it.

Here is a 1280x720 shot of Avatar:

2736_19_large.jpg


This is an incredible smooth, detailed image. On a higher resolution display this would still look phenomenal and very clean. You can't pick out individual pixels or any other image quality blemishes (obviously). Even if we don't technically reach the same level of geometry in a scene like that, we could fake a lot of it and still produce a reasonable facsimile.

Super high bitrate 1280x720 video looks infinitely better than a console game running at 1280x720, doesn't it? In fact, if you pause a 1080p Blu-ray and examine an individual frame, you aren't going to see the same razor sharp pixel definition that a current realtime game might produce. It will appear soft around the edges, though still sharp. This article is suggesting that developers put more powerful hardware in future game machines to use in order to produce a look closer to this rather than 1280x720 games of the past. Put that processing power into image quality in a different way than simply bumping up resolution. With proper filmic motion blur, it would look absolutely stunning at 30 fps even.

This wouldn't work for every game, but I'd love to see developers tackle this in the future.


Not as many as you seem to think. Just some high profile titles that put the spotlight on it.


That's one hell of a generalization. There are loads of locked 30 fps games available on consoles (99% solid, at least).

Sure, you can come away with good IQ, but look at how much detail is lost in that picture.

This suggests that you don't know anything about framerates and displays.

Displays typically refresh at multiples of 30 (60 Hz, 120 Hz, 240 Hz). A 30 fps video on a 60 Hz display allows for an even 1 frame for every 2 refreshes, basically. This provides a smooth, even motion.

24 fps is a practice used by film. Most displays do not support a mode necessary to display 24 fps without significant judder and, as a result, 24 fps at home will not look as smooth as 24 fps in a theater (unless you're TV properly handles 24 fps content).

24p with 3:2 pulldown!
 

Deadbeat

Banned
I completely agree with Dark10x with this low resolution and low framerate.

See, if console devs are purposely gimping their render output to make incredibly high quality graphics, I will get all the benefit of that on PC. I mean look at PC screens of MW3/Conviction/AC and so forth. Nice high res crisp graphics. I am 100% for this.
 
I just don't get it. The next consoles should be powerful enough that 1080p should not be an issue, even with tons of effects being applied to all those pixels.

With nearly everyone owning a 1080p TV at this point, I just don't get why you wouldn't want to meet that bar every time. I really hope the console makers demand that baseline resolution. It sounds like it's our only hope. :/
 
I appreciate the line of thinking here. If this were the path some developers chose to follow I think we could see some really incredible visuals far beyond anything available now. I'd like to a see a fundamental shift in the way rendering is handled and I'd be willing to give up resolution to achieve it.

Here is a 1280x720 shot of Avatar:

http://2.bp.blogspot.com/_nIpPUtoMNy4/S-Y3tGgdwbI/AAAAAAAABIU/EpS8hYx89AI/s1600/2736_19_large.jpg[img]

This is an incredible smooth, detailed image. On a higher resolution display this would still look phenomenal and very clean. You can't pick out individual pixels or any other image quality blemishes (obviously). Even if we don't technically reach the same level of geometry in a scene like that, we could fake a lot of it and still produce a reasonable facsimile.

Super high bitrate 1280x720 video looks infinitely better than a console game running at 1280x720, doesn't it? In fact, if you pause a 1080p Blu-ray and examine an individual frame, you aren't going to see the same razor sharp pixel definition that a current realtime game might produce. It will appear soft around the edges, though still sharp. This article is suggesting that developers put more powerful hardware in future game machines to use in order to produce a look closer to this rather than 1280x720 games of the past. Put that processing power into image quality in a different way than simply bumping up resolution. With proper filmic motion blur, it would look absolutely stunning at 30 fps even.

This wouldn't work for every game, but I'd love to see developers tackle this in the future.


Not as many as you seem to think. Just some high profile titles that put the spotlight on it.


That's one hell of a generalization. There are loads of locked 30 fps games available on consoles (99% solid, at least).[/QUOTE]

But isn't the native res of Avatar something massive like four thousand by something? That would go a long way towards ridding the image of any undesirables when scaled down to 720p.
 

Derrick01

Banned
I guess I'll see some of you hold outs on PC! I had a laugh that there were some who actually thought they'd go for 1080/60fps next gen and not more graphics stuff. It will never ever happen.
 

ghst

thanks for the laugh
Here is a 1280x720 shot of Avatar:

2736_19_large.jpg


This is an incredible smooth, detailed image. On a higher resolution display this would still look phenomenal and very clean. You can't pick out individual pixels or any other image quality blemishes (obviously). Even if we don't technically reach the same level of geometry in a scene like that, we could fake a lot of it and still produce a reasonable facsimile.
i can't work out if i'm missing something or if this post is really so boneheaded as to imply that avatar was rendered in 720p.
 

TheExodu5

Banned
Also, for the record, this is already pretty much achievable on PC with supersampling, in case anyone wants to check the results.

And from having tried it myself, I can tell you that native 1080p with 4x AA looks better than 2x2 supersampled 720p to my eyes.

Sure, the 720p image looks really nice and clean, but when blown up to a 1080p display, it's obvious the native 1080p image has so much more detail.
 

RoboPlato

I'd be in the dick
I just don't get it. The next consoles should be powerful enough that 1080p should not be an issue, even with tons of effects being applied to all those pixels.

With nearly everyone owning a 1080p at this point, I just don't get why you wouldn't want to meet that bar every time. I really hope the console makers demand that baseline resolution. It sounds like it's our only hope. :/

I agree. Especially since the general public actually knows that 1080p is actually some metric of quality.
 

dark10x

Digital Foundry pixel pusher
You call it an incredible smooth, detailed image. I call it blurry as fuck.
In motion, though, it would look absolutely amazing. Films aren't designed to be viewed as still shots.

Sure, you can come away with good IQ, but look at how much detail is lost in that picture.
The overall image is impressive enough that I wouldn't mind losing detail.

i can't work out if i'm missing something or if this post is really so boneheaded as to imply that avatar was rendered in 720p.
You're missing something.

I'm sure as hell not suggesting that Avatar was rendered at 720p. No, what I'm telling you is that these developers are talking about achieving that kind of look (the image as a whole) within games. Producing an image that looks more like a film than a game as we know it. It's not something that truly exists yet and it's not just about anti-aliasing. They're talking about everything within a scene.
 

gofreak

GAF's Bob Woodward
I'm kind of confused here, because I thought films were rendered out at 2xxxXwhatever resolution to start with. Like, that Avatar CG...it wasn't rendered out at 720p.

Why/how would games be 'oversampling' vs film in that case?
 

Grokbu

Member
I personally feel that, in most cases, the feeling and responsiveness of 60FPS is more important than technical image quality.
 
Top Bottom