• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Developers Discuss Benefits Of Targeting 720p/30fps Next Gen, Using Film Aesthetics

Eccocid

Member
Meh... i am ok with current game engines and their performance and games etc..
I just dont want any moar loading times!!!!
 

Nirolak

Mrgrgr
I saw that but just assumed that I completely missed something there because shitting up a perfectly fine picture with a vaseline filter couldn't be their point.

Disregarding the difference in effects, the vaseline picture looks objectively, factually, worse. Right? Am I going crazy here?

Actually, their proposal is that the top picture here looks better.

The top picture is what Pixar actually put on their blu-ray release for example, as opposed to the bottom shot being an "in progress" image.

I'm not sure I agree with them, at least in stills.

blurzsjy7.png
 

TheExodu5

Banned
Unfortunately this is quite true. Every monitor I've owned has produced absolutely awful results when native 1280x720 is fed to it. With the right display, however, 1280x720 can still look excellent.

Upscaled 720p never looks very good on a 1080p display, no matter how good the scaling, if you sit close enough.

Since I sit 2.5' away from my 32" TV screen, I am strongly against 720p as the standard.
 

Dorfdad

Gold Member
I saw that but just assumed that I completely missed something there because shitting up a perfectly fine picture with a vaseline filter couldn't be their point.

Disregarding the difference in effects, the vaseline picture looks objectively, factually, worse.


edit: it would only make sense if the vaseline is the thing that allows them to go for those extra effects.

Looks worse. Sharper yes but brighter and no shadows at all. Background is blurry mess.
 

Corky

Nine out of ten orphans can't tell the difference.
Wait all this talk about downsampling this or that...

Rendering and then downsampling an image in real time from 3840x2160 down to 1280x720 is just a matter of scaling in the end is it not? I mean as compared to a game just running natively at 3840x2160 would be just as demanding as real time downsampling. Or am I missing something here?
 
Personally I think the blurry image looks better. But on the other hand, I spent a few days trying to play games without glasses and started to wonder why they looked so horrible and that I had trouble playing them. Turns out I got so used to my glasses that even sitting merely inches away from my monitor the image became too blurry for me to be able to play the game.
 

Nirolak

Mrgrgr
Wait all this talk about downsampling this or that...

Rendering and then downsampling an image in real time from 3840x2160 down to 1280x720 is just a matter of scaling in the end is it not? I mean as compared to a game just running natively at 3840x2160 would be just as demanding as real time downsampling. Or am I missing something here?

The downsampling here is to just make the image more soft/blurred, not to decrease performance impact.

You could also do this in the reverse direction though, which would save performance, and is another part of the proposal.
 

TheExodu5

Banned
Wait all this talk about downsampling this or that...

Rendering and then downsampling an image in real time from 3840x2160 down to 1280x720 is just a matter of scaling in the end is it not? I mean as compared to a game just running natively at 3840x2160 would be just as demanding as real time downsampling. Or am I missing something here?

We're not suggesting they should supersample. We're just setting the bar for what they might hope to achieve. We can actually produce supersampled images (as Denis has provided) to approximate the end results.

As for those Wall-E images...the comparison isn't a good one, since the images are actually different. The shadowing is completely different in both screens.

Eithey way, not a fan of the top image at all. I don't like it when a film is trying to simulate poor eyesight. The top image makes me feel like I need glasses.
 

Eccocid

Member
The shadows aren't the point of the comparison. That's just an artifact of it being in development.

You dont need all that detail especially when there is even a slight camera movement(if they are applying motion blur etc)
The one at below looks ike a promo still image where you can show of the detail.
 

Stallion Free

Cock Encumbered
Arguing in favor of 720p for a filmic look is like going to see TDK or MI4 and saying afterwards: "I wish they had filmed the whole movie with normal cameras" rather than "I wish they had filmed the whole movie with IMAX cameras."

And the whole proposal is kinda of ridiculous in light of cameras filming in higher and higher resolution. Girl with the Dragon Tattoo looked ridiculously crisp on my theaters 4k digital projectors and it was filmed with Red cameras. It looked amazing.
 

sleepykyo

Member
Actually, their proposal is that the top picture here looks better.

The top picture is what Pixar actually put on their blu-ray release for example, as opposed to the bottom shot being an "in progress" image.

I'm not sure I agree with them, at least in stills.

blurzsjy7.png

Pixar makes amazing films. Films and games are different beasts though.

Whatever the target frame rate is, someone will always decide to miss it. At least when 60 is missed by a third, the frame rate is still high. With 30fps as the target, miss it by a third and what's left isn't even cinematic.
 

TheExodu5

Banned
The surface looks so much better in that second shot. You can see and feel the texture of the paint chipping. In the top image, it looks like a painted texture instead, and loses all of its 3 dimension quality.
 

jett

D-Member
Actually, their proposal is that the top picture here looks better.

The top picture is what Pixar actually put on their blu-ray release for example, as opposed to the bottom shot being an "in progress" image.

I'm not sure I agree with them, at least in stills.

blurzsjy7.png

There's a case to be made to intentionally blur out CG and/or add a layer of film grain. Take a look at this promotional 4k shot from Avatar. It has no film grain or blur of any kind(depth of field aside), it looks razor sharp. And it also super fake and unpleasant to the eyes.

This has nothing to do with video games though.
 

Haunted

Member
Personally I think the blurry image looks better. But on the other hand, I spent a few days trying to play games without glasses and started to wonder why they looked so horrible and that I had trouble playing them. Turns out I got so used to my glasses that even sitting merely inches away from my monitor the image became too blurry for me to be able to play the game.
But do you think the blurry image looks better because it's blurry or because of its better shadowing and colour correction?

I'd say it looks better despite being more blurry. Give me the crisp picture with the shadowing and colouring of the top one for the best of both worlds.
 

pottuvoi

Banned
Though now you can achieve near supersampled results with MSAA + SGSSAA at this point.

Not sure what SGSSAA involves exactly, but it is relatively costly.
Actually when you turn SGSSAA on you turn MSAA into a SSAA. (instead of calculating one shader once per pixel it does shader once per sample. (4x runs shader 4 times within pixel)

Good thing with SGSSAA is that it also uses proper sample locations as well, not like 2x2 and such modes which use ordered grid.
 

TheExodu5

Banned
There's a case to be made to intentionally blur out CG and/or add a layer of film grain. Take a look at this promotional 4k shot from Avatar. It has no film grain or blur of any kind(depth of field aside), it looks razor sharp. And it also super fake and unpleasant to the eyes.

This has nothing to do with video games though.

The reason that shot looks bad is not really because it's super high resolution. It's because there's not enough detail in the image to warrant that resolution. The texturing on his skin is not detailed enough and makes the image look fake. Blurring and film grain simply makes it seem like the detail might be there, when in reality it's not. If the detail was there, the high resolution shot would look good.
 

C.Dark.DN

Banned
But do you think the blurry image looks better because it's blurry or because of its better shadowing and colour correction?

I'd say it looks better despite being more blurry. Give me the crisp picture with the shadowing and colouring of the top one for the best of both worlds.
That will be what Wall-E looks like when it releases in 4k for home release.
 
I dearly wish people would realize that games are not films. When you're reacting or controlling instead of just sitting there staring, a higher framerate helps.
 

Wonko_C

Member
Actually, their proposal is that the top picture here looks better.

The top picture is what Pixar actually put on their blu-ray release for example, as opposed to the bottom shot being an "in progress" image.

I'm not sure I agree with them, at least in stills.

blurzsjy7.png


Wait, what...

http://www.youtube.com/watch?v=BAbVVwCxtog#t=06s

What is going on in their brains that makes them think that shortsightedness is better? I have myopia and this isn't funny.
 
But do you think the blurry image looks better because it's blurry or because of its better shadowing and colour correction?

I'd say it looks better despite being more blurry. Give me the crisp picture with the shadowing and colouring of the top one for the best of both worlds.

Right I get that there are extra effects in the blurry shot so that make the overall image better.

But specifically around the outside of Wall-E's eyes. Something sticks out about them in the clear image that might look a bit off to me.

Although I agree with Exodu's point that the texture and detail of the paint looks so much better in the clear shot. I still think there is something about the softer image that is easier on the eyes and makes it more pleasant to look at. But again a still image doesn't represent playing a video game.
 

sleepykyo

Member
60 FPS is better for pointer games -- mouse or something like Move -- so I'm not sure this applies to a lot of games. LA Noire 2, sure, but not as a standard.

CoD, Street Fighter, Soul Calibur, Madden. It applies to games where responsive controls are important.
 

zoukka

Member
You can bet your ass that pixar has shown clips of the movie to people and for some reason they have preferred the less sharp image. We would really need to see the comparison in motion.
 

Thraktor

Member
It all depends on the type of game. A game like Wipeout benefits from a sharper picture and higher framerate, but games that are more focussed on narrative would be better off with 720p and 30fps with the graphical power focussed elsewhere.

If they make a sequel to LA Noire on next-gen consoles, it's the sort of project that would really suit a more "filmic" rendering style. Render it in 720p, entirely in black and white, and at 24fps. Simulate the 180° rule used in filmmaking, by rendering each frame as if it had an exposure time of 1/48th of a second (this would actually make it look smoother than most 30fps games). Use a realistic 35mm cinecam-style DoF. Add some slight viginetting to the image and simulate some very light film grain. The result would look far better than any attempt to get the game running at 1080p and 60fps.

Another aspect of the game to take into account would be to restrict the camera, as far as possible, to angles and movements that would actually be feasible with a film camera. For example, when you get into a car, the camera simply swings around the back of the car and you start to drive off. In films (especially older films) this would have to be a cut, as a different camera setup would be used to get car shots than normal outdoor scenes. Of course, this sort of cinematic camera-work has to be implemented in a way that doesn't get in the way of gameplay, but there are games that manage it, such as Eternal Darkness.
 

SapientWolf

Trucker Sexologist
Actually, their proposal is that the top picture here looks better.

The top picture is what Pixar actually put on their blu-ray release for example, as opposed to the bottom shot being an "in progress" image.

I'm not sure I agree with them, at least in stills.

blurzsjy7.png
Godammit Pixar. Godammit.
 

Dennis

Banned
Please explain how 60 fps isn't better for all games.

Because running at 60 fps is an indication that they had performance to spare that could have been used to make the game look better instead.......at 30 fps.

This 60 fps fetish...........

20 fps is all you need to find good places to take screenshots
trollololo
 

M3d10n

Member
I honestly don't quite get what the nvidia is trying to say. The only way you'd manage to get Film/CG-like IQ would be to render at a much higher resolution than your intended framebuffer and downsample, and then filter the image. Only using filters is really not going to cut it.

There's only so far one can get bathing a non-antialiased 720p image with filters, that's true. But I think they are talking more about exploring different ways of rendering that reduce aliasing in 720p rather than trying to go for 1080p.

CG often uses a form of antialiasing that is different from MSAA or SSAA: instead of sampling sub-pixels, they sample using a physical based model (like light rays going through a lens) which also grabs samples outside the pixel's position, resulting in a smoother image.

There are some proposed techniques for decoupling sampling from shading, so effects that require multiple samples (not only antialiasing, but focal and motion blur) can be done efficiently. DirectX 11 GPUs support "scattering", which means they can write data (eg: fragment colors) to arbitrary positions in a buffer. This means a render pipeline can store and access samples in different layouts other than a bitmap buffer, so you could essentially achieve physical-based super-sampling effects without actually rendering at 2x/3x/4x the resolution.

But DX11 scatter is far from lightweight and the difference in performance between using it at 720p and 1080p can be substantial.
 
It all depends on the type of game. A game like Wipeout benefits from a sharper picture and higher framerate, but games that are more focussed on narrative would be better off with 720p and 30fps with the graphical power focussed elsewhere.

If they make a sequel to LA Noire on next-gen consoles, it's the sort of project that would really suit a more "filmic" rendering style. Render it in 720p, entirely in black and white, and at 24fps. Simulate the 180° rule used in filmmaking, by rendering each frame as if it had an exposure time of 1/48th of a second (this would actually make it look smoother than most 30fps games). Use a realistic 35mm cinecam-style DoF. Add some slight viginetting to the image and simulate some very light film grain. The result would look far better than any attempt to get the game running at 1080p and 60fps.

Another aspect of the game to take into account would be to restrict the camera, as far as possible, to angles and movements that would actually be feasible with a film camera. For example, when you get into a car, the camera simply swings around the back of the car and you start to drive off. In films (especially older films) this would have to be a cut, as a different camera setup would be used to get car shots than normal outdoor scenes. Of course, this sort of cinematic camera-work has to be implemented in a way that doesn't get in the way of gameplay, but there are games that manage it, such as Eternal Darkness.

LA Noire looks better in B&W at 1080p 24 than it looks at B&W 720p 30, in my opinion anyway. there's no reason to limit resolution in order to achieve a 'cinematic' look. my blu-rays may not have much single pixel detail at 1080p, but they still have more detail at 1080p than they do at 720p. thats ignoring that i just saw a film shot in 4k the other week and it looked stunning.

i didn't go 'OH MAN TOO MUCH RESOLUTION'.

Because running at 60 fps is an indication that they had performance to spare that could have been used to make the game look better instead.......at 30 fps.

This 60 fps fetish...........
60 fps better approximates how we really see. higher would be even preferable. but when it comes to gaming that fetish is as much about how the game plays as it is about how the game looks. 60 fps feels more responsive. games control better at 60 fps. that's no small detail.

1080p vs 720p looks better, but it doesn't *play* better. 60 fps vs 30 fps looks and plays better.
 

TheExodu5

Banned
Right I get that there are extra effects in the blurry shot so that make the overall image better.

But specifically around the outside of Wall-E's eyes. Something sticks out about them in the clear image that might look a bit off to me.

Although I agree with Exodu's point that the texture and detail of the paint looks so much better in the clear shot. I still think there is something about the softer image that is easier on the eyes and makes it more pleasant to look at. But again a still image doesn't represent playing a video game.

The lighting of his eyes is different in both shots. Can't really compare them. The glass is too clear in the second shot.
 

SapientWolf

Trucker Sexologist
CoD, Street Fighter, Soul Calibur, Madden. It applies to games where responsive controls are important.
It's more about visual fluency than responsive controls. Playing something like MvC3 at 30fps would be like trying to play tennis with a strobe light.
 
That might be an artifact of my resizing.

Check if you still feel this way with the big one: http://0.tqn.com/d/kidstvmovies/1/0/I/H/walle008.jpg
You are right it probably is the browser resizing the image improperly. I am at work and the monitor is a bit small so I can't do a proper comparison.

But I do think that the detail lost in the texture of the flaking paint is the most noticeable/significant difference between the two images. The blur completely destroyed that.
 

XiaNaphryz

LATIN, MATRIPEDICABUS, DO YOU SPEAK IT
it seems developers prefer to use the extra power on new generation of consoles for textures, shaders and maybe resolutions, 60 fps on consoles as a standard is never going to happen, with a new console the circle will start again.

so the question is, with next-gen consoles:

a) Do you want your games to look like current-gen games but to run at silky smooth 60 fps (and 720p) or

b) Do you want your games to look better than curent-gen games at the expense of fps and resolution.

I choose A but I might be in the minority here
A lot of people want games to look better at higher resolution AND to run at 60 fps.

Actually, their proposal is that the top picture here looks better.

The top picture is what Pixar actually put on their blu-ray release for example, as opposed to the bottom shot being an "in progress" image.

I'm not sure I agree with them, at least in stills.

blurzsjy7.png
I'm not sure that's quite the same argument - because CG stuff is done with essentially virtual cameras and virtual sets, you have to do more work to emulate a "film look" that may be the intended result which requires certain elements to be tweaked and balanced (grain, color, gamma, image stability, resolution, contrast, depth of field, motion blur). This has been the case on pretty much every CG animated film or effects shot I've seen. Not every animated feature may go for a photorealistic take on film look, but some degree of it is almost always agreed upon so the film has a consistent look to it.

Also, to the people talking about film being 24 fps, the effective framerate is actually double or triple that because projectors actually show each frame multiple times before moving on to the next one, which is why you don't notice any flickering/choppiness despite the low framerate.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
I would totally settle for 720p/30 if the picture quality could be increased as much as they say it can.

I agree!

Give me this at 720p and 30 fps.

gran-turismo-5-008.JPG


Now it doesn't have to be GT6 or GT7 in particular, but if they can make the shadows look this good next-gen at 720p and 30 fps I'd LOVE it!
 

XiaNaphryz

LATIN, MATRIPEDICABUS, DO YOU SPEAK IT
I agree!

Give me this at 720p and 30 fps.

gran-turismo-5-008.JPG


Now it doesn't have to be GT6 or GT7 in particular, but if they can make the shadows look this good next-gen at 720p and 30 fps I'd LOVE it!
A racing sim at 30 fps? :/

That's one genre I don't mind sacrificing visuals for framerate if I absolutely had to. I notice the difference there a lot, especially on my wheel setup.
 

Reallink

Member
2 years in next gen will be about applying sub HD filters to sub HD output for that super sub HD SD aesthetic. Resistance 3 and Alan Wake will look 4k by comparison.
 

Durante

Member
I have been following Timothy Lottes' blog for a while, and I've also followed this discussion with interest.

I think there is some point to it when it comes to oversampling, flickering is one of the most immersion-breaking and persistent motion artifacts in games, and it's often caused by an over-abundance of pseudo-detail (that is, noise).

However, I vehemently disagree when it comes to framerate. Even if you were capable of perfectly simulating film-quality motion blur, games are an inherently interactive medium. A high framerate is necessary to ensure fast response times, and not just smooth motion.
 

Kintaro

Worships the porcelain goddess
So basically, I can skip next generation of console hardware as well since my current PC will still run their games better?

Saves me money!
 

jarosh

Member
and yet: games are not movies. there's no reason to think that the paramount goal of a game developer should be to recreate the look of film no matter the cost, no matter the sacrifices.

games benefit from plenty of things that are inherently undesirable or unneeded in film, considering the medium's non-interactivity. among them are higher resolutions and more frames per second. film is a guided, directed experience with no room for user interaction, so there's no need for added detail information beyond the director's intention. you can't "lose" a movie and you can't stop in place and look around and take in as much information as you want/need. yet in games you are often free to observe your immediate environment from all angles, you move the camera and often even are allowed to get as close to an object or character as you want.

why should it be desirable for a game like skyrim, for example, to have a softer, lower resolution look locked at 30 fps, just so a more film-like "experience" can be achieved? wouldn't you rather have a higher resolution image with the potential to see as much detail as possible when you stand atop a mountain and gaze far into the distance? wouldn't you rather see more frames per second when you run from a rabid beast, making it easier to react to your surroundings while trying to escape and attempting to spot a hiding place or a safe spot to launch a counter-attack? let's not even get into the added benefit of more pixels (=more information) for user interfaces, huds etc...

none of the above would be necessary if skyrim was a movie. the director and cinematographer would plan out precisely what you, the viewer gets to see during every second, every frame of the movie. any battle scene would be carefully choreographed. there is no inherent need for added information, be it through a higher resolution or a higher framerate (which isn't to say that movies won't benefit from either!), because there's no margin of error, no user input, no possibility of alternate outcomes.

film is both storytelling and directed experience. unless we wanna give up any and all interactivity in games, approximating the look of film while compromising progress in areas that makes games unique, shouldn't be desirable. or are "cinematic" experiences all we want out of games anymore?
 
Top Bottom