• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Developers Discuss Benefits Of Targeting 720p/30fps Next Gen, Using Film Aesthetics

We're taking about 720p with absolutely zero spatial or temporal aliasing. 720p with none of the rendering artefacts we just accept as the norm in games today, even in supersampled PC games. Basically, go grab a Pixar Bd and see if you like the image quality there.
Sometimes Brain's posts wind me up with his constant PC preaching but this needs to be QFT.

They're not talking about 720p with a simple blur applied; they're talking about Pixar level IQ at 720p.

720p versions of Pixar films still look incredible when upscaled via my 1080p TV. Taking screengrabs of games, bluring and upscaling to 1080p is nothing like what's being talked about. That's an apples to oranges comparison.

As BS says, go grab a Pixar BD, watch it and then come back.
 

SapientWolf

Trucker Sexologist
Sometimes Brain's posts wind me up with his constant PC preaching but this needs to be QFT.

They're not talking about 720p with a simple blur applied; they're talking about Pixar level IQ at 720p.

720p versions of Pixar films still look incredible when upscaled via my 1080p TV. Taking screengrabs of games, bluring and upscaling to 1080p is nothing like what's being talked about. That's an apples to oranges comparison.

As BS says, go grab a Pixar BD, watch it and then come back.
Pixar uses micropolygons. That level of IQ isn't achievable with consumer level hardware.
 

XGoldenboyX

Member
2736_19_large.jpg


I believe we are close

KZ7.jpg
 
[IM]http://2.bp.blogspot.com/_nIpPUtoMNy4/S-Y3tGgdwbI/AAAAAAAABIU/EpS8hYx89AI/s1600/2736_19_large.jpg[/IMG]

I believe we are close

[IM]http://www.gamegavel.com/reviews/wp-content/uploads/2011/03/KZ7.jpg[/IMG]

Killzone 3 is probably the only next gen game I've played that used really noticeable dithering. It times it was an ugly looking game.
 

Septimius

Junior Member
30 fps is survivable. I know many cry for movies to move to 60fps, though. The problem with 'leaning on' 30 fps is that it will be very, very noticeable when it drops below 30. And when you're caping at 30, you want to make the most out of those 30 you do make, so you're likely to get drops. That's no fun.
 

sentry65

Member
I made some renders I cropped in on to show as examples what the developers are talking about:

1080pnoaa.jpg
720phighaaupscaled.jpg
1080phighaa.jpg


1080p without AA is going to have shader noise in the reflections and highlights on top of bad edges
A high quality 720p is blurrier, but looks more realistic without all the artifacts

1080p with AA of course just looks great
 

charsace

Member
720p can be made to look really great with high aa/af, great shaders and other tech applied. Maybe this weekend I will fire up Blender or messiahstudio and render a scene at 800*600 res and post some pics to show you guys.
 

Wonko_C

Member
I made some renders I cropped in on to show as examples what the developers are talking about:

1080pnoaa.jpg
720phighaaupscaled.jpg
1080phighaa.jpg


1080p without AA is going to have shader noise in the reflections and highlights on top of bad edges
A high quality 720p is blurrier, but looks more realistic without all the artifacts

1080p with AA of course just looks great

Oddly enough, I prefer the sharp 1080p no AA image to the blurry upscaled 720p AA image. But hey, I love playing 320x240 2D games with the sprites unfiltered. 1080p with AA hits the sweet spot, though.
 

mr_nothin

Banned
I dont get this....
You start with the higher resolution and then use filters and post processing to dirty up the image. You dont start out with the lower resolution.......

You want to start out with the cleanest image possible, to give you some headroom, and then you start to scratch up/dirty up/blur up the image. You want to start out with as much information in the image as possible and then fuck it up....you dont want to start out with a fucked up image and try to clean it up then dirty it up lol.

1080p + FXAA (to soften up the image) looks better than 720p upscaled.

It's the same philosophy film makers and photographers use. That's why we shoot RAW on our cameras. With this guy's thinking...it's like wanting to use a compressed DSLR to film instead of a 2K or 4K uncompressed Film/R3D camera that can shoot in RAW. If image quality was your only concern, which one would you choose?
 

TUROK

Member
I dont get this....
You start with the higher resolution and then use filters and post processing to dirty up the image. You dont start out with the lower resolution.......

You want to start out with the cleanest image possible, to give you some headroom, and then you start to scratch up/dirty up/blur up the image. You want to start out with as much information in the image as possible and then fuck it up....you dont want to start out with a fucked up image and try to clean it up then dirty it up lol.

1080p + FXAA (to soften up the image) looks better than 720p upscaled.

It's the same philosophy film makers and photographers use. That's why we shoot RAW on our cameras. With this guy's thinking...it's like wanting to use a compressed DSLR to film instead of a 2K or 4K uncompressed Film/R3D camera that can shoot in RAW. If image quality was your only concern, which one would you choose?
As has been stated many times before, using a 720p res enables developers to get more out of the hardware.

Here, let me use an example. Lets say you have a computer capable of running Battlefield 3 at 1080p, Ultra settings, and 60 FPS. Now imagine that the game has been developed for that computer, and has had its polygon count increased, its particle count and complexity increased, it's lighting precision increased, and so on and so forth, and these improvements made it so the game would only run at a decent framerate at 720p. You would have a VASTLY SUPERIOR looking game just by using the processing speed for the resolution and framerate for other things.

It's not just about image clarity, but also about improving the graphical assets.
 

kevm3

Member
I'm fine with the 720P if it meant much better lighting and physics. The 30 fps thing, I'm not so sure about. If you're going to keep that res, you had better make the framerate better than what we're getting this generation. I'm not a huge resolution junky when it comes to games. I'd much rather they max out 720P before having to fill the screen with more pixels. I'd much rather see much better animation, lighting and physics before super high res and sharp textures.
 

mr_nothin

Banned
As has been stated many times before, using a 720p res enables developers to get more out of the hardware.

Here, let me use an example. Lets say you have a computer capable of running Battlefield 3 at 1080p, Ultra settings, and 60 FPS. Now imagine that the game has been developed for that computer, and has had its polygon count increased, its particle count and complexity increased, it's lighting precision increased, and so on and so forth, and these improvements made it so the game would only run at a decent framerate at 720p. You would have a VASTLY SUPERIOR looking game just by using the processing speed for the resolution and framerate for other things.

It's not just about image clarity, but also about improving the graphical assets.

What?
BF3 wouldnt look as good at 720p @ 30fps on ultra settings. Assets would look a little better but overall it wouldnt look as good.

What you're talking about is having a standard...period. What I'm talking about is not having the standard be 720p. I've already applied the same type of logic when going from last gen to this gen: "Imagine what they could have done at 480i instead of putting all that powered towards upping the resolution". I'm all for having a standard and everything....but 720p/30fps isnt the standard that we should be shooting for.
 

TUROK

Member
What?
BF3 wouldnt look as good at 720p @ 30fps on ultra settings. Assets would look a little better but overall it wouldnt look as good.
Ugh, you're not getting it. If you have an i-7, a GTX 580, and develop a game for that specific hardware, a game that will only run at a decent framerate in 720p, it would look a lot better than Battlefield 3 does on the same hardware, despite the resolution and framerate advantages.

Whether you think that's a standard that we should or should not have is another issue.
 

sentry65

Member
I really wish people would forget this whole idea of a "standard" because it's going to be different for every game.

Games will run at whatever the developers decide is the best compromise.
 

mr_nothin

Banned
Ugh, you're not getting it. If you have an i-7, a GTX 580, and develop a game for that specific hardware, a game that will only run at a decent framerate in 720p, it would look a lot better than Battlefield 3 does on the same hardware, despite the resolution and framerate advantages.

Whether you think that's a standard that we should or should not have is another issue.

Of course, that's the same philosophy that consoles have. Developing for a closed system will always produce more efficient visuals but that's not what I'm getting from the OP. They are trying to target a specific resolution and framerate...not a specific graphics card. PC devs already develop around certain graphics cards and cpus though. For the Ultra preset on BF3, they dev'd it with the 580/6970 in mind. Of course they cant specifically program the game, down to the metal, just for a 580 because the 580 wont be around for too long.
 

TUROK

Member
Of course, that's the same philosophy that consoles have. Developing for a closed system will always produce more efficient visuals but that's not what I'm getting from the OP. They are trying to target a specific resolution and framerate...not a specific graphics card. PC devs already develop around certain graphics cards and cpus though. For the Ultra preset on BF3, they dev'd it with the 580/6970 in mind. Of course they cant specifically program the game, down to the metal, just for a 580 because the 580 wont be around for too long.

But that's exactly what they're talking about, next gen consoles. There will never be a standard for PC's, because as hardware improves, you can improve resolution and framerate just by upgrading hardware, so capping either of those doesn't make sense, but by having such a standard on consoles, they can maximize IQ and asset fidelity at a reasonably decent resolution.
 
I should point out there is a follow up response posted here: http://www.neogaf.com/forum/showthread.php?p=34265264

My prior comment, "IMO a more interesting next-generation metric is can an engine on a ultra-highend PC rendering at 720p look as real as a DVD quality movie?" is a rhetorical question asking if it is possible for a real-time engine to start to approach the lower bound of a DVD movie in realism.

To make this clear, I'm not suggesting that games should compromise interactive experience just to get visual quality. If I was going to develop a title for next generation consoles I would output 1080p and run frame locked to 60Hz with no dropped frames period. I still believe developers will be able to start to reach the quality of film for next generation consoles and current generation PCs, and I'm intending to develop or prove out some of the underlining techniques and technology which gets us there.
 
I'm a professional rendering artist in the industry and here's a couple of my thoughts:

The general public, even most hardcore PC gamers don't really know what they're talking about because they're basing their opinion on current game renderings, current AA tech, current realtime shaders, and current effects. 720p PS3 games look more realistic than the PS2 HD 1080p remastered games and can do things the PS2 games can't in terms of enemies on screen etc.

What ultimately is going to happen is it's going to depend on the game, the director, and the artists - just like it always has. You'll get games like Shadow of the Colossus that opt for a new level of realism and accurate physics at the expense of resolution and framerate. And You'll get games like God of War with higher frame rates and resolution, but simpler rendering.

Yet a game like ut99 or cs is infinitely much more playable than the 'realistic' killzone 2 or uncharted 2 because the horrendous input lag on both those realistic games makes it feel like you are wading through a sea of jelly and/or are extremely drunk.

For any game where you actually have to be in control and react to things (so basically anything but slow RTS games and turn based games) you NEED 60 fps to make it feel right.

Nothing can possibly make up for a game that doesn't control right, nothing.
I'll take 1998 graphics with no discernable input lag over 2011 graphics with 100-200ms input lag, and so will anyone who actually wants to play videogames instead of only look at videogames.
No matter how good a game (subjectively) looks, input lag is NOT ok.

If you think that that initial first hour impression of a game looking a bit better is worth sacrificing the gameplay feel than you have your priorities mixed.
I started out playing battlefield 3 mp on my pc on ultra as it gave me an average of about 60 fps (with drops to the high 30s), I was impressed by the graphics for about an hour or 2.

By the time I hit the last rank in the game I had gradually lowered my settings to never drop under 60 fps because the feeling of increased input lag whenever the framerate dropped lingered on LONG after the graphics wow factor was gone and was ruining the feel of being in direct control of the game.
I was already playing it on a crt monitor (so I already had 20-60 ms input lag less at all times than if I would be playing it on an LCD monitor -and possibly scaling the image on an lcd monitor) with a wired mouse, and even on that ideal set up the input lag when you get to 30 fps was ruining the feel of the game.

I can't even imagine wanting to play the game at 30 fps on an lcd tv that has to upscale and process the image adding another 100+++ ms latency).

I really think that most people who are ok with it just don't know any better, if they knew what 1:1 no delay input without anything messing with your timing felt like they would not be ok with it at all.
 

McLovin

Member
I don't mind 1080p at 30fps for next gen. 720p is too low for next Gen, its a good target for the wii u if most of the wii owners get it. Even hardcore wii fans keep saying their Mario and Zelda games won't benefit from going HD as long as they go for good art style it shouldn't be a problem. Xbox/ ps4 is another story. 720/30fps is a good current Gen standard. I'm not worried about racing games since games like gt5 and forza already run at 60.
 

op_ivy

Fallen Xbot (cannot continue gaining levels in this class)
I made some renders I cropped in on to show as examples what the developers are talking about:

1080pnoaa.jpg
720phighaaupscaled.jpg
1080phighaa.jpg


1080p without AA is going to have shader noise in the reflections and highlights on top of bad edges
A high quality 720p is blurrier, but looks more realistic without all the artifacts

1080p with AA of course just looks great

thank you for this. 720p with more detailed graphics AND good AA is superior IMO.
also, i only own a 720p set!
.

30 fps with good blur also looks great and for most cases would be preferable to 60 fps - if its "cheaper" and allows for other graphical improvements as well. it looked pretty great in the 360's pgr 3 at launch, surprising, since that is one genre that i felt always needs 60 fps and now i think arcade games can get away with 30 and blur. would still greatly prefer 60 though in more "technical" sim style games, ala FM and GT
 

TUROK

Member
Yet a game like ut99 or cs is infinitely much more playable than the 'realistic' killzone 2 or uncharted 2 because the horrendous input lag on both those realistic games makes it feel like you are wading through a sea of jelly and/or are extremely drunk.

For any game where you actually have to be in control and react to things (so basically anything but slow RTS games and turn based games) you NEED 60 fps to make it feel right.

Nothing can possibly make up for a game that doesn't control right, nothing.
I'll take 1998 graphics with no discernable input lag over 2011 graphics with 100-200ms input lag, and so will anyone who actually wants to play videogames instead of only look at videogames.
No matter how good a game (subjectively) looks, input lag is NOT ok.

If you think that that initial first hour impression of a game looking a bit better is worth sacrificing the gameplay feel than you have your priorities mixed.
I started out playing battlefield 3 mp on my pc on ultra as it gave me an average of about 60 fps (with drops to the high 30s), I was impressed by the graphics for about an hour or 2.

By the time I hit the last rank in the game I had gradually lowered my settings to never drop under 60 fps because the feeling of increased input lag whenever the framerate dropped lingered on LONG after the graphics wow factor was gone and was ruining the feel of being in direct control of the game.
I was already playing it on a crt monitor (so I already had 20-60 ms input lag less at all times than if I would be playing it on an LCD monitor -and possibly scaling the image on an lcd monitor) with a wired mouse, and even on that ideal set up the input lag when you get to 30 fps was ruining the feel of the game.

I can't even imagine wanting to play the game at 30 fps on an lcd tv that has to upscale and process the image adding another 100+++ ms latency).

I really think that most people who are ok with it just don't know any better, if they knew what 1:1 no delay input without anything messing with your timing felt like they would not be ok with it at all.

http://www.eurogamer.net/articles/digitalfoundry-nfs-hot-pursuit-face-off?page=2

Need for Speed: Hot Pursuit runs at 30 FPS on consoles and has 83 ms response time.
 
http://www.eurogamer.net/articles/digitalfoundry-nfs-hot-pursuit-face-off?page=2

Need for Speed: Hot Pursuit runs at 30 FPS on consoles and has 83 ms response time.

Criterion are one of the only devs who put maximum importance on input lag, and they are wizards at reducing the amount of frames their engine takes to put a result on screen.
Burnout paradise was also the best game in class for input lag.

83 ms is still significantly much more than 50 ms, and those horrible 3frame input lag lcd screens that everyone uses (most people have even worse monitors) make it even more important to reduce input lag from the game engine and framerate as much as possible.

In a perfect world where you could clone the criterion engineers and make them work on every game in the world, it would be just about scraping the 'close enough' side of things.

I think you make a very poor argument though as we both know the reality is that almost all 30 fps games have a large amount of input lag (usually 100-166 ms).

You also seem to forget that in a feeble attempt to minimize the input lag damage those games then also don't use any form of vsync.
And then the ugly little problem arises that most devs aren't even capable of making their game actually run at a consistent 30 fps at all and you get the tripple whammy of:
-input lag out the wazoo
-stuttering
-screen tearing like crazy (which in my subjective opinion just about negates any effect 'better' graphics can have, it just looks like shit when every second or third frame tears)

This gen is awful for gameplay responsiveness, image quality and immersion (due to tearing and stuttering).
Priorities are messed up, if they can't get 60 fps (and 1080p as long as we are stuck with LCD tvs) in most real time (not turn based) games next gen then I'm sticking to my PC only for the first time ever.

I barely play my ps3 or xbox because the games aren't responsive, a problem I didn't have on ps2 back when games were played on CRT tvs and the vast majority of combat/shooting/action/platform/racing games were 60 fps.
 

NIN90

Member
You haven't played games until you played at 120+ FPS @ 120 hz. I do it with some older games (CoD2) and I feel like my face is deeply buried inside Christina Hendrick's chest everytime I do it.
 

mhayze

Member
Hopefully next gen consoles have fast enough CPUs that they can run the game engine at 60fps (or 120 - fat chance) even if GPU output is limited to 30 (yes, this is possible). This will give it the responsiveness of a locked 60fps engine with visual quality improvements usually reserved for 30fps locked. I know there are technical difficulties, but so was having 1 general purpose core + 6 vector units in the current gen.

The other nice thing about a smaller frame buffer is having more RAM leftover for other things.
 

Kapura

Banned
Hopefully next gen consoles have fast enough CPUs that they can run the game engine at 60fps (or 120 - fat chance) even if GPU output is limited to 30 (yes, this is possible). This will give it the responsiveness of a locked 60fps engine with visual quality improvements usually reserved for 30fps locked. I know there are technical difficulties, but so was having 1 general purpose core + 6 vector units in the current gen.

The other nice thing about a smaller frame buffer is having more RAM leftover for other things.

doesn't cheating the system in such a way sometimes result in visual ghosting? I'm out of my depth with lots of graphical discussions, but I thought that upscaling 30FPS to 60FPS usually resulted in artefacts.
 

Durante

Member
The other nice thing about a smaller frame buffer is having more RAM leftover for other things.
When we're talking about 2GB RAM (at the very least) is the memory usage of the framebuffer really an issue? Even with 4xAA a 1080p framebuffer should be around 30 MB.


doesn't cheating the system in such a way sometimes result in visual ghosting? I'm out of my depth with lots of graphical discussions, but I thought that upscaling 30FPS to 60FPS usually resulted in artefacts.
That's not what he is talking about. The game would only output 30FPS, but calculate inputs and physics at 60. I don't really see the point though, since you can't react to something you don't see. Also, I know that this is already being done, at least for input and physics.
 

SapientWolf

Trucker Sexologist
Hopefully next gen consoles have fast enough CPUs that they can run the game engine at 60fps (or 120 - fat chance) even if GPU output is limited to 30 (yes, this is possible). This will give it the responsiveness of a locked 60fps engine with visual quality improvements usually reserved for 30fps locked. I know there are technical difficulties, but so was having 1 general purpose core + 6 vector units in the current gen.

The other nice thing about a smaller frame buffer is having more RAM leftover for other things.
Framebuffer update rates don't have to be the same as input polling and game logic update rates. Even 60hz is extremely slow for that sort of thing.
 

XiaNaphryz

LATIN, MATRIPEDICABUS, DO YOU SPEAK IT
I should point out there is a follow up response posted here: http://www.neogaf.com/forum/showthread.php?p=34265264

My prior comment, "IMO a more interesting next-generation metric is can an engine on a ultra-highend PC rendering at 720p look as real as a DVD quality movie?" is a rhetorical question asking if it is possible for a real-time engine to start to approach the lower bound of a DVD movie in realism.

To make this clear, I'm not suggesting that games should compromise interactive experience just to get visual quality. If I was going to develop a title for next generation consoles I would output 1080p and run frame locked to 60Hz with no dropped frames period.
I still believe developers will be able to start to reach the quality of film for next generation consoles and current generation PCs, and I'm intending to develop or prove out some of the underlining techniques and technology which gets us there.
I think he assumed people would take this as obvious. Obviously not, it appears. :lol
 

TUROK

Member
I think you make a very poor argument though as we both know the reality is that almost all 30 fps games have a large amount of input lag (usually 100-166 ms).

You also seem to forget that in a feeble attempt to minimize the input lag damage those games then also don't use any form of vsync.
And then the ugly little problem arises that most devs aren't even capable of making their game actually run at a consistent 30 fps at all and you get the tripple whammy of:
-input lag out the wazoo
-stuttering
-screen tearing like crazy (which in my subjective opinion just about negates any effect 'better' graphics can have, it just looks like shit when every second or third frame tears)

This gen is awful for gameplay responsiveness, image quality and immersion (due to tearing and stuttering).
Priorities are messed up, if they can't get 60 fps (and 1080p as long as we are stuck with LCD tvs) in most real time (not turn based) games next gen then I'm sticking to my PC only for the first time ever.

I barely play my ps3 or xbox because the games aren't responsive, a problem I didn't have on ps2 back when games were played on CRT tvs and the vast majority of combat/shooting/action/platform/racing games were 60 fps.
Jesus Christ... The amount of assumptions you made off one of my posts is staggering.

Not sure which games you're talking about either, but a lot of the AAA console releases have some form of v-sync. Halo: Reach, Uncharted 2 and 3, Killzone 2 and 3, have v-sync, and Gears of War 2 and 3 have partial v-sync in which they drop it once too many frames are dropped.
 
Top Bottom