• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Can 1080p/60fps be a standard next generation?

maniac-kun said:
theres not a single game i play below 1080p on my pc
It's always by my own choice, yeah. I play Bejeweled and some other titles (Super Meat Boy, Dragon Age) in windowed mode (1600x 900 or 1280x720) to multitask more easily. Gaming on the side, as it were.


Durante said:
It's better than going from 1600*1200 to 480i back in the CRT days though!
I must admit that I had a 1280x1024 (5:4! How did I survive!) monitor for the longest time. :lol
 
60fps for ALL racing/fighting games!

I can live with 720p, 30fps, good AA, good motion blur + most effects/polygons/shaders/etc possible for all other genres. I'm not sure how many developers will sacrifice that for 1080p.
 
jonremedy said:
You're welcome to do this, but I guarantee you it will play pretty horribly...

Well, I will test and try out with various settings next time as I have not gamed at that low a consistent frame rate for a long time (quake/unreal 1 times). Maybe 25FPS could be the ticket - but my original point is that a lower frame rate from the high 60's does help the world appear more cinematic in it's motion. It's just a matter of personal preference on how low you are happy for it to go.

I hope game developers are not forced to stick to 60FPS next gen.
 
About the sub-hd debate: going by beyond3d list of rendering resolutions on the 360, taking into account 233 retail games (including "high profiles" and exclusive titles), we have 46 sub-hd games (I've considered everything under 1280x720 to be sub-hd, even Halo Reach's 1152x720).

So, that makes for about 19% of games in that list alone.

It means that more than 80% of the games are 720p or above.

The fact that some exclusives are indeed sub-hd doesn't really change anything, since even there, most of them are 720p. We could argue about actual quality of the games, but it's irrelevant to the point.

http://forum.beyond3d.com/showthread.php?t=46241
 
metareferential said:
About the sub-hd debate: going by beyond3d list of rendering resolutions on the 360, taking into account 233 retail games (including "high profiles" and exclusive titles), we have 46 sub-hd games (I've considered everything under 1280x720 to be sub-hd, even Halo Reach's 1152x720).

So, that makes for about 19% of games in that list alone.

It means that more than 80% of the games are 720p or above.

The fact that some exclusives are indeed sub-hd doesn't really change anything, since even there, most of them are 720p. We could argue about actual quality of the games, but it's irrelevant to the point.

http://forum.beyond3d.com/showthread.php?t=46241
If you included every 360 game ever then that number would probably be greater than 90%.

Seriously, this sub HD myth needs to end now as it's getting tiresome hearing people bring it up.
 
Mr_Brit said:
If you included every 360 game ever then that number would probably be greater than 90%.

Seriously, this sub HD myth needs to end now as it's getting tiresome hearing people bring it up.

Yes, that's the point.

On the ps3 side, so to get a bigger frame of the picture, we have 64 sub-hd titles among 225 listed retail games.

Taking into account every game on both systems, the percentage of hd titles is even greater, since most of the sub-hd games are multiplatform.
 
Mr_Brit said:
If you included every 360 game ever then that number would probably be greater than 90%.

Seriously, this sub HD myth needs to end now as it's getting tiresome hearing people bring it up.

In some ways it amazes me people don't care about 1080p more.

I mean if you spent the money on an expensive HDTV or computer monitor, you'd want to run everything in the native resolution right?

If you bought a sweet 2560x1600 LCD monitor, you'd run every game on your PC in that resolution, no doubt.

But people who buy even more expensive 1080p HDTVs, generally couldn't care less about running in the 1080p native resolution, and are perfectly satisfied with lower resolution content. So why should the people making the games care that much? I think ~720p is good enough for most people, and that won't be changing for a long time.
 
brain_stew said:
Exactly! :lol

Small minority implies its almost an insignificant number of games, and it most certainly is not. Its still technically a "minority" (i.e. less than 50%) but its certainly not an insignificant ~10% of new releases as was implied. Its a significant amount now, and is actually more common in high profile releases (at least on the 360 side) and the prevalence of subHD releases is increasing all the time.

http://forum.beyond3d.com/showthread.php?t=46241

*throws data into excel*


retail PS3 (223 games total):

native resolution exactly 1280x720p: 137 games

native resolution above 720p: 21 games
(of which 7 use a lower horizontal resolution than 1280px, e.g. 960x1080p)

native resolution below 720p: 65 games
(including all "almost" 720p games, e.g. the ridiculous 1280x718 and horizontally scaled 1080p games that have a lower total pixel count than 1280x720p, even though the vertical resolution is often more important for our visual perception anyway)


That's 29% sub720p games, of a sample that was specifically chosen to be measured for the native resolution (= games someone cares about and games that were easily noticed to have a different resolution). So in reality it should be less.



PSN (29 games total):

native resolution exactly 1280x720p: 5 games

native resolution above 720p: 22 games
(of which only 6 are not native 1080p, even including WO:HD)

native resolution below 720p: 2 games


That's 7% sub720p games (while over the half is actually native, full 1080p)



w/o going through the lists, I think it's also safe to say that the majority of the PS3 exclusive, well at least first party, titles are 720p or above (thus the titles you HAVE to play on that platform).




retail Xbox360 (231 games total):

native resolution exactly 1280x720p: 168 games

native resolution above 720p: 9 games
(Though Tekken 6 is actually sub HD if you play with motion blur)
Interestingly there aren't really high profile 1080p games here though.

native resolution below 720p: 54 games
(of course including all "almost" 720p games)


That's 23% sub720p games, and again, of a sample that was specifically chosen to be measured for the native resolution (= games someone cares about and games that were easily noticed to have a different resolution).



XBLA (18 games total):

native resolution exactly 1280x720p: 12 games

native resolution above 720p: 1 game

native resolution below 720p: 5 games


That's 28% sub720p games. For some reason PSN "destroys" XBLA in that test sample. Though there aren't much DD games looked at by the beyond pixel counters, I guess because they are often low profile and are probably above 720p most of the time anyway, so there are no "scandals". Kinda a shame as XBLA arcade offers trials for every game and also on PSN there are many available (trials = full game w/o activation key, therefore the best test subject considering pixel counters sometimes uses screenshots or old demos if no one has access to the retail game)
 
Minsc said:
In some ways it amazes me people don't care about 1080p more.

I mean if you spent the money on an expensive HDTV or computer monitor, you'd want to run everything in the native resolution right?

If you bought a sweet 2560x1600 LCD monitor, you'd run every game on your PC in that resolution, no doubt.

But people who buy even more expensive 1080p HDTVs, generally couldn't care less about running in the 1080p native resolution, and are perfectly satisfied with lower resolution content. So why should the people making the games care that much? I think ~720p is good enough for most people, and that won't be changing for a long time.

People buy hdtv's to watch tv shows, or dvd's; they've bought it to replace an old sdtv.

Most countries don't have hd broadcasting. So they end up buying full hd tv's just because that is what the market has to offer. Every tv is full hd now.

People buy an hdtv, a ps3, and they use composite connections. That's how it goes.
 
Minsc said:
In some ways it amazes me people don't care about 1080p more.

I mean if you spent the money on an expensive HDTV or computer monitor, you'd want to run everything in the native resolution right?

If you bought a sweet 2560x1600 LCD monitor, you'd run every game on your PC in that resolution, no doubt.

But people who buy even more expensive 1080p HDTVs, generally couldn't care less about running in the 1080p native resolution, and are perfectly satisfied with lower resolution content. So why should the people making the games care that much? I think ~720p is good enough for most people, and that won't be changing for a long time.

Like people can even spell native resolution or even know that 1080p is just 1920x1080...
 
It will be the generation after the next.

I think the console manufacturers are going to create cheaper machines than this generation, given how it all played out, sales wise.
 
I dont know why, but anything at 60fps looks fake and cartoony to me, automatically. I vastly prefer 30 fps because it looks more "real" to my eyes.
 
Purple Drank said:
I dont know why, but anything at 60fps looks fake and cartoony to me, automatically. I vastly prefer 30 fps because it looks more "real" to my eyes.
my-brain-is-full-of-fuck.jpg
 
RPGCrazied said:
vsync locked would be nice as well.
I wish console devs would start to give the fucking option, at least.

Let the user choose, don't make the decision for him.


Purple Drank said:
I dont know why, but anything at 60fps looks fake and cartoony to me, automatically. I vastly prefer 30 fps because it looks more "real" to my eyes.
See, the only result of this post is that no one is ever going to take you seriously again in the context of framerate discussion. This is not another interesting viewpoint people have to consider in the future, you're just wrong and have no credibility on the matter. Sorry.
 
Even in regards for movies people haven't adapted the bluray as a standard, I know people who even have a bluray player and plugged it into a sdtv, and they also get more dvds than blurays.
I just hope a stable performance is a standard.
 
we'll be lucky to have standard 720p/30fps next-gen. No seriously alot more games will be 1080p 60fps next gen. moreso than this gen. just not every game.
 
Purple Drank said:
I dont know why, but anything at 60fps looks fake and cartoony to me, automatically. I vastly prefer 30 fps because it looks more "real" to my eyes.

I agree to a certain extent. I think right now it's because of the technology used to render (I mean, just one example is the slightly sickening "liquid-rolling" low-quality AF-effect we often have in 60fps, an effect which is reduced with a lower framerate). When technology further matures and most of the limits on image quality on all platforms are somewhat overcome I think 60fps will be the preferred choice for both look and feel in all situations.
 
in the 1990s all Sega and Namco arcade games after 1993 were 60fps, so i concider current gaming a step back in that regard.
 
Purple Drank said:
I dont know why, but anything at 60fps looks fake and cartoony to me, automatically. I vastly prefer 30 fps because it looks more "real" to my eyes.

If the world judders like a bitch when you turn your head in real life, then its time to go visit the optician.
 
camineet said:
in the 1990s all Sega and Namco arcade games after 1993 were 60fps, so i concider current gaming a step back in that regard.

60fps were an arcade staple almost from the beginning. It's pretty much what made them such a special experience. So smooth.
 
Mr_Brit said:
1080P/30FPS>>>>>720P/60FPS.
So sad that console players have to deal with such limitations and compromises in 2010. :(


It depends on the game. Couldn't imagine a good one on one fighter or a sim racing game running at 30fps.
 
I will never understand the 30>60 people. Now that I have a decent PC, I hate dipping below 60.

30 is very playable but when you get used to 60 it looks bad. The "cinematic" argument doesn't do it for me. Seeing a smooth world that doesn't stutter and take me out of the experience is much more cinematic.
 
Another point is that all these PC players are going to have to forget about playing every game at 1080P/60FPS/8xMSAA when the next generation comes around when games like Crysis will be made to look tame in comparison.

Edit: Just to clarify I'm a PC/console player, not some stupid console warrior clamoring for sub HD and sub 30FPS.
 
Mr_Brit said:
1080P/30FPS>>>>>720P/60FPS.

And that is probably what it is going to happen.

Devs aiming at 1080p (just like with 720p in this gen), along with graphical bells and whistles, which means 30 fps. And for most demanding games, they can resort to 720p (or general sub-1080), screen tearing, crazy framerates xD

Having the same exact assets and geometry though, I don't know if I'll choose 1080 and slower framerate over 720p and better framerate. The good thing is that where we have a choice (pc gaming), we have more options than 720p vs. 1080p.
 
Mr_Brit said:
Another point is that all these PC players are going to have to forget about playing every game at 1080P/60FPS/8xMSAA when the next generation comes around when games like Crysis will be made to look tame in comparison.

Edit: Just to clarify I'm a PC/console player, not some stupid console warrior clamoring for sub HD and sub 30FPS.
It's true that PCs usually have to play catch up for the first (sometimes two) years of a new console generation, but given how this generation is looking to be one of the longest lasting so far (I mean, 2012 or even 2013 until new consoles are introduced isn't out of the question here) with rumours being that we'll see a much smaller leap compared to previous generational jumps, it's very possible that PC tech will be there day 1 with the new consoles.

But even if not, until then (which is at least 2 years, maybe 3+), the gap between the HD consoles and PC is only going to widen more and more. That's the beauty of PC tech. It never stops, it's constantly advancing.
 
Haunted said:
That's the beauty of PC tech. It never stops, it's constantly advancing.

That is why I don't believe the "smaller tech gap" theory.

In 2013 cheap hardware will still be much like going from ps2 to ps3.
 
metareferential said:
That is why I don't believe the "smaller tech gap" theory.

In 2013 cheap hardware will still be much like going from ps2 to ps3.
Exactly, even a conservative console will be a massive leap from current gen consoles. Something the power of the 6870/6950 should be small, cheap and low power enough by 2013 on a 22nm process to be included in a console and would be several times more powerful than the 360 and PS3.
 
I was wrong about one thing, COD games actually use 2xAA.

That being said, I looked at the resolution of 360 games that B3D has compiled and this list is mostly that:

360resolutions.png


78% of Xbox 360 games in that list (which by no means is comprehensive) are 720p or above.

12 games (5%) are sub-HD AND have no AA, those are the games with awful picture quality.
 
60fps should definately be a standard for online mp first person games.

Thinking about what the other guy said about pc gaming then why can't we choose in the options menu. Surely they could let you choose between 2 modes ie - gfx or framerate.
 
CurseoftheGods said:
I expected better from charelquin. :'(

Exaggerated for comic effect but my point is serious: you can't give this generation "credit" for the full resolution leap from 480i to 720p when the most graphics-intensive console games specifically don't aim for or achieve a 720p resolution. The response from developers to the abnormally large jump in maximum resolution this generation has not been to find ways to squeeze out performance at that resolution; it's to drop pixels.

The defensiveness about this is always really weird to me. I mean: I still own an SDTV. I'm absolutely not a resolution whore of any kind. :lol

Basileus777 said:
Can they really push stereoscopic 3d without prioritizing framerates? How are you going to make a game 3D (and not at 15 fps) if it runs at 30 fps in 2d mode without resorting to crazy drops in resolution?

The problem here is that if you push a game up to 60 rendering fps to target 3D, you're still going to actually be outputting 30 fps to the user's eyes (unless you're suggesting that people can play their 3D-targeted games in 2D and thereby "cheat" out higher framerates.)

REMEMBER CITADEL said:
As for it being a significant percentage, that's also not only vague (define "significant percentage"), but not necessarily true either. It depends on whether you look at all games (as you should), exclusives, console exclusives, in which period of time and so on.

It is very rare for one of these discussions to actually focus on "all games" in any other context, though. When we have an "upcoming releases" thread it isn't all titles, it's notable titles of likely quality (and, by and large, of relatively broad appeal.) When we talk about what makes a given console worth owning, it's these high-profile games that are under discussion. It's specifically these high-profile, graphically impressive games that are most likely to drop resolution, because they'll have the most ambitious graphical aspirations and therefore the most reason to sacrifice in one area to make them happen.

The Experiment said:
I think the console manufacturers are going to create cheaper machines than this generation, given how it all played out, sales wise.

Almost certainly true, and one reason that a resolution bump will be less valuable to almost all developers than trying to eke out more graphical wow.

Mr_Brit said:
Another point is that all these PC players are going to have to forget about playing every game at 1080P/60FPS/8xMSAA when the next generation comes around when games like Crysis will be made to look tame in comparison.

I actually find that pretty unlikely. This bath of consoles were less powerful relative to then-current PCs than any previous generation and there are quite a few reasons to expect that trend to continue.
 
charlequin said:
The problem here is that if you push a game up to 60 rendering fps to target 3D, you're still going to actually be outputting 30 fps to the user's eyes (unless you're suggesting that people can play their 3D-targeted games in 2D and thereby "cheat" out higher framerates.)
This is true, but at least if you target a game at 3D you absolutely cannot have any kind of screen tearing. (And framerate instability also becomes a much worse issue)

charlequin said:
I actually find that pretty unlikely. This bath of consoles were less powerful relative to then-current PCs than any previous generation and there are quite a few reasons to expect that trend to continue.
I agree. Even without discussing cost issues, the physical reality is that modern-day high-end PC GPUs alone have a higher power budget than any future console could afford.
 
metareferential said:
That is why I don't believe the "smaller tech gap" theory.

In 2013 cheap hardware will still be much like going from ps2 to ps3.

I've kind of wondered this as well. I mean FFS if we wait till say fall 2012 DX11 is gonna be old hat. They aren't even gonna be making DX10 hardware or even early DX11 hardware. Same thing with processors. Even the cheap shit is gonna be at least 2-4 gens beyond what we have now. That's all that will be made then.

I still slightly believe in the cheaper theory, but only because even a cheaper console by then is gonna be a fucking massive advancement.
 
Brettison said:
They aren't even gonna be making DX10 hardware or even early DX11 hardware. Same thing with processors.
Well, they certainly weren't making DirectX 7 hardware when Nintendo released the Wii.
 
The problem is not the percentage of published games that run at sub HD and sub 30fps. The problems is that many of those games that do run at sub HD, with tearing or sub 30fps are some of the most popular and acclaimed games this generation. All the CoDs, Halo 3, GT5, MGS4, GT4, etc.
 
charlequin said:
The problem here is that if you push a game up to 60 rendering fps to target 3D, you're still going to actually be outputting 30 fps to the user's eyes (unless you're suggesting that people can play their 3D-targeted games in 2D and thereby "cheat" out higher framerates.)

That brings up a possible solution to my needs :] *

There are a lot of games that turn details down to achieve a 'steady' framerate in 3D. I'd love to be able to access that mode to achieve a high non-3D framerate.

Someone make this happen.

p.s. Does anyone have the link for the small rotating box that renders at different framerates? The left side could be xx-fps and the right side yyy-fps.

Just to recap. Framerate is king, many console gamers defer to the expertise of others to determine resolution, LCDs are stupid/CRTs - bring them back.

*Obviously I am talking about the peculiarities of the console space. I didn't feel the need to spell it out, but the "PC Gaming" crowd impresses me daily.
 
Mindlog said:
There are a lot of games that turn details down to achieve a 'steady' framerate in 3D. I'd love to be able to access that mode to achieve a high non-3D framerate.
There's this thing called "settings" in PC games. It allows you to sacrifice details to improve framerate. You can even choose which details to drop!
(Not that you'd need to to get 60 FPS on current console games)

What I'm trying to say is that I'm pretty sure configurable graphics are one of those "pain in the ass" things that console gamers dislike about PC gaming.
 
i have no idea why the same thread keeps popping up on gaf every other day.

Anyway, a little common sense will tell you that 1080p WILL be standard next gen.
720p IS standard this gen (then you do have a MINORITY of sub-standard res games) and the gap between 720p and 1080p is a lot smaller than last gen's 640x480 to 720p.
Plus the advancements in tech should be greater since next gen is taking like twice the time to come out.

60fps on the other hand won't, and will never be standard, simply because 30fps are ok for a number of games and some devs will always choose to sacrifice 60fps for better graphics.

The only chances of 60fps becoming standard are:

A) everyone using some sort of 'natural motion' technique like the one used in that force unleashed 2 demo (basically making a 30fps game look 60fps with little or no impact on performances) or even consoles including some dedicated chip for that. Doubtful but not entirely out of the question imho.

B) If 3d becomes standard then you could probably get a majority of 30 fps 3d games which could maybe run at 60fps in 2d mode.
 
1080p don't mean much unless you have a really big screen.

Next generation GPU's should be able to do 720p with good amounts of AA & AF and throw a bunch of geometry and lightning around.

I'd love to see ATI 6650/7650 or such in Xbox 720
 
Purple Drank said:
I dont know why, but anything at 60fps looks fake and cartoony to me, automatically. I vastly prefer 30 fps because it looks more "real" to my eyes.
O.o

Check your sensors asap! Welcome to the planet earth btw
 
charlequin said:
I actually find that pretty unlikely. This bath of consoles were less powerful relative to then-current PCs than any previous generation and there are quite a few reasons to expect that trend to continue.
if speculation that the 360 successor will use some re-worked fusion chip as a central apu, the technology sitting in it will be akin to that available to mid-range contemporary laptops.
 
As the console faithful are so quick to point out: with closed architecture, developers can wrangle every bit of power out of some hardware. Of course, the other edge of that sword is there's only so much power, and that means compromise. Fancy shaders, or 60 fps? Advanced particles, or 1080p?

History dictates that AAA game development leans towards the special effects over resolution or frame rate almost every time.

So no, there is practically no chance that 60 fps becomes the standard in the next or any console generation. As long as game devs much choose between frame rate and special effects, and they always will on closed architecture, they will choose special effects.
 
eso76 said:
i have no idea why the same thread keeps popping up on gaf every other day.

Anyway, a little common sense will tell you that 1080p WILL be standard next gen.
720p IS standard this gen (then you do have a MINORITY of sub-standard res games) and the gap between 720p and 1080p is a lot smaller than last gen's 640x480 to 720p.
Plus the advancements in tech should be greater since next gen is taking like twice the time to come out.

60fps on the other hand won't, and will never be standard, simply because 30fps are ok for a number of games and some devs will always choose to sacrifice 60fps for better graphics.

The only chances of 60fps becoming standard are:

A) everyone using some sort of 'natural motion' technique like the one used in that force unleashed 2 demo (basically making a 30fps game look 60fps with little or no impact on performances) or even consoles including some dedicated chip for that. Doubtful but not entirely out of the question imho.

B) If 3d becomes standard then you could probably get a majority of 30 fps 3d games which could maybe run at 60fps in 2d mode.


Don't know how that one works..........

As for your point about S3D making 60fps standard, well, I'm not seeing it. S3D on HDTVs is limited to 720p, so performance between the two modes shouldn't be too dissimilar unless an engine is severely vertex limited and with parallel rasterisers becoming the standard in modern GPUs, well, I don't see that being the case very often.

Techniques like Crytek's S3D implementation could become much more useful and prevalent as well, further negating the difference, and perhaps even making the S3D the more performant than the standard 1080p mode.


So no, S3D being a requirement isn't going to bring about a standard 60fps requirement.


nubbe said:
1080p don't mean much unless you have a really big screen.

Next generation GPU's should be able to do 720p with good amounts of AA & AF and throw a bunch of geometry and lightning around.

I'd love to see ATI 6650/7650 or such in Xbox 720

23" is considered "really big" these days? Because the difference on even a screen that small is night and day from several feet with rendered 3D content.
 
brain_stew said:
Don't know how that one works..........

As for your point about S3D making 60fps standard, well, I'm not seeing it. S3D on HDTVs is limited to 720p, so performance between the two modes shouldn't be too dissimilar unless an engine is severely vertex limited and with parallel rasterisers becoming the standard in modern GPUs, well, I don't see that being the case very often.

Techniques like Crytek's S3D implementation could become much more useful and prevalent as well, further negating the difference, and perhaps even making the S3D the more performant than the standard 1080p mode.


So no, S3D being a requirement isn't going to bring about a standard 60fps requirement.
Another problem is what would happen if one manufacturer went for 1080p minimum and the other went for 720p. It would either mean a huge visual gulf between the two consoles, with the 720p box getting better visuals, or it would mean that the 720p manufacturer could sell a substantially cheaper machine with equal power but at 720p instead of 1080p which in the eyes of most consumers would provide equal graphics for less cost.

If 1080p is to become standard then it needs to be pushed by both Microsoft and Sony.
 
Top Bottom