• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

360, ps3, and sub-HD resolutions

neight said:
It's funny how no one complained how most ps2 games in the last gen ran at the sub-SD resolution of 512x448.
it's like the gaming version os uncanny valley... as things get closer and closer, the differences are talked about more and more...
 
Yoboman said:
Err....duh?
I meant sub-SD. Horizontal res of 512 is 128 pixels shy of 640.
wazoo said:
because CRT are more tolerant to non fixed resolution and do not need upscaling.

CRT HD FTW.
Scaling does not add detail. You need resolution for that.
 
neight said:
It's funny how no one complained how most ps2 games in the last gen ran at the sub-hd resolution of 512x448.

The difference is that Microsoft touted their system as an HD system. And with that, we've come to expect that the baseline for all their games would at least be 720p.
 
I suppose there's the good kind of sub-HD resolution, and the bad kind.

Good kind: freeing up more power per pixel to do things other higher resolution games aren't.

Bad kind: cutting processing load to reach performance targets that could be achieved with more effort/time etc.

I don't mind it, as long as there's a very obvious visual case for it (i.e. the game looks noticeably better 'per pixel' than higher res games). I agree, though, that thusfar it's more the bad kind we're seeing.
 
rod said:
it seems having a game running at native 720 these days is a fricken luxury, but isnt it what we were promised?
dredsmana.jpg
 
Bearillusion said:
I know GRID is 30 but GT5p is 60. Not Sony's problem it isn't on 360.

So how do your examples disprove what I said? Look, in order to get a game running at 60 fps, you need everything to be calculated within 16 ms. Double that if you're at 30 fps. A big problem is the 10 megs of memory used for the frame buffer on the 360. It's just too small. If you want to use a frame buffer of 1280 x 720 AND use MSAA, you need to use predicated tiling. The problem is predicated tiling takes time which really hurts you when you're trying to cram everything into 16 ms. That is why you have to lower the resolution on the 360 in order to allow it to fit into the frame buffer and have MSAA while trying to avoid predicated tiling. That is a bottleneck on the 360.
 
Time constraints, limited budget and perhaps the skill/platform experience of the development team.

There are plenty of games on both platforms that are testament to what developers can get out of the systems when they put their minds to it and have the financial backing.
 
I think people are looking from a completely skewed point of view, resolution is only one piece of the puzzle when it comes to resource allocation, geometry, lighting, physics etc all take away from overall system power, and all require compromises.
As gamers this gen, we have been collectively asking for more "realistic" lighting, and we got Halo 3 because of it, which frankly is one of the most impressive games I have ever seen, once you ignore the jaggies (VGA gaming helps there); We asked for better physics, so we got Euphoria, and games like GTA4 benefit from this, we ask for 60 fps, so we get NG2 etc. All of those games weren't "HD" in res, but contained elements that defines the "next gen" standards, see a pattern here?

For better or worse we have had the first "HD" console generation released before the technology was at the level of the average consumer, it has resulted in a situation where - like always - PR has blinded people from the fact that "true" HD console gaming is a generation away. In my view I see this gen like the first "3D" gen (Saturn/PS/N64), looking back, they were released too soon, and while they produced some great games, 3D gaming only came of age in the PS2/DC/GC/Xbox era, I expect the same here.

So yeah, enjoy this gen for what it is, but don't expect true HD gaming to become a reality until the next generation of consoles, whenever they hit.
 
nextgeneration said:
No, what I'm saying is that Microsoft pushed Sony to release their system earlier than they wanted to. Had Microsoft released their system 1 or 2 years later, and Sony released their system another year after Microsoft, I think things would be different.

No, not much different. Had it been 2 years later, I can see a difference but not one year as though graphics tech could have advanced in that time, these consoles have been in planning stages for longer. Could Sony have went with a more powerful CPU if they delayed it another year? Remember, the technology they plan to put into these console don't exist yet.

What was the best looking PC game back in 2005? I think these consoles hold up very well, despite their low resolutions (720 isn't that high on the PC).


rod:
It has everything to do with the developer. They set out to make a game with this feature, and that feature, that something has to take a hit, whether it's texture detail, framerate, and resolution.

Sometimes these decisions pay off, and sometimes they don't.

I'd argue that in the case of CoD4, they pushed a high framerate at the expense of resolution. I think it was a move that ultimately paid off, as it looks great on my projector and 62" DLP. Had it been native 720p and locked at 30fps, I would not have been as impressed, personally.

Halo 3 has probably the best lighting outside of Crysis, but I think their tradeoff: incredible lighting for a lower resolution WITHOUT any anti-aliasing as in CoD4 (which makes the lower resolution more bearable...or bearable at all really) was a bad decision. Had they went with simpler lighting on par with say, CoD4, they could have still had a great looking game with other more useful effects like advanced object based motion blur (I'm talking about the more advanced mb algorithms that would blur all fast moving objects, rather than a cheaper method that is screen based), higher polycounts (human characters could have used them in their faces), etc. You can see instances of the excellent lighting at some points, in some levels, but for the most part, the low resolution was far more apparent. Something that you notice 30% of the time (heavenly lighting) compared to something you notice all of the time (resolution). Bad decision making right there.

Bearillusion said:
GRID is running at 720p and looks fantastic. GT5p runs higher than 720p and looks fantastic.

GT5p runs at 720p. Not "higher than".
 
duckroll said:
Here's a better question. With both consoles outputting everything at 720p anyway, how many people on GAF can actually tell if a game is "sub-HD" without a pixel counter TELLING everyone that it is?


This. I remember for GTAIV, everone was like "ok feels a bit blurry on the PS3, dithery on the 360 and that's about it". Then the 640p thing dropped and some people were like "LOLZ I KNEW IT"
Ridiculous.

FightyF said:
GT5p runs at 720p. Not "higher than".

1280x1080
 
GAF is to blame.

Well, not GAF specifically, but gamer expectations and internet forums. We bitch and bitch and bitch about every single piece of media released for every game, and tear each one to shreds if it doesn't live up to our unrealistic expectations. So devs cheat, trying to fool us into thinking we're getting what we want in pre-release media, hyping us up, and securing those pre-orders before dropping the reality bomb on us.
 
Yoboman said:
Err....duh?
Sub SD - 512x512(or 448) was common, because difference between 640 and 512 horizontal, is imperceptible on 99% of SDTV displays. Heck it's tough to spot the difference on any CRT displays, including VGA monitors.

And it wasn't anything exclusive to PS2 - there were GC and even XBox games doing the same. The reasons were just different - last gen resolution was lowered to gain memory, not performance.
 
Fafalada said:
The reasons were just different - last gen resolution was lowered to gain memory, not performance.

What, memory is not part of performance now? How much memory a game engine has available to it affects tons of performance issues. :P
 
wazoo said:
because CRT are more tolerant to non fixed resolution and do not need upscaling.

CRT HD FTW.

I love my CRT HDTV. Theres not too many things that run through it that don't look good, regardless of resolution.
 
duckroll said:
What, memory is not part of performance now? How much memory a game engine has available to it affects tons of performance issues. :P

just as important as the memory itself is the memory bandwidth, and if you're bottlenecked there, a resolution drop will often help.
 
IMHO, the timeframe these developers and console manufacturers have been working in is the key. HDTV has just become popular in the last 2 years. The games coming out now were probably started in development 3-4 years ago. No need to build a game for a trend such as HDTV that was not popular when the game was being initially set up. I believe from now on the games will be better at using the hardware and will run with more AA and higher resolution. But still, we have to remember the majority of homes around the world have not jumped on the HD bandwagon yet.
Computers have had high resolutions for 5 years so the PC games have been built for it.
 
Marty Chinn said:
So how do your examples disprove what I said?

I wasn't try to disprove your point about the resorces needed re 30fps vs 60fps. Most devs don't even target 60fps as they settle for 30fps. To say that the 360 and PS3 aren't capable of pumping out awesome looking games at a true 720p is false when you look at Gears and Uncharted.
 
FightyF said:
No, not much different. Had it been 2 years later, I can see a difference but not one year as though graphics tech could have advanced in that time, these consoles have been in planning stages for longer. Could Sony have went with a more powerful CPU if they delayed it another year? Remember, the technology they plan to put into these console don't exist yet.

Sony would not have much benefited so much. They could have gotten a better GPU, maybe. But, even, their problem is the CELL programming, and when devs do not even use more than one SPE, how do you benefit to get 8-10-12 instead of 7 ?
 
rod said:
i mean, if you compare the raw specs of a 360 or a ps3, while they dont match a top of the line gaming pc these days obviously, they shouldnt be having problems with something so fucking simple at 1280x720 and/or 2xaa.
So, uh, developed any graphics systems for current-gen consoles recently? That shit ain't simple.
 
rod said:
i just expected so much more from these consoles
Me too. For consoles that do nothing but harp on graphical prowess, i havent seen one damn game that has made me say wow. Graphically speaking.

Im sure metal gear 4 will do so, if it runs at a steady 60fps. Team ico will make me believe as well.
 
duckroll said:
How much memory a game engine has available to it affects tons of performance issues. :P
That's true - but there was a difference I was trying to point out. In a 360/PS3 game, you drop resolution to run more stably at 30fps or whatever.
In a PS2/GC game, you'd drop resolution to be able to fit more effect buffers etc. into VRam(as opposed to not having those effects at all, or having them at lower quality), without considering performance much (or at all).
 
thanks for the great replies guys especially marty and fighty, that 10MB of ram article kind of explains the AA situation with the 360. whats the deal with the ps3 though, does it deal with AA in a different way?


and also, do you guys think that maybe in the future, we should have graphic options in games, im not talking about super complex features like in pc games as i doubt your common household casual gamer would know what AA or AF are, but to be honest in some of these games i wouldve loved to adjust my settings. even if just to see how they ran with custom settings and not FORCED to be rendered at 600p upscaled to 720p.

that brings up another question. when you change the resolution on a pc game, im guessing that it is no way shape or form upscaling, rather rerendering the image? or am i mistaken, if so, would that all BE POSSIBLE on home consoles?
 
zenbot said:
So, uh, developed any graphics systems for current-gen consoles recently? That shit ain't simple.


thats exactly why im asking on gaf, i know there are a lot of people here who know they're shit, a lot more than i know. so they could shed some light on why these corners ARE being cut.
 
If you want technical details you should probably lurk/post on Beyond3D. I'm sure there are people here who know their shit but there are even more people here who pretend to know their shit. That's no good.
 
FightyF said:
Could Sony have went with a more powerful CPU if they delayed it another year? Remember, the technology they plan to put into these console don't exist yet.
If Sony targeted Q1 2007 from onset, we'd have seen a completely different GPU inside. There was no need for an extra year really (partly because they already delayed in the first place).
If 360 targeted 3-6months later, we'd probably have a RROD free console (and possibly some adjustments in clock speeds).

That said - I'm talking about planning those release dates from the start - not delaying in the last minute.
 
Fafalada said:
That's true - but there was a difference I was trying to point out. In a 360/PS3 game, you drop resolution to run more stably at 30fps or whatever.
In a PS2/GC game, you'd drop resolution to be able to fit more effect buffers etc. into VRam(as opposed to not having those effects at all, or having them at lower quality), without considering performance much (or at all).

That's bull. It's already been explained in this thread alone several times, that most resolution drops are being made simply to fit more effects and for AA.
 
nextgeneration said:
The difference is that Microsoft touted their system as an HD system. And with that, we've come to expect that the baseline for all their games would at least be 720p.
And Sony didn't? Sony emphasised HD resolutions about a billion times more than Microsoft did. Look how that turned out.
 
Sir Fragula said:
And Sony didn't? Sony emphasised HD resolutions about a billion times more than Microsoft did. Look how that turned out.

Yeah, but like, a team which "feels at home" working with MS, which developed on 360 alone, which already had developed a game with the same engine (DoA4) did not manage to get a true HD res. Capcom at least managed to do that with DMC4, a multiplatform title, and the 360 version has also 2xAA (?) and runs at 60fps. Nobody is forgetting the talking about PS3 being HD and all, but I would have not expected the same from NG2.
 
duckroll said:
That's bull. It's already been explained in this thread alone several times, that most resolution drops are being made simply to fit more effects and for AA.
My point was there's a difference when you choose between
"lowres + do something" and "hi-res + do something but a bit slower".
And
"lowres+ do something" and "hires + can't do something at all" - the latter being indicative of choices you usually had lastgen.
 
Sir Fragula said:
And Sony didn't? Sony emphasised HD resolutions about a billion times more than Microsoft did. Look how that turned out.

Sony big games (Uncharted, GT5P, Ratchet, Motorstorm, Resistance) are 720p. MS big games (Halo3,NG2) are not.
 
The hardware isn't the problem, it's the developers (and consumers) deliberately choosing polygons, textures and effects over resolution.
 
Fafalada said:
My point was there's a difference when you choose between
"lowres + do something" and "hi-res + do something but a bit slower".
And
"lowres+ do something" and "hires + can't do something at all" - the latter being indicative of choices you usually had lastgen.

Let's see:

Marty Chinn said:
So how do your examples disprove what I said? Look, in order to get a game running at 60 fps, you need everything to be calculated within 16 ms. Double that if you're at 30 fps. A big problem is the 10 megs of memory used for the frame buffer on the 360. It's just too small. If you want to use a frame buffer of 1280 x 720 AND use MSAA, you need to use predicated tiling. The problem is predicated tiling takes time which really hurts you when you're trying to cram everything into 16 ms. That is why you have to lower the resolution on the 360 in order to allow it to fit into the frame buffer and have MSAA while trying to avoid predicated tiling. That is a bottleneck on the 360.

Sounds to me like sometimes it IS a case of "hires and can't do something at all" to me.
 
Scotch said:
The hardware isn't the problem, it's the developers (and consumers) deliberately choosing bullshots over resolution.
Fixed. Even if your game runs at 640*480, if you have great models and textures you can make a 16000*9000 bullshot, scale it down, and tell the media it uses in-game assets. Now that most screenshots released no longer show the actual in-game IQ, developers have little incentive to increase it.

duckroll said:
Sounds to me like sometimes it IS a case of "hires and can't do something at all" to me.
Your quote is exactly what Faf said: do something, but slower.
 
wazoo said:
Sony would not have much benefited so much. They could have gotten a better GPU, maybe. But, even, their problem is the CELL programming, and when devs do not even use more than one SPE, how do you benefit to get 8-10-12 instead of 7 ?
I guess you could make the argument that if Sony had gone for a more powerful GPU then underutilising CELL would have mattered less because devs wouldn't be forced to leverage it in graphics tasks just to bolster RSX's deficiencies and help it keep up?
 
duckroll said:
Here's a better question. With both consoles outputting everything at 720p anyway, how many people on GAF can actually tell if a game is "sub-HD" without a pixel counter TELLING everyone that it is?
Its one of the resons I stopped playing COD4. I could tell something was wrong with it, even before I knew it was running below 720P. The game is a pixelated mess when upscaled to 1080P and I had serious trouble picking out players in multiplayer at a distance because of it.
 
The only game i could really tell wasnt native 720p was haze, its a blurry mess. Cod4 halo3 etc etc are very hard to tell they aint native 720p games.
 
Isn't it just par for the course? Some devs just do things better than other devs, but there will always be devs who pull tricks to get what needs to be done, done? Won't matter the console generation. How many Wii games are actually 480p?

If you're a true resolution whore, you're playing on a PC anyways, where you laugh in the faces of console gamers complaining about resolutions.

"720p? What's that? Sounds much lower than 1900x1200...."
 
duckroll said:
Sounds to me like sometimes it IS a case of "hires and can't do something at all" to me.

MartyChin said:
The problem is predicated tiling takes time which really hurts you when you're trying to cram everything into 16 ms
It sounds like a performance issue to me ;)

The times where it's really a case of "can't do it at all" is with software architectures that are completely incompatible with concept of tiling for whatever reason, and re-engineering is just not viable solution. But there's no reason for exclusive, let alone 1st party titles to fall into that category.
 
proposition said:
I guess you could make the argument that if Sony had gone for a more powerful GPU then underutilising CELL would have mattered less because devs wouldn't be forced to leverage it in graphics tasks just to bolster RSX's deficiencies and help it keep up?

you are right, but still the main problem of the ps3 is the under utilization of the CELL, something that would have not changed one year later (it would be even worse, because we would be one year later in the mastering of the CELL). And I'm not sure they would have pushed for a better GPU since Sony was all about the CELL. one year later, CELL yields would have been better, some the 8th SPE would have been ok at least, maybe a second CELL in the wildest dreams.
 
I think people had insane expectations. Seriously, we've known the consoles' specs way before they were released. Did you expect some kind of miracles while both have 512megs of ram ?

We should all feel satisfied, because I'm not sure I can point to any PC game looking as good / running as smoothly as the top dogs (uncharted, gears, ratchet, CoD etc) with such "low" tech specs.

I personnaly like what I see, I think there's a HUGE step from last gen. And that's what is important. I don't fucking care if the game runs @ 645.37p instead of 720p if ti pleases my eye. And I'll take this "cheating" over framerate or tearing issues anytime (AC i'm looking at you)


edit : spelling lulz
 
I can easily tell what games are running at sub-720p and what games aren't. If the game doesn't have anti-aliasing it is even more obvious. The reason this issue isn't spoken about much despite being massively widespread is that the majority of people who own PS3s and Xbox 360s are still using 480i television sets, with a small number using high definition sets but ignorant of the fact that their picture output is still set to display at 480i. You really can't tell what is sub-720p if your picture is only outputting at 480i, simply put. Only a small minority will experience playing on a sub-720p resolution, and the developers know this.
 
Top Bottom