• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

360, ps3, and sub-HD resolutions

Ynos Yrros said:
You're saying that at 480p the graphics would have been amazing, and they obviously wouldn't. You lose tons of details when playing in SD.

The lighting and rendering would have been insane. You could essentially have games like Toy Story, or any raytraced stuff. You still need high res texture work. You won't lose all that detail.

And I was just correcting you with that second point.


Still, I don't see how that has anything to do with what was being discussed,

Read the OP.

he said that games would look insane if they were 480p, I said it's BS, because only in high resolution can games show all the fine details.

I'm talking about the computational power that is required for 720p resolutions. You can have really advanced shaders. I've mentioned it in another thread, but per-object motion blur, rather than screen based motion blur. Games could truly look like CG, and run at 60 fps.

If HDTV technology wasn't out right now, we wouldn't even perceive detail loss to begin with considering that we haven't seen any better. Read the post before replying to it for Pete's sake. Or rather, your own sake.

He even said that GT5P looks as good on SDTV as it does on HDTV, when I couldn't think of a game that loses more of it's graphical fidelity in lower resolution.

Firstly, there are many, many more detailed games. Secondly, I mentioned it was like a supersampled 640 by 480 game.

If PD had to make GT5 on the PS3, for only 480p...they would have a photorealistic game. That is the point. You're used to watching TV on your 480 set, right? Games could rival that.

Start thinking dude, before replying with partisan rhetoric, it doesn't serve the thread and the topic at hand.
 
Xyphie said:
Don't care. Good art direction, framerate and making good use of what resources you have is more important than more shaders and pixels for me. One of the best-looking games this generation is running in 480p on my Wii with a great frame rate, great image quality and little loading. It's going to look great 5 years from now, GeoW/KZ2 won't.
It also has colors.

2 pages before the Wii comes in.
 
I don't think it's an issue. Resolution is just an element of the balance of what makes a game look good on a TV. If developers feel they get a better result graphically by pushing more effects, more detail on screen with a better framerate and rendering at native sub-720p with internal upscaling, then it's their choice.
And just like someone else said it's a bit ridicolous that people complain just because someone else has counted the pixels for them :lol while they couldn't tell the difference by the themselves. It completely proves that developers are doing the right thing.
 
Xyphie said:
Don't care. Good art direction, framerate and making good use of what resources you have is more important than more shaders and pixels for me. One of the best-looking games this generation is running in 480p on my Wii with a great frame rate, great image quality and little loading. It's going to look great 5 years from now, GeoW/KZ2 won't.
It also has colors.

Yeah...if you think that Gears or KZ2 doesn't have good art direction, framerate, or that neither make good use of resources, you're not thinking straight.
 
I bet Team Ico next game will be (hopefully) 600p with noAA and looks godly. It will still be a jump from their previous game.
 
FightyF said:
The lighting and rendering would have been insane. You could essentially have games like Toy Story, or any raytraced stuff. You still need high res texture work. You won't lose all that detail.

And I was just correcting you with that second point.
Read the OP.
I'm talking about the computational power that is required for 720p resolutions. You can have really advanced shaders. I've mentioned it in another thread, but per-object motion blur, rather than screen based motion blur. Games could truly look like CG, and run at 60 fps.
Firstly, there are many, many more detailed games. Secondly, I mentioned it was like a supersampled 640 by 480 game.
If PD had to make GT5 on the PS3, for only 480p...they would have a photorealistic game. That is the point. You're used to watching TV on your 480 set, right? Games could rival that.
Start thinking dude, before replying with partisan rhetoric, it doesn't serve the thread and the topic at hand.
You do realize that all the CGi content has to be RENDERED in high resolutions? Same with TV, it's recorded in high resolution. I like how you try to change the topics as quickly as possible every time you are wrong.

GT5P losses tons of fidelity on SDTV, you can't see cars in the distance nearly as well as on an HDTV. I won't even go into discussing the game's detail with you, because you are obviously trying to derail the discussion again.

Content rendered in 640x480 has very little detail and there is no way it could look insane. Playing Uncharted on SDTV makes it lose 90% of it's fine detail.
 
The biggest factor resolution plays is graphical fidelity, specifically for aliasing in terms of consoles atleast. The bigger the resolution, the less anti-aliasing you'll have to do.

But the biggest issue if anything is all the PR mumbojumbo we were fed during 04/05 for both consoles (MS's '2xAA standard' or 'killzone2e32005.gif'). The thing is, that we perfectly believed both MS and Sony as to the graphical prowess of the machines because hey, that's almost where PCs were and it wasn't that hard to believe.

Now we know better, of course.
 
Neither console is as powerful as everyone would like to think. With their video tech, i'm astonished they can do what they have already done.
 
There are tons of amazing looking games this gen that look way better than anything put out last gen.

Far Cry 2
Mirror's Edge
Gears of War 2
Uncharted
Prince of Persia
BG&E 2
Killzone 2
Lost Planet
Resident Evil 5
GRID
TimeShift
Kameo
Banjo
I could go on and on..

I'm totally satisfied with the visual leap this gen and we're less than half way through. Some developers can bring it and some can't. That's all there is to it. It's not the hardware that's at fault here.
 
why do most ps3 games have no AA compared to their xbox 360 counterparts which have at least 2xAA according to that list.
 
Ynos Yrros said:
You do realize that all the CGi content has to be RENDERED in high resolutions? Same with TV, it's recorded in high resolution. I like how you try to change the topics as quickly as possible every time you are wrong.


I never said you should render content at 640 by 480.


I said if games had to be displayed at 480p. BIG DIFFERENCE.

An ideal resolution would be to render the game at 800 by 600, and then output at 480p.

GT5P losses tons of fidelity on SDTV, you can't see cars in the distance nearly as well as on an HDTV. I won't even go into discussing the game's detail with you, because you are obviously trying to derail the discussion again.

You're the one who changed the topic, and you're trying to do it with this post. I was talking about the extra processing power that could be used for making better visuals. Not more detailed visuals.

You attempted to make it a detail issue, when under my hypothetical situation where there would be no HDTVs to begin with, there wouldn't be any perceived detail loss to begin with.

Content rendered in 640x480 has very little detail and there is no way it could look insane. Playing Uncharted on SDTV makes it lose 90% of it's fine detail.

Heh, you've completely missed the point of my post. My point was that you could have a much better rendering pipeline, better lighting, much better shadows (like the soft self shadows in Halo 3), way more advanced shader effects (again, like per object motion blur), and have games that rival stuff you normally watch on your 4:3 television.

I'll repeat it again since you apparently have reading comprehension issues, if PD had to make GT5 for 480p and 480p only, it would look completely photorealistic. All that extra processing power could be used towards better reflections, 3D grass, tire marks (heheh, even games on the N64 had that), more detail in the tracks, etc.
 
FightyF said:

I never said you should render content at 640 by 480.


I said if games had to be displayed at 480p. BIG DIFFERENCE.

An ideal resolution would be to render the game at 800 by 600, and then output at 480p.

If you can render at 800x600, why wouldn't you just display at 800x600? VGA cables exist. And once people start using monitors to play games instead of TVs, then they would got higher in whatever resolutions that monitors support, which brings us back to the exact same problem.

After all, if HDTVs did NOT exist, do you even think for a moment that the hardware in the Xbox360 and PS3 would be remotely as advanced?
 
duckroll said:
If you can render at 800x600, why wouldn't you just display at 800x600? VGA cables exist. And once people start using monitors to play games instead of TVs, then they would got higher in whatever resolutions that monitors support, which brings us back to the exact same problem.

It can be safely argued that most gamers would stick with their TV sets.

After all, if HDTVs did NOT exist, do you even think for a moment that the hardware in the Xbox360 and PS3 would be remotely as advanced?

You wouldn't see a costly BR drive in the PS3, and I think there would have been 215Mb of RAM for both...but the CPU and GPU technologies were a natural evolution in hardware technology. Some aspects of them were tailored for HD, but not enough to claim that the lack of which would have made the consoles not even remotely as advanced.
 
FightyF said:
You wouldn't see a costly BR drive in the PS3, and I think there would have been 215Mb of RAM for both...but the CPU and GPU technologies were a natural evolution in hardware technology. Some aspects of them were tailored for HD, but not enough to claim that the lack of which would have made the consoles not even remotely as advanced.

You mean like the Wii?
 
This is a disappointing generation of consoles in terms of being part of the "HD" era. The 360 and the PS3 don't have the RAM for high resolution textures, or AA which is another memory monster.

As for console games matching PC games in graphics, they were lucky to be pulling this off at the start of this generation, let alone now. Then again consoles have the better game lineups now anyway so its a redundant argument really.
 
The argument starts to be silly, because HDTV are here, DVB programs are coming, HD DVD (BR) are here, they are almost no CRT to sell in shops (at least where I live), and this is independent from the VG industry, so the consoles had to make the jump.
 
kbear said:
There are tons of amazing looking games this gen that look way better than anything put out last gen.

Far Cry 2
Mirror's Edge
Gears of War 2
Uncharted
Prince of Persia
BG&E 2
Killzone 2
Lost Planet
Resident Evil 5
GRID
TimeShift
Kameo
Banjo
I could go on and on..

I'm totally satisfied with the visual leap this gen and we're less than half way through. Some developers can bring it and some can't. That's all there is to it. It's not the hardware that's at fault here.
Funny how every game you mentioned runs at 30 fps :lol
 
rod said:
thats exactly why im asking on gaf, i know there are a lot of people here who know they're shit, a lot more than i know. so they could shed some light on why these corners ARE being cut.

Pssst, wrong forum. If you want real honest intelligent discussion on this subject, go to beyond3d.

Im sure metal gear 4 will do so, if it runs at a steady 60fps.

You better smoke some weed before you play MGS4 then. The game doesn't run at 60fps, just 30.
 
FightyF said:
That's what I said. Secondly, Sony claimed that their games would be running at 1080p, which they aren't.

Where'd they say this? Obviously they pushed native 1080p support as a major feature, but I don't remember hearing anything about having all of their games running at 1080p.
 
Ynos Yrros said:
This discussion makes no sense what so ever, no one but Microsoft promised a standard in resolution and anti aliasing level in games, that's what the topic is about.

The topic is about developers having difficulty getting their titles up to HD resolutions. You want to give Sony a pass even though the CEO of their Entertainment division said that the PS3 would run 32:9 1080p at 120fps in the 4th fucking dimension. And that's not a joke, he actually said that.

The discussion is only relevant because you want to turn this into some kind of "But Sony didn't say" thing, which is clearly bullshit. They did say they would hit 1080p and then some. They said it over and over and over again. And so far, they've come woefully short of the benchmark they set for themselves. Microsoft too.

But it's not like it's any different this time around. The only thing that has changed is that the focus now is on resolution. Console manufacturers have always made these grandiose, ridiculous claims that their hardware was going to be the second coming of Christ in digital form. And they always fail to deliver on their lofty promises.

It is interesting to see what developers are doing to squeeze more out of each piece of hardware, so I'm always interested at what's being rendered natively, but so far as I can tell, the lowered resolutions isn't the clear indication of one consoles superiority over the others some people want it to be.

Also, Blast Processing.
 
duckroll said:
You mean like the Wii?

The Wii's hardware was not a large jump over the GameCube's. Traditionally we see a computational jump from each generation to the next. There is no strict order of magnitude, but it's very safe to say that the Wii did not meet this jump over the previous generations. For what we know, Hollywood is just an enhanced Broadway GPU that runs 1.5 times faster and not much else. Judging by the looks of most Wii games out, I'd say that's pretty accurate.

This is pretty much common knowledge, that the Wii wasn't a generational leap over the last, so why do you even bring up the Wii?
 
It is easy to back yourself into a corner and unfortunately have to make a sacrifice. Technically challenged devs are not always the reason for a res drop. It could be that the game design and required draws are too extreme for the architecture of the engine.

Take GTA as an example, yeah the assets could looked better but only by sacrificing the view distance. Decisions are made early on and then there is a knock on effect. They could increase the res but at the expense of something else.
 
urk said:
The topic is about developers having difficulty getting their titles up to HD resolutions. You want to give Sony a pass even though the CEO of their Entertainment division said that the PS3 would run 32:9 1080p at 120fps in the 4th fucking dimension. And that's not a joke, he actually said that.

The discussion is only relevant because you want to turn this into some kind of "But Sony didn't say" thing, which is clearly bullshit. They did say they would hit 1080p and then some.

I agree the thread is not about this (who said what, who broke what 'promises'), but there is a big difference between saying your machine can render at certain resolution, and saying all games will run at a certain minimum resolution. Sony did not make the latter claim, as far as I know. The made a big deal out of the PS3 being able to render at 1080p, when the 360 wasn't pushing that point, and I think they were very careful to avoid ever saying all PS3 games would run at any given resolution. It's like when people spec a chip and say it can do however many gigaflops. That's what it's capable of, but it's not a claim about how much power software running on it will use, or are guaranteed to use at a minimum.

I agree we should stop crucifying the platform holders over this, though. It's long since been a reality that sub HD resolutions have been used, so we might as well focus on that reality and why it happens rather than statements about it made 3 years ago.
 
FightyF said:
The Wii's hardware was not a large jump over the GameCube's. Traditionally we see a computational jump from each generation to the next. There is no strict order of magnitude, but it's very safe to say that the Wii did not meet this jump over the previous generations. For what we know, Hollywood is just an enhanced Broadway GPU that runs 1.5 times faster and not much else. Judging by the looks of most Wii games out, I'd say that's pretty accurate.

This is pretty much common knowledge, that the Wii wasn't a generational leap over the last, so why do you even bring up the Wii?

Because that's a modern console that targets 480p. If HDTVs did not exist (which is a pretty pointless assumption to begin with), who's to say MS wouldn't have put out a Xbox1.5 and Sony continue on the PS2. Realistically, I don't see any reason why would invest so heavily in this next-gen otherwise. There was a big jump because HDTVs were getting more popular and they wanted in on that market of people who were willing to spend more money.
 
I don't care if a game is running at subHD resolutions. I never understood the whole "596" "620.." etc pixel trolls. As long as the game is good + it looks good. Who cares.
 
not really, there was a big jump from ps1 to ps2, 2àx to 50x in peak or something like that, and the jump to ps3 is most likely in the same order. There is no jump with the wii (50% is less than 2, nothing).
 
gofreak said:
I agree we should stop crucifying the platform holders over this, though. It's long since been a reality that sub HD resolutions have been used, so we might as well focus on that reality and why it happens rather than statements about it made 3 years ago.

Yup. I just find it ultra bizarre that so many are willing to duke it out over what was clearly marketing bullshit from both sides. Hardware manufacturers are still crafting their messages as if they believe the majority of consumers can be snowed by technical terminology and hyperbole. Obviously it still works to a certain degree, but I don't Sony or Microsoft were prepared for 4x zoom and pixel counters to out the reality behind what's going on under the hood.
 
wazoo said:
not really, there was a big jump from ps1 to ps2, 2àx to 50x in peak or something like that, and the jump to ps3 is most likely in the same order. There is no jump with the wii (50% is less than 2, nothing).

There's a big jump from N64 to GC too. What's your point?
 
I'm somewhat over the graphics craze, I play games to enjoy myself... eye ejaculations are simply icing on the cake :)
 
I'm more concerned with the fact that every major sequel this gen(aside from Galaxy) has disappointed the hell out of me.

This resolution nonsense has zero effect on my enjoyment of this gen. =/
 
CoG said:
PS3 is fill rate limited.

Actually, if the game uses HDR its a bit of a trick getting AA to work with it IIRC, so thats why you see for example Oblivion without AA on the PS3 version, though with a higher resolution compared to the 360-version.
 
Forsete said:
Actually, if the game uses HDR its a bit of a trick getting AA to work with it IIRC, so thats why you see for example Oblivion without AA on the PS3 version, though with a higher resolution compared to the 360-version.

both are 720p 2xAA since they released GOTY editions
 
duckroll said:
Here's a better question. With both consoles outputting everything at 720p anyway, how many people on GAF can actually tell if a game is "sub-HD" without a pixel counter TELLING everyone that it is?

ng2rez.jpg
 
Sir Fragula said:
And Sony didn't? Sony emphasised HD resolutions about a billion times more than Microsoft did. Look how that turned out.

Yes, Sony did also. I neglected to mention that in my post. ;) Count me in as one of the people who are underwhelmed with the power of ps3/360, as I've been disappointed with the tech for both ps3/360.
 
Simple answer.

Both consoles are no where near as powerful as fanboys/companies would like you to believe. Nuff said, case closed.

Also... as long as I get great gameplay... I do not give a **** actually. If I wanne be nerdy about resolutions/polygons/... I have my PC. Consoles are about easy insta-fun gameplay (as demonstrated by Wii IMHO). NG2 plays superbly... I am happy.
 
Forsete said:
They added AA? Why didnt they patch that into my original version. :(

2xAA was always in the 360 version, 600p 2xAA upgraded to 720p 2xAA for GOTY, and i think the ps3 one was always 720p 2xAA if i remember right, or maybe its that horrible QAA that devs keep using on ps3
 
urk said:
Yup. I just find it ultra bizarre that so many are willing to duke it out over what was clearly marketing bullshit from both sides. Hardware manufacturers are still crafting their messages as if they believe the majority of consumers can be snowed by technical terminology and hyperbole. Obviously it still works to a certain degree, but I don't Sony or Microsoft were prepared for 4x zoom and pixel counters to out the reality behind what's going on under the hood.
It's nothing more than fanboy fodder in most cases. I know you remember, but we had this same discussion a few weeks ago when the GTA and MGO resolutions came out. Looks good, plays good are my only concerns. How the developer achieves that is none of my concern.
 
FightyF said:
No, not much different. Had it been 2 years later, I can see a difference but not one year as though graphics tech could have advanced in that time, these consoles have been in planning stages for longer. Could Sony have went with a more powerful CPU if they delayed it another year? Remember, the technology they plan to put into these console don't exist yet.

What was the best looking PC game back in 2005? I think these consoles hold up very well, despite their low resolutions (720 isn't that high on the PC).

Yes, one year probably wouldn't have made much a difference. Let me change it to 2 -3 years. ;) What I'm trying to get across is that Microsoft accelerated the start of this generation prematurely and in doing so, compromised the power of both ps3/360.
 
duckroll said:
Here's a better question. With both consoles outputting everything at 720p anyway, how many people on GAF can actually tell if a game is "sub-HD" without a pixel counter TELLING everyone that it is?


Amen duckroll. No one can. Honestly until it was revealed to be sub HD I just figured halo3 looked a little bit jaggy. I didn't give it a second thought.

I have to say I am a little bit curious in the programming and technical side of the decision to drop the resolution down by 20- 30% depending on the game and what specific advantages it throws out. I remember reading for halo 3 , it was the way they'd setup the lighting. So you could have had halo 3 with halo 2 quality lighting at 720P or halo 3 with high quality HDR lighting at whatever the hell it actually ran at again. So yeah, I'm simply curious what the trade off is whenever this is done, is it to get the lighting to work ? is it simply to get a small framerate boost? I mean in NG2 if dropping the res made the game run at 60fps say 90% of the time instead of 70% of the time it's worth it. The only time this would ever bug me is if they dropped it so low it scaled poorly OR it was done simply to avoid pushing the game back a few months, a few months that'd allow it to run at full resolution and full framerate. But even then, if the game is fun and still looks allright , the number of pixels on screen is the last thing I worry about.

The only game I ever noticed had some sort of resolution issue that tuned out true was PGR3 back at the 360 launch.


I suspect the same thing will happen in the next wave of consoles where the minimum standard will be 1080P , you'll see all sorts of games running at slightly lower resolutions.
 
nextgeneration said:
Yes, one year probably wouldn't have made much a difference. Let me change it to 2 -3 years. ;) What I'm trying to get across is that Microsoft accelerated the start of this generation prematurely and in doing so, compromised the power of both ps3/360.

Are you pouting because Microsoft made you cum prematurely?
 
Team Ninja is not very talented. Period. They never have been. Not suprised they took a shortcut like that. Average player will not be able to pick out the lower resolution.

What suprises me is that so few gaffers seem to be able to notice. When I first saw Halo 3 in person, I was was not aware of the real resolution but immediatly noticed it was not up to res. It was up there with a digital connection to a huge TV and the image just slithers and crawls all up and down. It's written in every single pixel, how can you not see it?

It doesn't have to look nasty of course. Look at COD4. One of the cleanest looking games around.... And I had to go to the B3D thread after playing for a few hours becasue I knew something funny was going on and I suspected the resolution was cheated.

Perhaps there is still not enough widespread distrust of the pr-supplied supersampled "screenshots" that we all look at. Does it really take that much brainpower to figure out that a perfectly crisp and smooth screenshot is not truly representative of a real game?

As long as they can sell you with bullshots and get a pass on the actual game, they will continue to cut corners.
 
nextgeneration said:
Yes, one year probably wouldn't have made much a difference. Let me change it to 2 -3 years. ;) What I'm trying to get across is that Microsoft accelerated the start of this generation prematurely and in doing so, compromised the power of both ps3/360.


What if they try it again?
 
Top Bottom