• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The 360 & AA (i.e. 'free AA' or whatever it's called)

Raistlin

Post Count: 9999
vdo said:
Do you have any thoughts as to why Sony did not use any EDRAM for PS3? They seemed to have good success in PS2 and if they are looking at it for PS4, it seems strange that they did not implement it in PS3.

The problem is that they do not have experience designing T&L (ie. shaders) into their GPU architecture.

For that reason, they decided to go with nVidia in order to have current T&L handling as well as a more dev-friendly graphics architecture. There's already a ton of bitching about CELL. Just imagine if their GPU was custom too :p


Now that they've established some level of long-term collaboration ... the hope is the PS4 may end up being the best of both worlds.
 

Nostromo

Member
Shogmaster said:
The amounts of ROPs, the bandwidth available, and the pixel shaders abilities obviously don't allow for easy 1080p games with all the popular trimmings this gen. Even your game is struggling to keep 30fps @ 720p since you are going for prettiest visuals possible. I'd argue that even 720p is a struggle this gen with all the shader goodies.
You're going completely off mark, read what I wrote again.
I'm telling you that the number of ROPs (unless you have a lot of transparent stuff on screen) is not so impotant when your most expensive pixels requires a fill rate of 1/10 pixel per clock cycle!
 

Raistlin

Post Count: 9999
Nostromo said:
We certainly don't need edram in the future unless the rendering pipeline changes substantially.

Really?



I was under the impression both nVidia (now, with Sony) and ATI have been looking into doing this for the future?
 
Brimstone said:
No.



The eDRAM for the 360 is to help alleviate bandwith pressure from only having a single 128 bit bus shared between the GPU and CPU.


The PS3 doesn't have this problem because CELL and RSX each have their own memory bus.

RSX has a substantial pixel shading edge. None of the transitor budget is allocated towards eDRAM, instead more processing area is focused on pixel shaders.

They are both very important. Notice I said "one of the most important", not the most important.
 
Nostromo said:
You're going completely off mark, read what I wrote again.
I'm telling you that the number of ROPs (unless you have a lot of transparent stuff on screen) is not so impotant when your most expensive pixels requires a fill rate of 1/10 pixel per clock cycle!
So you are saying 8 ROPs is plenty for 1080p and the limiting factors are pixel shaders and rendering bandwidth for these consoles?
 

Kaako

Felium Defensor
Bungie has spoken:

You Owe me 80p!

One item making the interwebs rounds this week was the scandalous revelation that Halo 3 runs at “640p” which isn’t even technically a resolution. However, the interweb detectives did notice that Halo 3’s vertical resolution, when captured from a frame buffer, is indeed 640 pixels. So what gives? Did we short change you 80 pixels?

Naturally it’s more complicated than that. In fact, you could argue we gave you 1280 pixels of vertical resolution, since Halo 3 uses not one, but two frame buffers – both of which render at 1152x640 pixels. The reason we chose this slightly unorthodox resolution and this very complex use of two buffers is simple enough to see – lighting. We wanted to preserve as much dynamic range as possible – so we use one for the high dynamic range and one for the low dynamic range values. Both are combined to create the finished on screen image.

This ability to display a full range of HDR, combined with our advanced lighting, material and postprocessing engine, gives our scenes, large and small, a compelling, convincing and ultimately “real” feeling, and at a steady and smooth frame rate, which in the end was far more important to us than the ability to display a few extra pixels. Making this decision simpler still is the fact that the 360 scales the
“almost-720p” image effortlessly all the way up to 1080p if you so desire.

In fact, if you do a comparison shot between the native 1152x640 image and the scaled 1280x720, it’s practically impossible to discern the difference. We would ignore it entirely were it not for the internet’s[GAF] propensity for drama where none exists. In fact the reason we haven’t mentioned this before in weekly updates, is the simple fact that it would have distracted conversation away from more important aspects of the game, and given tinfoil hats some new gristle to chew on as they catalogued their toenail clippings.

http://www.bungie.net/News/content.aspx?type=topnews&cid=12821

There you go folks and thank you Bungie for stepping up and telling us the reason for your decision to go 640P native.
I fixed the quote a lil bit btw :D
Any thoughts?
 
We would ignore it entirely were it not for the internet’s propensity for drama where none exists. In fact the reason we haven’t mentioned this before in weekly updates, is the simple fact that it would have distracted conversation away from more important aspects of the game, and given tinfoil hats some new gristle to chew on as they catalogued their toenail clippings.

LOL @ Bungie
 

65536

Banned
a Master Ninja said:
Wow, I just bought a 1080p tv specifically for use with my Xbox 360. What a waste...
Well you should have known that 99% of 360 games run natively at 720p anyway, so this doesn't really change things. On the plus side, if the game is running at 640p and being scaled up, the higher the resolution the better, as you're less likely to get artefacts scaling to 1080p than 720p.

This really makes me wish that the 360 had a couple more display options though, and I have no idea if they would be possible or not. Ideally, the system would output 720p if a game runs natively at that resolution, and 1080p for everything else. (I find that many televisions do a better job scaling 720p to 1080p than the 360 itself)

And it would be nice to have an option for CRT owners using the VGA lead to disable scaling entirely. Currently, I'm using a CRT monitor hooked up to the 360, and I would much rather see Halo 3 running at "true" 640p rather than having to see a scaled up version. (while the monitor does up to 1920x1440 from a PC, it doesn't like the 1920x1080 signal from the 360 for some reason so I'm running at 1366x768 with Halo currently and the scaling artefacts are very obvious)
 
andrewfee said:
This really makes me wish that the 360 had a couple more display options though, and I have no idea if they would be possible or not. Ideally, the system would output 720p if a game runs natively at that resolution, and 1080p for everything else. (I find that many televisions do a better job scaling 720p to 1080p than the 360 itself)
)

OK, but with a 1080p TV won't you get a better picture running at 1080p 100% of the time, even when a game's native resolution is 720p?

I could have sworn I read recently an article explaining precisely why. I'll try and dig it up later tonight.
 

65536

Banned
gregor7777 said:
OK, but with a 1080p TV won't you get a better picture running at 1080p 100% of the time, even when a game's native resolution is 720p?

I could have sworn I read recently an article explaining precisely why. I'll try and dig it up later tonight.
The 1080p TVs I've owned/used did a better job scaling 720p to 1080p than having the 360 handle it, so for games that do run in 720p, I would prefer to have the 360 output that.

It would also apply to CRTs which look significantly better when you send them an unscaled image vs a scaled up one. If you do need to have scaling going on (eg Halo 3, PGR) then the higher the resolution the better.

Would be interested in reading the article if you can find it anyway though.

One item making the interwebs rounds this week was the scandalous revelation that Halo 3 runs at “640p” which isn’t even technically a resolution. However, the interweb detectives did notice that Halo 3’s vertical resolution, when captured from a frame buffer, is indeed 640 pixels. So what gives? Did we short change you 80 pixels?
Just a slight correction for this though - they didn't short change us 80 pixels, they short-changed us 184,320 pixels. :p

Great of them to come out and say that rather than pretend it didn't happen though.
 

Kaako

Felium Defensor
Metalmurphy said:
So basically it's exactly what every one was saying already.

Pretty much, but know we hear it from the "horse's mouth" so to speak. I also wanted to know if there was any other reason beside the framerate stability and it seems that HDR was one of them. I'm glad that Bungie decided to go for stability in this case.

This also shuts up any of the people still in denial about the native resolution.
 

Raistlin

Post Count: 9999
Kaako said:
Any thoughts?

My thoughts are that they are full of shit if they think that when scaled, its indistinguishable from 720p. The high level of aliasing is evidence of that ... unless of course, its being caused by something else.
 
Onix said:
My thoughts are that they are full of shit if they think that when scaled, its indistinguishable from 720p. The high level of aliasing is evidence of that ... unless of course, its being caused by something else.

-2xAA
 
Kaako said:
Any thoughts?
XBots in denial, pwned by Bungie. Drama queens auto-pwned by definition. Devkit screenshot argument dead in water. Numeracy vindicated. Monday morning quarterbacks going down in flames, as per usual.

...although really, who gives a flying fuck?
 
Onix said:
My thoughts are that they are full of shit if they think that when scaled, its indistinguishable from 720p. The high level of aliasing is evidence of that ... unless of course, its being caused by something else.

They didn't say that. Read it again. They said:

if you do a comparison shot between the native 1152x640 image and the scaled 1280x720, it’s practically impossible to discern the difference.

Thus they're saying no matter how you look at it, it looks like it's 1152x640. They didn't make any claim about scaled to 720 vs native 720. So it's even more lame.
 

vdo

Member
Bungie said:
This ability to display a full range of HDR, combined with our advanced lighting, material and postprocessing engine, gives our scenes, large and small, a compelling, convincing and ultimately “real” feeling

This explains why a while back, when Stinkles posted a photomode shot of MC holding a gun that the lighting and texture on the gun looked so real to me. I remember thinking that it was a smooth surface (instead of strong normal mapping/bumpy texture looks), and yet something about it looked better than more detailed looking textures usually do. It just looked like real gun metal. I knew there was something more to the way that was being rendered than what I have been typically seeing in games - it's nice to have a deeper explanation.
 

Arkham

The Amiga Brotherhood
gregor7777 said:
Remind me again, which console ships with only a composite cable?

Remind me again which console is still compatible with its predecessor's cables?

Someone who needed a component cable probably had one for their PS2. I did. I very much appreciated the fact that I could reuse the cable without being forced to re-purchase effectively the same thing.

HDMI cables are 10 bucks. The quality certainly varies less (if at all) than with component cables.
 

camineet

Banned
Shogmaster said:
The amounts of ROPs, the bandwidth available, and the pixel shaders abilities obviously don't allow for easy 1080p games with all the popular trimmings this gen. Even your game is struggling to keep 30fps @ 720p since you are going for prettiest visuals possible. I'd argue that even 720p is a struggle this gen with all the shader goodies.

I tend to agree with you completely on that. even if that's not the whole story, that's how I see things. it's really simple to understand that the amount of ROPs x the GPU clockspeed does not produce enough raw pixel fillrate to meet the requirements of this generation. I don't care what anyone else says. yeah there are other things to concider, but it still boils down to raw power. I'm certain that if Xbox 360 and PS3 both had 256-bit buses, at least 50 GB/sec bandwidth to GDDR3, and 16 ROPs instead of 8, framerates would be fine (60fps more often than not) at 720p, and perhaps even more graphics effects could be pulled off. or more anti-aliasing. 1080p would actually be feasable as long as detail / effects are kept within reason. I am 100% convinced that Xenos and RSX were kept to the absolute minimum performance, the most barely acceptable performance to allow for HD resolutions while letting everything else suffer. I'm convinced that both could've done with more fillrate and more bandwidth. I doubt I'd find a developer that would disagree. even though as I said, there's more to it than just fillrate & bandwidth.
 

Arkham

The Amiga Brotherhood
gregor7777 said:
OK, but with a 1080p TV won't you get a better picture running at 1080p 100% of the time, even when a game's native resolution is 720p?

I think that's why the 360's very capable scaler is used. Smart move on MS's part. It's a very underrated element in the machine's success.
 

Brimstone

my reputation is Shadowruined
SolidSnakex said:
You're actually going to take shots at a game not running at 1080p after this news?



Killzone 2 has a graphic engine using a "deferred render" approach and the PS3 RSX has more pixel shading power than the XB360 GPU Xenos.


I wouldn't mind the option to choose either 720p or 1080p with Killzone 2. If they have to turn off AA for 1080p thats fine, at least they're giving customers a choice on what they think is best.
 

maus

Member
Kaako said:
Bungie has spoken:

Any thoughts?
Very annoyed. Sacrificing 80 vertical pixels may have been a good move in order to incorporate more effects, however the big issue is the lack of AA. Without AA, THE ENTIRE IMAGE IS DEGRADED... severly in my opinion.

Dynamic shadows are still in their early stages imo and are very blocky and unconvincing. Just look at the opening cinematic; when the shot cuts to a close-up of Johnson, the self-shadowing just looks glitchy and ugly. People thinking it looks good are kidding themselves (i'm referring to specifically the dynamic shadows, not the HDR or lighting). If Bungie could have toned down the "Advanced lighting engine" and had enough power to do AA, color me annoyed.

People with the "SHUT UP IT'S HALO 3, IT'S AMAZING" responses are killing me. The fact is that it is Halo 3 and Bungie made it look like a pixelated mess. This isn't some 3rd party experiment, this is the biggest damn game on the system.
 

Raistlin

Post Count: 9999
Marty Chinn said:
They didn't say that. Read it again. They said:



Thus they're saying no matter how you look at it, it looks like it's 1152x640. They didn't make any claim about scaled to 720 vs native 720. So it's even more lame.

Holy crap ... I didn't notice they did it backwards :lol

wtf are they even trying to say?
 

maus

Member
Nostromo said:
They didn't sacrify 80 vertical pixels, they sacrified 27% of a 720p image.

Yeah couldn't come up worth the wording, thanks.

I'M JUST SO MAD MY GLASSES ARE STEAMING UP. Seriously though, it annoys me how smug that Bungie response is, especially that last sentence. They are raking in fucking millions and they can't devote more man hours into making the final installment to the Halo franchise look at least technically on par with other current releases.
 

Raistlin

Post Count: 9999
marvelharvey said:
I'm so happy I just spent $5000 on a new TV, when my previous HDTV would have sufficed.


You're like the 5th person to say this here ... wtf? Is Halo 3 the only game your ever going to play? :lol
 

JeStaH

Member
maus said:
Yeah couldn't come up worth the wording, thanks.

I'M JUST SO MAD MY GLASSES ARE STEAMING UP. Seriously though, it annoys me how smug that Bungie response is, especially that last sentence. They are raking in fucking millions and they can't devote more man hours into making the final installment to the Halo franchise look at least technically on par with other current releases.

Wow are you seriously that worked up about it? Deep breath.... I thought gaming was suppose to be fun?

Honestly how many people would have known a thing if it was never mentioned?

They sacrificed 27% of a 720p image? Do you think it will cost them 27% of their sales or reviewers will doc 27% of their scores? Do you think they upped their advertising budget to compensate? Maybe bribed more reviewers with 27% more swag than normal?

Halo 3 is here, it will sell a ton. People will log loads of hours playing it. As a consulation to those who must always rain on other people's parades. You'll at least be able to say, but it runs at 640p .. :lol
 

Raistlin

Post Count: 9999
JeStaH said:
Wow are you seriously that worked up about it? Deep breath.... I thought gaming was suppose to be fun?

People can enjoy a game and still critique its graphics.

Honestly how many people would have known a thing if it was never mentioned?

A lot. People knew something was up, because it has some of the worst aliasing on the system ... that's why this thread was created.

They sacrificed 27% of a 720p image? Do you think it will cost them 27% of their sales or reviewers will doc 27% of their scores? Do you think they upped their advertising budget to compensate? Maybe bribed more reviewers with 27% more swag than normal?

Halo 3 is here, it will sell a ton. People will log loads of hours playing it. As a consulation to those who must always rain on other people's parades. You'll at least be able to say, but it runs at 640p .. :lol

So if a game sells really well and/or is good ... it should just get a 10 for everything, and no one should ever discuss any parts of the title that aren't perfect?

Seriously?
 
Onix said:
You're like the 5th person to say this here ... wtf? Is Halo 3 the only game your ever going to play? :lol
No, but as alluded to earlier in this thread, the astounding sales success of this 640p game could lead to more developers going the same route... and this is what scares me the most.
 

JeStaH

Member
Onix said:
People can enjoy a game and still critique its graphics.



A lot. People knew something was up, because it has some of the worst aliasing on the system ... that's why this thread was created.



So if a game sells really well and/or is good ... it should just get a 10 for everything, and no one should ever discuss any parts of the title that aren't perfect?

Seriously?


I'm just pointing out how ridiculous it is to put so much emphasis on a few pixels. This game is not the only example of how pixels does not equal how good a game is. Let's be honest, most of the people here are just the usual suspects who just love to pile on.

The game seems to be well received by the critics and the people who are actually playing it seem to be really enjoying it.

The one guy is so mad that his glasses are steaming? I just though this thread needed a let's look in the mirror for a moment and remember why we play games.. Because that shit is fun!
 

Draft

Member
JeStaH said:
I'm just pointing out how ridiculous it is to put so much emphasis on a few pixels. This game is not the only example of how pixels does not equal how good a game is. Let's be honest, most of the people here are just the usual suspects who just love to pile on.

The game seems to be well received by the critics and the people who are actually playing it seem to be really enjoying it.

The one guy is so mad that his glasses are steaming? I just though this thread needed a let's look in the mirror for a moment and remember why we play games.. Because that shit is fun!
It's not a "few" pixels. It's 27% of a 720p image, which is like... a LOT of pixels.

A LOT.
 

Raistlin

Post Count: 9999
marvelharvey said:
No, but as alluded to earlier in this thread, the astounding sales success of this 640p game could lead to more developers going the same route... and this is what scares me the most.

I suppose, but it probably won't be that much of an issue for most devs. Halo 3 was guaranteed to sell ... they could have used the original Wolfenstein 3D engine, it would be a hit.


For most games, no such guarantees exist. To differentiate themselves, many devs attempt to do some decent graphics, amongst other things (and no, that isn't saying Halo 3 has crap graphics).
 

65536

Banned
gregor7777 said:
http://www.hometheaterhifi.com/volume_14_1/feature-article-1080p-3-2007-part-1.html

From my understanding of that article, if you're on a 1080p set, you want all of your content to display as 1080p (upscaled or native) for the best picture.

I may be wrong though.

Excellent read, either way.
Thanks for the link. I have seen that before actually. In most cases, they're right, you do just want to set the device to 1080p and leave it at that.

The 360 is a 720p native source though, so that means you're not going to be losing any information, or having the image scaled twice if you have it set to 720p (with games that render at 1280x720 at least) which means you can either set the 360 to 1080p and have it upscale the image, or leave it at 720p and have the TV upscale it to 1080p. From my experience, most decent TVs have a sharper image when you let them handle the upscaling rather than have the 360 do it. You'll also avoid any potential performance issues (tearing in Dead Rising, for example) if you leave it in 720p.

In the case of something like Halo 3 which is rendered at 1152x640 though, the 360 would be scaling to 1280x720 and the TV would then scale that to 1080p. Scaling twice is something you want to avoid at all costs. (which is why they recommend you just set everything to 1080p) Now, the 360 can't just output 1152x640 as that's not a proper video signal so it will look best when the 360 is set to 1080p as that means things should only be scaled once.

That's why it would be best to have an option on the 360 that would output 720p native game at 720p, and everything else at 1080p. (and personally I'd like to see an option to disable the scaler entirely for people using monitors over VGA, and possibly DVI/HDMI) It would mean you get a sharper image with 720p native games, and scaled ones look as good as they can.


Nostromo said:
They didn't sacrify 80 vertical pixels, they sacrified 27% of a 720p image.
20%, not 27%.

1280x720 = 921,600px, 1152x640 = 737,280px
921,600 - 737,280 = 184,320px
921,600 x 0.2 = 184,320px.

maus said:
Very annoyed. Sacrificing 80 vertical pixels may have been a good move in order to incorporate more effects, however the big issue is the lack of AA. Without AA, THE ENTIRE IMAGE IS DEGRADED... severly in my opinion.

Dynamic shadows are still in their early stages imo and are very blocky and unconvincing. Just look at the opening cinematic; when the shot cuts to a close-up of Johnson, the self-shadowing just looks glitchy and ugly. People thinking it looks good are kidding themselves (i'm referring to specifically the dynamic shadows, not the HDR or lighting). If Bungie could have toned down the "Advanced lighting engine" and had enough power to do AA, color me annoyed.

People with the "SHUT UP IT'S HALO 3, IT'S AMAZING" responses are killing me. The fact is that it is Halo 3 and Bungie made it look like a pixelated mess. This isn't some 3rd party experiment, this is the biggest damn game on the system.
By lighting, they mean the way that objects are lit in the environment, not how the shadows are being drawn. The lighting in Halo 3 is amazing, but yes, the shimmering pixels you get due to the scaling is distracting. You really need to have the signal 1:1 mapped to your display with Halo 3, and preferably you'll be running in 1080p.

If you're currently using a HDTV with a 1366x768 native resolution and using 720p (or worse, 1080i) over component, switching to a VGA lead and outputting 1360x768 should look a lot cleaner.

marvelharvey said:
I'm so happy I just spent $5000 on a new TV, when my previous HDTV would have sufficed.
Uh, you didn't think Halo 3 was going to be 1080p native, did you?
 

Raistlin

Post Count: 9999
Draft said:
It's not a "few" pixels. It's 27% of a 720p image, which is like... a LOT of pixels.

A LOT.

It isn't just that its a lot. Its that it is an 'odd' resolution from a mathematical perspective, causing some noticeable artifacting when scaled. When no AA is thrown in the mix, it produces a pretty aliased result.







JeStaH said:
I'm just pointing out how ridiculous it is to put so much emphasis on a few pixels. This game is not the only example of how pixels does not equal how good a game is. Let's be honest, most of the people here are just the usual suspects who just love to pile on.

I disagree. Its not just that its a few pixels (though many will argue its a significant amount) ... its the artifacts using this specific resolution, and without AA, that is the problem.

Like it or not, the consensus is that this title has amongst the worst aliasing on the system (if not the worst) ... certainly the worst of a major title.

The game seems to be well received by the critics and the people who are actually playing it seem to be really enjoying it.

That has never been the argument. We're talking the graphics specifically.

The one guy is so mad that his glasses are steaming? I just though this thread needed a let's look in the mirror for a moment and remember why we play games.. Because that shit is fun!

Hey ... I'm not agreeing with that guy :lol
 
andrewfee said:
The 1080p TVs I've owned/used did a better job scaling 720p to 1080p than having the 360 handle it, so for games that do run in 720p, I would prefer to have the 360 output that.

Have you done a lag test using Guitar Hero 2? The 360 scaler creates at most half a frame of lag, where TV scalers typically add a couple of frames of lag or more. In fact, if there's a set that scales to 1080p or 1366x768 without adding a couple of frames of lag or more, I (and a ton of people at AVS forum) would love to know what it is.

One great thing about the 360 is that it can scale to native resolutions like 1360x768 via VGA and HDMI, where the PS3 can't support such resolutions since for some bizarre reason it can't support PC/DVI output, it can only do HDMI.

The 360 scaler is pretty incredible, can you show any photos of your set demonstrating how it scales to 1080p worse than your set does?
 

maus

Member
JeStaH said:
Wow are you seriously that worked up about it? Deep breath.... I thought gaming was suppose to be fun?

Honestly how many people would have known a thing if it was never mentioned?

They sacrificed 27% of a 720p image? Do you think it will cost them 27% of their sales or reviewers will doc 27% of their scores? Do you think they upped their advertising budget to compensate? Maybe bribed more reviewers with 27% more swag than normal?

Halo 3 is here, it will sell a ton. People will log loads of hours playing it. As a consulation to those who must always rain on other people's parades. You'll at least be able to say, but it runs at 640p .. :lol
Way to pratically prove my point, "thank god Bungie doesn't lose anything! consumers are stupid anyway they won't notice!" I've talked to plenty of players, mainstream Halo dopes included, that notice the game looks like crap compared to their Gears of War.

A lot of things about the release of Halo 3 smell rotten. The lack of features that were included in the past (vertical split-screen), the crap quality of the "high-def revisions" of the previous Halo cinematics in the incredibly overpriced special edition, and now all this. I like Bungie and their community interaction, but when they take jabs at some of the more hardcore gamers it pisses me off.

Blizzard Entertainment manages to have great community relations without all the bigheadedness (though they aren't perfect too, no one is).

andrewfee said:
If you're currently using a HDTV with a 1366x768 native resolution and using 720p (or worse, 1080i) over component, switching to a VGA lead and outputting 1360x768 should look a lot cleaner.
This is exactly how I'm running Halo 3. Can't imagine what it looks like over component.
 

Raistlin

Post Count: 9999
gregor7777 said:
http://www.hometheaterhifi.com/volume_14_1/feature-article-1080p-3-2007-part-1.html

From my understanding of that article, if you're on a 1080p set, you want all of your content to display as 1080p (upscaled or native) for the best picture.

I may be wrong though.

Excellent read, either way.

Unless you have a 9"+ CRT front projector, your 1080p TV is going to be fixed pixel. That means, no matter what you throw at it (if its compatible), it ends up being displayed at 1080p.


For content that isn't 1080p, the question then becomes what device do you want deinterlacing/scaling the image to get it to 1080p? The TV or the device (assuming the device can deinterlace/scale)?

The answer unfortunately, isn't simple. It comes down to the particular TV's abilities, the devices abilities ... and the content itself.
 

Valcrist

Member
JeStaH said:
I'm just pointing out how ridiculous it is to put so much emphasis on a few pixels. This game is not the only example of how pixels does not equal how good a game is. Let's be honest, most of the people here are just the usual suspects who just love to pile on.

The game seems to be well received by the critics and the people who are actually playing it seem to be really enjoying it.

The one guy is so mad that his glasses are steaming? I just though this thread needed a let's look in the mirror for a moment and remember why we play games.. Because that shit is fun!

It's understandable to be irked when people criticize faults, but this is a gaming discussion forum. No one is saying the game is defined by its graphics, but its just as silly to use the argument of "Hey, its fun guys! That absolves the game of its faults."
Every game has faults, which may or may not be representative of its whole package.
The point of this thread was to look at halo with a purely graphical standpoint. I'm sure a lot of people just want to know if this is Bungie's maxed capabilities on the 360 with their engine, was the resolution a sacrafice for something else?
 

JeStaH

Member
Draft said:
It's not a "few" pixels. It's 27% of a 720p image, which is like... a LOT of pixels.

A LOT.

Right and has that taken away from your enjoyment of the game before you found out? Does it make the game a LOT less fun after you found out?

I always think back to videophiles who love to point out edge enhancement in everything. When I found out what it was I began looking for it in DVD's etc. When you focus on just that one negative thing you miss out on the big picture.
 
andrewfee said:
Uh, you didn't think Halo 3 was going to be 1080p native, did you?
Actually, I did think this was going to be 1080p. Up until a few days ago, I hadn't read a single word about Halo 3, I avoid almost every thread about a game I'm yet to purchase, through fear of something being spoiled.
 

Raistlin

Post Count: 9999
JeStaH said:
Right and has that taken away from your enjoyment of the game before you found out? Does it make the game a LOT less fun after you found out?

I always think back to videophiles who love to point out edge enhancement in everything. When I found out what it was I began looking for it in DVD's etc. When you focus on just that one negative thing you miss out on the big picture.

How do you define enjoyment? Generally, graphics have at least some impact on that. Obviously how much varies from person to person.


If this game has effectively the worst aliasing of any 360 title, then it wouldn't exactly surprise me that some people are a little annoyed. They can still enjoy the title, but that doesn't mean they don't notice it.



marvelharvey said:
Actually, I did think this was going to be 1080p. Up until a few days ago, I hadn't read a single word about Halo 3, I avoid almost every thread about a game I'm yet to purchase, through fear of something being spoiled.

If you admittedly knew nothing about the title, exactly what would make you think the game would be native 1080p?

The staggering catalog of native 1080p 360 titles?
 
Top Bottom