SuperSonic1305
Member
Wow just wow. This is a new low for gaming journalism.
Man, I thought we were past this crap.
Incidentally this doubles (moreso perhaps) for FPS, just as anti aliasing helps to recreate that "infinite resolution" (really more about the source material as a DVD won't look as sharp as a proper blu-ray though it will upscale nicer usually) motion blur recreates the fact film just kind of blurs each frame together that helps make film look smoother than it actually is, while games rendering at the same FPS without motion blur will be notably less smooth looking.![]()
They are just regurgitating this chart. If you notice, its from AVS forum and is talking about movies. Things that are "rendered" in infinite resolution because its real life. You are not going to see a huge differences between anything recorded with perfect image quality between 720p and 1080p.
BUT you can't use the same chart for video games. Video games are rendered in realtime in its native resolution and that resolution makes a huge difference in image quality. The fact that these guys work for IGN and can't see the difference should be stunning.
Wow just wow. This is a new low for gaming journalism.
The same part is, it's not journalism at all.
You have bunch of these ready to go, don't you?
Guys. It's a podcast about the Xbox. It will be OK, I promise.
You didn't read what I said. The difference between a Blu-ray and a broadcast is a lot more than just the resolution. You cannot compare them. You can't compare them by playing a Blu-ray on a 720p TV either.
I think we should move past this painting of all games press because of a certain publication or a certain author making a comment you think is dumb.Wow just wow. This is a new low for gaming journalism.
Even with movies I'd disagree a fair bit, spout all the math and science you want, 480p/SDTV content such as DVD and standard cable broadcast only looks good from about 30 feet back from my 55 inch TV. Even from that distance, going from SDTV to HDTV still looks demonstrably better. The colors pop more and everything just has more definition to it.
Games are a bit of a different story due to the framerates typically being higher , even on my computer monitor , running something at 720p versus 1080p the difference is immediate even if I stand 20 feet away.
Another way to look at it is this- watch content exclusively in 1080p for a month and then switch down to 720p. It WILL look fuzzier no matter what the content is.
Many would take the above to mean that I fully support the switch up to 4K content but in fact I am emphatically against it - nothing runs in 4K right now outside of photographs (and even at that a 4K image is only 8.2 megapixels which your cellphone can generate) and theatrical print runs which display on a 300 foot screen (or bigger). 99% of your content will simply be pixel doubled both length and width wise to basically waste 75% of the on screen pixels for nothing.
In short - more resolution is better except when it isn't.
you're gonna see that difference whether you're on a 15 inch monitor or 50.
What on earth is he smoking?![]()
I assume he is talking about the information in this thread that can validate this claim. If you include the vitally important variable of viewing distance. Which was completely ignored in their statements.
![]()
I assume he is talking about the information in this thread that can validate this claim. If you include the vitally important variable of viewing distance. Which was completely ignored in their statements.
![]()
I assume he is talking about the information in this thread that can validate this claim. If you include the vitally important variable of viewing distance. Which was completely ignored in their statements.