• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Resolutiongate: the aftermath. Musings from a Call of Duty Dude Bro.........

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Yes, easily.

Interesting. The science says that a 8ft distance you just on the edge of being not being able to resolve all the detail @ 720p and way over the distance to be able to resolve all the information @1080p.

I wonder if the sharpening filter on the Xbox One is adding so much aliasing that it's immediately noticeable.
 

HTupolev

Member
Are you saying at 8ft distance you can tell the difference between the resolutions on Xbox One and PS4 on a 42inch set?
Absolutely.

At that distance, the difference in raw sharpness will begin to become smaller for most people, though it'll probably still be fairly visible. And differences in shimmering caused by undersampled details (lighting highlights, thin objects like power lines, etc) might still be completely visible at HUGE distances.

For instance, Halo 3 runs at 640p and has some very sharp, shimmery specular highlights. Even when I'm 50 feet from my 37" TV, a distance at which the game is nigh-unplayably hard to see, the aliasing caused by those highlights is very visible and obvious.
 

RaikuHebi

Banned
JpGYX.gif

Finally. It's been a good damn while since I've seen a new one.
 
Interesting aside to bolster what Bruiserbear is saying...


We've known each other a long time now. We were having a conversation when this whole resolution-gate thing started out and I made some hyperbolic comment (me?? WHAT?) that anyone in the media that can't tell the difference between 720p and 1080p shouldn't be in the media. Bruiser shot back saying that he wasn't sure that he would be able to tell the difference and I should go easy on them.


So he went into this thing thinking there wasn't gonna be a difference... if you're wondering if he has an agenda.
 
For a game made mainly for current gen, Ghosts is almost shocking in 1080p.
There are a few parts of SP you would think you are playing something like Killzone.
MP wise its not the crazy graphics improvement but just how clear it is makes a big difference in picking enemies out.
 

madmackem

Member
Anyone who says the cod xbox one looks good needs an eye check up, the over sharpening the xbox is doing to the image is unreal, i really would rather be playing the 360 version.
 

badb0y

Member
Of course 1080p looks better than 720p and it's noticeable. Heck going from 1080p to 1600p was noticeable when I made the switch last year.

The one thing that I learned form resolutiongate fiasco is that mainstream gaming journalism outlets employ people with zero technical knowledge.
 
Anyone who says the cod xbox one looks good needs an eye check up, the over sharpening the xbox is doing to the image is unreal, i really would rather be playing the 360 version.

You'll find lots of these people at Xbox fan sites. Almost everyone is saying the differences don't matter or they don't care. Quite a different tune than what they were saying in January/February 2013. It's like they're routing for their favorite sports team.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Anyone who says the cod xbox one looks good needs an eye check up, the over sharpening the xbox is doing to the image is unreal, i really would rather be playing the 360 version.

Yeah. I'm starting to think the biggest difference on the Xbox One version is due to the over sharpening which really makes it stand out and look ugly. I wonder how close they would look if the Xbox One version was simply upscaled from 720p without that horrible sharpening.
 

MercuryLS

Banned
Guess they haven't done the PS4 patch to lower the 60+ framerate to a steady level then?

Shame.

The patches they've released seems to have helped, but I've still seen a few bouts of judder in matches. It's far less prevalent now then it was before though. It happens in a few matches, but I've had many matches where there were zero framerate issues. It seems like an easy fix, I don't get why they haven't done it yet.
 

badb0y

Member
The patches they've released seems to have helped, but I've still seen a few bouts of judder in matches. It's far less prevalent now then it was before though. It happens in a few matches, but I've had many matches where there were zero framerate issues. It seems like an easy fix, I don't get why they haven't done it yet.

Because Infinity Ward.
 

Cynn

Member
Like I said met just going by another poster

You could be right as well I will definitely know for sure by the end of next week as my new tv will be 120hz

120-240hz HDTVs just do frame doubling and it introduces a LOT of input lag. You will end up turning it off very quickly just like I did.
 

werks

Banned
Interesting. The science says that a 8ft distance you just on the edge of being not being able to resolve all the detail @ 720p and way over the distance to be able to resolve all the information @1080p.

I wonder if the sharpening filter on the Xbox One is adding so much aliasing that it's immediately noticeable.

Pretty sure that science is for movies and not games. The difference between a 720p movie and a 1080p movie is no where the same as a game.
 

Massa

Member
Interesting. The science says that a 8ft distance you just on the edge of being not being able to resolve all the detail @ 720p and way over the distance to be able to resolve all the information @1080p.

You can't be serious. People already explained to you why this is wrong multiple times before.
 
Doubtful that someone who bought both a X1 and PS4 in launch week and spent $120 buying the same game on both is your typical Call of Duty bro. You're on the extreme end. It'd be interesting to see what more regular gamers think, assuming they even notice a difference.
 

Zinthar

Member
Doubtful that someone who bought both a X1 and PS4 in launch week and spent $120 buying the same game on both is your typical Call of Duty bro. You're on the extreme end. It'd be interesting to see what more regular gamers think, assuming they even notice a difference.

Call of Duty is pretty popular -- enough so that the audience isn't completely limited to guys drinking Natty Light and listening to Jack Johnson while humping your dead body in the game.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
You can't be serious. People already explained to you why this is wrong multiple times before.

Instead of doing a drive by post. Explain to me why the current science of visual acuity applies to tv and movies but magically doesn't apply to games.
 

Moofers

Member
Great post, OP. I've only played Ghosts on PS3 (Where I found the graphics rendered it almost completely unplayable) and PS4. I haven't yet played the Xbox version but I'm loving the PS4 version so far.
Unplayable? Your standards must have you playing very few games.
 

Kinyou

Member
I would argue that figuring out the cell was much more of a handicap then esram is...and CoD on the PS3 never doubled its resolution...to the contrary the more powerful PS3 continued to run at a lower res...while you're expecting the LESS powerful Xbone to double it's res to catch the more powerful Ps4 while maintaining all of the same assets?.....hmmm...
I'm not sure if Ghost should really be used to make any conclusions. Even the PC version runs like shit.

I remember most people here concluding that IW is incompetent after the PS4 frame rate issues showed up
 
Two things;

1) Playing CoD doesn't make one a dudebro. Pretty sure posting long insights of a technical nature on NeoGAF is very anti-dudebro behaviour.

2) The question isn't so much about can you tell the difference in a Pepsi challenge; it's about whether you notice and care in isolation. Does it matter? I play PC games at 900p and this has never bothered me at all.

If you're getting both, get multiplats on PS4. Otherwise, get the platform you like the first-party IPs for. It's really that simple. I don't know why the community is still harping on about this.
 

Y2Kev

TLG Fan Caretaker Est. 2009
Interesting. The science says that a 8ft distance you just on the edge of being not being able to resolve all the detail @ 720p and way over the distance to be able to resolve all the information @1080p.

I wonder if the sharpening filter on the Xbox One is adding so much aliasing that it's immediately noticeable.

Now I know for a fact that people have explained to you why the "science" you're talking about is different for gaming in the past. Why are you not taking it into account?
 
Interesting aside to bolster what Bruiserbear is saying...


We've known each other a long time now. We were having a conversation when this whole resolution-gate thing started out and I made some hyperbolic comment (me?? WHAT?) that anyone in the media that can't tell the difference between 720p and 1080p shouldn't be in the media. Bruiser shot back saying that he wasn't sure that he would be able to tell the difference and I should go easy on them.


So he went into this thing thinking there wasn't gonna be a difference... if you're wondering if he has an agenda.

Thanks Pete. I had actually forgotten about our little exchange, but yeah, I was honestly not sure how big the difference would be in my own home on my own TV. It was pretty big.

Guess they haven't done the PS4 patch to lower the 60+ framerate to a steady level then?

Shame.

I'm completely speculating right now, but based on my observations, I think they may have patched the framerate limit to 60 in the past week, which is one of the reasons I've noticed an improvement in recent days. However, there are also actual frame drops, and I believe that's what I'm still seeing occasionally. I say this because I don't seem to notice framerate fluctuations randomly anymore. They only seem to occur when shit gets crazy on occasion, and I would assume those changes in framerate I'm noticing then are drops, and not increases, as they only occur when a lot is going on (ie, helicopter in the air firing on enemies, granade goes off, and gunfire), and it's not that common to see all those things happening at the same time.

Doubtful that someone who bought both a X1 and PS4 in launch week and spent $120 buying the same game on both is your typical Call of Duty bro. You're on the extreme end. It'd be interesting to see what more regular gamers think, assuming they even notice a difference.

I was only sharing MY experience, but I do think anyone who has a critical eye for games will notice this stuff. However, I would absolutely expect that your normal dude who just buys COD every year, he probably wouldn't immediately notice these differences.

That's why I haven't been disagreeing with the press when they say "most people won't notice", because they're probably right. But who do they think their main audience is? The people who watch Sessler each week on Youtube are probably pretty hardcore gamers, same goes for Polygon readers. So when they tell their audience it doesn't matter, it comes off as rather odd.
 

SeanR1221

Member
Interesting aside to bolster what Bruiserbear is saying...


We've known each other a long time now. We were having a conversation when this whole resolution-gate thing started out and I made some hyperbolic comment (me?? WHAT?) that anyone in the media that can't tell the difference between 720p and 1080p shouldn't be in the media. Bruiser shot back saying that he wasn't sure that he would be able to tell the difference and I should go easy on them.


So he went into this thing thinking there wasn't gonna be a difference... if you're wondering if he has an agenda.

Interesting.

I know for me, I was playing assassins creed pre 1080 patch. I quit, installed the patch, and said wow...and that was 900-1080. I can only imagine how big of a difference 720-1080 is.

(6 feet away from a 55")
 

badb0y

Member
Two things;

1) Playing CoD doesn't make one a dudebro. Pretty sure posting long insights of a technical nature on NeoGAF is very anti-dudebro behaviour.

2) The question isn't so much about can you the difference in a Pepsi challenge; it's about whether you notice and care in isolation. Does it matter? I play PC games at 900p and this has never bothered me at all.

If you're getting both, get multiplats on PS4. Otherwise, get the platform you like the first-party IPs for. It's really that simple. I don't know why the community is still harping on about this.
On a 1080p monitor?
 

MercuryLS

Banned
Unplayable? Your standards must have you playing very few games.

I played a bit of the PS3 version too, the game is incredibly ugly even compared to past CODs. Blurry textures that take forever to load, low res, Vaseline smeared IQ. Just terrible.
 
On a 1080p monitor?

Yup. Sometimes I play at 720p as well. I'd rather have all the fancy effects on.

Funny thing is that I have pretty bad astigmatism, so if there's bad jaggies on a console, I just don't wear my glasses. Built-in antialiasing. :D

Still notice them on Forza 5. Ironically, that game is at 1080p. -_-
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Now I know for a fact that people have explained to you why the "science" you're talking about is different for gaming in the past. Why are you not taking it into account?

No they have not explained why the science does not apply to video games. They've implied that the reason is because "You won't find jagged edges or scaling artifacts in a film" but the Snellen test does not use film or video to determine the limits of visual acuity. It uses high contrast shapes and lines. Which is much more relevant to video games presentation then movies or films.
 

Ishan

Junior Member
I think bringing competitive aiming aspects into it complicates it. I havent played cod for more than 2 minutes so I cant comment for that specifically but for example I used to play cs at 640 X 480 and didnt upgrade that even thought my computer could easily run the higher resolutions. I was used to it and preferred that mode.
 
Instead of doing a drive by post. Explain to me why the current science of visual acuity applies to tv and movies but magically doesn't apply to games.
You seem to think that "the current science of visual acuity" puts hard, fast rules on what you can see when. But all the charts and so on you've seen are statistical averages, not absolute thresholds. As just one example, they're usually going to be using 20/20 vision as the standard model. But people can and do sometimes have vision better than 20/20; this is especially common in younger folks, such as you'd find playing videogames.

And as for some gaming content having different detectability than photographic imagery, that's definitely true, due to rendering techniques. (HTupolev gives a specific example above.) I suggest leaving your sarcastic "magical" comments aside, and learning more about a topic before you start making absolutist statements.
 

alr1ght

bish gets all the credit :)
Instead of doing a drive by post. Explain to me why the current science of visual acuity applies to tv and movies but magically doesn't apply to games.

Live action sources have no aliasing. They have essentially infinite polygons and anti aliasing.

Computer generated movies are massively supersampled (probably 8k or more) with loads of AA so you see no polygon edges.

Videogames are rendered at a relatively small resolution and don't benefit from supersampling. It's easier to pick out aliased edges, shimmering, and all the other things that plague image quality.

This image is 1080p, yet it looks amazing. Why? It's massively downsampled so it doesn't suffer from image quality issues. Nothing rendered at 1080p native will ever even come close to this image quality (not the models/lighting).
ibrz8lvDx5bIla.jpg


I'm sure you will ignore it though as you have been trying to justify to yourself that resolution doesn't matter, though.
 

HTupolev

Member
Instead of doing a drive by post. Explain to me why the current science of visual acuity applies to tv and movies but magically doesn't apply to games.
Sampling.

Antialiasing/postprocessing/stuff aside, in a game, the colour of a pixel is based only on the colour of the point at the very center of that pixel.

When you're working with a camera, this isn't the case. Whether it's film or digital, each "grain" or "pixel" in the camera gets bombarded by a bajillion photons while a frame is being captured. The result is that the colour of a grain/pixel is actually an averaging of the colour over the on-screen area that ultimately gets taken up by the grain/pixel.
Obviously CGI doesn't have the benefit of tons of photons, but it can still approximate the result by supersampling. Rather than render the image at raw 1080p the way a game will, a CGI studio might render the image at a MUCH higher resolution; that way, the colour of a pixel in the final 1080p image can be created by averaging the colours of all the pixels in the high-resolution render that fall within that pixel.

By sampling sufficiently, CGI and live-action film avoids aliasing.
If there's a thin object in a game and the camera is moving, that thin object can actually shimmer in and out of existance from one frame to the next as pixel centers fall on and off the object. But when an image is perfectly supersampled, that object is always accounted for; even if the pixel center doesn't land on it, subpixels will, and so the object will always be accurately accounted for.
Similar logic applies to jaggies, and many other kinds of visual details.

//===============

Also, the acuity charts that get thrown around on GAF aren't that great. There are various types of acuity and various ways to measure it.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
By sampling sufficiently, CGI and live-action film avoids aliasing.
If there's a thin object in a game and the camera is moving, that thin object can actually shimmer in and out of existance from one frame to the next as pixel centers fall on and off the object. But when an image is perfectly supersampled, that object is always accounted for; even if the pixel center doesn't land on it, subpixels will, and so the object will always be accurately accounted for.
Similar logic applies to jaggies, and many other kinds of visual details.

I think you may be on to something there. I'm going to give this a long hard think.
 

nib95

Banned
Great thread OP. It's common sense and scientifically evident that there is a noticeable difference between 1080p and 720p. I agree, that it can even affect gameplay, which is all the more reason that some of the media response has not only been perplexing, but down right outrageous.

Unless you game on a smaller screen at an unusually long viewing distance, if you have normal eye sight, you should notice the differences. Enough to break the lower resolution version of the game? Absolutely not. But it's enough to make one version of the game tangibly better looking and more enjoyable to play. That is after all one of the main reasons we're buying these expensive new consoles, for better graphics and tech.
 

nib95

Banned
Live action sources have no aliasing. They have essentially infinite polygons and anti aliasing.

Computer generated movies are massively supersampled (probably 8k or more) with loads of AA so you see no polygon edges.

Videogames are rendered at a relatively small resolution and don't benefit from supersampling. It's easier to pick out aliased edges, shimmering, and all the other things that plague image quality.

This image is 1080p, yet it looks amazing. Why? It's massively downsampled so it doesn't suffer from image quality issues. Nothing rendered at 1080p native will ever even come close to this image quality (not the models/lighting).
ibrz8lvDx5bIla.jpg


I'm sure you will ignore it though as you have been trying to justify to yourself that resolution doesn't matter, though.

Great post dude.
 

SeanR1221

Member
Live action sources have no aliasing. They have essentially infinite polygons and anti aliasing.

Computer generated movies are massively supersampled (probably 8k or more) with loads of AA so you see no polygon edges.

Videogames are rendered at a relatively small resolution and don't benefit from supersampling. It's easier to pick out aliased edges, shimmering, and all the other things that plague image quality.

This image is 1080p, yet it looks amazing. Why? It's massively downsampled so it doesn't suffer from image quality issues. Nothing rendered at 1080p native will ever even come close to this image quality (not the models/lighting).
ibrz8lvDx5bIla.jpg


I'm sure you will ignore it though as you have been trying to justify to yourself that resolution doesn't matter, though.

Thank you for this post
 
Live action sources have no aliasing. They have essentially infinite polygons and anti aliasing.

Computer generated movies are massively supersampled (probably 8k or more) with loads of AA so you see no polygon edges.

Videogames are rendered at a relatively small resolution and don't benefit from supersampling. It's easier to pick out aliased edges, shimmering, and all the other things that plague image quality.

This image is 1080p, yet it looks amazing. Why? It's massively downsampled so it doesn't suffer from image quality issues. Nothing rendered at 1080p native will ever even come close to this image quality (not the models/lighting).
http://i.minus.com/ibrz8lvDx5bIla.jpg[IMG]

I'm sure you will ignore it though as you have been trying to justify to yourself that resolution doesn't matter, though.[/QUOTE]

[quote="HTupolev, post: 91144955"]Sampling.

Antialiasing/postprocessing/stuff aside, in a game, the colour of a pixel is based only on the colour of the point at the very center of that pixel.

When you're working with a camera, this isn't the case. Whether it's film or digital, each "grain" or "pixel" in the camera gets bombarded by a bajillion photons while a frame is being captured. The result is that the colour of a grain/pixel is actually an averaging of the colour over the on-screen area that ultimately gets taken up by the grain/pixel.
Obviously CGI doesn't have the benefit of tons of photons, but it can still approximate the result by supersampling. Rather than render the image at raw 1080p the way a game will, a CGI studio might render the image at a MUCH higher resolution; that way, the colour of a pixel in the final 1080p image can be created by averaging the colours of all the pixels in the high-resolution render that fall within that pixel.

By sampling sufficiently, CGI and live-action film avoids aliasing.
If there's a thin object in a game and the camera is moving, that thin object can actually shimmer in and out of existance from one frame to the next as pixel centers fall on and off the object. But when an image is perfectly supersampled, that object is always accounted for; even if the pixel center doesn't land on it, subpixels will, and so the object will always be accurately accounted for.
Similar logic applies to jaggies, and many other kinds of visual details.

//===============

Also, the acuity charts that get thrown around on GAF aren't that great. There are various types of acuity and various ways to measure it.[/QUOTE]

And these posts explain why Okami HD is GORGEOUS.

Thanks guys.
 

RoboPlato

I'd be in the dick
The point you make about it having an effect on gameplay by making it easier to see distant enemies is a point I've mentioned quite a few times. It has a tangible benefit to gameplay, especially as games get more detailed.

I also think that side by side a lot more people will be able to tell the difference than expected. A good number of people that say they can't tell the difference probably just think they can't.
 
No they have not explained why the science does not apply to video games. They've implied that the reason is because "You won't find jagged edges or scaling artifacts in a film" but the Snellen test does not use film or video to determine the limits of visual acuity. It uses high contrast shapes and lines. Which is much more relevant to video games presentation then movies or films.

I can assure you the difference is noticeable, and I have a lot of respect for real science. You're either misinterpreting something or getting some really bad information somewhere.
 
I have to disagree. Xbox One version just works beautifully and frame rate is KING when it comes to COD games.

I've noticed far too many framerate drops on the PS4 version. They should lower the resolution to 900p and get the framerate right.

LOL...


Lowering the resolution would cause even MORE issues.

Why?

Because the jumpiness in CoD on PS4 is it being 60-90fps... it's TOO fast.
 
Absolutely.

At that distance, the difference in raw sharpness will begin to become smaller for most people, though it'll probably still be fairly visible. And differences in shimmering caused by undersampled details (lighting highlights, thin objects like power lines, etc) might still be completely visible at HUGE distances.

For instance, Halo 3 runs at 640p and has some very sharp, shimmery specular highlights. Even when I'm 50 feet from my 37" TV, a distance at which the game is nigh-unplayably hard to see, the aliasing caused by those highlights is very visible and obvious.

I find when people start quoting "distance from screen" stuff, noticeable detail, etc, they are really only referencing still images. You are absolutely correct, the real difference isn't simply in screenshots, it's in motion, where the aliasing / shimmering is infinitely more pronounced.
 
Top Bottom