First of all, this is going to be long. It's also probably going to say a lot of things you already know. GAF is generally full of tech-savvy folks, and this thread has more of that than usual. So I know you already know about resolution and pixels and frame rates. Please don't take it as patronizing when I spell it all out. I'm just trying to "show my work", as it were.
Second, I want to apologize again for anything I said in this thread that anyone believes was disengenuous. I admit now that I wasn't as informed as I thought I was. I made a lot of assumptions based on past experience and things I read without doing my own research and introspection.
Last thing before I begin: I'm going to go ahead and say I still don't think there is evidence of a conspiracy or coverup in the games press. Speculate all you wish, but I think the real issue here is just a lack of real reporting. I'm sorry to have contributed to that for the brief time I did. Whether or not you believe the games press is trustworthy and has your best interests in mind (and I believe that in general they do), the fact that you feel like you can't trust a member of the press or "games journalism" as a whole is important, and it's not fair for me or anyone else to downplay that. If you think something in the industry stinks, it's clear that the industry needs to address it. If, after receiving an explanation, you still think something is fishy, that is your prerogative. All I or any other member of the press can do is present our case. We can't make you believe.
So, all of that said... I have made some interesting observations about resolution and performance. I'm sure they're observations that have been made before, so I don't want this to look like I'm taking credit for any groundbreaking research or anything. Much of it was sparked by posts in this thread that brought up interesting points. But it was important for me to find these answers myself rather than relying on anyone or anything else. My methods have attempted to use math and science in the best way I know how, but I'm not an A/V expert, computer scientist, or developer. These methods aren't perfect, but I think they can demonstrate the difference in resolution and performance between the PlayStation 4 and the Xbox One that the average person can expect. Here it goes...
Let's start with some simple undeniable facts.
1. The Xbox One and PlayStation 4 are very similar in terms of the actual hardware they use in their machines. They are both essentially custom x86 PCs. There are differences, however, not all of which are known to the general public.
2. The Xbox One and PlayStation 4 can both output native 1080p (1920x1080 pixels with progressive scan), which is considered full HD.
3. The Xbox One version of Battlefield 4 runs at 720p upscaled to 1080p on a supported display. The PlayStation 4 version of the game runs at 900p. (This resolution isn't a commonly-referenced one, but it's 1600x900 pixels.)
4. The Xbox One version of Call of Duty: Ghosts runs at 720p upscaled to 1080p on a supported display. The PlayStation 4 version of the same game runs natively at 1080p.
Resolution measures the number of pixels in a video stream or display. It is noted by horizontal times vertical the number of columns of pixels times the number of rows. Figuring out the exact number of pixels in a video stream or display is as simple as multiplication. Therefore, the number of pixels in a 1080p video (1920x1080) is 2,073,600. The number of pixels in a 720p video is only 921,600.
And here's where the problem begins. If you weren't an expert you may assume that the difference between 720p and 1080p is only 360p. That's about 33%. A significant difference, sure. But how large of a difference? However, the total pixel count proves that the difference is actually about 55%, meaning the number of pixels in 1080p is more than double that of 720p.
"Double" is a number people tend to understand. Saying something is twice as fast, twice as big, or twice as good has weight. That's why people seem to care so much that the PlayStation 4 outputs twice the number of pixels as the Xbox One for a particular game. It's not insignifcant. On paper, anyway.
As far as upscaling goes, it's little more than resizing a smaller image to fit a bigger display. That means that in order to show a 720p video on a 1080p display, the device has to blow up that image. It's not perfect, and upscaling can introduce visual artifacts or make existing visual flaws more apparent. As an example, if you took a 100x100 pixel image (like your GAF avatar) and used an image editor to blow it up to 200x200, you could still make out what the image is, but it would be more blurry.
So far, I haven't told you anything you don't already know or you couldn't work out yourself with simple math. But here's the question: what do we do with the knowledge that the PS4 version of Ghosts has double the pixels of the Xbox One version? We could just put the discussion to rest and say that the PS4 is twice as powerful as the Xbox One. But having seen how similar the specs of both systems are, that doesn't seem to add up. So where's the discrepancy?
(As a side note, I believe that very question "Why doesn't this add up?" is what has led several press folks to post the op-ed pieces they have. My concern after looking into this issue more is that they didn't do enough research and testing.)
The approach that I've taken with this issue even up until last night sounds like a reasonable one at first don't make any snap judgements or false conclusions without having all the facts. And since we don't have either system in our hands, we can't put them through their full paces. While I still believe that it's not wise to shoot first and ask questions later, one of the biggest flaws in that argument is that we don't have to completely withhold analysis right now. Even though we don't have the full picture, we can gather all the information we have, do real-world tests, and figure out what we do know.
So we're starting to get to the crux of the matter, but there's still one more factor to consider: much of the games press who is suggesting that there is not much visual difference between multiplatform Xbox One and PS4 games are doing so based on comparison captures from Battlefield 4, NOT Ghosts. The difference between those two versions is much smaller only about 500,000 pixels instead of 1 million. So the only evidence that Kyle Orland and others have that 720p is "not that different" from 1080p is not even taking 1080p footage or screenshots into account. That's problem one.
Here's the second problem, and after looking into it, I'm honestly baffled and embarrassed with all the suggestions about how most people probably won't notice the difference between 720p and 1080p, I can't find any evidence that anyone claiming the difference is negligible has actually tested it. It's all based on assumptions and bad math. Even I fell prey to this. What's more confusing is that this isn't that difficult of a test to perform. So I did.
Based on the latest information I was able to find, I believe that it's safe to assume that the average TV in America is about 46" and the average viewing distance is somewhere around 12-14 feet. (Of course, your situation is probably not identical. That's why these are averages.) There are a handful of viewing distance charts you can find online that purport to show the distance at which you'll be able to tell the difference between resolutions at a given screen size. Kyle Orland's article includes one, and according to that one, the average TV owner should only "need" 720p.
Based on the numbers I pulled, that 720p figure seemed a little suspect to me, so I decided to test it as closely as possible with my own setup. So I tried some different games at different resolutions on my PC at an equivalent viewing distance based on my screen size. Again, not perfect, but the math is correct so it should be pretty close. What I found was that at the US average distance and screen size, I could pretty easily tell the difference between 1080p and 720p. However, it's not quite as clear cut as the numbers would suggest. Even though there are twice the number of pixels, would I call the games in 1080p "twice as good"? No. It's completely subjective and impossible to put a number on, but jaggies were much more apparent at the lower resolution, and things just seemed seemed a little muddy.
One important thing to consider here is that this obviously wasn't tested with Xbox One's upscaler because I don't have access to that console. No one does. We don't know how good or bad it is. But no matter its quality, I now believe based on my own tests that the difference between 720p and 1080p should be noticeable almost immediately to attuned eyes, and even to the untrained or non-gamer if they know what to look for.
So now that I've talked resolution absolutely to death, I'm going to wipe the slate clean and say this: the biggest issue here isn't really about resolution at all. The resolution is an indicator of larger problems. Some others in this thread have hinted at it. I'm going to attempt to demonstrate the difference in a more concrete way. This is where it gets dicey, but bear with me...
Let's go back to Ghosts for this part. Infinity Ward have made it very clear that the decisions they've made in resolution on each platform are so they could hit the target frame rate of 60 frames per second that the Call of Duty series is known for. As far as I am aware, we don't have known-good high-quality captures of the game on either next-gen system, so for the sake of this argument let's just assume that the game runs at the exact same video quality on each machine, with the only difference being resolution.
We know that there are a wide variety of resolution options available to developers to pump their game out. This is made pretty clear based on BF4 targeting 900p on PS4. I am not a developer, but my assumption here is that if Infinity Ward could get Ghosts running at a reliable 60 frames per second on Xbox One at 1080p or 900p or any other resolution, they would do it. But what they chose was 720p. Based on what we covered about total pixel count above, this choice suggests (but not proves) that the PlayStation 4 could actually be as much as 55% more powerful than the Xbox One in practice.
Since we can't run real benchmarks on both of the systems right now, and we won't be able to even when we get them in our hands, we don't have any way of knowing that number exactly. But if we assume that IW pushed Ghosts to the maximum allowable resolution they could while still maintaining an average 60 frames per second, the best difference I can calculate between the two console's graphics performance would roughly equate to the difference betwen a GeForce GTX 680 (PS4) and a GTX 560 (Xbox One). The former is a high-end $500 card that's a little over a year old, the latter came out almost three years ago and costs around $100 today. That's not nearly as close as I once believed.
If Kotaku's article is to be believed (and I don't have any reason to question it), developers are torn on the issue. Some say it's just a matter of the Xbox One SDKs not being as robust right now, that they might catch up in the future. Others suggest the problems are more ingrained and systemic and they might never get better. We don't know which is true. We can only make decisions based on the information we have today. For some consumers, by the time the Xbox One is released, it could be too late to make an informed decision.
So, based on the information I've found through research and my own personal testing, I have to believe that the PS4 is a more powerful machine by far, and that it's the machine that anyone who cares about visual fidelity should go for. The price difference and Microsoft's hostile attitude towards consumers this generation have made the choice easier for many people. But even if those weren't a factor, even if these machines cost the exact same and existed in a vacuum separate from their respective company's rhetoric, the choice seems pretty clear. Unless you are swayed by Xbox One's exclusive titles, its controller, or its online community, the PS4 is the clear choice.
Don't get me wrong here: I want the Xbox One to be a good console. Competition is great for both sides, and I don't wish ill on anyone who has chosen to buy the Xbox One. Furthermore, I want to play Titanfall and Dead Rising 3 and Halo at 1080p, 60 frames a second. I hope its performance improves. But I don't feel like I can bank on that today.
I was wrong to argue that the peformance difference was negligible. I was wrong to dismiss those who felt like they were being betrayed. I was wrong to not question the opinionated conclusions of others in the press who apparently haven't tested this issue themselves in a real-world scenario.
Truce?