• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry to stop counting Pixels in 2022.

Termite

Member
I don't think they meant to, but they're kind of admitting that their site's whole purpose is a lot less relevant in 2022.

They started in 2004, I remember reading them in 2007 on Eurogamer analyzing Crysis and Assassin's Creed etc and it was fascinating. And they got super popular back in 2013 when the PS4 and Xbox One launched - because it really mattered which console was more powerful and there were clear differences in resolution difference noticeable to the naked eye. I remember their Battlefield comparison so clearly and thinking "Yep, PS4 for me." Now, it doesn't matter to purchasing decisions at all, it's just an arguing point for nerds.

Perhaps why there's more game review editorializing in their content these days as well.
 
Last edited:

GHG

Member
Just as things were about to get spicy.

At least other technical outlets will still do their job.
 

SlimySnake

Member
Damn.. Is that actually true? How low was it?
They updated their article with some new pixel counts. Initially Alex said it was below 720p, but some time since the article and video went up, they updated the article with the 533p figure saying it significantly goes below 533p at times.

At the entry level, it's incredible to see Xbox Series S deliver this at all but it does so fairly effectively, albeit with some very chunky artefacts. Here, the reconstruction target is 1080p, but 1555x648 looks to be the max native rendering resolution in letterboxed content with some pixel counts significantly below 533p too.

I have no idea why they are shying away from numbers below 533p when they said they would still continue to count switch resolutions that hit this low of a pixel count. Who says below 533p anyway? Why cant you just give an actual minimum pixel count?

 

GHG

Member
They updated their article with some new pixel counts. Initially Alex said it was below 720p, but some time since the article and video went up, they updated the article with the 533p figure saying it significantly goes below 533p at times.



I have no idea why they are shying away from numbers below 533p when they said they would still continue to count switch resolutions that hit this low of a pixel count. Who says below 533p anyway? Why cant you just give an actual minimum pixel count?


To be fair, in light of this information I'd stop counting pixels too.
 

Elios83

Member
I don't get how they're supposed to pull this off.
Comparisons without actual metrics based on what? Perceptions? Feelings? 400% zooms?
 

Deerock71

Member
democratic national convention dnc GIF by Election 2016
 
But the quote literally says it looks chunky. I think if the game's pixels start to look chunky, it's time to pull out the old Protractor.
It's going to be Peanut Butter from here on out. Either Smooth or Chunky. No need to count the nuts anymore. One is crunchy the other isn't.
 
Last edited:

dcmk7

Member
They updated their article with some new pixel counts. Initially Alex said it was below 720p, but some time since the article and video went up, they updated the article with the 533p figure saying it significantly goes below 533p at times.



I have no idea why they are shying away from numbers below 533p when they said they would still continue to count switch resolutions that hit this low of a pixel count. Who says below 533p anyway? Why cant you just give an actual minimum pixel count?

It doesn't make much sense to not share the number unless they are a bit unsure.. which to be honest why would they be.

I know we shouldn't form opinions on performance from what is a technical demo, but considering the extensive optimisations that were necessary for Series S, to even get the thing to run. Was expecting something around 720p. Not ~300/400 levels.
 

ReBurn

Gold Member
If they move from a technical to a more subjective analysis then there is no point in watching their content.
As the newer image reconstruction methods gain wider use on consoles with the move to current gen only games it's going to become harder for them to count pixels, anyway. We're going to have to rely on 2,000% zoom more than finding countable edges.
 
Last edited:

SlimySnake

Member
I don't get how they're supposed to pull this off.
Comparisons without actual metrics based on what? Perceptions? Feelings? 400% zooms?
Average FPS. If the game's visuals look identical then there is really no need to do anything else. Nowadays they are given graphics settings by devs themselves that proof the graphics settings between the two consoles is identical. So at that point, you have to simply look at the game side by side and see if the resolution makes a difference. If not, then go straight into framerate analysis.

I remember when the PS4 Pro ports first came out, both Richard and Alex looked at games and said that if you dont zoom in, you cant tell the difference between native 4k and 4kcb in Watchdogs. Cerny had invited Richard to look at the two TVs displaying Days Gone side by side and only after staring at the two was he able to tell the difference. So if an analyst has to look that hard then whats the point? They literally have to use cutscene cuts to even do the pixel counts. They cant just take the screenshot during gameplay and pixel count it because 4kcb resolves the entire 4k frame budget, but on cuts during cutscenes, the first frame shows some artifacts which is what they have been using this whole time to pixel count. Now who is going to notice 1 out of 30 or 60 frames a second?
 
This means that they can spend more time talking about a game's technical achievements and artistic choices rather than counting and comparing everything. That sounds good to me.
 

SlimySnake

Member
It doesn't make much sense to not share the number unless they are a bit unsure.. which to be honest why would they be.

I know we shouldn't form opinions on performance from what is a technical demo, but considering the extensive optimisations that were necessary for Series S, to even get the thing to run. Was expecting something around 720p. Not ~300/400 levels.
Well, the tflops difference between the XSX and XSS is 3:1 right? But the resolution difference in pretty much every game has been 4:1. 4k vs 1080p. So If the XSX is 1080p or 2.1 million pixels, the XSS would be around 0.5 million pixels or 560p. However, we have seen several games downgrade graphics settings on the xss aside from dropping the resolution so clearly the difference in GPU performance is more than 4:1. Guardians is 1080p on the XSS with severely paired back foliage and other detail while its native 4k on the XSX. I am not surprised it is dipping below 0.5 million pixels.

The question is just what does significantly mean when you are already under 533p. And why after looking at such a low polycount do they believe it is sensible to stop counting pixels. Especially when the XSS is posting record sales. Shouldnt the consumers know that this thing might be dipping as low as 480p? Especially since the upscaling technique introduces chunky artifacts?

I remember Yoshi and other UE engine games dropping to 360p-480p on the Switch in handheld mode, and I thought that was unacceptable. Clearly John did as well thats why he carved an exception for the Switch pixel counting.
 
Last edited:

ACESHIGH

Member
I remember when 900p vs 1080p was seen as a night and day difference by DF and fanboys. Funny how the narrative has changed. I do agree that with all these recontruction techniques it gets tricky but raw pixel count should still be a valid metric.
 

DeepEnigma

Gold Member
They updated their article with some new pixel counts. Initially Alex said it was below 720p, but some time since the article and video went up, they updated the article with the 533p figure saying it significantly goes below 533p at times.



I have no idea why they are shying away from numbers below 533p when they said they would still continue to count switch resolutions that hit this low of a pixel count. Who says below 533p anyway? Why cant you just give an actual minimum pixel count?

 
Last edited:

ZehDon

Member
Actually quantifying image clarity, and final image quality, without using the resolution metric will be an interesting challenge to see resolved.
 

Iamborghini

Member
I think it’s a good idea.

Sometimes the temporally upscaled 4K is better than native TAA 4K. (Looking at you Control) At the end of the day what’s matters is the final output, the temporal stability and the crispness. So I’m ok with that.
 

Elios83

Member
Average FPS. If the game's visuals look identical then there is really no need to do anything else. Nowadays they are given graphics settings by devs themselves that proof the graphics settings between the two consoles is identical. So at that point, you have to simply look at the game side by side and see if the resolution makes a difference. If not, then go straight into framerate analysis.

I remember when the PS4 Pro ports first came out, both Richard and Alex looked at games and said that if you dont zoom in, you cant tell the difference between native 4k and 4kcb in Watchdogs. Cerny had invited Richard to look at the two TVs displaying Days Gone side by side and only after staring at the two was he able to tell the difference. So if an analyst has to look that hard then whats the point? They literally have to use cutscene cuts to even do the pixel counts. They cant just take the screenshot during gameplay and pixel count it because 4kcb resolves the entire 4k frame budget, but on cuts during cutscenes, the first frame shows some artifacts which is what they have been using this whole time to pixel count. Now who is going to notice 1 out of 30 or 60 frames a second?

We all know that games on the new consoles like PS5 and XSX look the same.
But objective metrics are still needed in a technical analysis otherwise they can be labeled as simple opinions.
Also if Switch games will continue to be pixel counted because of low resolution so should the XSS games.

Frame rate also needs the context of the resolution the game is running on.
And it also means that they should do like VGtech where pretty much identical sequences are analyzed with objective statistics provided for the whole sequences.
But so far they haven't done that and they have usually provided general claims with a lot of VRR advocating when talking about frame rate.

Also this whole thing seems to go against their interests which is getting hits through people interested in console war ammos and I doubt they really want to lose that public.
I think we don't know the whole story of what they're planning yet...
 
Last edited:

Ezekiel_

Member
If they want to improve their face-off videos, they should figure out a way to have the input from one controller be transmitted to 3 consoles, so we could have like for like scene comparaison with a matching frametime graph.
 

chonga

Member
Wrong to cease counting.

It is right to highlight the final output matters most, but how you get there is what matters when you're whole purpose is to post a technical analysis.

Consider that two versions of a game look identical, but one has a native resolution that is much lower than the other. That's a technical feat of the lower resolution version. And that is interesting from a technical standpoint, which is the whole point of the videos.
 

Buggy Loop

Member
With dynamic scaling, TAA, upcoming AI upscale/downscale/sidescale/etc, it’s almost useless now to count pixels.

But,

Fucking hammer on devs for frame times
 

NickFire

Member
Jokes aside, isn't this going to flame the console war drama way more than keeping the pixel counts? Even with objective metrics, look what happens when they compare games.
 

DenchDeckard

Gold Member
To be fair the consoles are that close you’re fine with whichever console you get it on. Once ps5 has VRR it’s over as nothing will matter if you have a vrr display.
 

Sosokrates

Founder of western console warring.
There nothing wrong with saying XSS goes to a native 533p or whatever resolution it is and explaining it uses whatever reconstruction technique to achive a higher pixel count, The problem arises when someone is like, "XSS is 533p lolz"

It aint. The player does not see 533p, it looks more like 900-1080p

Same goes with games like returnal, that has a native res of 1080p but looks more like 1440p thanks to reconstruction techniques.
 
Last edited:

bender

Candy Corn Aficionado
Being more subjective isn't why I watch their content. Presenting facts and letting people decide what is important to them seems like the most beneficial thing. Worry about platform warriors is ridiculous.
 

ZywyPL

Gold Member
No surprise given how they complained the whole year about how hard it got to count the pixels when all the edges are so blended/blurred due to all the recent image processing tech.

Still, native 4K is a class of its own with only DLSS being almost on par with it, but other techniques fell short with easily noticeable lower res by that soft/blurred image, no need to pixel count to know that something's wrong.

Best part about pixel wars is how DF, NX and VG Tech all get different res results.

Given how popular dynamic-res has become, no wonder.
 
Last edited:

ckaneo

Member
Being more subjective isn't why I watch their content. Presenting facts and letting people decide what is important to them seems like the most beneficial thing. Worry about platform warriors is ridiculous.
The issue here is due to reconstruction and dynamic ranges what they present are hardly facts. They are good estimates
 

Riky

My little VRR pleasure pearl goes vrrrooommm.
There nothing wrong with saying XSS goes to a native 533p or whatever resolution it is and explaining it uses whatever reconstruction technique to achive a higher pixel count, The problem arises when someone is like, "XSS is 533p lolz"

It aint. The player does not see 533p, it looks more like 900-1080p

Same goes with games like returnal, that has a native res of 1080p but looks more like 1440p thanks to reconstruction techniques.

The average pixel count would be more useful than giving highs and lows, what does the player actually normally see through a run of play.
The lows can be so brief that you blink and miss them, same with the highs.
Just an average figure would be fine and then detail what techniques are used to improve the image and how it looks.
 

ckaneo

Member
But would they feel this way if Series X would have shown a big difference like they was expecting?
It's way to early in the generation to say anything about the differences between the two.

Besides, spec wise this is a close generation, maybe the closest the generation has ever been so why exactly would they expect a big difference?
 

Edgelord79

Gold Member
It's the right thing to do but their timing is very convenient. Resolution mattered so much for them from 2017 to 2020 (during the mid-gen years) and now suddenly for that generation it doesn't matter?
Onset of machine learning and super sampling techniques like DLSS are making pixel counting and native resolution irrelevant.
 
Top Bottom