nosferatu93
Banned
Time to unsubscribe
retire?Nah it will evolve with Richards new techniques of comparison
No, but it aint only the only reason.
They updated their article with some new pixel counts. Initially Alex said it was below 720p, but some time since the article and video went up, they updated the article with the 533p figure saying it significantly goes below 533p at times.Damn.. Is that actually true? How low was it?
At the entry level, it's incredible to see Xbox Series S deliver this at all but it does so fairly effectively, albeit with some very chunky artefacts. Here, the reconstruction target is 1080p, but 1555x648 looks to be the max native rendering resolution in letterboxed content with some pixel counts significantly below 533p too.
They updated their article with some new pixel counts. Initially Alex said it was below 720p, but some time since the article and video went up, they updated the article with the 533p figure saying it significantly goes below 533p at times.
I have no idea why they are shying away from numbers below 533p when they said they would still continue to count switch resolutions that hit this low of a pixel count. Who says below 533p anyway? Why cant you just give an actual minimum pixel count?
But the quote literally says it looks chunky. I think if the game's pixels start to look chunky, it's time to pull out the old Protractor.To be fair, in light of this information I'd stop counting pixels too.
It's going to be Peanut Butter from here on out. Either Smooth or Chunky. No need to count the nuts anymore. One is crunchy the other isn't.But the quote literally says it looks chunky. I think if the game's pixels start to look chunky, it's time to pull out the old Protractor.
It doesn't make much sense to not share the number unless they are a bit unsure.. which to be honest why would they be.They updated their article with some new pixel counts. Initially Alex said it was below 720p, but some time since the article and video went up, they updated the article with the 533p figure saying it significantly goes below 533p at times.
I have no idea why they are shying away from numbers below 533p when they said they would still continue to count switch resolutions that hit this low of a pixel count. Who says below 533p anyway? Why cant you just give an actual minimum pixel count?
As the newer image reconstruction methods gain wider use on consoles with the move to current gen only games it's going to become harder for them to count pixels, anyway. We're going to have to rely on 2,000% zoom more than finding countable edges.If they move from a technical to a more subjective analysis then there is no point in watching their content.
Average FPS. If the game's visuals look identical then there is really no need to do anything else. Nowadays they are given graphics settings by devs themselves that proof the graphics settings between the two consoles is identical. So at that point, you have to simply look at the game side by side and see if the resolution makes a difference. If not, then go straight into framerate analysis.I don't get how they're supposed to pull this off.
Comparisons without actual metrics based on what? Perceptions? Feelings? 400% zooms?
Close topic...that was great...LOLLifeconsole wars always finds a way
This means that they can spend more time talking about a game's technical achievements and artistic choices rather than counting and comparing everything. That sounds good to me.
Well, the tflops difference between the XSX and XSS is 3:1 right? But the resolution difference in pretty much every game has been 4:1. 4k vs 1080p. So If the XSX is 1080p or 2.1 million pixels, the XSS would be around 0.5 million pixels or 560p. However, we have seen several games downgrade graphics settings on the xss aside from dropping the resolution so clearly the difference in GPU performance is more than 4:1. Guardians is 1080p on the XSS with severely paired back foliage and other detail while its native 4k on the XSX. I am not surprised it is dipping below 0.5 million pixels.It doesn't make much sense to not share the number unless they are a bit unsure.. which to be honest why would they be.
I know we shouldn't form opinions on performance from what is a technical demo, but considering the extensive optimisations that were necessary for Series S, to even get the thing to run. Was expecting something around 720p. Not ~300/400 levels.
They updated their article with some new pixel counts. Initially Alex said it was below 720p, but some time since the article and video went up, they updated the article with the 533p figure saying it significantly goes below 533p at times.
I have no idea why they are shying away from numbers below 533p when they said they would still continue to count switch resolutions that hit this low of a pixel count. Who says below 533p anyway? Why cant you just give an actual minimum pixel count?
Richard's new year resolution
Cracks me up.
Average FPS. If the game's visuals look identical then there is really no need to do anything else. Nowadays they are given graphics settings by devs themselves that proof the graphics settings between the two consoles is identical. So at that point, you have to simply look at the game side by side and see if the resolution makes a difference. If not, then go straight into framerate analysis.
I remember when the PS4 Pro ports first came out, both Richard and Alex looked at games and said that if you dont zoom in, you cant tell the difference between native 4k and 4kcb in Watchdogs. Cerny had invited Richard to look at the two TVs displaying Days Gone side by side and only after staring at the two was he able to tell the difference. So if an analyst has to look that hard then whats the point? They literally have to use cutscene cuts to even do the pixel counts. They cant just take the screenshot during gameplay and pixel count it because 4kcb resolves the entire 4k frame budget, but on cuts during cutscenes, the first frame shows some artifacts which is what they have been using this whole time to pixel count. Now who is going to notice 1 out of 30 or 60 frames a second?
Best part about pixel wars is how DF, NX and VG Tech all get different res results.
The issue here is due to reconstruction and dynamic ranges what they present are hardly facts. They are good estimatesBeing more subjective isn't why I watch their content. Presenting facts and letting people decide what is important to them seems like the most beneficial thing. Worry about platform warriors is ridiculous.
There nothing wrong with saying XSS goes to a native 533p or whatever resolution it is and explaining it uses whatever reconstruction technique to achive a higher pixel count, The problem arises when someone is like, "XSS is 533p lolz"
It aint. The player does not see 533p, it looks more like 900-1080p
Same goes with games like returnal, that has a native res of 1080p but looks more like 1440p thanks to reconstruction techniques.
It's way to early in the generation to say anything about the differences between the two.But would they feel this way if Series X would have shown a big difference like they was expecting?
Onset of machine learning and super sampling techniques like DLSS are making pixel counting and native resolution irrelevant.It's the right thing to do but their timing is very convenient. Resolution mattered so much for them from 2017 to 2020 (during the mid-gen years) and now suddenly for that generation it doesn't matter?