• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

How can we tell what resolution a game is rendering?

Bishop89

Member
I'm not sure if this is sarcastic, but if not:

What your TV tells you is the resolution of the signal going into it. However, even if the signal exiting the console is 1080p, the game can be rendering at a lower resolution. That results, generally speaking, in a blurrier game even though your TV claims it's the same type of input.

Thanks for educating me.
Time to smack my tv for lying to me all these years!
 
Ok, this is pretty damn cool and not that hard to do by myself. Thanks Liabe Brave and all the other counters :)

The Quaz51 (whoever started this) pixel count theorem should be a thing.
 

shandy706

Member
What about the latest build of Ryse. Isn't the example used from the old build (before they redid the LOD, Lighting, etc.)

We know it's said to be 900p....do the new videos show this?

Is Crytek some kind of Wizard that makes it difficult to tell?
 

sono

Gold Member
Very interesting thread.

I also learned about dynamic resolution which sounds like a good technique
 

AndyD

aka andydumi
Writing a program that does this automatically doesn't seem impossible at all. While the program would be stupid enough to have issues with antialised edges and motion blur, it could also count literally every single line in the picture, so if you had a way to see if a line was antialiased or blurred you could check the most certain lines in the picture and guess based on that.

I have to learn the PNG file format anyhow so I'll check this out.

It would not be impossible but tedious to check the whole image. Far easier would be something to count pre-defined areas. Essentially we can tell it where to count just as we identify areas to manually count. That would eliminate a huge workload that is redundant at best and wasted at worst. And if you feed it a good 5-7 areas to work on, you make sure you get more accurate readings.
 

EvB

Member
Very interesting thread.

I also learned about dynamic resolution which sounds like a good technique

it is, and is something hopefully we should see more of to maintain a decent framerate.

Both Rage and Wipeout use it if you want to see how it's used to maintain frame rate
 
It would not be impossible but tedious to check the whole image. Far easier would be something to count pre-defined areas. Essentially we can tell it where to count just as we identify areas to manually count. That would eliminate a huge workload that is redundant at best and wasted at worst. And if you feed it a good 5-7 areas to work on, you make sure you get more accurate readings.

I'm not very technically minded but that was my general thought when I asked about it originally

In essence just create a program that mimics what we do
 

Muffdraul

Member
The only thing that matters to me is- Can I see the difference with my own two eyes?

I bought my first HDTV in 2008, a 46" Bravia. Playing Xbox 360 games on it was never an issue- the 360 displays everything at 1080p, which matches the Bravia's screen, and it always looked great to my eyes.

The PS3 was a different story. Depending on the game, sometimes it would output 720p. When it did, it would look a bit crappy, because the Bravia's built-in upscaler wasn't the best. Clearly inferior to the PS3 upscaler, which always did a great job. Even if I had to force it.

That Bravia died about a year ago, and I replaced it with a brand new one. It was only 42" and the built-in upscaler must be improved, because now when I played a 720p PS3 game on it, it looks beautiful. Maybe if I tried really hard and went out of my way to do some kind of direct comparison I could see a difference. But why bother? It looks fine to me so it's a dead issue for me.
 
The only thing that matters to me is- Can I see the difference with my own two eyes? ...Maybe if I tried really hard and went out of my way to do some kind of direct comparison I could see a difference. But why bother? It looks fine to me so it's a dead issue for me.
Of course you're correct. There are plenty of situations where resolution changes won't matter, either for technical reasons (display size, viewing distance, etc.) or because the user simply doesn't mind. Pixel counting is only about revealing numerical facts. Interpretation of those facts, or whether you care at all, are separate issues.

What about the latest build of Ryse. Isn't the example used from the old build (before they redid the LOD, Lighting, etc.)

We know it's said to be 900p....do the new videos show this?
The newest footage I know of is the Story trailer from a few days ago. Here's a few direct-feed 1080p shots from it released by Microsoft on the official Xbox site. I've outlined the elements I'll count in red.

iSpDVRL.jpg

qd960pY.jpg


Here are the two test elements for the first shot, zoomed in 6x. Remember that we use near-vertical lines to count resolution width, and near-horizontal lines to count resolution height.
L5avLrL.jpg
9fchp1v.jpg

On the first test, I count 15 steps. Resolved back down to its original size, this crop is 15 pixels wide. On the second test, I count 19 steps, and the original size is 19 pixels tall. So these tests indicate native rendering at the same resolution as the originating screenshot, which is 1920x1080.

To double-check, let's count elements from the other shot. Because they're from shallower angles, I've only zoomed in 4x this time.
8ByQLSH.jpg
BgYZDSS.jpg

The first test is a bit low-contrast since the shield isn't in the plane of focus, but I count 16 steps; the original crop is 16 pixels wide. On the second test I count 31 steps, and the original resolution is 31 pixels high. So these tests also confirm native rendering at 1920x1080.

Since we know from Microsoft and Crytek themselves that Ryse renders at 900p, these results are anomalous. Barring error by me--and please feel free to reconfirm all my counts--it appears that one of two things is going on. The first possibility is that the situation is exactly like the Combat Vidoc: this output doesn't come from One, but as a target render from a PC, at higher resolution than the game will end up.

However, the trailer doesn't have the "not representative" warning on it, suggesting that this is really supposed to be One output. Given that, there's another possibility I think more plausible. Note that all the screenshots Microsoft released are from cinematics, not gameplay segments. So perhaps the trailer does come from One, but the cinematics in Ryse aren't real-time but instead prerendered, at a higher resolution than the gameplay (and possibly with extra effects enabled). This is not an unusual approach, and it would also explain why some of the scenes in the trailer seem sharper than others.

Of course, the best way to test this would be to pixel count some of the gameplay segments in the trailer to compare. Unfortunately, there's a problem. The version embedded on the Xbox site is only 360p, and there's no version at all on the official Xbox Youtube channel. There are supposed 1080p versions unofficially on Youtube, but they're very blurry either from upscaling or simply compression artifacts. Despite multiple attempts on several grabs, I found counting impossible. (You're welcome to try yourself, of course.)

So for now, it appears Crytek and Microsoft have still not released any footage of Ryse provably running at its actual resolution. At the very least, we can say that gameplay will definitely not look as good as the cutscene shots above (though those shots might appear in the game as video).
 

Binabik15

Member
If you're right that would be really sly
and dickish
to hide behind screens tgat are actually rendered at 1080 and hide the upscaled stuff you'll actually see during gameplay in low res videos,making it impossible to judge how good it works.
 
If you're right that would be really sly
and dickish
to hide behind screens tgat are actually rendered at 1080 and hide the upscaled stuff you'll actually see during gameplay in low res videos,making it impossible to judge how good it works.

Did GTA V's graphics match the trailers exactly?
 

velociraptor

Junior Member
Of course you're correct. There are plenty of situations where resolution changes won't matter, either for technical reasons (display size, viewing distance, etc.) or because the user simply doesn't mind. Pixel counting is only about revealing numerical facts. Interpretation of those facts, or whether you care at all, are separate issues.


The newest footage I know of is the Story trailer from a few days ago. Here's a few direct-feed 1080p shots from it released by Microsoft on the official Xbox site. I've outlined the elements I'll count in red.

iSpDVRL.jpg

qd960pY.jpg


Here are the two test elements for the first shot, zoomed in 6x. Remember that we use near-vertical lines to count resolution width, and near-horizontal lines to count resolution height.
L5avLrL.jpg
9fchp1v.jpg

On the first test, I count 15 steps. Resolved back down to its original size, this crop is 15 pixels wide. On the second test, I count 19 steps, and the original size is 19 pixels tall. So these tests indicate native rendering at the same resolution as the originating screenshot, which is 1920x1080.

To double-check, let's count elements from the other shot. Because they're from shallower angles, I've only zoomed in 4x this time.
8ByQLSH.jpg
BgYZDSS.jpg

The first test is a bit low-contrast since the shield isn't in the plane of focus, but I count 16 steps; the original crop is 16 pixels wide. On the second test I count 31 steps, and the original resolution is 31 pixels high. So these tests also confirm native rendering at 1920x1080.

Since we know from Microsoft and Crytek themselves that Ryse renders at 900p, these results are anomalous. Barring error by me--and please feel free to reconfirm all my counts--it appears that one of two things is going on. The first possibility is that the situation is exactly like the Combat Vidoc: this output doesn't come from One, but as a target render from a PC, at higher resolution than the game will end up.

However, the trailer doesn't have the "not representative" warning on it, suggesting that this is really supposed to be One output. Given that, there's another possibility I think more plausible. Note that all the screenshots Microsoft released are from cinematics, not gameplay segments. So perhaps the trailer does come from One, but the cinematics in Ryse aren't real-time but instead prerendered, at a higher resolution than the gameplay (and possibly with extra effects enabled). This is not an unusual approach, and it would also explain why some of the scenes in the trailer seem sharper than others.

Of course, the best way to test this would be to pixel count some of the gameplay segments in the trailer to compare. Unfortunately, there's a problem. The version embedded on the Xbox site is only 360p, and there's no version at all on the official Xbox Youtube channel. There are supposed 1080p versions unofficially on Youtube, but they're very blurry either from upscaling or simply compression artifacts. Despite multiple attempts on several grabs, I found counting impossible. (You're welcome to try yourself, of course.)

So for now, it appears Crytek and Microsoft have still not released any footage of Ryse provably running at its actual resolution. At the very least, we can say that gameplay will definitely not look as good as the cutscene shots above (though those shots might appear in the game as video).
I don't think we have had any real direct feed footage from Crytek thus far. I would like to see how the game actually looks like running on the hardware.
 

BONKERS

Member
Ryse could be a number of things.

Good upscaling. Good AA+Good Upscaling(They did mention a few times that the scaling has really improved a lot). Or it could just be rendered natively at 1080p on a PC.

Some of the first Ryse shots released were at resolutions beyond 1080p.

And the first footage shown of the game had some aliasing issues,(Plus the old specular lighting model was pretty harsh too, which created aliasing as well.)

Aliasing issues are next to non-existent in the recent shots/footage.
 

p3tran

Banned
Ryse could be a number of things.

Good upscaling. Good AA+Good Upscaling ....
....

I really hope you are wrong on this, because if that (up)scaler invalidates our pixel counters, then this war won't be fair! what's next then? women? children?
:D
 
Ryse could be a number of things.

Good upscaling. Good AA+Good Upscaling(They did mention a few times that the scaling has really improved a lot).
Good upscaling alone will not defeat properly applied pixel counting. There are no upscaling algorithms that deliver better results than Lanczos or bicubic, and there haven't been for years. Until the development of a revolutionary new method closer to ideal sinc--and probably not even then--upscaling will not obscure itself.

As for AA, yes that makes pixel counting harder. But there are two things to keep in mind. First, the Ryse images counted above already have powerful AA, better than any current-gen game I've seen. Yet that AA didn't defeat pixel counting.

Second, even potential future achievements won't do quite what you think they will. The very best AA imaginable might well make pixel counting impossible...but the way it'd do that is by blending edges into a fog of low contrast. So the result would not be an image indistinguishable from native rendering. Instead, it'd be an image indistinguishable from native rendering that has had a blur filter applied. Antialiasing and blur are synonyms.

It is the case that as display pixel densities go up--4k, 8k, 16k, etc.--the perceivable difference will go down, as individual scaling effects will subtend smaller angles of the visual field. But issues will remain as viewers approach the display. And just by dint of how they work, digital displays can't escape the possibility of encountering subpixel geometry, even at very high pixel density. So scaling artifacts, and thus pixel counting, will always be theoretically possible. You can take heart, though, that they'll become trivial to viewer experience.
 

BONKERS

Member
Man I wonder what kind of AA Ryse is using anyways.


The AA wasn't that good in the first shown shots and footage of the game. (IMO anyway. But it still was a little better than current gen stuff.)


In their papers for the game, they developed and used SMAA T1X in the first footage. Which still left a lot of specular aliasing issues. So they may still be using SMAA T1X, but with the redone shading (Specular issues are far less an issue now and look a lot better) it could be why it looks better.
 
Ryse could be a number of things.

Good upscaling. Good AA+Good Upscaling(They did mention a few times that the scaling has really improved a lot). Or it could just be rendered natively at 1080p on a PC.

Some of the first Ryse shots released were at resolutions beyond 1080p.

And the first footage shown of the game had some aliasing issues,(Plus the old specular lighting model was pretty harsh too, which created aliasing as well.)

Aliasing issues are next to non-existent in the recent shots/footage.

When this game is released, I wonder how it will look
 
Late reply, but: I wouldn't know, GTA4 killed my desire to play another GTA game so hard that I haven't even watched any footage of 5 besides the initial unveil trailer.

GTA V is a lot clearer and looks more natural. It doesn't look like a painting and it doesn't look all blurry like GTA IV.
 

Paz

Member
Damn that's a lot of effort you have to go through because devs/pubs don't like to release information (or remotely truthful media assets), but its great to see pixel counting is alive and well in the next generation :p

I wonder if Ryse's enormous file size gives more weight to the theory that the cut-scenes are not real time but rendered from in-engine with bells & whistles, since as you pointed out it would explain why the screens & movies don't match the resolution Crytek announced the game runs at. Either that or it's just the typical media assets not actually being from the game as it runs on the target platform.

Surprised nobody has started a "Ryse footage & screens not from XBox One?" thread yet :p
 

kevlar35

Banned
Can I ask this question seriously? If it's necessary to go to this extent to tell the difference does the difference really matter?
 

Ashes

Banned
Can I ask this question seriously? If it's necessary to go to this extent to tell the difference does the difference really matter?

I think you're missing the point of the thread.

This thread is about determining the native resolution. That's it.

Arguments about whether one can tell or not tell is separate from working out - to the best of our ability- the native resolution.

And you don't need pixel counting to know there is a difference between something last gen [below hd] and this gen at 1080p hd.
 

Durante

Member
I personally would be extremely surprised if those Ryse shots up there accurately depicted what you get over your HDMI cable from XB1 during gameplay.

So is every game so far confirmed to be running at 1080p on the PS4?
No. A few are, but we really won't know for sure until we measure all of them running on a retail system.
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
Can I ask this question seriously? If it's necessary to go to this extent to tell the difference does the difference really matter?
I do a lot of video recording and livestreaming. It makes the most sense from both a quality and resource management perspective to record in the native resolution when possible, even if it takes a minute to discern what that resolution actually is.
 

televator

Member
Coincidentally I've been trying to figure out what the native res is for Telltale's The Walking Dead on console. It really annoys me that no professional review site even bothered with such a detail...

I'm not sure I fully understand this pixel counting proccess. Even if I did I do know that for real reliable results you need some real source grabs and not just random shots on the internet.
 

Izayoi

Banned
Liabe Brave, I just wanted to say that I really appreciate this thread. I've learned a lot, and it's great having a place on GAF to go for this kind of information. Thanks!
 

chemicals

Member
Are you kidding OP? You're a gaffer, you should be able to tell resolutions and frame rates just by looking at the screen.
 

thejazzking

Neo Member
Thanks for educating me.
Time to smack my tv for lying to me all these years!

This. This is why I came to GAF.

Fed up of people always thinking they're right and angrily defending a position with zero knowledge of subject matter as in most other forums/comment sections.

I like it here.
 

EvB

Member
This. This is why I came to GAF.

Fed up of people always thinking they're right and angrily defending a position with zero knowledge of subject matter as in most other forums/comment sections.

You aren't reading the right threads
 
Liabe Brave, I just wanted to say that I really appreciate this thread. I've learned a lot, and it's great having a place on GAF to go for this kind of information. Thanks!
You're quite welcome. Please keep in mind that I'm definitely not the last word when it comes to this stuff--technical GAFfers like dark10x and Durante (among others) are better resources, I'm sure. And though they post much less often, quaz51 and Al Strong are incredibly skilled and well-versed in pixel counting. If they were to disagree with my counts, then I'm probably wrong.

Are you kidding OP? You're a gaffer, you should be able to tell resolutions and frame rates just by looking at the screen.
I know you're joking, but of course laymen and not just GAFfers can indeed tell some differences in framerate and resolution just by looking at the screen. What people usually can't tell is the difference between, say, 1440x1080 and 1920x1080. Pixel counting is for finding the precise numbers; that doesn't mean only pixel counting can discern any difference.
 
I apologize for bumping my own topic, double-posting, and resurrecting a necrothread all in one go. But some of my previous posts may be in error, and I didn't want those statements to stand without a proper caveat.

My pixel counting of the most recent Ryse screenshots determined they were 1920x1080. Given the mismatch with the known rendering resolution, I suggested that the game's cinematics might be prerendered, which is a common approach. But my idea was always less plausible than it sounds. No matter how common CGI cutscenes may be in the industry generally, they are wildly at odds with the "everything realtime" philosophy of Crytek specifically. The CEO's Twitter handle is RealtimeCevat, for gosh sakes!

And then my nagging doubt was vastly exacerbated by an older July shot from one of the scenes I counted. Here it is, followed by the October shot I used before.

dPLhk5u.jpg

SE3WPx0.jpg

The scene now looks drastically better...except that it got a lot blurrier. This immediately suggests that my pixel count was wrong; the blur could be Crytek going from 1080p in the old build to upscaled 900p in the new shot.

But it's not cut-and-dried. My further recounts couldn't find the mismatch between steps and pixels that's the hallmark of upscaling, and my attempt to evaluate the AA pattern was also fruitless. Now as I've said before, I'm not an expert and failure is always an option. But when a technical powerhouse like Durante implies that the newer shots may be implausible for 900p, you listen. If my counts weren't mistaken, is there any other reason a shot could be 1080p but blurry? Yes, due to AA. Perhaps Crytek is using a very aggressive version of their AA solution, in order to produce a more cinematic image? After all, while the newer shot is blurry in comparison to the original, on its own it's no softer than most live-action film imagery.

I also found another possibile culprit: compression. Both shots look high-quality, but here's the respective Lab-color b channels of the images.

mpzssqg.png

QSjUziz.png

There's no comparison--the color information of the newer shot is radically less detailed. (This isn't as apparent in the original images because of perceptual limitations in the human visual system.) Could the difference be due to upscaling? I tested by taking the old shot, shrinking it down, then blowing it back up again to simulate upscale. Because this process is more forgiving than what happens with game rendering, rather than downscale to 900p I went all the way to qHD (540p) before coming back up. Here's a comparison of the b channels again. (I swear to you this is an animated GIF--pay close attention and you'll see.)

NnjoQxj.gif

I think it's obvious no amount of scaling could ever create the gulf in color quality between the old and new shots. That almost certainly means some very coarse compression has been applied to the image. Could that make the whole thing blurrier, and not just the color data? I can't answer from first principles, and I don't know how to test it empirically without knowing what type/amount of compression was applied in the first place.

So the possible conclusions are either:

1. The new Ryse screenshots are actually rendered at 900p, and upscale is making them soft.
2. The shots are rendered at 1080p and then a combination of AA and compression is making them soft.

I don't see knockout evidence for either statement, and I urge anyone with pixel-counting or compression-algorithm expertise to weigh in. But at the very least, my initial stance firmly siding with the second choice is unfounded. There's no solid proof these shots of Ryse were rendered at 1080p.
 

oVerde

Banned
Great update. And it seems a mystery indeed.

Amazing if some upscaler wizardry comes into fruition, would be a massive crow banquet.
 
It's quite unsatisfying to leave a question unanswered, even if the reason may simply be lack of ability on my part. Perhaps those soft, blurry edges in Ryse that I nevertheless count as native 1080p are simply defeating me, no matter that I claimed AA didn't do so. Since newer direct-feed gameplay sources have been added for Ryse, I thought I'd take another crack at it.

As my attempt began, one thing became abundantly clear: though the video and screenshots are high-quality, it's still an incredible hassle to find countable edges. Blur and noise both disrupt straight lines, and are abundantly on show in Ryse. There's very aggressive depth-of-field strongly blurring background and foreground elements; post-processing edge AA and temporal frame-blending AA to smooth everything; extremely strong motion blur effects, both for moving objects and the world when moving the camera; smoke, dust, and bloom scattering light...and that's even before the entire image is doused with film grain and upscaled! The end result is rather a lot of frames that look something like this:

24bQj.png

(That shot's made even more indistinct by the desaturated, low-contrast color story.) But seeing if pixel counting can cut through this soup is the whole point, so let's go. Here's a shot from the York footage, with the counting element outlined and then blown up 16x below it.

oVqc9.png


vLUu8.png


Correcting for the zoom, this crop is 12 pixels high. That's the easy part; how many geometry steps are we seeing? Bearing in mind that I've cropped the AA off the top right corner, but the bottom left corner is only AA, I count 10 steps here. The original shot was 1080p, so 1080*(10/12) is 900p. Even with everything working against us, we seem to have accurately determined Ryse's known upscale.

The most questionable premise, obviously, is how many steps there are. To make it clearer exactly how I determined my count, look at the below animation. (Altering your distance from the screen may change how accurate you think my count is.)

WBL03.gif


Let's do another shot, a little clearer perhaps. Again, the original on top with the counting element outlined and then zoomed in below.

zhbKW.png


v79gt.png


Here the (corrected) pixel height is 12. But this time the steps are clearer due to highlighting against a dim backdrop. I see 10, giving us 900p again by the same math as above.

Let's do it one more time for good measure, in a third locale. But this time instead of using the video I have a direct-feed screenshot from Gamersyde. In it I've outlined two elements, and then given the closeups.

LkF2e.png


XZG1O.png

ifCPd.png


The first zoom is that same count again, 10 steps and 12 pixels. The second, smaller zoom is 5 steps and 6 pixels, but the math still gives the same result: 1080*(5/6)=900. So my conclusion is that all these videos and screenshots are native 900p blown up to 1080p.

I urge you to check my work; I have no final say in this matter. But given that several counts, on different sources, all come up with the same answer, I'm more confident here than in my last Ryse analysis.

Plus, just plain watching the palace video with its larger draw distance (and thus more subpixel detail) amply shows there's an upscale. All the techniques Crytek employ do a great job of masking it, better than I would've expected. And of course you can't put a number to it this way...but an upscale of some sort is clear. For example, this close-cropped (but unscaled by me) animation of Marius shows it. At one-third speed the shadows scrabble all over, even though shadow maps in Ryse are generally high-resolution. And near the end, distant architecture at the top left shudders and shimmers in an obvious telltale of upscaling (even through the DOF blur).

MdeUZ.gif
 

KKRT00

Member
I got this from a scene from Your gif from the shield

iEjQu28f3WAzo.png


Looks exactly like Your shots from earlier footage.
 
With your vague description I couldn't find the exact frame you're talking about, but here's the same set of pixels from an extremely similar frame. You're right that this looks much like the counts I used in the prior set, one of which is repeated afterward for comparison.

qJjhS.png


OhVy5.jpg


And just like with my previous crop, my counting on your example would calculate native 1080p. But that doesn't prove what I think you want it to: that Ryse has always been 900p and Crytek are simply so skilled that pixel counting founders against their might.

The reason it's not a slamdunk is simply that no single edge can show this, especially given the image quality factors I already mentioned above. The DOF, edge and temporal AA, motion blur, bloom, etc. make most edges very, very soft indeed. Notice, though, that my first and most challenging example isn't much sharper. The others are gradually easier, with the last being quite blatant to my eyes. (That's partially helped by moving from a video framegrab to a screenshot. You yourself pointed out that video grabs will have temporal AA, whereas framebuffer exports won't.)

My point is, in a game using so many blur and noise effects, some edges will always be too soft to count accurately. But that won't apply to all edges, all the time. For a striking example, here's a wider view of the scene your crop comes from, with that element outlined in red.

t0u4W.png


Quite near your element is geometry with much more ragged stepping, outlined in blue. Here are zoomed crops of those elements:

Jj045.png


KtVQN.png


The first is 10 steps over 12 pixels, the second 5 steps over 6 pixels. Both these calculate to 900p. So the very same shot you want to use as evidence that counting doesn't work, shows that counting works just fine.

Now compare to one of the older screenshots I counted as 1080p.

NBsYG.jpg


All the edges in this shot are soft, not just the ones I counted. I realized later that this overall softness could possibly be due to upscaling, which is why I followed up with doubts about my initial 1080p result. But now that we have definitely upscaled footage, that idea is also thrown in doubt. Because the palace media--even though quite soft in some areas--shows obvious jaggies and scaling artifacts in others, often in the same frame. (And this is despite the fact that the old shot is a framebuffer missing temporal AA, versus your new shot being a blended video grab.) These contradictory observations impel me to revoke my initial conclusiveness of 1080p. But your one soft edge elsewhere seems to me too weak to suddenly compel surety in the other direction.

As an addendum, I want to be clear that my lack of certainty doesn't really extend to the first Ryse count I did, of the initial Combat Vidoc. Here's your "uncountable" edge followed by one from that footage:

qJjhS.png


D1qfb.png


As you can see, the AA back then spilled away from the geometry across many more pixels than even the most blurred edge in the new footage. I therefore still believe that old footage was running on a spec-targeted PC. The presence of other effects like water caustics that have since been removed bolsters the case.
 

Hermii

Member
I doubt if i can tell much difference from 1080p and 900p upscaled on xboxone which apearantly has a really good upscaler. And no ms didnt pay me to say this.
 

Timu

Member
The AA wasn't that good in the first shown shots and footage of the game. (IMO anyway. But it still was a little better than current gen stuff.)


In their papers for the game, they developed and used SMAA T1X in the first footage. Which still left a lot of specular aliasing issues. So they may still be using SMAA T1X, but with the redone shading (Specular issues are far less an issue now and look a lot better) it could be why it looks better.
Ah I see, but it's definitely one of the better AA on consoles from what I seen.
 
I doubt if i can tell much difference from 1080p and 900p upscaled on xboxone which apearantly has a really good upscaler. And no ms didnt pay me to say this.
The One upscaler is almost certainly no better--indeed, not different at all--from the PS4 upscaler, since they both use AMD hardware. Even if they're unalike, absolutely no upscaler in the world can do anything to eliminate artifacts. That's why you get shimmery stuff like this:

MdeUZ.gif


That said, while upscaling can't hide artifacts, other effects can. And Crytek has done a very good job adding those effects to mitigate issues and kee perceived quality quite high. There's still cause for concern--most devs don't have Crytek's technical muscle, and games at 720p are far harder to polish than Ryse at 900p--but if the effect isn't bothering you, that's good! I made this thread to explain how pixel counting gives us numbers, not to gainsay anyone's enjoyment.
 
I always thought pixel counting was a joke used to make fun of people who took resolution stuff too seriously, I didn't know it was actually a thing.


The more you know.
 

KKRT00

Member
@Liabe Brave

My point is how many edges have You checked in earlier footage?
And there is difference between 300kb jpeg and 2mb PNG.

---
Btw in Palace gameplay there is exactly the same scene as in Story trailer. I think thats good place for comparison.

--------
Thats shimmer is not from upscaling, but from sub-pixel aliasing and will be visible in every game with only post-AA.
 

p3tran

Banned
aw man! I have a feeling this scaler shit from microsoft is going to cause trouble!

where is that guy that said that he would try to write a pixel-counting app? it looks like we may be needing it after all :D
 
Top Bottom