• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Question regarding checkerboarding?

KORNdoggy

Member
I thought i had this understood, but i'm not sure now.

if a game like final fantasy 15 is "checkerboard 1800p" does that mean the game is natively rendering at 1800p and then the checkerboard is boosting the resolution to a full 4K output? (albeit fake)

or is the checkerboarding boosting a lower native res to 1800p via checkerboarding, and on a 4K screen that output is simply upscaled to fit your screen?

basically i'm confused as to what horizon: zero dawn's native resolution is? DF say it's a checkerboarded 2160p title. now, we all know this thing isn't natively rendering at that. but then what is it rendering at natively? does a game have to render at 1920x2160 to reach full 4K checkerboarded resolution?

Cheers.
 
Checkerboard rendering means it's rendering every other pixel in a checkerboard pattern (with other fancy image processing and sampling effects) to get to the desired resolution.

So if a game is checkerboard 1800p, then it's checkerboard rendering up to 1800p and then that 1800p image is upscaled to 4K.

You'll find that this isn't necessarily a bad thing in many circumstances because even playing a game at like 1440p still looks like a solid upgrade on a 4K TV.
 

Rourkey

Member
i'm interested to know this as well, also the reason why this gets a free pass but when an Xbox exec pointed out XB1 games are unscaled to 1080p everyone lost their shit!

(I thought it a stupid statement as well at the time, just wondered why no one calls this out like they did back then)
 

dr_rus

Member
The second one. 1800p checkerboard means that the game is using checkerboard rendering to reach the stated resolution. 1800p is 1600x900 with checkerboarding which result in 3200x1800 final buffer resolution which is a bit shy of 4K's 3840x2160. Only half of these 3200x1800=5760000 pixels are being actually shaded each frame though so when comparing it's h/w requirements this mode is close to a native resolution of 2560x1080 or 1920x1440 - somewhat short of the typical PC 2560x1440 resolution.
 
The second one. 1800p checkerboard means that the game is using checkerboard rendering to reach the stated resolution. 1800p is 1600x900 with checkerboarding which result in 3200x1800 final buffer resolution which is a bit shy of 4K's 3840x2160. Only half of these 3200x1800=5760000 pixels are being actually shaded each frame though so when comparing it's h/w requirements this mode is close to a native resolution of 2560x1080 or 1920x1440 - somewhat short of the typical PC 2560x1440 resolution.

Could you explain the difference here?
 
i'm interested to know this as well, also the reason why this gets a free pass but when an Xbox exec pointed out XB1 games are unscaled to 1080p everyone lost their shit!

(I thought it a stupid statement as well at the time, just wondered why no one calls this out like they did back then)

Upscaling to 1080p was frustrating for people in the Xbox One because 1080p TV saturation was high and people felt that Microsoft should have made a console powerful enough to hit that consistently.

4K is still in its infancy. It takes a lot of power to natively render modern games in 4K. A console like the PS4 Pro will have to cut corners somehow. But also note that games upscaled to 4K usually look better than games upscaled to 1080p games do.

Could you explain the difference here?

Think of a checkerboard. If you render every black square, you're only "rendering" half of the squares. Now imagine one frame you're showing all of the black squares, and on the next frame you're showing all the white squares.

Checkerboard rendering is more complicated than that so it's hard to put an exact output resolution on it, but that's the basic idea. It tries to make higher resolutions cheaper to achieve with minimal costs to image quality. It succeeds in most cases.
 

KORNdoggy

Member
Checkerboard rendering means it's rendering every other pixel in a checkerboard pattern (with other fancy image processing and sampling effects) to get to the desired resolution.

So if a game is checkerboard 1800p, then it's checkerboard rendering up to 1800p and then that 1800p image is upscaled to 4K.

You'll find that this isn't necessarily a bad thing in many circumstances because even playing a game at like 1440p still looks like a solid upgrade on a 4K TV.

so does that mean horizon is rendering at a higher native resolution than say, final fantasy is achieving WITH checkerboarding?
 
so does that mean horizon is rendering at a higher native resolution than say, final fantasy is achieving WITH checkerboarding?

Yes. It's not really possible to get an exact resolution that checkerboard rendered games are actually "rendering" internally, but yes, if you compare a checkerboarded 1800p to a checkerboarded 2160p, the latter is "higher resolution."
 
1800p is 1600x900 with checkerboarding which result in 3200x1800 final buffer resolution which is a bit shy of 4K's 3840x2160. Only half of these 3200x1800=5760000 pixels are being actually shaded each frame though....
Though it seems all current implementations do this, it's not necessarily true. You could have a CBR setup where the reprojection is run before the lighting pass. The savings would then be minimal, but the quality would be extremely difficult to differentiate from native rendering. The reason no one does this yet is likely because a lot of shading operations are redundant (i.e. the same object will end up looking pretty much the same from frame-to-frame). This is especially true for higher spatial and temporal resolutions.

If you want more info, OP, you might find the thread I made about CBR helpful. I believe the most accurate answer about Horizon Zero Dawn is that it's rendering at 3840x2160, just half the pixels are rendered in a slightly less accurate way.
 

ethomaz

Banned
1800 checkerboard means the game is being native rendered at 1600x1800 and get checkerboard to 3200x1800... after that you get upscaling to 3840x2160.

The second one. 1800p checkerboard means that the game is using checkerboard rendering to reach the stated resolution. 1800p is 1600x900 with checkerboarding which result in 3200x1800 final buffer resolution which is a bit shy of 4K's 3840x2160. Only half of these 3200x1800=5760000 pixels are being actually shaded each frame though so when comparing it's h/w requirements this mode is close to a native resolution of 2560x1080 or 1920x1440 - somewhat short of the typical PC 2560x1440 resolution.
The actual resolution to checkerboard 1800p is 1600x1800... 2880000 pixels... the result is a image with 3200x1800.

It is close to 1440p (2764800 pixels) in terms of power (performance) to render the image but the quality is close to 1800p (5760000 pixels) with some artifacts due to using old frame info to fill half of the image.

i'm interested to know this as well, also the reason why this gets a free pass but when an Xbox exec pointed out XB1 games are unscaled to 1080p everyone lost their shit!
Upscale to 1080p is way worst than 1080p.
1800p checkerboard is way better than 1080p (better than 1440p if you be fair).

It is really easy to understand why one get free pass and another not.
 

TyrantII

Member
i'm interested to know this as well, also the reason why this gets a free pass but when an Xbox exec pointed out XB1 games are unscaled to 1080p everyone lost their shit!

(I thought it a stupid statement as well at the time, just wondered why no one calls this out like they did back then)

"Dumb" upscaling isn't 1:1 scaling, which is a big problem. It's sort of like being lossy, where you lose information in the process. It's why a 720P image scaled to 1080P looks worse than it did at 720P (vasaline smear) as pixles are averaged in a dumb fashion and baked into the image.

Using resolution that are factors of 1080P and can evenly be scaled produces 1:1 scaling. Further checkerboard scaling is using temporal fuckery (like Killzone shadow fall) did to best calculate pixles from prior frames instead of rendering them outright, saving a ton of processing power.

It's not perfect, but the difference is vasaline smear vs a crisp good looking image.
 

gofreak

GAF's Bob Woodward
i'm interested to know this as well, also the reason why this gets a free pass but when an Xbox exec pointed out XB1 games are unscaled to 1080p everyone lost their shit!

The reason is : it's an improvement over 'the common standard' (1080p), whereas in the context of the XB1 comment, it was an excuse for a box not meeting the performance of a then cheaper competitor that was releasing at the same time, and not so reliably meeting the 'common standard' expected of the new generation at the time.
 
1800 checkerboard means the game is being native rendered at 1600x1800 and get checkerboard to 3200x1800... after that you get upscaling to 3840x2160.
That's a misleading way to state it; language like this is why people like the OP have a hard time understanding. An 1800c game natively renders the same number of pixels as 1600x1800, but not in that effective configuration. The target buffer is sparse 3200x1800. And saying that it "gets checkerboarded" implies that these 1600x1800 pixels are used to derive the rest, like upscaling. But that's not what's going on. Instead, the difference is made up by previously shaded pixels from indexed polys reprojected and error-corrected.

Nothing you said was strictly untrue, but it didn't make the actual method very clear.
 
i'm interested to know this as well, also the reason why this gets a free pass but when an Xbox exec pointed out XB1 games are unscaled to 1080p everyone lost their shit!
Because CBR is not upscaling. It's immensely better.

Further checkerboard scaling is using temporal fuckery....
CBR is not any kind of scaling. The temporal fuckery is a completely different and much more accurate approach. (It's more advanced than what Killzone did too.)
 

ethomaz

Banned
That's a misleading way to state it; language like this is why people like the OP have a hard time understanding. An 1800c game natively renders the same number of pixels as 1600x1800, but not in that effective configuration. The target buffer is sparse 3200x1800. And saying that it "gets checkerboarded" implies that these 1600x1800 pixels are used to derive the rest, like upscaling. But that's not what's going on. Instead, the difference is made up by previously shaded pixels from indexed polys reprojected and error-corrected.

Nothing you said was strictly untrue, but it didn't make the actual method very clear.
Yeap... you are right I could be more specific but that is what made most people here get confused I guess. I tried to cover in a simple comparing in a reply.

"It is close to 1440p (2764800 pixels) in terms of power (performance) to render the image but the quality is close to 1800p (5760000 pixels) with some artifacts due to using old frame info to fill half of the image."

I guess it is not the better way to explain yet because the quality is close to the full picture (3200x1800) because it is using old rendered info to fill half of the actual image... different of upscaling that is guessing the pixel based in the nearest pixels (guess is a bit oversimplified too because the result using complex algorithms).
 
1800c is rendering 1600x1800 (twice 900p) pixels in a 3200x1800 (four times 900p) framebuffer.

2160c is rendering 1920x2160 (twice 1080p) pixels in a 3840x2160 (four times 1080p) framebuffer.

For the actual rendered pixels the multiply is just so you know how many pixels are actually rendered, they are rendered sparsely so it's not directly like rendering that block and then expanding to the final framebuffer.
 

dr_rus

Member
Could you explain the difference here?
The difference is in the naming =) Checkerboard means that you're looking at the checkerboard pixel grid, with "holes" in it, where half of the pixels are "missing". The resulting spatial resolution of such grid is the same as if there would be no "holes". These "holes" are filled with the data taken (in a smart way) from a previously rendered frame where the other half of pixels were missing. So you're rendering only half of the resulting spatial resolution each frame which halves the h/w requirements for such rendering.

Though it seems all current implementations do this, it's not necessarily true. You could have a CBR setup where the reprojection is run before the lighting pass. The savings would then be minimal, but the quality would be extremely difficult to differentiate from native rendering. The reason no one does this yet is likely because a lot of shading operations are redundant (i.e. the same object will end up looking pretty much the same from frame-to-frame). This is especially true for higher spatial and temporal resolutions.
What would you reproject before the lighting pass? Not all games use deferred lighting with a dedicated lighting pass either.

1800 checkerboard means the game is being native rendered at 1600x1800 and get checkerboard to 3200x1800... after that you get upscaling to 3840x2160.

No, the game is being rendered to a 3200x1800 buffer in a checkerboard pattern. There is no 1600x1800 resolution anywhere here although I see why it's easier to describe this way since 3200x1800 with checkerboarding has the same h/w requirement as 1600x1800 "native".

Technically, it would be more accurate to talk about rendering to 1600x900 with MSAA 2x with forced shading for 3200x1800 CB and 1920x1080 with MSAA 2x with forced shading for 3840x2160. "Forced shading" here means that all MSAA subsamples are being shaded instead of the typical MSAA behavior where only the subsamples which fall into different triangles are being shaded.
 

onQ123

Member
I thought i had this understood, but i'm not sure now.

if a game like final fantasy 15 is "checkerboard 1800p" does that mean the game is natively rendering at 1800p and then the checkerboard is boosting the resolution to a full 4K output? (albeit fake)

or is the checkerboarding boosting a lower native res to 1800p via checkerboarding, and on a 4K screen that output is simply upscaled to fit your screen?

basically i'm confused as to what horizon: zero dawn's native resolution is? DF say it's a checkerboarded 2160p title. now, we all know this thing isn't natively rendering at that. but then what is it rendering at natively? does a game have to render at 1920x2160 to reach full 4K checkerboarded resolution?

Cheers.



The biggest problem is that people have a misunderstanding of what "Native" means to begin with so if you was to tell them that the 1800p resolution is also the native resolution but using a different rendering technique (checkerboard rendering) they wouldn't understand it & just go off running their mouth.
 

truth411

Member
By my understanding, say for example Horizon Zero Dawn is 3840/2160p via checkerboard rendering. So it's effectively rendering the image at 3840i/2160p. Effectively rendering half the pixels at a time of a full 3840/2160p frame.
 
1800 checkerboard is basically the game running at 900p and uses checkerboard processing to get 1800p
No, it isn't at all. For one thing, 1800c shades twice as many pixels as 900p. For another, "uses checkerboard processing" is a vague phrase that is avoiding the very question the OP was asking.

What would you reproject before the lighting pass? Not all games use deferred lighting with a dedicated lighting pass either.
You'd reproject the unlit shaded pixels from the previous frame. And yes, some engines would not be suited for this. The point was that the current particular methods aren't the only possible implementations.

By my understanding, say for example Horizon Zero Dawn is 3840/2160p via checkerboard rendering. So it's effectively rendering the image at 3840i/2160p.
No, that's a confusing way to think about it too; interlacing has a particular meaning, and CBR doesn't use it. The more complete and longer descriptions already posted in the thread are far truer. This is a situation where trying to simplify the explanation too far eliminates the specifics that actually make CBR what it is.
 

dr_rus

Member
You'd reproject the unlit shaded pixels from the previous frame. And yes, some engines would not be suited for this. The point was that the current particular methods aren't the only possible implementations.

For that you'll have to store them between frames which would mean both bandwidth and memory size impact - and the benefit of this is highly questionable in the end. Checkerboard artifacts are mostly related to motion vector calculation failures in general, not the incorrect lighting approximation specifically.
 
The difference is in the naming =) Checkerboard means that you're looking at the checkerboard pixel grid, with "holes" in it, where half of the pixels are "missing". The resulting spatial resolution of such grid is the same as if there would be no "holes". These "holes" are filled with the data taken (in a smart way) from a previously rendered frame where the other half of pixels were missing. So you're rendering only half of the resulting spatial resolution each frame which halves the h/w requirements for such rendering.

Thanks, I feel like I get it now. I was a bit confused by your resolutions, but now I realize you were talking about both 1800c and 2160c. :)

I'm really hoping this tech can make its way into a driver-level feature for PC GPUs. Could do wonders for 4K gaming.
 
No, it isn't at all. For one thing, 1800c shades twice as many pixels as 900p. For another, "uses checkerboard processing" is a vague phrase that is avoiding the very question the OP was asking.

Plus, well, isn't 900p technically a quarter of 1800p? Same way 1080p is a quarter of 2160p/4K.

It's more, in terms of basic pixel count per frame, the equivalent of 1600x1800, or 3200x900, depending on what dimension you're wanting to cut in half. Within 1/15th of a second (so 2 frames out of thirty a second), you have two of these 'checkered' frames played one after the other, with some extrapolation of the values from the previous to fill out the holes of the next. Thus within that 1/15th of a second, your eyes - and the console hardware - piece together a 3200x1800 image based on two separate images with the equivalent of 1600x1800.
 

ethomaz

Banned
No, the game is being rendered to a 3200x1800 buffer in a checkerboard pattern. There is no 1600x1800 resolution anywhere here although I see why it's easier to describe this way since 3200x1800 with checkerboarding has the same h/w requirement as 1600x1800 "native".

Technically, it would be more accurate to talk about rendering to 1600x900 with MSAA 2x with forced shading for 3200x1800 CB and 1920x1080 with MSAA 2x with forced shading for 3840x2160. "Forced shading" here means that all MSAA subsamples are being shaded instead of the typical MSAA behavior where only the subsamples which fall into different triangles are being shaded.
Yeap if you describe it in a more correct way... then it what you said but I tried to simplify.

1800cb is rendered at 1600x900 with MSAA 2x with the end result being a 1600x1800 frame... that frame is completed with old render info to reach the 3200x1800 frame... after that it is used upscaling to 4k.
 

dr_rus

Member
Thanks, I feel like I get it now. I was a bit confused by your resolutions, but now I realize you were talking about both 1800c and 2160c. :)
Eh not really, I was talking about an equivalent "native" (as in non-checkerboard) resolution when it comes to assessing the h/w requirement. 1800p checkerboard is somewhat short of 2560x1440 here, as it's close to either 2560x1080 or 1920x1440. If you take 3840x2160 checkerboard then it has almost the same h/w requirement as a non-checkerboard 2560x1600 resolution. Probably not the best way to compare these though.

I'm really hoping this tech can make its way into a driver-level feature for PC GPUs. Could do wonders for 4K gaming.

Highly doubtful that it's possible to implement this as a general driver level override, especially in newer APIs like DX12 and VK. With some s/w transparent h/w support though I can see this being an option.
 
For that you'll have to store them between frames which would mean both bandwidth and memory size impact - and the benefit of this is highly questionable in the end. Checkerboard artifacts are mostly related to motion vector calculation failures in general, not the incorrect lighting approximation specifically.
Artifacts are due to reprojection problems yes, but general pixel value accuracy can also be off with CBR. You're probably right that on a 4K display (the common target for CBR), these variations would not be noticeable enough to justify the effort.

But current CBR does seem to be using that storage/bandwidth already. At least UI and possibly particles do not get reprojected. So the final display buffer doesn't appear to be serving as the source for the following frame.
 

Lister

Banned
Highly doubtful that it's possible to implement this as a general driver level override, especially in newer APIs like DX12 and VK. With some s/w transparent h/w support though I can see this being an option.

Some games support sparse rendering on PC already, but yeah, I'm not sure if this cna be implemented on a driver level - it may have to be done purposefully on the engine side. Even with some hardware support on the Pro, for example, developers sitll need to implement the technology into their rendering pipelines.

As for quality: So far, and I would love to be proved wrong, I've seen mixed results.

On PC, playing WatchDogs 2 with sparse rendering on, the frame took a slightly softer appearance, but otherwise still looked great. However, the reduction on my GPU workload wasn't night and day at my 3440x1440p 21:9 monitor, though it was a ncie boost. On the other hand, the last Tomb Raider and Resident Evil 7 on Ps4 Pro look significantly softer from even a 1440p native PC screen. So implementations and the final quality appear to vary from game to game, and even from scene to scene.
 

ethomaz

Banned
BTW from Cerny... this is the concept btw.

checkerboard_rendering-native-4k-ps4-pro.jpg

For the 1800p example:

- Render 1600x900 + MSAA 2x
- Result in a 1600x1800 image
- Using old frame data reconstruct to 3200x1800
- Upscale to 3840x2160
 

ethomaz

Banned
Not exactly, the chain is:

1600x1800->checkerboard->3200x1800->upscale->3840x2160
He is saying that because when you render a 1600x900 image with MSAA 2x the result is a 1600x1800 image.

The resolution used is really 1600x900 but the render is working with 2x 1600x900 if you add MSAA 2x.
 
Within 1/15th of a second (so 2 frames out of thirty a second), you have two of these 'checkered' frames played one after the other, with some extrapolation of the values from the previous to fill out the holes of the next. Thus within that 1/15th of a second, your eyes - and the console hardware - piece together a 3200x1800 image based on two separate images with the equivalent of 1600x1800.
Your eyes have nothing to do with it, and there's not half as many pixels in each frame. At 1800c resolution, every single frame sent to the display--even if the game is 60fps--has a full 3200x1800 pixels. Half of them were rendered using a highly complex indexed-reprojection-with-correction method that's somewhat cheaper and less accurate. But ~5.76m unique pixels are sent every frame, just like with native 1800p.
 
He is saying that because when you render a 1600x900 image with MSAA 2x the result is a 1600x1800 image.

The resolution used is really 1600x900 but the render is working with 2x 1600x900 with MSAA 2x.

Since when checkerboard is related to MSAA???

PS4 Pro is meant to work with double the pixels compared to base PS4
 
Your eyes have nothing to do with it, and there's not half as many pixels in each frame. At 1800c resolution, every single frame sent to the display--even if the game is 60fps--has a full 3200x1800 pixels. Half of them were rendered using a highly complex indexed-reprojection-with-correction method that's somewhat cheaper and less accurate. But ~5.76m unique pixels are sent every frame, just like with native 1800p.

Ah, okay, apologies for my misunderstanding. So what I imagined more happens internal of the machine itself, before a player sees anything, with each delivered frame being the result of the calculated projection of what a full 1800p image should look like, via the checkerboard method?
 

Lord Error

Insane For Sony
Some games support sparse rendering on PC already, but yeah, I'm not sure if this cna be implemented on a driver level - it may have to be done purposefully on the engine side. Even with some hardware support on the Pro, for example, developers sitll need to implement the technology into their rendering pipelines.
Yes, the engine needs to calculate motion vectors and feed that into the algorithm for cb, and this cannot be done on a driver level, unless some kind of standard is established that everyone would follow.


On PC, playing WatchDogs 2 with sparse rendering on, the frame took a slightly softer appearance, but otherwise still looked great. However, the reduction on my GPU workload wasn't night and day at my 3440x1440p 21:9 monitor, though it was a ncie boost. On the other hand, the last Tomb Raider and Resident Evil 7 on Ps4 Pro look significantly softer from even a 1440p native PC screen. So implementations and the final quality appear to vary from game to game, and even from scene to scene.
That's because on your PC the game is rendering at the same target resolution of your monitor. On PS4Pro, ROTR is rendering at 1800p and seeing that on a 2160p screen will have upscaling artifacts. Overall I think that game still looks better than 1440p game (with same AA, which I think was FXAA in ROTR) upscaled to 4K.
RE7 on PS4 is not using checkerboarding at all, and is just rendering at native 2240x1260 resolution, so naturally that will look softer than 1440p. Games on Pro that actually render in 2160cb resolution that matches the 4K display (like Horizon does) will have the look and sharpness that you were seeing on WD2 on PC.
 
On the other hand, the last Tomb Raider and Resident Evil 7 on Ps4 Pro look significantly softer from even a 1440p native PC screen.
Resident Evil VII does not use CBR at all. And I don't think I've ever seen a comparison where 1440p PC Rise of the Tomb Raider was significantly sharper. Do you have the comparison you're talking about to hand?

He is saying that because when you render a 1600x900 image with MSAA 2x the result is a 1600x1800 image.

The resolution used is really 1600x900 but the render is working with 2x 1600x900 if you add MSAA 2x.
Be careful how you speak about this. It's not just "rendering with MSAA 2x" or "adding MSAA 2x". Doing that normally does not increase the number of pixels. You specifically have to force shading of the extra samples--which is not part of the MSAA process--to get a proper source for CBR.

Furthermore, MSAA is typically only used for edges. If you do that, the result isn't CBR but what Sony has called "geometry rendering". This isn't as high-quality, since it doesn't affect textures and surfaces.

You have to force extra sampling over the entire image to actually have CBR, and by that point I think it's misleading to call it any kind of "MSAA". (It uses the same sampling grid, but then so could a scaling operation, and that's certainly not a type of MSAA.)
 

ethomaz

Banned
Be careful how you speak about this. It's not just "rendering with MSAA 2x" or "adding MSAA 2x". Doing that normally does not increase the number of pixels. You specifically have to force shading of the extra samples--which is not part of the MSAA process--to get a proper source for CBR.

Furthermore, MSAA is typically only used for edges. If you do that, the result isn't CBR but what Sony has called "geometry rendering". This isn't as high-quality, since it doesn't affect textures and surfaces.

You have to force extra sampling over the entire image to actually have CBR, and by that point I think it's misleading to call it any kind of "MSAA". (It uses the same sampling grid, but then so could a scaling operation, and that's certainly not a type of MSAA.)
It is how they render it internally.

While in the MSAA used most for anti-aliasing you apply to samples... in the CB you will be applying MSAA 2x to the full 1600x900 internal render... the result is a 1600x1800 internal render.

The MSAA is the same... the difference is the target/use.
 
Eh not really, I was talking about an equivalent "native" (as in non-checkerboard) resolution when it comes to assessing the h/w requirement. 1800p checkerboard is somewhat short of 2560x1440 here, as it's close to either 2560x1080 or 1920x1440. If you take 3840x2160 checkerboard then it has almost the same h/w requirement as a non-checkerboard 2560x1600 resolution. Probably not the best way to compare these though.

Ah, ok. I'm a bit confused again, but it's starting to make sense. I suppose this is a technique best explained with images or video.

It's a shame that it would be difficult to implement on a driver-level. But hopefully it's something that'll gain traction.

Here's where I'm confused:

You said 1800c is 1600x900p with checkerboarding. That's a quarter the resolution of 1800p. Half the pixels on both axis. But if you're following a checkerboard pattern, then you'd be dealing with half the resolution at a time.

Then you say the h/w requirement is closer to half the resolution. 3200x1800/2= 2 880 000. 1920x1440 as you mentioned is approximately equivalent. This makes sense following the checkerboard pattern.

But where does the 1600x900p come from? What were you trying to say with it?
 

Lister

Banned
Yes, the engine needs to calculate motion vectors and feed that into the algorithm for cb, and this cannot be done on a driver level, unless some kind of standard is established that everyone would follow.



That's because on your PC the game is rendering at the same target resolution of your monitor. On PS4Pro, ROTR is rendering at 1800p and seeing that on a 2160p screen will have upscaling artifacts. Overall I think that game still looks better than 1440p game (with same AA, which I think was FXAA in ROTR) upscaled to 4K.
RE7 on PS4 is not using checkerboarding at all, and is just rendering at native 2240x1260 resolution, so naturally that will look softer than 1440p. Games on Pro that actually render in 2160cb resolution that matches the 4K display (like Horizon does) will have the look and sharpness that you were seeing on WD2 on PC.

Well, again, apparently things vary a lot. I didn't know Re7 ws rendering at a lower than 1440p resolution, that explains it, but we have side by side comparisons of a 1440p render on PC looking sharper than the 1800p checkboard rendering of Tomb Raider.

I recall some fo the textures ont he Pro being lower quality than the PC ones, so I guess that mgiht a contributing factor, but it still looked softer than the 1440p PC shot did overall.

EDIT: So I had the two confused. It was 1440p on PC vs the pro on Resident Evil 7, which make sense,s ince apparently it's rendering below 1440p. And the tomb raider comparison was vs 4K on PC not 1440p, which naturally is goign to look sharper than 1800p on the Pro.

Therefore, going by my Watch Dogs 2 experience, Checkerboarding/spare rendering to a native panel resolution just makes sense.
 
So what I imagined more happens internal of the machine itself, before a player sees anything, with each delivered frame being the result of the calculated projection of what a full 1800p image should look like, via the checkerboard method?
Yes, that's right. A CBR game outputs the same-size frames, on the same schedule, as native rendering. The difference is that up to half the pixels may be slightly different values. But some of that half will actually be exactly right, and some more will be so close to correct that the errors are unnoticeable.

That's why CBR results can be so good overall (though they're also variable, and can be less accurate at times).

I suppose this is a technique best explained with images or video.
I took a stab at doing just that a little while ago in my thread about CBR. You might check that out and see if it helps any.

...we have side by side comparisons of a 1440p render on PC looking sharper than the 1800p checkboard rendering of Tomb Raider.
Can you point me to those? I'd be interested in seeing them.
 
Yes, that's right. A CBR game outputs the same-size frames, on the same schedule, as native rendering. The difference is that up to half the pixels may be slightly different values. But some of that half will actually be exactly right, and some more will be so close to correct that the errors are unnoticeable.

That's why CBR results can be so good overall (though they're also variable, and can be less accurate at times).

Thanks again for clarifying.
 

Lord Error

Insane For Sony
Well, again, apparently things vary a lot. I didn't know Re7 ws rendering at a lower than 1440p resolution, that explains it, but we have side by side comparisons of a 1440p render on PC looking sharper than the 1800p checkboard rendering of Tomb Raider.
I haven't seen that, but if that was the case, it could be due to differences in texture resolution, and especially AA quality being better in the PC screen. It could also be that some things in the scene look better and some look worse.

I have seen comparision between the native 4K ROTR PC shot and whatever PS4Pro was outputting - it was the upclose shot of Lara holding some kind of blue torch - and to me the Pro shot looked just a bit softer, but nothing major. Actually, on a second thought, I think that ROTR on Pro might even render at 2160cb, but I'm not sure.
 

Lister

Banned
I haven't seen that, but if that was the case, it could be due to differences in texture resolution, and especially AA quality being better in the PC screen. It could also be that some things in the scene look better and some look worse.

I have seen comparision between the native 4K ROTR PC shot and whatever PS4Pro was outputting - it was the upclose shot of Lara holding some kind of blue torch - and to me the Pro shot looked just a bit softer, but nothing major.

Check out my edit, but yeah it was PC @ 4K, and yes, there is also a texture quality discrepancy.

RFg5bMa.jpg


Every game shoudl have a sparse rendering option, IMHO.
 

ethomaz

Banned
Check out my edit, but yeah it was PC @ 4K, and yes, there is also a texture quality discrepancy.

Every game shoudl have a sparse rendering option, IMHO.
Ohhhh sorry I missed the edit even posting after lol (I had in quote already).

Yeap... 4k with better textures will look sharper than 1800p checkerboard... there is no comparison at all.
 
Top Bottom