• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

2160p reconstructed vs. 1800p native?

Native 4K is not an option. So my preference is:

  • 2160p reconstructed with whathever method the developers come up with - more P's the better, right?

    Votes: 33 41.3%
  • 1800p native all the way, at least I know the heritage of all my pixels

    Votes: 14 17.5%
  • 1080p is fine as long as there's so much RT that Alex Battaglia is permanently blinded

    Votes: 12 15.0%
  • I have a kick-ass gaming PC - nothing else to contribute, just wanted to point that out

    Votes: 21 26.3%

  • Total voters
    80

nowhat

Member
While console wars can be entertaining and all, I'd like to point out that there currently is a very active thread about a "benchmark" in photo mode. I admit participating as well for some one-liners - couldn't help it, such threads are like a laser pointer to a cat. But still, seriously. We are arguing over frame rates in photo mode. I think a brief intermission would be in order (and yes, I would feel the same had the results gone the other way) - why not discuss tech in general, especially in a topic where we all can be equally right and wrong as this is arguably very much subjective?

The next (current?) generation of consoles has barely launched, and developers will surely learn to get more performance out of them in the coming years. But I think it's safe to say by now that no, not all games will be native 4K/60fps, RT or not. And a solid frame rate (even if it would be just 30fps) is always preferable, so if something has to give, it would be resolution. Dynamic resolution scaling (which often is done with the horizontal axis) is of course an option, but in the end, it's the vertical resolution that counts the most. Two points to keep in mind for this debate:
  1. While the internal rendering resolution in a game may be pretty much anything, the console will upscale (or downscale) that to match the output resolution of the console. Since we're talking about new and shiny tech, the output resolution will be 2160p.
  2. All reconstruction methods are equal, but some are more equal than others - some of the implementations (it's just not the technique used, the devil is in the details) can produce fantastic results, while others *cough*RDR2*cough* less so. For the purpose of this discussion, let's assume the reconstruction is one of the better implementations.
So, this is something I see being tossed around frequently, that native 1800p would be better than any "fake 4K" method if native 4K is not feasible. And... I'm not sure that would really be the case? The console still has to upscale it, which means it will lose some sharpness in the process. On the other hand, all of the reconstruction techniques have some issues. Checkerboarding doesn't handle transparencies or lines at certain angles well, temporal injection (what Insomniac calls their method) can be remarkably artifact-free, but at the same time it can also look quite soft and CGI-like, and so on.

I guess there are no right answers here, which kinda is the point of this thread. My vote goes to reconstruction, because our lord and saviour Cerny has championed it more seriously, I've been impressed by the results with many games on a Pro, but that's just my opinion, what's yours?
 
I feel like the choice wouldn't realistically be between 1800 and 4k cbr, it would be 1440 and 4k cbr.

I think 4k cbr looks great when it is implemented well, so I would always vote for that. (edit: vs 1440. 1800p is probably just as good as 4k cbr)
 
Last edited:
Reconstruction techniques are almost indistinguishable between 4k these days. Alex and DF have a hard time unless told if an image is coming at 4k. Looks at insomniac.

So pushing native 4k or 1800p may be a waste of resources when we can barely tell.
 
Horizon zero down used half 4k CB and it looks great I thought, so it *should be far less demanding than native 1800p*. I wish every dev on consoles used that and leave an option to select it for pc while you're at it.

*speculation , don't quote me on that
 
Last edited:

x@3f*oo_e!

Member
1800 is more pixels (work) than 2160 checkboard - the equivalent is actually 1440p raw .. (50% pixesl overall)

Checkerboarding (with temporal reconstruction) gets artifacts on fast camera movement - but I actually don't mind that.

Probably native for fast paced, checkboard for slower paced
 

Kuranghi

Member
Cap to 30fps and get native 4K :goog_tongue:

I just spent all day working out how to scanline-sync games with RTSS, now I can whack games to the highest settings, set to native 4K res and play games at 30hz but with no input lag, no tearing and perfect frame pacing.

Anything less than 2160p just isn't the same on a big 4K panel. You just gotta suck it up and cap your framerate.

That being said, there are many other factors like the games art style, texture detail and how LODs work that might make native 2160p a bit pointless, or its an engine thats a crusty piece even at 4K, at that point I say render at the lower res to get more frames and apply temporal reconstruction/anti-aliasing + sharpening.
 

nowhat

Member
Whatever Horizon zero down used on p4 pro looked great.
I agree, for the most part. During gameplay it looks terrific, unless you pause the image in very specific points with lots of movement (and if you find yourself doing that, you're just wanking over minute details which you'll never notice in real life - get back to actually playing the game already!). But CBR is quite evident in cutscenes, check out Aloy's hair or anyone with fur clothing. They look fine otherwise, but when positioned in front of a background that is out of focus (as is the case with many cutscenes, there's a rather heavy Bokeh effect in place), the checkerboarding artifacts are very much visible. It doesn't ruin the game by any means, and such situation is difficult for any reconstruction method, but that it's not native resolution clearly shows in those instances. But again, when it comes to actually playing the game it's as good as native for me, no complaints.
 
1440p+ with temporal injection is the best non native 4k imagery I saw last gen, so that. That and 60fps would be my preferred target for ps5 unless it's a simple game and native 4k is doable. Or a developer chooses 30fps for game logic complexity reasons.
 
Last edited:

Rikkori

Member
It depends a lot on game and art style, it's not just about the technique. It's really difficult to spot the difference even between 5K & 50% 4K reconstructed in something like The Division 2, or just between 1800p and 5K in Kingdom Come Deliverance (AA or no AA). On the other hand in games like RDR 2 even 4K is barely enough and only so long as you keep TAA to medium, but otherwise it screams for 5K+.
 

yugoluke

Member
I cannot tell the difference any more. I will always take a reconstruction if 60fps is the outcome as a result. Additionally, I am certain that DLSS or equivalent techniques will be the focus in the future. The native resolution conversation will not be something that matters once super sampling techniques become more ubiquitous.

AI/Machine learning will make this conversation redundant.
 

Mmnow

Member
The right answer is always going to depend on the game, the equipment it's played on, distance from the telly and the sensitivity of the player.

Which is why the only really right answer is for developers to give console players a degree of control over graphical settings.
 

EverydayBeast

thinks Halo Infinite is a new graphical benchmark
What’s Nintendo’s excuse for not using 4K god dang it? The pressure is on them to release a traditional console.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
TAAU or AMDs SuperResolution
Im one of those dweebs who doesnt mind TAAU as long as its not at 30fps......80fps+ or bust.

<---- Plays at 1440p DSR from as high as I can maintain frames.
 
Top Bottom