• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF asks: Does Resolution Matter?

Methos#1975

Member
I don’t get why console devs don’t just go with 1080p, consoles are made for the living room so the normal viewing distance is a couple of meters, I really couldn’t care less if the games run at 1080p or 4K at that range.
Nah. Sorry but resolution does matter and many of us with our setups can see a perceptiable difference. For example I immediately could see the resolution drop with the boasted FPS on FO4 implemented today and immediately turned it off so I could go back to the higher resolution. I'm also that guy that uses quality mode over performance mode in most games such as Vahalla, the lost in detail and sharpness is imo not worth 60fps when 30fps is still smooth.
 

Kamina

Golden Boy
The higher the numbers the less it matters.
Same with FPS. 60 is dope, 120 is still a noticable improvement. 240 is probably overkill.

Eventually, when we reach 4k/120/Raytracing as standard in the not too distant future, we may focus on refining graphical realism again.
 

Hunnybun

Member
This is my personal opinion but I can't tell the difference between 1440p and above at a normal seating difference.

I spent a lot of time messing with Forza Horizon 4 on my PC. I couldn't tell a difference between 1440/1800/2160 at my seating distance (1-2.5m from a 50" 4K tv) and I've left it at 1440 so I can run it Ultra 60. Now if I went up close to my TV and stared at a static image then yeah I would notice. Hell I can easily play at 1080p from that distance and it looks fine. When it comes to consoles I can't tell a difference between checkerboard and native either.

I play games not stare at 400% static zooms.

I would imagine this is the case for the vast majority of people, if they were honest. ESPECIALLY once they're actually playing the damn game and absorbed in the action.

I sit about 6 ft from my 65" OLED and it's incredibly hard to notice differences like that.

Miles Morales: I think I can notice a difference between native 4k and whatever tricks they're pulling in the Performance RT mode. Note: I think. I couldn't swear to it. It's genuinely hard to tell. I'm not sure I could pass a blind test tbh.

Demon's Souls: the difference seems to be slightly bigger here. There seems to be a slight sharpening. Nothing major and nowhere near worth the loss of frames, but just about noticeable. I think I could pass a blind test on this.

Horizon ZD: I can tell the difference between 1080p and 4k cb. In most scenes it's a relatively minor difference, though, amazingly enough. It would actually be interesting to see just how beautiful a game could be made for next gen targeting 1080p60. Maybe the trade off would actually be worth it?!

I can't honestly tell a difference between 4k native and checkerboard, either.
 

Chessmate

Banned
Of course resolution doesn't matter. Because this is where Xbox will outperform Playstation this gen. Obviously, it wasn't much of a problem when PS4 and PS3 outperformed Xbox One and 360 resolution-wise. 🤗

Anyway, Linneman rightly says: IF you can't hold your target framerate, lower the resolution. That's common sense. If one of the consoles can hold a righer resolution AND a steady framerate, it's obviously the more capable machine - and "the better version." But its telling people need to spin this.
 
Last edited:

rofif

Can’t Git Gud
I chose 4k monitor over 144hz 1080 and it does for me. I play mostly sp games and dark souls is locked to 60fps anyway. When I saw how much sharper dark souls 3 looks at 4k I was hooked. It also helps that 4k ips panels were better quality that 1440p year ago.

Also, with slaw, death stranding looks better than native 4k. No aliasing, no pixel crawling. Amazing.

And I play 58fps locked to start in that sweet 40 to 60freesync range for low lag and no tearing
 

bender

What time is it?
We probably need two generations in a row where the target resolution is static (4K -> 4K) as the technology leaps between console generations is diminishing. We'll need PS6/Xbox Next before native 4K is realistic. I'd rather get a few more bells and whistles at 60FPS with upscaling rather than native 4K. The best thing about games late last generation and early on so far, is that they are starting to have performance/quality options.
 
There are so many factors coming into play. Mainly it depends on the size of your display and also the sitting distance. But we should also not forget that tv displays are varying very much in their quality, meaning the same resolution on one display may appear much better compared to another one. And then we also have the fact that it also depends on how good you can see.
On some displays it was even hard to tell the difference between 720p and 1080p if you don't sit directly in front of it, so now with these much higher resolutions it becomes even harder to tell the difference, unless you have a huge television and sit near to it. I think 8k will only make sense for really huge TV sets.
 
The Irony of DF at times and I'm sorry screen res was something that was a talking point even in the CRT days. How many of us Saturn and PS owners loved it when games like VF2, Tekken 2 looked to use high res modes.
 
XSX has been winning the resolution battle so it doesn't matter now?
Well, they've got to pander to the masses. If makes me laugh now they want to talk about an average over screen res, well that should also apply to frame rates, unless the Xbox drops a few frame that is
 

Panajev2001a

GAF's Pleasant Genius
I'm over here at 240p
Are you far away in the shimmering distance shouting “you can’t see me”? Technically accurate ;).

you cant see me john cena GIF by Saturday Night Live
 
Last edited:

Elog

Member
They are spot on. The incremental value of resolution once you hit around 1600p is minimal at best with a TV+couch set-up given 5-8" view distance and a 55-65" TV - the eye can not really discriminate the resolution increase and the amount of extra silicon you need to go from 1600p to 2160p is close to 100%.

With a monitor there is still good value in going beyond 1600p since we sit so much closer but for a console where almost everyone has the TV+couch set-up? Plain waste of silicon - use it for stable frames and post-processing instead.
 

Panajev2001a

GAF's Pleasant Genius
Well, they've got to pander to the masses. If makes me laugh now they want to talk about an average over screen res, well that should also apply to frame rates, unless the Xbox drops a few frame that is
I think DF is straight down the DLSS hole and I read that way their updated stance on native resolution. No, not just because of nVIDIA often being a sponsor, John has been talking about modern upsampling/reconstruction techniques and how modern games at “native” res are a mix of layers that may or may not be at much lower resolution than the native one... DF was the one that had to be told UE5 demo was not native 4K.
 

regawdless

Banned
Resolution is important and makes a difference - just as fps, raytracing etc.

It's all about finding the best mix for every game.

I'm generally playing at 1440p and with good AA, it's clean enough. 4k is obviously better, but often not worth the fps hit for me.
 
I think DF is straight down the DLSS hole and I read that way their updated stance on native resolution. No, not just because of nVIDIA often being a sponsor, John has been talking about modern upsampling/reconstruction techniques and how modern games at “native” res are a mix of layers that may or may not be at much lower resolution than the native one... DF was the one that had to be told UE5 demo was not native 4K.
It wasn't just DF that said the Unreal tech demo wasn't native, but Epic's VP of Engineering who came out and said it.
I take it going down the line it must mean Series X will have a screen res advantage ;)
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
It wasn't just DF that said the Unreal tech demo wasn't native, but Epic's VP of Engineering who came out and said it.
I take it going down the line it must mean Series X will have a screen res advantage ;)
No, DF had to be told it was not native... initially they believed it was native (at least this is what DF itself said). Games with good AA are difficult to distinguish from native resolution games is the point they are making.

Not sure how you drag the XSX into it... 🤷‍♂️.
 
Last edited:
Personally I think that meeting or exceeding native display resolution is important...as long as you do that it'll never look bad. 480p looks good on a 480p display, it looks bad on a 4K display. In the same vein 1080p looks good on a 1080p display, but looks bad on a 4K display. There are benefits but people put way too much stock in raw pixel count. Beyond 1080p I'll always choose framerate over resolution.
 

TonyK

Member
I suppose I'm in the minority that prefers 4k30fps over 2K60fps. I tried to play at 2k60 in several games in PS5 but they look too PS4 in that mode for me. Maybe is because I play close to my 65"TV but for me it's more noticeable the drop to 2K than the benefit of playing at 60fps.

Last example is Yakuza like a Dragon PS5 version. All people here tell that it should be played at 2k60fps because there is no image difference between quality and performance and 30fps are more or less unplayable. I bought the game and discovered that at 30fps it's perfectly playable (same as all the games at 30fps we've played during years and years) and in 4K looks noticeably sharpen than the blurry and PS4ish 2k.

I wish all games continue with this trend of offering both possibilites: 4K30 and 2K60.
 

martino

Member
they put things in context with history and technical limitations to makes obvious to those loosing them of sight why it mattered then less and less...
With this video they eliminate any "doubts" around this narrative.Now trying to spin it again will expose even more who are the real "shills" full of bad faith.
 
No, DF had to be told it was not native... initially they believed it was native (at least this is what DF itself said). Games with good AA are difficult to distinguish from native resolution games is the point they are making.

Not sure how you drag the XSX into it... 🤷‍♂️.
When you are watching a video it may have been harder to tell. I was being sarcastic over the Series X. But I wonder going forward if the system will have a screen res advantage?
 

Shmunter

Member
I suppose I'm in the minority that prefers 4k30fps over 2K60fps. I tried to play at 2k60 in several games in PS5 but they look too PS4 in that mode for me. Maybe is because I play close to my 65"TV but for me it's more noticeable the drop to 2K than the benefit of playing at 60fps.

Last example is Yakuza like a Dragon PS5 version. All people here tell that it should be played at 2k60fps because there is no image difference between quality and performance and 30fps are more or less unplayable. I bought the game and discovered that at 30fps it's perfectly playable (same as all the games at 30fps we've played during years and years) and in 4K looks noticeably sharpen than the blurry and PS4ish 2k.

I wish all games continue with this trend of offering both possibilites: 4K30 and 2K60.
Once in motion tho, there is the temporal resolution factor. Our brains process information over time, at 60fps our brain literally sees more detail. Of course there a is pivot point between res and framerate. Go to far in one direction and the pivot breaks.
 
Last edited:

ZywyPL

Banned
Linneman: Pixel count only became important with flat panels

And (sadly) this still holds up to this very day, anything a flat screen receives that isn't at its native res is just blurry, more or less, which is the exact opposite result of why we reach for higher resolutions in the first place... 1440/1620/1800p are fine as long as they are being upscalled to 4K before they reach the display. 1800p on paper seems like a tiny gap to full 4K, but that slight blur is just there, let at alone anything below.


depends on the TV size. 1080p is good on 43'TVs. on 55' TVs below 4K is noticeable. on a 65'TV sub 4K becomes even more noticeable.

Depends on the display's native res - 1080p on 55" will be super crisp if the panel itself is also FullHD, whereas 1200-1440p which technically is higher therefore should logically look better, will actually be a blurry mess if those same 55" are native 4K. That's just how the flat display tech works, and it's one of its biggest drawback.
 
Even with great reconstruction anything that constantly sits around 1080p is too low imo for the big TVs we have nowadays (DRS is fine to get through a tough moment - better to sacrifice resolution than frame rate). I bet anyone could tell the difference between 1080p PS4 and 1440p PS4 Pro SM:MM shots without doing a side by side. I cropped them but not zoomed or otherwise edited.

6lIkl9S.png

Muksuxl.png


If you toggle between 1440p and 4k in something like Yakuza: Like a Dragon you should clearly be able to tell the difference but I think 1440p is good enough image quality and if it means we get more 60fps games I'm all for it.
 
In all the time I played One X/Series X on a 4k monitor not once did I think "Man, this blows away my 1440p PC." I've recently swapped out the 4k for a 1440p120hz on the Xbox and feel I've lost nothing, even in 60hz shitter games like Destiny 2.

60hz is the 480p of refresh rates. I really don't see any scenario where I'd want to move past 1440p without pairing it with at least 240hz.
 
Last edited:

Cherrypepsi

Member
More is more!
Native 4k is so nice and clean but it comes at a cost where I prefer to push other things.
For example Ray Tracing in Cyberpunk, or 240hz in competitive games.
I am completely fine with DRS, DLSS, upscaling and image reconstruction.
 
I don’t get why console devs don’t just go with 1080p, consoles are made for the living room so the normal viewing distance is a couple of meters, I really couldn’t care less if the games run at 1080p or 4K at that range.
Agree, I'm sitting 3/4 meters from my TV. Impossible to see the difference between 1080 and 4k.

Rather have better graphics with 1080p/60fps as the standard.
 

cireza

Member
With shit framerate in Outriders and Mortal Shell?
The hardware is built so that developers reduce the resolution, and the Mortal Shell guys kept the same 4K and went 30fps. When they will try using the hardware as intended, maybe they will reach the intended results ?
 
It does matter. 4K has a much clear and crispy image tem 1080p. This is specially important if you are playing on a big and nice TV. Image quality, at least for me, is way more important than performance. Any performance issue you can get used after while unless it is a competitive game as this will detract from the enjoyment.
 

Armorian

Banned
The hardware is built so that developers reduce the resolution, and the Mortal Shell guys kept the same 4K and went 30fps. When they will try using the hardware as intended, maybe they will reach the intended results ?

But MS never forced anything on devs (like target framerates and resolutions) so you will get relatively good versions of some games and shit versions of other titles. Developers clearly don't give a fuck about this console, they just shit out quick and dirty version of their games.

XSS is crippled by low RAM amount and memory BW, if you only have XSS prepare for some rough games.
 

Knightime_X

Member
nano jaggies (4k) is unnecessary
micro jaggies (1440p) is a non-issue to me
mini jaggies (1080p) is highly tolerable esp with at least 2x AA applied
mega jaggies (720p) difficult but can be tolerated with 8x AA
pixel boulders (480p) is horrific on a non-crt monitor

Applies exclusively to polygon graphics not 2d sprites for my sake.
 
Last edited:

assurdum

Banned
We already know their narrative. Resolution doesn't matter but every time they seen 4k native clarity in a particular platform, they suddenly change their mind, put a bolded title about such enhancement and call it transformative. The only real question it's why persist with this argument.
 
Last edited:

SkylineRKR

Member
Resolution matters. I find 1080p to look consirably worse than 4k and even 1440p. I would be fine with 1440p however, since my room doesn't allow for more than 55 inch anyway and i'm not that far from my TV.
 

Allandor

Member
My problem is the increase of "artifacted graphics" over the years. More and more artifacts in the picture are accepted (like shimmering, ghosting,. ...) that are produced by reconstruction techs. It works for some games though, but a choice to use reconstruction or a native resolution would be great.

With resolution it is always the higher the better (if performance is stable). But it is not the most important thing about the graphics. The content is important.
 
Last edited:
Top Bottom