• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF asks: Does Resolution Matter?

assurdum

Banned
Of course resolution doesn't matter. Because this is where Xbox will outperform Playstation this gen. Obviously, it wasn't much of a problem when PS4 and PS3 outperformed Xbox One and 360 resolution-wise. 🤗

Anyway, Linneman rightly says: IF you can't hold your target framerate, lower the resolution. That's common sense. If one of the consoles can hold a righer resolution AND a steady framerate, it's obviously the more capable machine - and "the better version." But its telling people need to spin this.
I could swear AC and Dirt 5 are higher resolution on ps5 ... but we can say the same thing about you. Of course for you matter. Although DLSS is a thing, you are the only which persist to say it counts more, but obviously it wasn't much a problem in the xbone era.
Anyway it's matter of logic. If you can't see it, of course steady perfomance count more.
 
Last edited:

Vae_Victis

Banned
its all about screen size and sitting distance isn't it?

1440p upscale to 4k on 55" looks noticeably worse than native 4k. but if you sit further back, you cant tell
Correct. You need to figure out how "large" a pixel on the screen is inside your field of view. And it is obviously going to be smaller the smaller the screen is, and the further away from you it is.

Past a certain point it becomes very difficult to distinguish between similar colours, and at some point it become physically impossible to distinguish two distinct points at all. Your visual acuity also plays a role in this.
 
Last edited:

Chessmate

Banned
I could swear AC and Dirt 5 are higher resolution on ps5 ... but we can say the same thing about you. Of course for you matter. Although DLSS is a thing, you are the only which persist to say it counts more, but obviously it wasn't much a problem in the xbone era.

Valhalla does have a slightly higher resolution on PS5 (i. e. it doesn't drop as low as XSX). But performance on XSX is better. Which is why DF recommended to lower the resolution a bit more on PS5. Didn't happen, so XSX has the "better version", as laid out in this thread.

To be honest, I don't care much about these things. Just find it funny how you guys change your preferences after every tech analysis...
 

crozier

Member
I’m in the market for a new TV (32”) and I’m seriously considering a 1080p set because I prefer performance mode and have a boatload of PS4 games I have yet to play.

Am I missing out?
 
Last edited:

Armorian

Banned
I’m in the market for a new TV (32”) and I’m seriously considering a 1080p set because I prefer performance mode and have a boatload of PS4 games I have yet to play.

Am I missing out?

With relatively good tv you're not. You will get native res or downsampled image so it's a win.
 

Tunned

Member
I don't know if it's the C9 55" Oled that is amazing in upscaling or whatever, but games at 1440p look amazingly crisp on it. I have perfect eye sight and I cannot tell the difference when switching between 1440p and 4k in a game, from a 1.5-2m viewing distance.
I guess maybe side by side I would see the difference, but I don't care, I'm just happy I didn't waste money on a RTX3080, my RTX2070 takes on everything like a champ at 1440p.
 
depends on the TV size. 1080p is good on 43'TVs. on 55' TVs below 4K is noticeable. on a 65'TV sub 4K becomes even more noticeable.
Also depends on the game and how motion blur and AA are used, spiderman at 1080p still looks nice and clear, but crash 4 looks blurry even at 1080p.
 

Rudius

Member
For me on a regular screen, 1080p is good enough with great AA. 1440p, checkerboard 4K or dynamic res is better and nice to have, but I would trade those for better graphics. Native 4K is a waste and should only be the aim of less demanding games, like last gen ports, indies and sport games. 60 fps is very important though, and may determine if I will buy a game or not, when I'm on the fence.

In VR it's a different story. 1080p is not near to being god enough, even with the best AA possible (super sampling), like on some Pro or PS5 enhanced PSVR games. I suspect even native 4K will not be good enough, but only serviceable, like 720p-900p on the TV. 8K and even 16K should be noticeable improvements, but we con only hope to achieve that in the near future with the use of eye tracked foveated rendering.
 

Rudius

Member
In console gaming history, it seemed to matter from Nov 15, 2013 to Nov 6, 2017.
Early last generation the anti aliasing methods were much worse, leading to shimmer and artifacts even at high resolutions. When you have close to perfect AA like in recent games it matters less. It's like comparing a movie in 720p vs 1080p to a game at those resolutions; in the movie it matters less because reality has perfect AA.
 

assurdum

Banned
Valhalla does have a slightly higher resolution on PS5 (i. e. it doesn't drop as low as XSX). But performance on XSX is better. Which is why DF recommended to lower the resolution a bit more on PS5. Didn't happen, so XSX has the "better version", as laid out in this thread.

To be honest, I don't care much about these things. Just find it funny how you guys change your preferences after every tech analysis...
"Slightly".1080p lower resolution Vs 1368p it's wider than 2160p vs 1800p. Ironic but the math say that. Anyway the only scene with very noticeable worse fps was a bugged cutscene used ad infinitum by DF, but sure when it's 1800p Vs 4k, FPS difference didn't counted for them, they shouldn't absolutely touch the precious 4k but hurry developers fix ps5 in Valhalla, 1368p is not worthy.
In any case you can search whatever you want in the forum, the people who preferred higher res was quite a minority, not sure where you get it from this higher res preference in the past; there were even many dedicated thread about it.
 
Last edited:

Topher

Gold Member
Of course resolution doesn't matter. Because this is where Xbox will outperform Playstation this gen. Obviously, it wasn't much of a problem when PS4 and PS3 outperformed Xbox One and 360 resolution-wise. 🤗

Anyway, Linneman rightly says: IF you can't hold your target framerate, lower the resolution. That's common sense. If one of the consoles can hold a righer resolution AND a steady framerate, it's obviously the more capable machine - and "the better version." But its telling people need to spin this.


Digital Foundry is the one "spinning" this. You realize that, right?

BTW, PS3 did not outperform 360 resolution-wise. PS3 typically had inferior versions of multiplatform games. Linneman explicitly references how the PS3 versions were distracting in comparison. He talks of how it became less of a problem with PS4/X1 but still noticeable. And now with 4K he needs a PC to magnify the images and compare. Seems to be a pattern.

But having said all that if you need pixel count as a scorecard in the console war nothing is stopping you.

You Do You GIF
 

Fredrik

Member
Remember GIF by memecandy


And that was with a console that some people called underpowered yet it was still very impressive.
That was in 30fps though. That’s not okay in 2021 unless it’s a super slow game like The Medium and barely even then. I want 1080p at 60 or more fps.
 

Lokaum D+

Member
yes it does, 4k image is so f* good

1080p@144hz only for competitive shooter, for everything else l prefer 1440/4k @ 60fps
 
Last edited:

cireza

Member
Developers clearly don't give a fuck about this console, they just shit out quick and dirty version of their games.
That's on them, not on the console. I think that it is a perfectly fine console to target 1080@60fps, if developers keep targeting higher resolutions, then it's their problem if the game runs like shit in the end. MS can't enforce anything.

Developers aren't really used to making efforts when porting their games anymore. Everything is multi-platform nowadays. You make a game with a target in mind, and you port it quickly to every other platforms by lowering features and call it a day. And some people may even tell everyone how incredible the shit port is for running on a last gen hardware like the Switch.

"Look at this, it is Doom on Switch ! You might even be able to distinguish something when you don't rotate the camera ! Geniuses !".

"Oh no, our game cannot run on this shitty hardware" "No problem, let's make a cloud version"

You will never see the efforts that were put in the Saturn Quake port in any current day port.
 

RoboFu

One of the green rats
Not as much as good color. I recentl bought a sumgsung g7. its a 1440 va monitor. supposedly one of the best. But I am taking it back for two big issues.

1. the curve is stupid. curved monitors are stupid. its horribly distorted.
2. VA is no where near as good of a screen as an IPS panel. Ots like night and day. if you havnt used IPS for PCs then it shouldnt bother you but if you have manily used IPS panels then VA looks like shit.
 

jroc74

Phone reception is more important to me than human rights
Of course resolution doesn't matter. Because this is where Xbox will outperform Playstation this gen. Obviously, it wasn't much of a problem when PS4 and PS3 outperformed Xbox One and 360 resolution-wise. 🤗

Anyway, Linneman rightly says: IF you can't hold your target framerate, lower the resolution. That's common sense. If one of the consoles can hold a righer resolution AND a steady framerate, it's obviously the more capable machine - and "the better version." But its telling people need to spin this.
Its so easy to tell who is either:

Not watching the clip
Or have no idea how to comprehend what was said.

Folks from DF explicitly said.....it mattered more when the target resolution was lower....
 
Last edited:

jroc74

Phone reception is more important to me than human rights
Look, lets take the PS5, Sony out the equation. Maybe that will make it easier for some folks.

In the clip......they used the Series S vs the Series X as an example. And how a game looked damn near identical on both as far as resolution.
 
Last edited:

Kuranghi

Member
It just depends on your screen size and distance to it (I'm assuming the TV is basically calibrated for accuracy), higher resolution/better AA makes a huge difference for me on a 65" screen at ~2m, even 1800p is soft in comparison to native res, if you don't have a similar size to viewing distance ratio then you might not notice the difference as much.

John said he can't see the difference between 1800p and 2160p on his LG OLED, I would like to know what his TV settings are and how close he is to the TV. I find it to be a big difference in clarity and really enjoy the extra detail it gives, I also see a difference between 4K and 8K/16K (Dolphin emulation) in terms of aliasing and clarity, if I didn't I wouldn't ever worry about it and just play the games. Whether its a meaningful difference is up to you, but you must start from the right place, ie:

* TV set up for an accurate image
* Screen to viewing distance ratio is appropriate for 4K

If you don't see the difference then I don't see why you even want to talk about it with us that do, just go and enjoy the games (at higher framerates to boot). I get more enjoyment from a game when there is less aliasing/more clarity, it lets me be more immersed in the world, (mostly) more so than if its 60 fps at a lower resolution/with more aliasing. Some games don't benefit from a higher resolution due to how the engine works so it can be a fool's errand to try and get rid of all aliasing in those titles, its pretty situational.
 

Three

Member
Clarity is one of if not the most important thing

Edit: I don't have a problem with current tricks and imo game should target a lower than 4k res, these consoles don't have the juice to do next gen and 4k, the refreshes will probably.

I just don't like the way he said it, like it's no big deal (clarity that is) when it is huge
Agreed when there is nothing else to gain but since we are talking about tradeoffs i would rather stable fps (even 30) and much better visuals. I find increasing resolution for a small increase in clarity to be a huge waste of computing power.
 

Hunnybun

Member
It just depends on your screen size and distance to it (I'm assuming the TV is basically calibrated for accuracy), higher resolution/better AA makes a huge difference for me on a 65" screen at ~2m, even 1800p is soft in comparison to native res, if you don't have a similar size to viewing distance ratio then you might not notice the difference as much.

John said he can't see the difference between 1800p and 2160p on his LG OLED, I would like to know what his TV settings are and how close he is to the TV. I find it to be a big difference in clarity and really enjoy the extra detail it gives, I also see a difference between 4K and 8K/16K (Dolphin emulation) in terms of aliasing and clarity, if I didn't I wouldn't ever worry about it and just play the games. Whether its a meaningful difference is up to you, but you must start from the right place, ie:

* TV set up for an accurate image
* Screen to viewing distance ratio is appropriate for 4K

If you don't see the difference then I don't see why you even want to talk about it with us that do, just go and enjoy the games (at higher framerates to boot). I get more enjoyment from a game when there is less aliasing/more clarity, it lets me be more immersed in the world, (mostly) more so than if its 60 fps at a lower resolution/with more aliasing. Some games don't benefit from a higher resolution due to how the engine works so it can be a fool's errand to try and get rid of all aliasing in those titles, its pretty situational.

You might just have unusually good eyesight.

I have a similar set up to you and I really struggle to differentiate 1440p and 4k. It's there but it's a struggle. I genuinely can't tell the difference between native and reconstructed 4k.

Otoh 30fps now looks so ludicrously jerky to me that I find it almost laughable that anyone would suffer it for a bit of clarity.

Horses for courses I guess. But I do suspect that your sensitivity to resolution is more at the extreme end than mine to frame rate. E.g. I've never had any problems with my eyes.
 

Kuranghi

Member
Not as much as good color. I recentl bought a sumgsung g7. its a 1440 va monitor. supposedly one of the best. But I am taking it back for two big issues.

1. the curve is stupid. curved monitors are stupid. its horribly distorted.
2. VA is no where near as good of a screen as an IPS panel. Ots like night and day. if you havnt used IPS for PCs then it shouldnt bother you but if you have manily used IPS panels then VA looks like shit.

But what about the contrast difference between VA and IPS? The colours might look nicer on an IPS screen but the contrast is ~1000:1 when you could have ~5000:1 (or higher but mostly on high end TVs) on a really good VA. Will you be able to see the VA panel side by side with the IPS? I think you'll be surprised how much better the black level/depth of the image is on the VA vs IPS.
 

Kuranghi

Member
You might just have unusually good eyesight.

I have a similar set up to you and I really struggle to differentiate 1440p and 4k. It's there but it's a struggle. I genuinely can't tell the difference between native and reconstructed 4k.

Otoh 30fps now looks so ludicrously jerky to me that I find it almost laughable that anyone would suffer it for a bit of clarity.

Horses for courses I guess. But I do suspect that your sensitivity to resolution is more at the extreme end than mine to frame rate. E.g. I've never had any problems with my eyes.

I wear glasses so I guess that means my eyesight might be clearer than most peoples (With them on ofc), the 30fps motion clarity thing is handled by the fact that my TV is an LCD and has a longish response time, so 30fps doesn't look nearly as jerky as it does on an OLED or fast response time monitor. If I was on a IPS/TN/OLED panel I would probably favour framerate more to alleviate that problem.
 

Hunnybun

Member
I wear glasses so I guess that means my eyesight might be clearer than most peoples (With them on ofc), the 30fps motion clarity thing is handled by the fact that my TV is an LCD and has a longish response time, so 30fps doesn't look nearly as jerky as it does on an OLED or fast response time monitor. If I was on a IPS/TN/OLED panel I would probably favour framerate more to alleviate that problem.

Fair enough. Tbh I noticed the jerkiness on my old X900E when switching to 4k modes almost as much as on the OLED.

I'm happy as long as we continue to get the options tbh. What I will say is that the resolution debate does seem to proceed as if trade offs don't exist. A lot of people who defend high native resololutions seem to think it's sufficient that they can tell the difference: of course you should be able to tell the difference!! We're talking about roughly double the computing power here! That doesn't justify the cost!

Let's take the new R&C game as an example. DF said that it was running in native 4k. I know there's some doubt about that because they subsequently released a blog post that said dynamic resolution, but given that they have headroom to release a 60fps mode, it's probable that it's just minimally dynamic in order to retain a consistent frame rate.

Ok, so let's say it's pushing roughly 4x the pixels of the PS4 game. And let's say the PS5 is about 8x as powerful as the PS4. Very roughly, the other improvements to fidelity are being achieved with 2x the GPU power (I know it's more complicated than that but just as an illustration). But the other improvements look like a massive, generational, leap! With 2x the GPU power.

So we can infer with some confidence that 2x GPU, holding other things constant, can provide a really significant boost to fidelity. Which is the same as going from 1440p to 4K. So it should follow that should involve a HUGE increase in clarity to even begin to be worth sacrificing a near generational leap in visuals.
 

Topher

Gold Member
Yeah, there definitely seems to be a change in narrative now that sony isn’t the resolution king anymore. But i do agree that 1800p compared to 4k is less noticable than 720p compared to 1080p

Over the past 15 years, Sony has been "resolution king" for about four. Either way, Linneman talks about how resolutions have progressively declined in importance since the PS3/360 years.

john lewis lol GIF
 

RoboFu

One of the green rats
But what about the contrast difference between VA and IPS? The colours might look nicer on an IPS screen but the contrast is ~1000:1 when you could have ~5000:1 (or higher but mostly on high end TVs) on a really good VA. Will you be able to see the VA panel side by side with the IPS? I think you'll be surprised how much better the black level/depth of the image is on the VA vs IPS.

How good is contrast when it’s just crushing your color range? I rather be able to see every pixel and every color than a slightly better yet crushed darker black.
 
But what about the contrast difference between VA and IPS? The colours might look nicer on an IPS screen but the contrast is ~1000:1 when you could have ~5000:1 (or higher but mostly on high end TVs) on a really good VA. Will you be able to see the VA panel side by side with the IPS? I think you'll be surprised how much better the black level/depth of the image is on the VA vs IPS.
Sure...but VA panels have nasty black level smearing. As soon as you find yourself in a dark the entire screen turns into a blurry ghosty mess in motion. It's really distracting, far more so than black not being quite as black is it could be IMO, and no amount of overdrive will fix it, its inherent to the technology.
 
I’m in the market for a new TV (32”) and I’m seriously considering a 1080p set because I prefer performance mode and have a boatload of PS4 games I have yet to play.

Am I missing out?
Hard to tell, 4K definitely has more fine details (kid of like when "retina displays" appeared on phones), God of War 2018 looks amazing at 60fps in resolution mode on the PS5, but on the other hand I only played it at 1080p on the PS4 Pro to get the boost in performance it offered.

But if you look on the PC side gamers monitors are usually 1080p or 1440p so that frame rates can be maintained high enough, as well as margins, you have to pay for all this gamers specific design.

I think on a 32 inch TV you may as well go 1080p, depending how far you sit it may not matter much... but I would go 4K if it was for me.
 

SlimySnake

Flashless at the Golden Globes
Anyone who has played Batman Arkham Knight and DriveClub on a large 4k screen can attest to there being a need for higher resolutions. There is shimmering and aliasing EVERYWHERE in those games. Especially apparent in the open world sections in Arkham Knight. It makes the games look like shit quite frankly.

That said, native 4k is a waste of resources. 1440p with Temporal AA is good enough. Im playing Mass Effect Andromeda right now and 1440p with TAA looks almost on par with native 4k with FXAA. Playing RDR2 at native 4k was like a religious experience on the x1x, but tbh, i wouldve rather they did 1440p 60 fps with TAA.

I think 1080p for current gen games or next gen games is far too low. I have seen what it does to SOTC and GoW, and I have seen 1080p games running on my PC. The shimmering is really bad and you just dont get to see the great looking assets.

Use 1440p as base and focus on increasing foliage, detail on character models, lighting and visual effects.
 

Heisenberg007

Gold Journalism
Graphical Fidelity* > Frame Rates* > Resolution.

I know it'll likely start a console war (please don't, because that's the only big example I can think of right now), but look at Demon's Souls and Halo Infinite War. Demon's Souls was running at 1440p, while Halo was running at native 4K.

Demon's Souls looks infinitely better than Halo. Graphical fidelity, better particle effects, and higher-quality textures will always make a game much better than a game with just native 4K but low fidelity.

* It could also be FR > GF > Resolution, based on personal preferences. But I feel resolution always comes third for most people.
 
Top Bottom