• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF asks: Does Resolution Matter?

Topher

Gold Member
Video timestamped to the resolution discussion:


I didn't see this posted yet so....

I jotted down some notes, but the overall conclusion that I get from this is that too much is made from resolution, especially focusing on the lower bound of DRS.

Notes:

Leadbetter questions the relationship between image quality and pixel counts. Not the same.

Inaccurate focus on the lowest bound in DRS.

Linneman: Pixel count only became important with flat panels. In the early days of DF, pixel counts were being compared in the 720p range and it was significant. Perceptual difference is massive compared to 4k to 1800p.

XB1 and PS4. Less than 1080p was noticeable, but less so than the prior generation due to AA.

Once we get to 4K at a normal difference it becomes much more difficult to perceive pixel differences. To see the difference, a file must be captured and compared frame by frame on a PC. Pixel counts are not as important anymore.

Tom: Mortal Shell. Even XSX vs XSS versions look similar side by side.

Leadbetter: Clarity is the only gain from higher native resolution.

Linneman: Performance drops are infinitely more noticeable than a slip in DRS.

That's a quick summary, but I think it gets the main points.
 
Last edited:

Esca

Member
Leadbetter: Clarity is the only gain from higher native resolution.
Clarity is one of if not the most important thing

Edit: I don't have a problem with current tricks and imo game should target a lower than 4k res, these consoles don't have the juice to do next gen and 4k, the refreshes will probably.

I just don't like the way he said it, like it's no big deal (clarity that is) when it is huge
 
Last edited:

jroc74

Phone reception is more important to me than human rights
Now I sit and wait for the folks that disagree.

DF, NXGamer basically said the same thing, just in different ways.

This video: "there isnt a linear relationship between resolution and image quality" Explains that the resolution scaling, reconstruction is so good.....uses Series S vs Series X.....as an example.

NXGamer: "its not the resolution but the quality of the resolution that matters"

This is DF just reiterating something they stated a few years ago. And explains why...yet again....

The first 2 posts: Achkutally......

High res but lower quality textures, shadows, streaming issues, pop in, etc ......high res matters more?
 
Last edited:
This is my personal opinion but I can't tell the difference between 1440p and above at a normal seating difference.

I spent a lot of time messing with Forza Horizon 4 on my PC. I couldn't tell a difference between 1440/1800/2160 at my seating distance (1-2.5m from a 50" 4K tv) and I've left it at 1440 so I can run it Ultra 60. Now if I went up close to my TV and stared at a static image then yeah I would notice. Hell I can easily play at 1080p from that distance and it looks fine. When it comes to consoles I can't tell a difference between checkerboard and native either.

I play games not stare at 400% static zooms.
 

Thirty7ven

Banned
Was finding it weird how they were downplaying it, since that hasn’t been the theme ever since the 1X came out. Then I was wondering why while I heard them go on... about the Series S and then it was like Ooooooooohhhhhh. I see, the Series S of course! Stupid little me. Duh
 
Last edited:
Yes resolution matters. But we are reaching diminishing returns here.

480i - 720p was a massive jump in clarity.

720p to 1080p was large and very noticeable.

1080p to 1440p is quite noticeable and we are seeing that many gamers are quite happy to sit and stay here at this resolution. Now they prefer higher refresh rates.

1440p to 4k is a welcome refinement but is also fairly dependent on screen size and how close you sit.

4k to 8k well..... I'm sure there is a difference especially with massive TV's in the 85+ inch range. And for VR I can easily imagine 8K+ along with foveated rendering.


But as far as sit in front of a flat TV gaming, we are nearly at a point where resolution increases won't matter like they used to. Console gaming should focus on a locked 60fps in most games. If you can lock 60fps and have a variable resolution between 1440p and 4k, I think that's the "gud enuff" range for the vast majority of people.
 

GreenAlien

Member
Once we get to 4K at a normal difference it becomes much more difficult to perceive pixel differences. To see the difference, a file must be captured and compared frame by frame on a PC. Pixel counts are not as important anymore.
Doesn't that depend entirely on how big the screen is?
 

Topher

Gold Member
He's right that clarity is the difference with 4k. But that's so noticeable to be fair. I can easily tell a game that's not native 4k because it doesn't look very clear - everywhere. Not just in some scenarios.

But for me, at least, lower frame rates are also noticeable everywhere. Last gen became such a chore for me to play on console as everything was 30fps. I'm thankful this gen nearly every game has a performance mode. Guess it comes down to personal preferences. I'll sacrifice that bit of clarity for frame rate. Glad we have options.
 
This is just my experience. I game on a monitor so I really don't notice a difference between 1800P and 4K. 1440P to 4K is noticeable however I really have to look hard to see the difference. Most of the time I stick with 1440P do to the rock solid framerate that it gives me. If I increase my settings to 4K the more unstable framerate isn't worth the slight increase in clarity (on my monitor). With more modern games it means gaming at 30FPs instead of 60FPs.

However if I had a massive TV my opinion could change. But that's not the case for me at the moment. I'm actually more interested in upgrading to a 4K HDR monitor than buying a new TV. A dual monitor setup can be great for my MMO gaming.
 

Mohonky

Member
Still gaming @ 2560x1080. When I bought the setup, I would have had to have doubled my GPU and monitor budget to go a resolution higher, so I didn't. Having gamed at my mates @4k on his console / TV, I still prefer the 21:9 res even if its lower. Ideally, I would go higher res 21:9, but for me, aspect ratio > res
 

Shmunter

Member
IQ is the great equaliser. Shimmer and jaggies are distracting. Thankfully that’s been pretty much eliminated these days.

Add post processing with things like fog, dof, motion blur, etc. And the only way to determine resolution is during specific moments like the first frame in a scene change.

The best looking games produce a naturally organic picture, not sharp lines and clinical unnatural rendering.

As long as you’re high enough, going higher is a waste of resources.
 

Tchu-Espresso

likes mayo on everthing and can't dance
Kudos to Richard for pointing out the elephant in the room: At normal viewing distances, the differences in resolution at the upper end become very hard to perceive.

Devs should focus on improving visual fidelity instead of just resolution but unfortunately that’s harder on the development budget.
 
Last edited:
Leadbetter: Clarity is the only gain from higher native resolution.

Linneman: Performance drops are infinitely more noticeable than a slip in DRS.

That's a quick summary, but I think it gets the main points.
These are very reasonable things to say.

They should make a video showing how each and what they could or do mean.

We are far from the sub 720/1080p games that pretty much constantly run below 30fps (or barely touch 60 when they target it). Now we have games that run 1440p+ at a locked or very close to locked 60fps with cartridge like load times in actual next gen titles... And they look great.

I mean, beyond much improved RT performance (we could have all lighting, shadows indirect lighting and reflections raytraced all the time) what could we ask for that truly would improve the quality of the graphics in the games released on these consoles?
 

Astral Dog

Member
It matters a lot but honestly, the push to fill those 4k pixels on your new expensive TV just isn't worth it imo, a 1080p tV would still produce beautiful visuals with even more impressive graphics and framerate, 4k is a marketing number though.

The Switch suffers from SD sub native res games , an overall lack of power when playing on TV again 1080p would be just fine, but the push for 4k TVs to become the standard will leave the poor Switch behind again and everybody will complain.

When companies push 8k to become the standard , it would still be a waste imo, and make the game downloads and sizes even more inconvenient
 

Topher

Gold Member
Its amazing how this was in the clip.

(at least somebody was watching)

awkward bill murray GIF
Jesse Pinkman Reaction GIF by Breaking Bad
 

01011001

Banned
There’s a lot of tricks to make the image quality better regardless of resolution.

you can’t fake framerates though, so personally I’ll take higher frames.

technically you absolutely can fake framerates... it just gives you unbearable input lag ;)

and then there is what VR headsets do, they shift the image to the side a bit whenever you move your head, in a higher refresh than most games run in. this lessens the noticeable delay in the game's response to your movement and can help with motion sickness.
PSVR does that for many games. the games will typically run at 60fps, but the headset will move your view in 90hz or 120hz (depending on the game)
 
Last edited:

dottme

Member
NXGamer NXGamer is kind of right i think. More is better but the ROI is quite limited above a certain resolution. I would prefer they focus on pushing more stable frame rates and more details than doing 4K.
But for a console manufacturer, it's easier to say: looks we do true 4K so we're better.
 

Kagey K

Banned
Once actual next gen games come out we will start seeing the differences, this period of limbo between both gens proves very little in the actual scheme of where next gen is going.

Assuming we ever cross this hurdle. Right now everyone seems scared to actually take the leap.
 
Last edited:

Mokus

Member
1080 pixel density per object it is quite acceptable overall. At close range almost unnoticeable, the issue is when things are in the distance and it shimmers on each object edge and polygonal edge. This is most noticeable in open world games. For me the shimmering at 1080p it's still too distracting.

The 1440 pixel density is where the things start to look good most of the time even on distant objects. There is still shimmering, but the object has to be very close to horizontal or vertical position and at a certain distance. For my eyes, the 1440p is where going higher it is not so crucial anymore. Jumping from 1440p to 4K is noticeable, but considering that on consoles it's going to be a 1440p60fps vs 4K30fps I'll take the higher frame rate without much thinking (and 4K is actually often reconstructed).
 

Sejan

Member
Honestly, I think a lot of games would be better on the PS5/XSX at 1080p with a perfectly smooth frame rate and better effects/render difference/AA/etc. Console gamers often sit far enough from the screen where the difference in resolution makes little true difference. I think for the average user, 4K is little more than a meaningless buzzword. I know that I’d likely sacrifice the resolution for smoother gameplay if given the choice.
 

ZehDon

Gold Member
Yeah, for all the talk of diminishing returns in the past, I think we're actually starting to see it for once in resolutions. 900p on the Xbone vs 1080p on the PS4 felt like a world of difference, especially on a decent sized TV. Jumping to the Series X, I find the difference between 1800p and 4K hard to spot on my 4K TV, frankly.

In PC land, I found the jump from 1080p 16:9 to 1440p 21:9 to be a massive jump because of the increased horizontal FOV. And so, Cyberpunk at 1440p Ultra Wide looks vastly more impressive than Cybperunk at 4k 16:9. I think offering the console gamer a choice between 4k and 60FPS is the right way to go, because even at 60FPS, the resolutions are usually pretty acceptable.
 
Top Bottom