• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

4K is overrated compared to 1440p.

rofif

Banned
Nah. 1440p looks like ancient shit without any aa help. Its only good as internal res for dlss to upscale to 4k. Then it looks better than bare 4k but not always better than 4k and it’s good taa
1440p is such a little step up from 1080p, it is clearly just transition resolution.
It’s barely ok on 27” and when we go bigger like 42 or 48” oled, bare 1440p is not enough.

Dlss is the saviour here. Or at least good aa which is also needed at 4k which itself is not good enough. 4k was not even enough without aa at 27”…
 
Last edited:
Monitors dont have very good scaling, so anything below native resolution results in clearly worse picture unless you will use DLSS or FSR. On TV's however 1440p upscaled looks still very sharp to my eyes (at least on my 55inch sony 4K LCD), so you can play new games at 1440p with better framerare, and older games at 4K.
 

Fahdis

Member
Depends , if you can push higher graphical settings at 1440 than you can at 4K then it can absolutely look better.

Please do tell me how 1440P with 4K both set to Ultra for both (lets not talk about Hz or FPS) make the game look better on 1440P? I think we are beyond the resolution argument because it is king. Stick to Graphical Fidelity.
 

Klik

Member
I have 27" 1440p 165hz monitor and i tried 4k 60hz.

In my opinion, at 27"-28" pixel density is really high on 1440p that makes 4k kinda pointless and waste of resources.

Even with next Nvidia GPU like RTX 4080 i think 1440p/144hz is perfect choice.
 

Fahdis

Member
To play at 4k at first but the hit for FPS was too much.

What game(s) in particular are we talking about? You could literally downgrade to a 3070Ti and have games play at 2K with high refresh rates (maybe not at 3090 levels fast). The juice from your 3090 is going to waste. You can do whatever you like, but the logic isn't working for me to not downgrade at this point. I guess since you bought it, you're stuck with it.
 
Last edited:

OverHeat

« generous god »
What game(s) in particular are we talking about? You could literally downgrade to a 3070Ti and have games play at 2K with high refresh rates (maybe not at 3090 levels fast). The juice from your 3090 is going to waste. You can do whatever you like, but the logic isn't working for me. I guess since you bought it, you're stuck with it.
Cyberpunk, Halo Infinite, Hitman etc
 

RoboFu

One of the green rats
Please do tell me how 1440P with 4K both set to Ultra for both (lets not talk about Hz or FPS) make the game look better on 1440P? I think we are beyond the resolution argument because it is king. Stick to Graphical Fidelity.
Read my post again
 

Fahdis

Member
Cyberpunk, Halo Infinite, Hitman etc

Yea Cyberpunk is a definitely not going to be accessable at the frame rates we want until 4000 series at 4K, I can understand you here. 3090s chug on this game too. Halo Infinite and Hitman are optimized for PC you could have a good time with a 3080 as well.
 

8BiTw0LF

Banned
1440p is the sweetspot for 27" - scientifically speaking. For work (graphics), 4K is the sweetspot for 27"-38".

Everything above 38" should be in 8K to maintain retina status.
 

Fahdis

Member
Read my post again

I don't think you read mine.

4K vs. 1440P
Both on Ultra Settings
4K will have FPS issues (without DLSS or FSR) and 1440P will be better performance
The point I was trying to make was that 4K > 1440P in Graphical Fidelity depending on your screen size.
1440P > 4K in FPS but it will never look better than 4K assets. On OLEDs it is night and day.
 

rofif

Banned
There was a brief moment I had both 27" 4k and 1080p ips monitors.
4k 27" at 200% scaling is exactly icons size as 1080p 100% at 27". So I did some comparisons. the 1080p mirrors edge is DOWNSCALED from 4k. Just to show the panel differences of pure screen pixels.
So even perfect msaa, 4k downscaled image, will not overcome pure screen design.
1440p would be a bit better than 1080p but not that much. Shame I didn't grab comparisons.
Anyway - now 4k at 48" is not so bad as I expected. I was really worried coming form 27" 4k but I naturally lean more back in my chair and TAA/DLSS techniques got much better. Raw 4k would be eye sore.
As for displaying 1080p on 4k? No problem. Nvidia has integer scaler in nvcp. Ps5 is doing something else. if I take screenshot of bloodborne, it looks pixelated as it is captured at 4k integer scaled. But it is not pixelated in real time, so I expect sony to be doing bilinear scaling

Bxz5tLm.jpg

BbIHbGh.jpg
 

RoboFu

One of the green rats
I don't think you read mine.

4K vs. 1440P
Both on Ultra Settings
4K will have FPS issues (without DLSS or FSR) and 1440P will be better performance
The point I was trying to make was that 4K > 1440P in Graphical Fidelity depending on your screen size.
1440P > 4K in FPS but it will never look better than 4K assets. On OLEDs it is night and day.
It’s the “both on ultra settings “ part that goes against what I said.
 

Spukc

always chasing the next thrill
Bought a 1440p 240hz monitor to go along my almost 4K ultra wide monitor and LG CX OLED. And seriously on a 27´ monitor with a 3090 playing at high framerate is a game changer it look crisp but the main draw is the smoothness of it all 240 fps doom eternal is pure sex!
1440p is overrated compared to ultrawide 3440x1440
everything is better in 21:9 inc movies
 

VFXVeteran

Banned
Bought a 1440p 240hz monitor to go along my almost 4K ultra wide monitor and LG CX OLED. And seriously on a 27´ monitor with a 3090 playing at high framerate is a game changer it look crisp but the main draw is the smoothness of it all 240 fps doom eternal is pure sex!
Any game will look good at the screen's native resolution (in this case 1440p) because there is a 1:1 framebuffer pixel-to-screen pixel alignment. It's only when you use a higher resolution device and the game has to upscale (internally) to that high native res where things start to not look good. For example running a game at 1080p on a 4k monitor/TV will look like crap compared to a 1080p game running on a 1080p TV.
 
Last edited:

Spukc

always chasing the next thrill
ekhm... cough cough...

N O P E NO BORDERS.

cool hack tho if you get the LG oled 42 c2 but not the same..
i will prolly end up with a c3 if LG manage to produce 32 inch oled screens on a large scale 4k 32 inch is the sweetspot imo.
But disabling screen real estate for fake 21:9 ? nah.

i would legit buy a 4:3 oled monitor.
 
Last edited:
Native resolution is irrelevant at this point, reconstructing to 4K from a lower resolution is what makes the most sense now (assuming we are talking about a 4K TV).

There was a brief moment I had both 27" 4k and 1080p ips monitors.
4k 27" at 200% scaling is exactly icons size as 1080p 100% at 27". So I did some comparisons. the 1080p mirrors edge is DOWNSCALED from 4k. Just to show the panel differences of pure screen pixels.
So even perfect msaa, 4k downscaled image, will not overcome pure screen design.
1440p would be a bit better than 1080p but not that much. Shame I didn't grab comparisons.
Anyway - now 4k at 48" is not so bad as I expected. I was really worried coming form 27" 4k but I naturally lean more back in my chair and TAA/DLSS techniques got much better. Raw 4k would be eye sore.
As for displaying 1080p on 4k? No problem. Nvidia has integer scaler in nvcp. Ps5 is doing something else. if I take screenshot of bloodborne, it looks pixelated as it is captured at 4k integer scaled. But it is not pixelated in real time, so I expect sony to be doing bilinear scaling

Bxz5tLm.jpg

BbIHbGh.jpg
Notepad in 4K is truly next level stuff, a completely different experience, specially if you are hugging the screen.
 
Last edited:

rofif

Banned
N O P E NO BORDERS.

cool hack tho if you get the LG oled 42 c2 but not the same..
i will prolly end up with a c3 if LG manage to produce 32 inch oled screens on a large scale 4k 32 inch is the sweetspot imo.
But disabling screen real estate for fake 21:9 ? nah.

i would legit buy a 4:3 oled monitor.
Yeah I don't like it either. I prefer to use full 16:9 aspect ratio.
But 21:9 on oled works and you cannot see disabled parts at night at all
 

JohnnyFootball

GerAlt-Right. Ciriously.
Bought a 1440p 240hz monitor to go along my almost 4K ultra wide monitor and LG CX OLED. And seriously on a 27´ monitor with a 3090 playing at high framerate is a game changer it look crisp but the main draw is the smoothness of it all 240 fps doom eternal is pure sex!
I found it to be quite the opposite. I bought a 240 Hz monitor myself (after I owning an LG C1 with my 3080 hooked up to it), but very few games aside from much older games could hit 240 fps. On the few games that could hit 240 fps (Quake 2 XP) I found my experience on my C1 at 120 fps to be more enjoyable due to the better contrast. Both experiences were very responsive.

Let me clear, I still have fun playing on my Gigabyte 240 Hz mind you and my OLED setup is more used for controller games, while I am using my 240 Hz display for keyboard and mouse games. The contrast difference is strikingly different and that is a factor for me.
 
Last edited:

Ironbunny

Member
I'd take 32" 4k 144hz over 27" 1440p 240hz any day. 1440p is okay but noway as crisp as 4k. 4k on a 27" is a waste though.
 

Hoddi

Member
I have a dual monitor setup with 27" 1440p165 and 4k144 displays. I found the upgrade from 1440p to 4k a much larger improvement than from 1080p to 1440p. It was a huge bump in clarity even at just 27".
 

Leonidas

Member
As someone who upgraded from 1440p to 4K 5 years ago I just can't see myself giving up the the 2.25x pixel density for desktop use.

4K240 monitors launch this year. That will probably be my next monitor upgrade.
 

Topher

Gold Member
As someone who upgraded from 1440p to 4K 5 years ago I just can't see myself giving up the the 2.25x pixel density for desktop use.

4K240 monitors launch this year. That will probably be my next monitor upgrade.

Where do these high refresh rates become important? I typically adjust settings for 80-100hz. Anything about that and I'm not seeing improvement. Just my eyes maybe. Or are there other factors for competitive multiplayer (input latency?) that do not really apply to me since I only play single player games? Just trying to understand.
 
Last edited:

MadPanda

Banned
You haven't explained why 4K is overrated, you barely stated you have preference for high refresh rate. I don't see the point in playing single player games on fps higher than 60, so I'll always prefer 4k/60 for SP than 1440/120 or whatever else.
 

Roberts

Member
There is a game called The Vanishing of Ethan Carter and you can switch resolution instantly on it - so a good way to test a difference between 1440p and 4k. On a 65 inch oled the difference is super noticeable.
 
Where do these high refresh rates become important? I typically adjust settings for 80-100hz. Anything about that and I'm not seeing improvement. Just my eyes maybe. Or are there other factors for competitive multiplayer (input latency?) that do not really apply to me since I only play single player games? Just trying to understand.

Yeah, 80 is the minimum for me and anything above 90~100 is difficult to discern. Of course, that's playing on a VA (AOC Q27G2U) with typical VA slow pixel transitions, so that probably affects things. I'm sure IPS and especially OLED with better motion clarity would make higher rates more discernible.

E: actually I think it's important to say too, control device matters. A mouse needs higher refresh, but controller feels fine at 60 to me.
 
Last edited:

Filben

Member
4k requires too much hardware power for the benefits over 1440p for my taste; it's fairly inefficient. You only get diminishing returns with it.

While 1440p over 1080p add many more details, the jump from 1440p to 2160p isn't that big.

Viewing distance and eye sight also plays a big role. Also,.you still need AA on 4k because pixel count alone won't remove jagged edges and/or shimmering.
 

DukeNukem00

Banned
Where do these high refresh rates become important? I typically adjust settings for 80-100hz. Anything about that and I'm not seeing improvement. Just my eyes maybe. Or are there other factors for competitive multiplayer (input latency?) that do not really apply to me since I only play single player games? Just trying to understand.

high refresh rate was never about visuals. Its about the way you interact with the game. Input latency, feel, responsiveness. Especially when you play on PC with a 1:1 input tool like a mouse. Every action you perform in any game is just silky smooth and every game responds to your commands as if you're playing Quake 3
 

jaysius

Banned
1080p looks great still if you have a high tier 1080p tv with quantum dots and sit 7-10 feet away AND have it properly calibrated with no crushed blacks and no shitty super contrast features enabled.

Placebo is a helluva drug.

Also if you buy a $500tv you’re going to get shit 4K up scaling and shit any resolution.
 
Last edited:

Topher

Gold Member
high refresh rate was never about visuals. Its about the way you interact with the game. Input latency, feel, responsiveness. Especially when you play on PC with a 1:1 input tool like a mouse. Every action you perform in any game is just silky smooth and every game responds to your commands as if you're playing Quake 3

It is also about visuals though. You don't even have to play a game to see the visual difference.

30 fps vs 60 fps:



But I think I understand your point about input types and how they can play a part in conjunction to frame rate.
 

Bo_Hazem

Banned
Bought a 1440p 240hz monitor to go along my almost 4K ultra wide monitor and LG CX OLED. And seriously on a 27´ monitor with a 3090 playing at high framerate is a game changer it look crisp but the main draw is the smoothness of it all 240 fps doom eternal is pure sex!

You should avoid the internet when you're still not sober, mate.
 
Top Bottom