• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF] What Graphics Card Do You Really Need for 4K PC Gaming?

jonnyXx

Member
..because when you've spent $1500 on a GPU, lowering settings kind of defeats the point of spending $1500 on a GPU.
I think that ultra settings are not worth it and you can sometimes gain a lot of performance with minimal visual loss by turning some settings down. Having said that, it fucking sucks that a $1800+(CDN) GPU is still not enough to brute force ultra settings. That's a lot of money.
 

ZywyPL

Banned
I'm such a huge fan of cheating 4k. Checkerboard rendering, DLSS etc. Even better when a game has a resolution scaler and you can run the game at 90% for example. It really helps with my 2080.

I know people want that native image, but for me it honestly isn't worth it. I'm glad on PC you have options to play the game's how you like.

I'm happy with my 2080, some games I can get 4k60, but most games I'd rather keep the graphical fidelity at the expense of some of the image quality and I'm ok with that.

I'm on the opposite camp, I prefer to tone down the settings to maintain native resolution at all cost - blurry/soft image is easily noticeable when you have the display right in front of your face, as oppose to consoles/TVs, and it just kills the entire image quality too much for me.

I've seen DLSS2 in Control tho, and I see no reason to not to use the tech whenever possible, it really is a game-changer in generating computer graphics, I hope te tech becomes more and more utilized in the upcoming years.
 

MoreJRPG

Suffers from extreme PDS
The sad part is, that people expect 500$ consoles to push 4k/30 (or even 60 fps) while much pricier PCs struggle to do so.

Apples to oranges. Series X will routinely hit 4K/60fps, PS5 should as well for the most part aside from first
party.
 
Last edited:
For me max settings is the whole reason I do PC gaming. I want the best the game can give. If you don’t care that’s fine but I am waiting for cards that can run 4K 60fps Max setting and until that happens I’m staying at 1080p 60fps.........max settings.
There is never going to be a card that will run all current games at that level of performance for any significant length of time. Cards get faster, but games and players get more demanding too. If a 2080 Ti is unable to hit 60 FPS at 4k in games designed for current gen consoles, what do you think is going to happen once next gen is out? Even if you're going to spend $1500-2000 on a 3080 Ti, you're gonna be running into games that will cause you trouble sooner rather than later.
 

Pizdetz

Banned
I think everything is pointing to the reality that the graphical jump of "next gen" is going to be rather small compared to before.
Xbox is even saying they want to run games at 120 FPS, and high framerate requires enormous horsepower, especially at high resolution.
I doubt we'll see much better looking games than Gears 5 for quite some time.
 

Thaedolus

Gold Member
My take:

4k is overrated and unnecessarily taxing

Framerate is much more important

Every game has different graphical settings which may or may not really affect what you see significantly but can be very taxing, so whether or not something can be run with everything on “high” or “ultra” probably doesn’t matter.

Build a rig that can do what you want it to do instead of worrying about hitting some arbitrary benchmark. 1440p is fine by me. 1080 is too if I want to output to my comfy couch. Whatever settings I need to go to ensuring >60hz at those resolutions probably still look great because I still play my NES on a CRT and that’s fine too
 

johntown

Banned
Then you're a damn fool because the median between high and ultra is very thin with a decent performance bump for toggling (like a proper pc player, imagine that, having decent choices that affect performance) properly betwixt the two.

I can run most games at 4k60 on my 1080ti and they look beautiful even with reduced settings. Moreso using my main monitor which is 1440/144hz

I was like you until I realized I chased upgrades too much for minimal gains.
I'm not saying that it won't look good but what you consider looking good and what I do are obviously two different things. Maybe it is my OCD or maybe I can actually tell the difference? Your statement is not universal either as some games you can notice a big difference between high and ultra and others you cannot notice much. It depends on the game.

If we want to get super technical here unless you have a large TV for your PC screen then it might even be hard to really notice much of a difference between 1080p and 4k.

When I go to 4k I want it to worth it not just well its good enough.
 

Ulysses 31

Member
Apples to oranges. Series X will routinely hit 4K/60fps, PS5 should as well for the most part aside from first
party.
Yeah right, the UE5 demo hit neither 4K or 60 fps. If PS5/XSX games hit 4K/60fps it will be because graphical details will be scaled back considerably compared to max PC settings.
 

johntown

Banned
There is never going to be a card that will run all current games at that level of performance for any significant length of time. Cards get faster, but games and players get more demanding too. If a 2080 Ti is unable to hit 60 FPS at 4k in games designed for current gen consoles, what do you think is going to happen once next gen is out? Even if you're going to spend $1500-2000 on a 3080 Ti, you're gonna be running into games that will cause you trouble sooner rather than later.
Your argument is actually valid and I agree. I want to be able to run the majority of games at 4k 60 max settings. I realize that newer games will always continue to push that. If what NVIDIA claims is true (40% boost I believe) once the 3080ti is released that will be enough for me to make the jump to 4k.
 

GametimeUK

Member
I'm on the opposite camp, I prefer to tone down the settings to maintain native resolution at all cost - blurry/soft image is easily noticeable when you have the display right in front of your face, as oppose to consoles/TVs, and it just kills the entire image quality too much for me.

I've seen DLSS2 in Control tho, and I see no reason to not to use the tech whenever possible, it really is a game-changer in generating computer graphics, I hope te tech becomes more and more utilized in the upcoming years.

Yes of course I forgot to mention that when I'm playing at 4K and lowering the resolution or cheating it I'm actually playing on a TV at distance as opposed to monitor. On my monitor I stick to native Res (which is 1440p), but with my GPU and Gsync I can pretty much just set it and forget it. I still think I'd rather scale back to 1080p at minimum. But these options are the beauty of PC, right? We are all more sensitive to certain visual features than others.

And yes DLSS is a massive game changer and I can't wait to see more games implement it. I've been praying for it to be used in Hitman and upgraded in Tomb Raider. :(
 

Allandor

Member
The sad part is, that people expect 500$ consoles to push 4k/30 (or even 60 fps) while much pricier PCs struggle to do so.
Difference is, console games are more or less optimized for that system. So the developer has 4k/60 (or something else) as target in mind and will use this to make it possible.
On the PC, you can choose from many, many options how you want to render the game. E.g. Ultra textures does not bring much image quality but costs memory (and bandwidth).

But just 4k is the worst thing you can do to improve image quality (worst from a efficiency standpoint). But of all things, it is the easiest for the developer to just scale resolution. Asset quality, art style and other things are way more important to improve quality.

In the past, we had the same discussion with 1080p and with every other resolution bump.
Half Life in 1600x1200 was once great. You can even run it at 8k without a problem. But image quality is no longer great at all ;)
 
Last edited:

diffusionx

Gold Member
I have a PC with a 1080TI hooked up to a 4K TV, and no, I can't hit 4K/60fps consistently, except for older games (like Witcher 3 gets there). Thing is, I think the PS4 Pro and Xbox One X showed that there is a lot of room between 1080p and 4K and you can get great looking games. I'm honestly not sure I could tell the difference between "1800p" and true 4K. Games like Horizon Zero Dawn with CBR look fantastic. And now with DLSS 2.0 going so far, I don't think 4K will be a priority for PC GPUs.

PC games have started to give people more flexibility for those in-between resolutions, but it should be the norm.

Difference is, console games are more or less optimized for that system. So the developer has 4k/60 (or something else) as target in mind and will use this to make it possible.
On the PC, you can choose from many, many options how you want to render the game. E.g. Ultra textures does not bring much image quality but costs memory (and bandwidth).

We see this in RDR2. Xbox One X runs the game at full 4K and 30fps. But a few settings are actually lower than the PC low setting to get there.

If PC gamers are willing to turn down settings, they can get whatever settings they want, they just have to accept reality instead of just saying the game is "bloated" or "unoptimized."
 
Last edited:

T-Cake

Member
If PS5/XSX games hit 4K/60fps it will be because graphical details will be scaled back considerably compared to max PC settings.

And without Digital Foundry pointing out the differences, I bet hardly anyone would be any the wiser!
 
Last edited:

Ulysses 31

Member
Seriously, can you tell the difference between HBAO+ or SSAO and 8x anisotropic filtering and 16x? 😂
If I stand still and scrutinize between settings sure, but I rarely do that. I mainly look at the fps performance each setting gives me.
 

T-Cake

Member
If I stand still and scrutinize between settings sure, but I rarely do that. I mainly look at the fps performance each setting gives me.

Yes, I find fiddling with all those settings rarely makes the FPS budge though (game dependent, of course). It's only when I change resolution does something happen. 😆
 

Tqaulity

Member
Stop.
Obsessing.
Over.
Max.
Settings.

Seriously. People need to stop caring about what some text in a menu says. Bumping select options down can often massively improve performance without any noticeable degradation in image quality. For those of us still on mid-range Pascal cards like the GTX 1070 this is a must, but even for those running 2080Ti the benefits to ditching your ultra fetish are there.
PREACH! I'm so tried of this as well. Like ultra/max is the only way to play a game :messenger_face_steam:
 

Tqaulity

Member
The sad part is, that people expect 500$ consoles to push 4k/30 (or even 60 fps) while much pricier PCs struggle to do so.
Because....consoles will not target the unoptimized ultra PC settings for the most part. Of course you can hit 4K/60fps on a PC GPU as well if you just turn down some settings. An RTX 2070 would do just fine for 4K at less than ultra settings. Also, remember that the Xbox One X (significantly weaker than next gen consoles) has it's fair share of native 4K games at 30-60fps already. So yeah, not at all far fetched.
 

MetalRain

Member
I really like 4K image, 1440p is fine and you probably will reach 60FPS in most games, but I rather have 4K with reduced settings.
 

Ellery

Member
When the next big GPU launches (RTX 3080, BIG NAVI RDNA2) people will praise it as a TRUE 4K GAMING GPU. And it will be at that time, maybe not Cyberpunk 2077 4K RTX on MAXXED OUT)
Less than a year later and new next gen big AAA games coming out then it will struggle again.
The cycle repeats again. At some point we will have it, but people also thought the R9 290X, Titan, Fury X, 1080 Ti were 4K gaming GPUs. Nope they are 1080p indie game GPUs now. (I am mostly kidding. mostly)
 

Tqaulity

Member
Yeah right, the UE5 demo hit neither 4K or 60 fps. If PS5/XSX games hit 4K/60fps it will be because graphical details will be scaled back considerably compared to max PC settings.
Wow, ok:
A. The UE5 demo was doing things well beyond what we see in any game today. It would be very taxing on any consumer hardware today
B. It is an engine demo...not a game. What would be the benefit of targeting 60fps for a demo that most people will never play? The added responsiveness is irrelevant since it's not really a gameplay demo and most people are just watching it
C. It was demo to show off specific rendering techniques and was not fully optimized. It takes years to build a game on current hardware never mind next gen hardware. Epic worked on developing that on the brand new PS5 hardware for only a few months....the fact that it ran that well given the amount of time put in is amazing

Sigh...why people get so obsessed with the # of 'p's in their image is just beyond me. 1440p is still very high resolution and the image quality and detail looked better than anything out today running at 4K. If Epic had never said that the demo was running at 1440p, nobody would ever know it was below 4K, especially when you looking at the video streams online.
 

Ulysses 31

Member
Wow, ok:
A. The UE5 demo was doing things well beyond what we see in any game today. It would be very taxing on any consumer hardware today
B. It is an engine demo...not a game. What would be the benefit of targeting 60fps for a demo that most people will never play? The added responsiveness is irrelevant since it's not really a gameplay demo and most people are just watching it
C. It was demo to show off specific rendering techniques and was not fully optimized. It takes years to build a game on current hardware never mind next gen hardware. Epic worked on developing that on the brand new PS5 hardware for only a few months....the fact that it ran that well given the amount of time put in is amazing

Sigh...why people get so obsessed with the # of 'p's in their image is just beyond me. 1440p is still very high resolution and the image quality and detail looked better than anything out today running at 4K. If Epic had never said that the demo was running at 1440p, nobody would ever know it was below 4K, especially when you looking at the video streams online.
A. Of course, I was responding to the routinely 4K/60fps comment and that having 30 fps for prettier graphics will be a common occurrence.
B. To show what graphical fidelity could be at 60 fps in that engine?
C. I've made no judgement about the demo so I don't know why you're inferring that I was not content with it.

I'm all for lowering resolution if it means better lighting, fps etc. I find the push for true 4K unwise since that will eat up most of the extra power these new consoles will have over current ones so I assume I'm not one of these people you're talking about.
 
Last edited:

-YFC-

Member
Well none really. To play all game at max settings at 4k60, those cards don't exist right now. As for 1440p and 1080p, any high end card released in the last 3 years will suffice.
 

jonnyXx

Member
I cringe everytime I see comments about peeps buying 4K screens and then running games at fluctuating all over the place sub 60 fps. And then same peeps go on online to cry how weak their graphics cards are... there are millions of benchmarks online showing exactly how much fps you'll get from almost any game at any res.

Why not get 2560x1440 screen instead and enjoy fantastic smoothness - enjoy the way the games are meant to be played? 2560x1440 ~100+ fps with vrr >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 4K sub 60 fps. Even average cards do great at wqhd res. Don't limit your enjoyment with 4K screens fellers wtf.
This is genuinely good advice and I would personally give the same advice. I think 4k is overkill for a monitor.

For myself though, I haven't owned a PC monitor in around 10 years. I hooked up my PC to TV and 7.1 sound system and never looked back. It meant that I was at a disadvantage in competitive games but I enjoy the home theater experience so much more than a desk experience.

There are no 1440p native TVs so I'm stuck with 4k. My TV supports 1440p@120hz but it's not native resolution so you can notice the scaling. I use it for competitive games and 4k for everything else.
 

GenericUser

Member
My friend, you're rather behind the times. Many titles on Xbox One X already push 4K 30/60. The next gen consoles will comfortably be able to do so.

the point is, they push current gen graphics to 4k/60fps. I doubt next gen consoles will be able to do the same on games that feature next gen graphics. I personally prefer the visual fidelity of a UE5 engine game over raw resolution, but that's just me. And again, I'd love to be proven wrong.
 
In any game I can't run max settings @ 4K/60, I turn AA down and that's usually enough. At 4K aliasing is already much less noticeable, so even the lowest AA settings are usually enough to hide any harsh edges and frees up a lot of performance in many cases. Full AA only really makes a big difference in screenshots.

If that didn't cut it somehow, I start looking for effects that have the least impact on how the game looks in motion. If Ultra shodows looks only a tiny bit better than High, but requires 5-10% more GPU, I'll go with high. If Ultra reflections look near-identical to High, down they go, you won't see the difference during gameplay. I'll work my way through all the settings that way until I'm sure I'll have a solid 60fps playthrough, well worth the 10-15 minutes it takes to do, especially on a 40+ hour long game.
 
He is probably talking about 4k60 ultra details, and your gpu isn't enough for that with very heavy games like rdr2 or ac odissey.

Of course with a mix of ultra-high-medium details you are perfectly right.
Yeah of course not but honestly the only thing i really cheap out on is antialiasing and antiscoptric filtering and i have those on t-aa and 4x respectively. Maybe one setting here or there thats high instead of ultra. No significant sacrifices in visual quality to achieve 4k 60 fps. Red dead im running at 80+ frames with mostly ultra settings. Oddysey i can push well over 60 as well. Obviously you cant max out render scaling and i cant imagine any card can achieve that yet but these things are negligible imo. The 3000 series will bring us closer to it but i doubt any reasonably priced card is gonna be able to max everything out. Point being i believe a 2080ti will go further than people think when it comes to achieving 4k 60 next gen with not much sacrifice. At risk of sounding like a toolbag im just gonna buy the 3080ti or whatever the titan version of that card is to future proof as much as possible.
 

ZywyPL

Banned
My take:

4k is overrated and unnecessarily taxing

Framerate is much more important

Every game has different graphical settings which may or may not really affect what you see significantly but can be very taxing, so whether or not something can be run with everything on “high” or “ultra” probably doesn’t matter.

Build a rig that can do what you want it to do instead of worrying about hitting some arbitrary benchmark. 1440p is fine by me. 1080 is too if I want to output to my comfy couch. Whatever settings I need to go to ensuring >60hz at those resolutions probably still look great because I still play my NES on a CRT and that’s fine too

We hear it all the time, don't we? 1050p is overrated, 900p is more than enough and is much less demanding. 1080p is overrated, 1050p is more than enough, and is less demanding. 1440p is overrated, 1080p is more than enough and is less demanding. 4K is... History repeats itself over and over again, few years from now we will hear how 8/10K is unnecessary and 4/5K is the sweat spot.

You see, the reason everything looks so damn awesome on 5-6" smartphones, at 1080p, hell, even mere 720p, is because the PPI is so damn high, while PC displays aren't even halfway there, let alone 55-65" TVs.

I'll admit, things have changed a lot with 3D era and its legacy that is high refresh rate monitors, then on top of that Nvidia changed the game forever with G-Sync, and high framerate is desired more than ever on PC ever since, but still - graphics cannot progress without resolution, the more pixels displayed the more details are shown, and vice versa, the more details there is the higher resolution is needed to show them, asset quality and resolution are just tied together whether we like it or not.
 
Top Bottom