• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

4K gaming: “It’s useless,” says Playdead founder.

Airola

Member
Well now, that depends on the game, doesn't it. Regardless developers shouldn't be prioritizing anything over getting 60FPS as a baseline. It's not 1998 anymore. 30FPS is simply unacceptable in this day and age.

Awww, I'm sowwy, did I hurt your feefees by pointing out that 30FPS is fucking awful, that I can almost count the frames as they go by, that it's genuinely uncomfortable and headache inducing to play at such a crap framerate.
To be perfectly frank I don't really care whether or not someone on the internet I've never met, and am never going to meet, respects me or not. 30FPS is a slideshow.

You haven't lived until you have played Castle Master on Commodore 64.



Game was (IS) awesome. No regrets.
 

Ovek

7Member7
I would rather have the option of running a next gen console at 1440p at a higher frame rate and take advantage of my TV's 120Hz panel.
 

Darak

Member
I prefer 4K's extra resolution over blurry temporal AA, chromatic aberration, object motion blur, and the plethora of other post-processing garbage every game nowadays abuses to oblivion, making the final image worse instead of better, in the name of being 'film-like'. In fact, the performance penalty of 4K is much more bearable when the game has a simpler (and, thus, better) post-processing pipeline.
 

Shifty

Member
It seems to me like we might be reaching the average perceptual threshold with resolutions approaching 4K, similar to the way the average perceptual threshold for framerate starts to become vague around the ~90hz mark.

Sure, it depends on individual, as well as the size of your screen, pixel density, distance that you sit from it, etc. but there's going to come a point where getting more crisp doesn't actually make a justifiable difference relative to the extreme, exponential performance cost that comes with increasing resolution.

I prefer 4K's extra resolution over blurry temporal AA, chromatic aberration, object motion blur, and the plethora of other post-processing garbage every game nowadays abuses to oblivion, making the final image worse instead of better, in the name of being 'film-like'. In fact, the performance penalty of 4K is much more bearable when the game has a simpler (and, thus, better) post-processing pipeline.
You speak as if chromatic aberration, object motion blur and the various other types of excessive post-processing are mutually exclusive to 4K and can be discarded once we hit that resolution.

TAA is the only one that really makes sense for. Everything else is just art style / aesthetic.
 

Journey

Banned
I would say something negative, but God Damn, Limbo and Inside were such great experiences, atmospherically beautiful too, without the need of advanced tech.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Same.

Some games are still good even on full-hd, but others like borderlands 3 looks absolutely horrible on a big 4k screen.

Also people is fixated with 4k...1800p is basically the same thing with less hit on performance.

On pc you can create custom resolution to find the sweet spot.
Borderlands 3 was a fun game, but performance wise is was absolutely atrocious and horrific. Im really surprised Gearbox hasn't gotten more heat for that.

I agree about 1800p. It's very close.

Checkerboarding is also a godsend in how close it can look to 4K.

Deus Ex Mankind Divided is one of the Pro (never got an X patch) titles that uses checkerboarding and it is amazing how close it looks to 4K.
 

Darak

Member
You speak as if chromatic aberration, object motion blur and the various other types of excessive post-processing are mutually exclusive to 4K and can be discarded once we hit that resolution.

TAA is the only one that really makes sense for. Everything else is just art style / aesthetic.

Every one of those techniques weights heavily on the fillrate and take a greater toll when you increase resolution. A heavy postprocessing pipeline in most games is one of the reasons current GPUs struggle with 4K. If they weren't so abused, 4K would be more feasible in current hardware. Obviously lower resolutions also benefit from simpler postprocessing, but the wins are not as pronounced.

Also, the aesthetic benefits of those techniques are arguable. In my humble opinion, they are not worth the cost. I usually turn them off if I can.
 

Fbh

Member
I wouldn't call it useless but it has always felt like a feature that should be left to high end PC hardware.
For consoles or low budget PC's there's a lot of other stuff I'd rather have devs focusing on, like performance, draw distances, overall world details, physics, etc. The Outer World console version is a good example IMO. Given the Hardware the Xb1X version could easily be the best version, but in pushing for the highest possible resolution it ends up with worse performance and less foliage than the Ps4 Pro version





good luck to play at 1080p+taa)))
sPlVd3t.png

XRlLg4V.png

Good thing then that I don't sit an inch away from my TV to look directly at tree branches
 
Last edited:

zcaa0g

Banned
4K for Horizon Zero Dawn, God of War, Spideman, Forza 7, Forza Horizon 4, etc. on the consoles is the first time ever for me that console games finally had the graphics quality and fidelity of a PC game, so from my viewpoint, the 4K difference is huge. Yes, I realize the new level of power for the Pro and X, but 4K was the game changer.
 
Last edited:

Azelover

Titanic was called the Ship of Dreams, and it was. It really was.
The increase in visual fidelity alone is a gimmick to sell hardware.

However, that's different from improving hardware in ways that will be beneficial to gameplay. For example, the original NES was graphically inferior to home computers at the time, but it was better at side-scrolling. And that translated to gameplay.

We already have a pretty crisp resolution that is good enough for most gameplay experiences. 4K adds nothing to the table. It's a total gimmick.
 

carsar

Member
Good thing then that I don't sit an inch away from my TV to look directly at tree branches
I see the ugly branches even at 4 meters away from 50" TV with normal vision/
1080p TAA is the reason why games can't look realistic, because it hides too many details and makes brances too кщгпр
 
Last edited:

Aceofspades

Banned
I am inclined to agree that pushing "Native" 4k shouldn't be a priority, hell... even DF still confuse some PRO games as Native 4k after zooming x100 to see the pixels and still get it wrong.
Reconstruction methods has gotten really good this gen.
 

pawel86ck

Banned
Nvidia lately has added high quality edge enhancement filter for upscaled content into Nv control panel and I'm surprised how much better upscaled picture looks now. So people dont need 4K content on their 4K panel in order to see benefits compared to fullhd panel.
 
Last edited:
I wouldn't say it's useless, but framerate is definitely far more important in a game.

You get dumb arguments online like, "So you'd game at 240/360/480p then?! lol you're wrong!". Which totally negates any nuance and misses out presenting an option of playing games at 10/15/25fps to balance the argument, and shows their ignorance regarding older games and platforms.

There is a middle ground for most, but i'll take 1080p and 60-240fps over 4k for modern games any day of the week.
 
You haven't lived until you have played Castle Master on Commodore 64.



Game was (IS) awesome. No regrets.


I still have a lot of time for stuff like this, and sentinel etc. I was even playing some Stunt Car Racer and Midwinter on Amiga recently too.

I'm not sure how that would fly today though, or any visual IF game that take 10-20 seconds to paint in the screen. :messenger_tears_of_joy:
 

Shifty

Member
Every one of those techniques weights heavily on the fillrate and take a greater toll when you increase resolution. A heavy postprocessing pipeline in most games is one of the reasons current GPUs struggle with 4K. If they weren't so abused, 4K would be more feasible in current hardware. Obviously lower resolutions also benefit from simpler postprocessing, but the wins are not as pronounced.
Very true, though I view that more as the industry trying to push ahead of the hardware and needing to invent novel workarounds (ex. checkerboarding, DLSS) to compensate. As much as I wish devs would scale things back, there's been a huge focus on better shinier bigger etc etc ever since the 360/PS3 gen.

We're also now seeing other more fundamental stuff like physically-based pipelines that render multiple G-buffer textures and combine them to create a final image, which adds extra overhead on top of screen-space post processing.

Also, the aesthetic benefits of those techniques are arguable. In my humble opinion, they are not worth the cost. I usually turn them off if I can.
I don't disagree, chromatic abberation and poorly-implemented motion blur can go in the garbage where they belong.
 

RobRSG

Member
Cool, but did he launch Inside for PS3 and X360 or should I suppose is 720p is useless as well?
 
Depends on the size of the screen . Would I trade my 24 inch 1080 p 144 Hz monitor for a 24 inch 4k 60 hz monitor well no that would be fucking stupid but if you going for a larger screen than you probably want 4k.
 
Last edited:

UltimaKilo

Gold Member
Up until last year, I would have agreed that 4K was useless, because I had a 40” TV for years and years. Once I upgraded to 55” 4K, I can’t go back.

You should blame the pace of hardware development, because it feels like having to choose between native 4K and 60fps come 2020 is ridiculous.

If we can’t even get that, VR is even further away. 150-170 degree FOV, 4K per eye, 240 hrz is 5-7 years away.
 

Solomeena

Banned
Because it comes at the expense of more important things, and it's not as if 1080/1440p are blurry messes that we are unable to decipher

Are you really seriously arguing that 1080P and 1440P are good enough?!!!! That is like fucking arguing that a 250GB HDD was enough storage or that 1MB of RAM was good enough. Fuck that.
 

DeepEnigma

Gold Member
1440p with the budget leftover for better physics, lighting, volumetrics, etc., at 60FPS. Thanks.
 
Last edited:

Bkdk

Member
Sure it’s not gonna be a striking difference like 480p to 1080p, still the higher definition the better.
 
4K is nice but I prefer framerate. 1440p is what I game at, and my 1080ti can hit 100fps plus in most of the games I play. Going from console to 144FPS on a 144hz monitor was an eye opener to say the least.

On PC, anyway, 4K at the framerates I prefer is just too expensive.
 
Last edited:
Wtf?

I’d rather play at 4K, medium, rtx off, 60fps than 1080p, ultra, rtx on, 60 FPS in Control and FFXIV

Control looks cool with RTX on but I still prefer clarity from increased resolution

FFXIV has TONS of shimmering and the best way to get rid of it is brute forcing it with pixels

It is worth it for people who deem it worth it. ITT lots of people are talking out their asses about an issue where I’ve tested measurable value to 4K gaming.
 
Last edited:

Jtibh

Banned
This reminds me of one actor back than when hd tv s just came out.
He said he stopped acting as his face was not hd ready and didnt want people to see all his facial flaws.
 
Show to this man the difference between rdr2 on ps4 and on pc and let's see if after that he can say the same thing with a serious face...
That has nothing to do with resolution. You're talking about graphical fidelity. For all we know, someone could be playing at a lower resolution on PC than on PS4.
 
4K is a cop out. Any game looks good in 4k with HDR on full.

I would rather they stick with 1080p, give me 120fps+ and bring back some fucking physics, destruction and swarms of enemies.

Last gen, over ten years ago!, had 99 nights and battlefield bad company 2. One had swarms of enemies, one had amazing destruction.

Nothing this gen has come close, yet we're chasing 4k? What's the point?
 

petran79

Banned
Not even tv broadcasters and blueray manufacturers caught on with 4k. In fact people still use analog TVs with digital TV boxes via scart.
 

Bolivar687

Banned
I'll take "4k" over weird peripheral experimentation but it did feel silly to chase that with half generational upgrades that don't even quite get there as it is.
 

VFXVeteran

Banned
Agree to disagree with this. 4k is nothing but good for playing a game. There is nothing wrong with higher fidelity. Now the creative aspect he is correct. That should be governed by his own wish/desires.
 
Technology nowadays dictates that it would make sense to push 1440p as the target. 2160p is far away still for good cost/performance without sacrifices.
 

Jigsaah

Gold Member
My take is it depends on the game. The Limbo developer...who's flagship title is basically simple shapes and Greyscale...yea resolution doesn't make a difference. however I will never forget when Fortnite did it's 4k upgrade on the X. I was BLOWN AWAY. Shadow of War!? 30 fps pre-X enhancement...post X enhancement...amazing.
 

ROMhack

Member
That's what one would say when they make games like Limbo and Inside which had and have crappy graphics. They weren't "art styles", they were bad graphics. Enjoyed the games though for the most part.

The art direction in those games was amazing, minimalistic with great use of shadows which added plentifully to the atmosphere. The animations were great, too.
 
Last edited:
Top Bottom