• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why do game developers neglect 1080p on current gen consoles

1080p sets aren't the norm anymore. Most TVs you'll see being offered are 4k. I barely see 1080p sets around in the wild. They're probably still there, but most households seem to have upgraded. Or downgraded, as some had some top of the line 1080p sets and now a shitty 4K.


Yeah no one wants or expects a Switch Pro.
And many of us have perfectly fine 1080p sets a few years old. When crts were a thing, I remember holding on to them until they died. I also don't remember them bing much larger than 32" very often. We made do, people are spoiled today. Why would you throw out something that you spent hundreds on, a few years after buying it? Makes no sense. PS5 looks amazing on 1080p. I am sure 4k looks better, but it also needs lower fps.

I just want them to give more settings, like on pc. Its like they they think console users are stupid or something, so they don't add sliders.
 

Three

Member
The thing is it has fidelity mode for 4k resolution and 30fps. Why doesn't this mode have 60 or 120fps for 1080p. Why are these modes fixed?
Can't they detect and scale accordingly? A simple programing if statement would do the trick.

if(display > 1080p) {
fpsCAP = 30; } else
fpsCAP = 60; }

See even i can come up with the code for it.

Fidelity should be 60 for 1080p. As it is now i have to use performanceRT and loose some settings.
It's not as easy as this because some things in fidelity mode still might not make the 16ms frametime budget even at 1080p. It will even mean QA have an extra mode to test. I would appreciate and respect any devs that put in the time to offer a 1080p mode though.
 
Because 1080p is old and blurry on 4K screens. We're trying to go forward, not back to 2013.
Does 1080p content really look blurry on 4k screen? upscalling doesn't get rid of it? I ask as one thing that kept me away from even getting a 1440p monitor at the time for pc was the cost and not having a good enough gpu (since been solved with 3060ti) but I didn't want a blurry mess. At least at 1080p native it still looks great.

Comparing 1080p native and 4k is it really that different?

I play a lot of switch games, ps2, pc over steam link, ps3 and ps4 games, will these look like crap on a 4k tv?
If it is that different would it be better to stick with 1080p if you play more 1080p content, or is there certian model tvs that do proper upscalling?
 
Does 1080p content really look blurry on 4k screen? upscalling doesn't get rid of it? I ask as one thing that kept me away from even getting a 1440p monitor at the time for pc was the cost and not having a good enough gpu (since been solved with 3060ti) but I didn't want a blurry mess. At least at 1080p native it still looks great.

Comparing 1080p native and 4k is it really that different?

I play a lot of switch games, ps2, pc over steam link, ps3 and ps4 games, will these look like crap on a 4k tv?
If it is that different would it be better to stick with 1080p if you play more 1080p content, or is there certian model tvs that do proper upscalling?
The display doesn’t do any of the upscaling. It’s completely up to the developer as to what the final image will look like.
 

Hoddi

Member
Does 1080p content really look blurry on 4k screen? upscalling doesn't get rid of it? I ask as one thing that kept me away from even getting a 1440p monitor at the time for pc was the cost and not having a good enough gpu (since been solved with 3060ti) but I didn't want a blurry mess. At least at 1080p native it still looks great.

Comparing 1080p native and 4k is it really that different?

I play a lot of switch games, ps2, pc over steam link, ps3 and ps4 games, will these look like crap on a 4k tv?
If it is that different would it be better to stick with 1080p if you play more 1080p content, or is there certian model tvs that do proper upscalling?
Stick with your TV. The difference between native 1080p vs upscaled to 4k is massive and you're gonna notice it on your Switch/PS4.
 

chasimus

Member
Because everyone has a financial interest in pushing 4K.
What makes me sad about this comment is that studios/developers gave up on 3D right when 4K TVs started coming out. Sony and LG had a great tech that lowered the resolution to 1080p to deal with the stereoscopic aspect and you didn't have to buy those $200 battery powered glasses watch it. And the picture quality was great!
 

Pagusas

Elden Member
1. TAA and any temporal Anti aliasing work best when they have more pixels to work from, below 1440p TAA starts falling apart. the reason this is the first generation where aliasing has barely showed it’s face is thanks to all the temporal magic happening + the resolution bump.

2. Lowest common denominator consumers should never be the ones setting the base line. If we followed that mode of thinking progress would be at a crawl always.

3. Dynamic Resolutions will become more and more common, they’ve already shown to be a great tool for mid and next gen visual bumps when the new hardware comes out, and devs need to target a certain resolution when optimizing the game, 1440p is the sweet spot right now.
 

FeldMonster

Member
As someone who has three 5 year old perfectly fine 1080p tvs (a 55" set i bought and a 50" set inherited from my late pops and my gfs tv), I see no reason to upgrade. Unlike some on here, I am scraping by. I work 40-60hours a week, have a house with gf, and we both total less than 75k for the household income. Bills add up, especailly now with inflation and gas prices.
It was painful enough buying the ps5 at msrp (after 1.4 years of trying).
That being said, I can't justify spending the loot for 4k when I have two 1080p sets. One of which isn't even being used, as we don't want a tv in the bedroom (bedroom is for sleeping, changing, and sex, have a living room and dedicated 40ftx 50ft game room)

So just getting the ps5, I played GTA5, and wooo looks amazing. Even better than pc in some ways (yeah i know bad af filtering, but the colors are more vibrant and I'm not seeing any jaggies, my 3060ti with 27montior the jaggies stand out, also don't see rtx settings on pc gta5).

The thing is it has fidelity mode for 4k resolution and 30fps. Why doesn't this mode have 60 or 120fps for 1080p. Why are these modes fixed?
Can't they detect and scale accordingly? A simple programing if statement would do the trick.

if(display > 1080p) {
fpsCAP = 30; } else
fpsCAP = 60; }

See even i can come up with the code for it.

Fidelity should be 60 for 1080p. As it is now i have to use performanceRT and loose some settings.

I am sure other games are like this too.
If not they need pc like sliders.

And yes many of us aren't all super rich, or have families, and don't throw out perfectly fine tv sets.
If you are just scraping by, why did you buy a brand new $500 console and an even more expensive brand new graphics card? Especially when you can play the vast majority of the same games on last gen hardware?

Why should your (self-admittedly low) financial level dictate the maximum graphical quality for everybody else? Rather selfish no?
 

Dream-Knife

Member
Why can't console games have a drop down menu like on PC and you just pick whatever resolution you want? Or a resolution scaler.
 

The_Mike

Member
I'm surprised that people want to settle with 60 fps.

I mean, yeah it is better than 30 fps, but 120 fps is put of its own league and devs should really priorities 1440p120fps over stuttering 30 fps.
 
20 months ago: Switch games are blurry slow res look terrible switch is trash sub 4k nintendo is lazy.
Today: Why arent $500+ consoles outputting at 1080p

Actually reason: 1080p on a $500+ system is criminal to say the least if were going back to the mid ps3 days with no sixaxis and other unique features but with 20 fps with crazy graphics I will never buy another playstation again. If you want 1080p on the current consoles go buy a 1080p tv and play it on that both the xbox and ps5 can downscale. if were going back to 1080p games need to be hit locked 60 fps or 120fps easily but going by how elden ring performs the resolution is the least of problems with the current consoles.
 
it’s a fact that even in the US only 50% of households do own a 4k television, so for people without a 4k television,
Maybe. But what percentage of people who own a PS5 don't have a 4K TV? I bet you it's a lot less than 50%.

Do they even make new TV that aren't 4K? I see $300 TV at Walmart and even those are all 4K.
 

Dream-Knife

Member
20 months ago: Switch games are blurry slow res look terrible switch is trash sub 4k nintendo is lazy.
Today: Why arent $500+ consoles outputting at 1080p

Actually reason: 1080p on a $500+ system is criminal to say the least if were going back to the mid ps3 days with no sixaxis and other unique features but with 20 fps with crazy graphics I will never buy another playstation again. If you want 1080p on the current consoles go buy a 1080p tv and play it on that both the xbox and ps5 can downscale. if were going back to 1080p games need to be hit locked 60 fps or 120fps easily but going by how elden ring performs the resolution is the least of problems with the current consoles.
Elden Ring is likely just bad code. It stutters from time to time on any rig and isn't a demanding game.
 

metaverse

Gold Member
I agree. 4K native is too demanding even for high-end GPUs. I'd rather see further advances in using machine learning to up scale. One of the biggest selling points of buying an Nvidia card over AMD is DLSS.
 
Can’t believe how many here are not happy with 1080p anymore. As someone who grew up in the 80s, HD is totally sufficient.

Of course it would be nice to have 1440p/60fps, but in order to archive this, developers scale down other graphical options including RT. Consoles are just not yet strong enough to handle more and for me graphical effects are far more important than a higher resolution. Sure if you have a crappy 4k tv where non 4k material looks blurry I can understand, but when I bought a 4k tv for me the most important thing was how older material looks like and I am still super happy with my choice at that time.
 
Can’t believe how many here are not happy with 1080p anymore. As someone who grew up in the 80s, HD is totally sufficient.

Of course it would be nice to have 1440p/60fps, but in order to archive this, developers scale down other graphical options including RT. Consoles are just not yet strong enough to handle more and for me graphical effects are far more important than a higher resolution. Sure if you have a crappy 4k tv where non 4k material looks blurry I can understand, but when I bought a 4k tv for me the most important thing was how older material looks like and I am still super happy with my choice at that time.
What type of TV was it? My biggest concern of even thinking about 4k Is the switch, retro and ps4 games. I don't want it to be a blurry mess. Brand, model cost, anything to look for?
 
Elden Ring is likely just bad code. It stutters from time to time on any rig and isn't a demanding game.
Yea the game still suffers from the problems that plagued the Dark souls games on PC all of those problems are in the pc port the consoles versions even in performance mode wont hit 60 fps so its for sure a code and engine problem.
 

Kimahri

Member
I am a little bit curious why so few games these days give a 1080p option. Most games have a graphics mode which targets up to native 4k with a 30fps target and a performance mode with a target of 1440p and 60fps.
However I personally would rather have an option to have higher graphic settings at 1080p with a 60fps target.

I actually do own a 4k tv, but my Sony tv is doing a very good job with outputting even 720p material, so I would be totally comfortable with 1080p.

However I am even more curious about the decision to push for resolutions over 1080p on consoles, since it’s a fact that even in the US only 50% of households do own a 4k television, so for people without a 4k television, they can either play on a PC screen which is normally smaller or worse a lot of extra performance of the current gen consoles is wasted.

So is there any particular reason why game developers push so hard for higher resolutions? It’s not like game developers are getting paid by the tv manufacturers to boost tv sales…
I'm more sensitive to iq than fps, so I go resolution all the way unless that mode is bad.

Also, to people mentioning switch, I gamr on switch a lot, but god damn is it tiresome for my eyes with those shitty resolution on a big tv.
 
What type of TV was it? My biggest concern of even thinking about 4k Is the switch, retro and ps4 games. I don't want it to be a blurry mess. Brand, model cost, anything to look for?
I bought a Sony Bravia, which completely convinced me. I went at that time to an electronics store and asked them to show me different quality content so I could see how that looks like upscaled and at that time Sony seemed to be best when it comes to upscaling.
 

tvdaXD

Member
Because 4K is a marketing thing, the big number/shiny badge they can put on the box. There's a reason many of the movies use digital intermediates of 2K... Many 4K blurays aren't actual 4K but just a simple upscaled of the 2K render :")
 

RafterXL

Member
Because we aren't in the year 2000 anymore. You can't buy 1080p tvs and very few people with next gen systems still own 1080p tvs. And, frankly, the rest of us shouldn't have to look at a muddled mess because a few of you refuse to upgrade, and the larger tvs get the more need for higher resolution.
You have the wrong tv.
There is no such thing as the right tv. Even Sony, which has been the best upscaling in the business, can't make 720p games not look like shit on a 65-85 inch 4k screen.

Can’t believe how many here are not happy with 1080p anymore. As someone who grew up in the 80s, HD is totally sufficient.

Of course it would be nice to have 1440p/60fps, but in order to archive this, developers scale down other graphical options including RT. Consoles are just not yet strong enough to handle more and for me graphical effects are far more important than a higher resolution. Sure if you have a crappy 4k tv where non 4k material looks blurry I can understand, but when I bought a 4k tv for me the most important thing was how older material looks like and I am still super happy with my choice at that time.
As someone who grew up in the 80's I completely disagree. I stopped using a 1080p monitor on my computer nearly 15 years ago. My first 4k tv would be over 10 years old now. There is zero reason to hang on to old, outdated tech. The fact that we are still talking about 1080p in 2022 is ridiculous.
 
There is no such thing as the right tv. Even Sony, which has been the best upscaling in the business, can't make 720p games not look like shit on a 65-85 inch 4k
Then you’re sitting too close if it’s a Sony and it looks like “shit.”

But this is one reason why I buy 50-55” sets.

Edit : BTW i’m not saying 720p can’t look like shit, like maybe a game with TAA, then it’s super blurry. Like crash 4 or xenoblade 2 example.

But even like mario odyssey, if it’s looking like shit, you’re sat too close or you got too large a tv for the room.
 
Last edited:
Can’t believe how many here are not happy with 1080p anymore. As someone who grew up in the 80s, HD is totally sufficient.
Right. If 1080p doesn’t look decent, the developer is ruining it with blurry AA or some kind of post processing.

Crazy how console gamers thought ps3 looked amazing at one point and now 1080p looks “pretty bad” lol. Maybe it looks bad if you bought a huge tv for your two bedroom apartment and are trying to blind yourself.

I think people just crap on anything but the latest thing, despite them being happy with something not too long ago.
 
Last edited:
1440p is a worse offender because despite the fact that when it comes to native resolution it might be the de facto standard this gen, PS4 Pro and PS5 don't actually output it, instead either upscaling to 4K or downscaling to 1080p. If that wasn't the case, I would buy a 1440p monitor here and now. Most people play on a TV, but with COVID I was personally stuck for months playing on a monitor. If the monitor was 1440p I reckon I would be royally fucked with artifacts.

I have no issues on the 1080p front (albeit 1080p performance modes are indeed nice), other than that the fonts/text should be tested/approved as part of the approval system. Elden Ring is quite hideous on that front, smaller than other souls game and uses garamond which is hard to read on a screen. But hardly the only offender.

Going forward I think it'll happen more, the solution is accessibility settings so everyone with a smaller TV, farther away from the set or less resolution than "ideal" can mitigate the issue somewhat.
 
Last edited:

GymWolf

Gold Member
Get those cataracts taken care of.
I had laser operation like 15 years ago, my sight is around 14\10 (or 24/20 if you are american), i have better sight than 99,8% of world population most probably.

Go back playing on a shitty tablet to make look your shitty looking games better then they really are.

But thanks for the advice anyway:messenger_blowing_kiss:
 
Last edited:
I had laser operation like 15 years ago, my sight is around 14\10 (or 24/20 if you are american), i have better sight than 99,8% of world population most probably.

But thanks for the advice.
So if you played returnal, you were just sat there thinking how bad it looked?
 

GymWolf

Gold Member
So if you played returnal, you were just sat there thinking how bad it looked?
No, because it is 1080p uspscaled to 4k and it is already better than native 1080p.

The difference between returnal on ps5 (that also has a decent graphic) and a switch game at 720p is still gigantic.

The uneven framerate in returnal was much more distracting tbh.
 
Last edited:
No, because it is upscaled 1080p and it is already better than native 1080p.

The difference between returnal (that also has a decent graphic) and a switch game at 720p is still gigantic.
returnal pretty much looks like 1080p plus TAA with its simple unreal engine 4 upscale technique. If you think the image looks any better than luigi’s mansion 3 at 1080p you’re either lying, or yeah, cataracts.
 
Last edited:

Ceadeus

Member
It would be difficult going back to 1080p. The graphical fidelity has so improved that it would be difficult to see the details if played in 1080p. While this resolution is fine on a 15 inch laptop while gaming, I just can't go back to this on a TV.

1440p is when it starts to look crisp enough imo. I really like how Valhalla looked on XSS when I first got it. You can find cheap enough 4K TV's , you don't have to have the latest and greatest.

Mine is like 6 years old and while it does not have feature like HDR10, 120hz or VRR, the image is crisp.
 

adamsapple

Gold Member
Native pixel count will be a useless measure in the next couple of years as reconstruction gets even better.

The Matrix demo reconstructs from a lower resolution and yet has amazing IQ.

Once games start being developed solely for next gen and with engines like Unreal 5, expect a lot more games that 'natively' run at 1080p but use reconstruction to achieve much better IQ.
 

Jeeves

Member
I've owned a 4K TV for over a year now because that's pretty much all they sell now. I've still yet to view anything in 4K on it and don't feel a need to. 1080p still looks as good as it ever did.

I can understand a game targeting 4K or 1080, but using anything between those as a performance target seems really strange to me...
 

GymWolf

Gold Member
returnal pretty much looks like 1080p plus TAA with its simple unreal engine 4 upscale technique. If you think the image looks any better than luigi’s mansion 3 at 1080p you’re either lying, or yeah, cataracts.
I didn't played lm3 but i saw some gameplay and it's probably the best looking game on switch.

For returnal upscaling tech, there is a dedicated df video, i don't remember whate they use.
 
I didn't played lm3 but i saw some gameplay and it's probably the best looking game on switch.

For returnal upscaling tech, there is a dedicated df video, i don't remember whate they use.
Even Df said returnal image quality wasn’t great, but I still like the look of the game overall. Looks great when the action kicks up. It uses Unreal engine 4 so it just uses that upscaling TAA method which isn’t nearly as good as some others like Insomniac games.

I’m not saying 720p is great btw it’s just not so bad that I can’t quickly get used to it, except with blurry taa.

Only a few times on switch did I think, yeah, this image quality is distractingly bad. Like Yoshi which is 540p, or xenoblade 2 which was so blurry I didn’t want to keep playing lol. Witcher 3 on switch had poor image quality but I played it just for the novelty of it being on switch.
 
Last edited:
Top Bottom