• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why are several PS4 Pro games arbitrarily locking 1080p users out of downsampling?

Is there a chance at all that Sony isn't aware of the community's need for this?
0%. They know 95% of households don't have 4k TVs. And they know supersampling is useful because they apply it in regular PS4s when connected to 720p tvs.

Maybe they don't want their ~3 million PS4 pro users to have significantly better iq than their ~63 million OG PS4 users? I don't know. But it's pissing me off.

I still have my launch PS4. I'm tempted to sell my Pro for around 300, add 200 more and buy the XboneX. Keep the best OG console for it's exclusives and the best refresh for multiplatforms. And I will save money cause I won't even need to buy a 4k tv. Win Win strategy.
 
fuck if ps4 didn't have the advantage of better 1st party games and their exclusives I'd ditch my pro for a Xbox one x in a heartbeat. it sucks buying new games and not seeing supersampling as an option.
 

Clear

CliffyB's Cock Holster
Wow.

But still you simply should not trust developers to not fuck up especially if there is a way to easily avoid potential issue on system level.

Logically though, the value of downsampling depends on the quality of the image in the 4k framebuffer. Going from a pristine native 4k image is not the same as going from one thats already being scaled up from a res between 1080p and 4k.

A 4k frame reconstructed via checkerboarding, temporal injection and/or additional re-scaling methods is inevitably going to contain some sort of artifacting, which may well be amplified by an additional hardware-level rescale because with every additional process on the data more and more fudging is going on.

Potentially ending up with an over-processed 1080p image with any/all performance costs from the 4k internal upscale lumped in! That doesn't sound like a guaranteed win to me.

Objectively, upscaling then downscaling is not smart pipelining.
 
Logically though, the value of downsampling depends on the quality of the image in the 4k framebuffer. Going from a pristine native 4k image is not the same as going from one thats already being scaled up from a res between 1080p and 4k.

A 4k frame reconstructed via checkerboarding, temporal injection and/or additional re-scaling methods is inevitably going to contain some sort of artifacting, which may well be amplified by an additional hardware-level rescale because with every additional process on the data more and more fudging is going on.

Potentially ending up with an over-processed 1080p image with any/all performance costs from the 4k internal upscale lumped in! That doesn't sound like a guaranteed win to me.

Objectively, upscaling then downscaling is not smart pipelining.
There are very few native 4k PS4 pro games. IQ looks infinitely better in each and every supersampled game compared to native 1080p. So, whatever artifacting there is, imho it looks much better than just rendering at 1080p.
 

Durante

Member
A 4k frame reconstructed via checkerboarding, temporal injection and/or additional re-scaling methods is inevitably going to contain some sort of artifacting, which may well be amplified by an additional hardware-level rescale because with every additional process on the data more and more fudging is going on.
There's absolutely no "fudging" when doing a simple bilinear downsampling from 3840x2160 to 1920x1080. It's a 4:1 mapping.
 

Clear

CliffyB's Cock Holster
There are very few native 4k PS4 pro games. IQ looks infinitely better in each and every supersampled game compared to native 1080p. So, whatever artifacting there is, imho it looks much better than just rendering at 1080p.

Are there any examples of a sub-native 4k display (say 1440p or even 1800p) being used to create supersampled 1080p output? How does that compare to using the same horsepower to anti-alias or filter the image on a native 1080p frame?

My point is that I understand the benefits of supersampling from 4k native, but there logically has to be a path of diminishing returns based on ratio of source to output frame size. By the same token utilizing power to post-process a native 1080p frame has to curve upwards towards some point of convergence quality-wise.

What I'm getting at is if the 4k base frame is arrived at indirectly, at a processing cost decided on an ad-hoc basis by the developer of the running app, a system level downscale is not an ideal solution. The downsampling also needs to be handled on ad-hoc basis, which is why we are in the situation we're in.

Durante said:
There's absolutely no "fudging" when doing a simple bilinear downsampling from 3840x2160 to 1920x1080. It's a 4:1 mapping.

A 4:1 remap of data that may have been scaled up already from as little as 1440p original image. If you can see a discrepancy between a native 2160p frame and one that has been created via temporal injection/CBD or whatever, that same discrepancy is going to be reflected in the 1080p frame surely? Yes, you'll gain stability from less sub-pixel jitter, but what about the rest of the image?
 
Are there any examples of a sub-native 4k display (say 1440p or even 1800p) being used to create supersampled 1080p output? How does that compare to using the same horsepower to anti-alias or filter the image on a native 1080p frame?

My point is that I understand the benefits of supersampling from 4k native, but there logically has to be a path of diminishing returns based on ratio of source to output frame size. By the same token utilizing power to post-process a native 1080p frame has to curve upwards towards some point of convergence quality-wise.

What I'm getting at is if the 4k base frame is arrived at indirectly, at a processing cost decided on an ad-hoc basis by the developer of the running app, a system level downscale is not an ideal solution. The downsampling also needs to be handled on ad-hoc basis, which is why we are in the situation we're in.



A 4:1 remap of data that may have been scaled up already from as little as 1440p original image. If you can see a discrepancy between a native 2160p frame and one that has been created via temporal injection/CBD or whatever, that same discrepancy is going to be reflected in the 1080p frame surely? Yes, you'll gain stability from less sub-pixel jitter, but what about the rest of the image?

Almost every game runs at less than 4k: Battlefield 1, Titanfall 2, Uncharted 4... Every time there is the option, I choose the supersampled version because of how much crisper the image is.

Yup, they even deleted the feedback request that were getting tons of votes.
Arrogant Sony is back.
 

Clear

CliffyB's Cock Holster
Almost every game runs at less than 4k: Battlefield 1, Titanfall 2, Uncharted 4... Every time there is the option, I choose the supersampled version because of how much crisper the image is.

I don't doubt you. My point is simply that these processes aren't magical, a larger array of pixels is just being transformed into a smaller one. The question is, what produces the best final result given the same finite power to operate upon that data.
 
I don't doubt you. My point is simply that these processes aren't magical, a larger array of pixels is just being transformed into a smaller one. The question is, what produces the best final result given the same finite power to operate upon that data.
For the time being, every PS4 pro game looks way better supersampled compared to native 1080p.
 

Vashetti

Banned
CoD WW2 doesn't downsample either, at least in the beta.

That can't be right, the beta has pristine image quality.

If true, the TAA in use is simply superb.

The PS4 version uses a dynamic resolution with the native resolution ranging from 960x1080 to 1920x1080. The PS4 Pro version also uses a dynamic resolution with the native resolution ranging from 1440x1620 to 2880x1620. The PS4 Pro downsamples from this resolution when outputting at 1080p. Both versions also appear to feature a temporal reconstruction technique that improves the resolution for parts of the frame that are similar to previous frames.

https://www.youtube.com/watch?v=mBo0bS3qkdk
 

Fafalada

Fafracer forever
Clear said:
The downsampling also needs to be handled on ad-hoc basis, which is why we are in the situation we're in.
System already has video options - downsampling could sit there (not unlike how people mostly do it on PC - where it's also ad-hoc). They could simply re-enable the "4k" option when connected to 1080p displays (rename it to something else if it sounds confusing), and let people handle it themselves from there, and keeps it transparent for games that already support 4k modes.
 

Durante

Member
A 4:1 remap of data that may have been scaled up already from as little as 1440p original image. If you can see a discrepancy between a native 2160p frame and one that has been created via temporal injection/CBD or whatever, that same discrepancy is going to be reflected in the 1080p frame surely? Yes, you'll gain stability from less sub-pixel jitter, but what about the rest of the image?
It won't be as good as downsampling from native 4k of course, but I have a very hard time imagining any scenario (in a 3D rendered game, not something like pixel art) where the final result of a system-level downsampling feature would not be far preferable to a native 1080p render.
 

KageMaru

Member
A 4:1 remap of data that may have been scaled up already from as little as 1440p original image. If you can see a discrepancy between a native 2160p frame and one that has been created via temporal injection/CBD or whatever, that same discrepancy is going to be reflected in the 1080p frame surely? Yes, you'll gain stability from less sub-pixel jitter, but what about the rest of the image?

Would it be possible to skip the upscaling step when rendering at a sub-2160p resolution? For example can't a 1440p game just downsample to 1080p instead of upscaling to 2160p and downsampling from there?
 

Draper

Member
I don't want to toss in a competitive angle into this, but I was planning to get the white PS Pro for my Panny Plasma, but it's like, what's the point really? They don't seem to give much of a fuck for 1080 players.

Isn't downscaling system wide for the Xbox One X?
 
It won't be as good as downsampling from native 4k of course, but I have a very hard time imagining any scenario (in a 3D rendered game, not something like pixel art) where the final result of a system-level downsampling feature would not be far preferable to a native 1080p render.

Would it be possible to skip the upscaling step when rendering at a sub-2160p resolution? For example can't a 1440p game just downsample to 1080p instead of upscaling to 2160p and downsampling from there?
You're missing his point - he's bringing up that as the internal resolution goes down, there should be a point where just applying better AA and extra effects to a 1080p image would provide better IQ then super sampling. Exactly where that point is is unknown, but surely downsampling, say, a 1200p image would be less efficient then just applying some better AA to a 1080p one.
 

TGO

Hype Train conductor. Works harder than it steams.
Isn't that technically false advertising, 1080p downsampling is one of its features and having "Pro Enhanced" written on the back and finding out that does fuck all on your Pro because it's disabled is a bit misleading.
it's not listed as 4K TV only so...., Pro Enhanced definitely implies it's being Enhanced when playing on a Pro.
 

Lord Error

Insane For Sony
It won't be as good as downsampling from native 4k of course, but I have a very hard time imagining any scenario (in a 3D rendered game, not something like pixel art) where the final result of a system-level downsampling feature would not be far preferable to a native 1080p render.
I can think of one kind of example. If you have the OS trick the game that it's connected onto a 4K screen, and then the OS does the downsampling from 4K, you will be getting slightly worse image quality in many Pro games that render sub 4K, and already do downsampling in their own software.

What I mean is: 1440p downsampled to 1080p (Like UC4 or UC:TLL) will look crisper than if in the same games, 1440p was first upscaled to 4K (by the OS tricking the game), then OS scaling that unnecessarily blurred image back to 1080p).

System level downsampling makes more sense where all games render at 4K, but in case of Pro, it's not as clear cut, and games like UC4 would then need to have sanother patch to avoid this extra upscaling step, for that option to make sense. So you'd end up with many high end games having slightly worse IQ, for the benefit of a few fringe games that down't support downsample in their own software. Not a very good show IMO. However I think some kind of game by game selective option to force the 4K output on system level would make sense on Pro - just as long as it's not global.
 
Isn't that technically false advertising, 1080p downsampling is one of its features and having "Pro Enhanced" written on the back and finding out that does fuck all on your Pro because it's disabled is a bit misleading.
it's not listed as 4K TV only so...., Pro Enhanced definitely implies it's being Enhanced when playing on a Pro.

I mean, it's pretty much on-par with their "Dynamic 4K" spiel.
 
Could it be the built in scaler 'chip' cannot deal with anything above 1080p as it's just the same as the one in the OG PS4. So the only way to do it is with dedicated GPU time, hence requiring manual interference.

/wild guess
 

Neith

Banned
0%. They know 95% of households don't have 4k TVs. And they know supersampling is useful because they apply it in regular PS4s when connected to 720p tvs.

Maybe they don't want their ~3 million PS4 pro users to have significantly better iq than their ~63 million OG PS4 users? I don't know. But it's pissing me off.

I still have my launch PS4. I'm tempted to sell my Pro for around 300, add 200 more and buy the XboneX. Keep the best OG console for it's exclusives and the best refresh for multiplatforms. And I will save money cause I won't even need to buy a 4k tv. Win Win strategy.

That is not a win win strategy, it is utter madness, especially with all the great Sony exclusives still to release that 100% will have supersampling. And then when Sony finally implements it system wide in ten years....

Could it be the built in scaler 'chip' cannot deal with anything above 1080p as it's just the same as the one in the OG PS4. So the only way to do it is with dedicated GPU time, hence requiring manual interference.

/wild guess

Highly doubt it. Seems like a randomly somewhat lazy dev thing to be frank.
 
Could it be the built in scaler 'chip' cannot deal with anything above 1080p as it's just the same as the one in the OG PS4. So the only way to do it is with dedicated GPU time, hence requiring manual interference.

/wild guess

My guess is Sonys tools just suck and its harder for devs to add then it should be

This is a company that fucked up texture filtering for half the generation for crying out loud.
 

Neith

Banned
I don't want to toss in a competitive angle into this, but I was planning to get the white PS Pro for my Panny Plasma, but it's like, what's the point really? They don't seem to give much of a fuck for 1080 players.

Isn't downscaling system wide for the Xbox One X?

What's the point? Nearly all Sony games, and many indies, as well as boost mode, are quite effective in their Pro enhancements. People are pissed, but make no mistake about no sane Pro user wants to go back to the PS4 OG lmao.
 

Clear

CliffyB's Cock Holster
Would it be possible to skip the upscaling step when rendering at a sub-2160p resolution? For example can't a 1440p game just downsample to 1080p instead of upscaling to 2160p and downsampling from there?

Exactly what I was getting at, that intermediate step in creating the 4k frame is likely doing work that although worthwhile for 4k output, might not be optimal if subsequently downscaled to 1080p.

The fundamental thing is that there's no technical reason whatsoever for the 1080p output mode not to utilize downsampling always if that produces the best results, because if the hardware identifies itself as being capable of doing the work to produce a 4k frame the actual output resolution is immaterial.

Point being, if there is a 4k mode, and a Pro-only 1080p mode, then the only reason not to implement downsampling as part of its 1080p renderpath (regardless of what display is hooked up) is if the devs feel that they can make better use of the gpu bandwidtth by other means.

Yes, you could argue its the devs simply deciding not to bother, but considering that it should be a relatively simple thing to implement compared to other nips and tucks that are commonly considered "enhancements", it seems strange that it isn't used more where titles offer Pro-only visual output at 1080p in addition to the 4k mode.
 

KageMaru

Member
You're missing his point - he's bringing up that as the internal resolution goes down, there should be a point where just applying better AA and extra effects to a 1080p image would provide better IQ then super sampling. Exactly where that point is is unknown, but surely downsampling, say, a 1200p image would be less efficient then just applying some better AA to a 1080p one.

I'm not missing his point. He specifically mentioned upscaling up then down. I was wondering why they would upscale up instead of just downsampling to 1080p. Also it's less likely for a studio to implement better AA just for one specific resolution, so I think it's a fair question.
 

RAWRferal

Member
At what point can PS4 Pro owners start actually taking some sort of action against Sony for the lack of downsampling on titles considering it was a much touted feature before launch?

I don't mean to come across as self-entitled, but this is way beyond a joke now. People have speculated about a system-wide firmware implementation for months now, but updates have come and gone with no change and if it was planned, surely Sony would have said something by now.

I know they never explicitly guaranteed anything, but at the same time I know myself and many other users are feeling pretty pissed off about this now, especially as there seems to be no logical explanation for its omission in many titles, where performance would not be compromised.

I can't think of any reason apart from to push people onto 4k TVs. I realise this makes me sound like some sort of tin foil hat conspiracy theorist, but at this point I've run out of any other reason that this would be happening.
 
As a 1080p Pro owner, I've been continuously disappointed with this lack of systemwide support. I would not recommend the Pro to someone with a 1080p screen; save your pennies folks.

Sony has once again decided that they'll stop being consumer friendly, and so will inevitably be usurped by Xbox for their troubles.
 
I'm not missing his point. He specifically mentioned upscaling up then down. I was wondering why they would upscale up instead of just downsampling to 1080p. Also it's less likely for a studio to implement better AA just for one specific resolution, so I think it's a fair question.
...frankly I'm not sure why I included your post. Carry on.
 

Durante

Member
You're missing his point - he's bringing up that as the internal resolution goes down, there should be a point where just applying better AA and extra effects to a 1080p image would provide better IQ then super sampling. Exactly where that point is is unknown, but surely downsampling, say, a 1200p image would be less efficient then just applying some better AA to a 1080p one.
I think you are also missing the point, at least the one I was trying to argue.
That point is that it's well established in this thread by now that many developers don't care enough to implement specific 1080-TV-with-PS4-Pro modes. Therefore, a system-level option is always better than having those games render at just 1080p.

So you'd end up with many high end games having slightly worse IQ, for the benefit of a few fringe games that down't support downsample in their own software.
Why? High-end games which care enough to have a specific mode can continue to use that specific mode. The system-level mode is for everything else (which, at least judging by this thread, are more than just "a few fringe games").
 

Fafalada

Fafracer forever
Lord Error said:
What I mean is: 1440p downsampled to 1080p (Like UC4 or UC:TLL) will look crisper than if in the same games, 1440p was first upscaled to 4K (by the OS tricking the game), then OS scaling that unnecessarily blurred image back to 1080p).
Er... OS tricking the game doesn't force an upscale to 4k.
That would only happen if a game specifically implements the upscale in software. Which would fall under 'very' special cases that probably benefit from the upscale in some way - custom resolve that does something at native 4k perhaps, in which case you'd not lose out quality wise anyway.
 

Flandy

Member
Somewhat unrelated question
Does Pro use good upscaling so 1080p scales perfectly into 4k? I'm wondering if a non Pro 1080p game such as Persona 5 looks exactly the same on my 4k TV as it would on any 1080p TV or if the image is blurred somewhat from the upscale

No. DF have tested it.

If you're on a 4K TV, you can set the Pro to 1080p and the games will downsample. Connect to a 1080p TV? No go.

Set a standard PS4 to 720p, the games are still rendering at 900p/1080p.

Link to the DF test?
 

Lord Error

Insane For Sony
Er... OS tricking the game doesn't force an upscale to 4k.
That would only happen if a game specifically implements the upscale in software. Which would fall under 'very' special cases that probably benefit from the upscale in some way - custom resolve that does something at native 4k perhaps, in which case you'd not lose out quality wise anyway.

Why? High-end games which care enough to have a specific mode can continue to use that specific mode.
My understanding is that all games on Pro that render at say 1440p, or 1800p do in fact upscale that to 4K - somewhere - before sending that 4K signal to the detected 4K TV. If this upscale step happens somewhere through the OS, then I can see how this could theoretically work. What I don't get is how the games not patched to be aware of this option would behave. Would they have to be tricked that they are displaying on the 4K TV or what?

The system-level mode is for everything else (which, at least judging by this thread, are more than just "a few fringe games").
I honestly think the issue is overblown, as very few high profile games suffer from this problem. A 'must downsample" TRC would be more beneficial IMO than a potentially confusing system level option for this. The amount of these games right now is way smaller than the amount of games that render at sub-4K, so I really think that if there's any chance of this option introducing any issue to games that render at sub-4K, the option should be something you activate on a per-game basis.

Somewhat unrelated question
Does Pro use good upscaling so 1080p scales perfectly into 4k? I'm wondering if a non Pro 1080p game such as Persona 5 looks exactly the same on my 4k TV as it would on any 1080p TV or if the image is blurred somewhat from the upscale
There is no way to upscale the 1080p to 4K TV to look the same as it does on 1080TV. The pixel density is not the same, so even when you do a straight pixel nearest neighbour duplication, the result just doesn't look the same.
 

sn0man

Member
It didn't get a full article, I believe it's buried in their GT Sport beta test article.

Paging dark10x, we're still waiting on that promised Pro downsampling article!

Wouldn't it be most damning and consequently most likely to get Sony in gear if they waited till they had a Xbox X to show the issues glaringly?
 
Top Bottom