Lol they knowIs there a chance at all that Sony isn't aware of the community's need for this?
Lol they knowIs there a chance at all that Sony isn't aware of the community's need for this?
0%. They know 95% of households don't have 4k TVs. And they know supersampling is useful because they apply it in regular PS4s when connected to 720p tvs.Is there a chance at all that Sony isn't aware of the community's need for this?
Wow.
But still you simply should not trust developers to not fuck up especially if there is a way to easily avoid potential issue on system level.
There are very few native 4k PS4 pro games. IQ looks infinitely better in each and every supersampled game compared to native 1080p. So, whatever artifacting there is, imho it looks much better than just rendering at 1080p.Logically though, the value of downsampling depends on the quality of the image in the 4k framebuffer. Going from a pristine native 4k image is not the same as going from one thats already being scaled up from a res between 1080p and 4k.
A 4k frame reconstructed via checkerboarding, temporal injection and/or additional re-scaling methods is inevitably going to contain some sort of artifacting, which may well be amplified by an additional hardware-level rescale because with every additional process on the data more and more fudging is going on.
Potentially ending up with an over-processed 1080p image with any/all performance costs from the 4k internal upscale lumped in! That doesn't sound like a guaranteed win to me.
Objectively, upscaling then downscaling is not smart pipelining.
This is a company that knows that people have, for years, wanted to be able to friggen change their PSN account names.... and it still hasn't happened.Is there a chance at all that Sony isn't aware of the community's need for this?
There's absolutely no "fudging" when doing a simple bilinear downsampling from 3840x2160 to 1920x1080. It's a 4:1 mapping.A 4k frame reconstructed via checkerboarding, temporal injection and/or additional re-scaling methods is inevitably going to contain some sort of artifacting, which may well be amplified by an additional hardware-level rescale because with every additional process on the data more and more fudging is going on.
Yes.You sure?
Yes.
The 4K-option is greyed out and can't be selected, only the 60/30fps toggle works.
Not even ashamed of teasing this directly in game menus.Yes.
The 4K-option is greyed out and can't be selected, only the 60/30fps toggle works.
There are very few native 4k PS4 pro games. IQ looks infinitely better in each and every supersampled game compared to native 1080p. So, whatever artifacting there is, imho it looks much better than just rendering at 1080p.
Durante said:There's absolutely no "fudging" when doing a simple bilinear downsampling from 3840x2160 to 1920x1080. It's a 4:1 mapping.
Yup, they even deleted the feedback request that were getting tons of votes.Lol they know
Yup, they even deleted the feedback request that were getting tons of votes.
Are there any examples of a sub-native 4k display (say 1440p or even 1800p) being used to create supersampled 1080p output? How does that compare to using the same horsepower to anti-alias or filter the image on a native 1080p frame?
My point is that I understand the benefits of supersampling from 4k native, but there logically has to be a path of diminishing returns based on ratio of source to output frame size. By the same token utilizing power to post-process a native 1080p frame has to curve upwards towards some point of convergence quality-wise.
What I'm getting at is if the 4k base frame is arrived at indirectly, at a processing cost decided on an ad-hoc basis by the developer of the running app, a system level downscale is not an ideal solution. The downsampling also needs to be handled on ad-hoc basis, which is why we are in the situation we're in.
A 4:1 remap of data that may have been scaled up already from as little as 1440p original image. If you can see a discrepancy between a native 2160p frame and one that has been created via temporal injection/CBD or whatever, that same discrepancy is going to be reflected in the 1080p frame surely? Yes, you'll gain stability from less sub-pixel jitter, but what about the rest of the image?
Arrogant Sony is back.Yup, they even deleted the feedback request that were getting tons of votes.
Almost every game runs at less than 4k: Battlefield 1, Titanfall 2, Uncharted 4... Every time there is the option, I choose the supersampled version because of how much crisper the image is.
For the time being, every PS4 pro game looks way better supersampled compared to native 1080p.I don't doubt you. My point is simply that these processes aren't magical, a larger array of pixels is just being transformed into a smaller one. The question is, what produces the best final result given the same finite power to operate upon that data.
CoD WW2 doesn't downsample either, at least in the beta.
CoD WW2 doesn't downsample either, at least in the beta.
That can't be right, the beta has pristine image quality.
If true, the TAA in use is simply superb.
The PS4 version uses a dynamic resolution with the native resolution ranging from 960x1080 to 1920x1080. The PS4 Pro version also uses a dynamic resolution with the native resolution ranging from 1440x1620 to 2880x1620. The PS4 Pro downsamples from this resolution when outputting at 1080p. Both versions also appear to feature a temporal reconstruction technique that improves the resolution for parts of the frame that are similar to previous frames.
System already has video options - downsampling could sit there (not unlike how people mostly do it on PC - where it's also ad-hoc). They could simply re-enable the "4k" option when connected to 1080p displays (rename it to something else if it sounds confusing), and let people handle it themselves from there, and keeps it transparent for games that already support 4k modes.Clear said:The downsampling also needs to be handled on ad-hoc basis, which is why we are in the situation we're in.
It won't be as good as downsampling from native 4k of course, but I have a very hard time imagining any scenario (in a 3D rendered game, not something like pixel art) where the final result of a system-level downsampling feature would not be far preferable to a native 1080p render.A 4:1 remap of data that may have been scaled up already from as little as 1440p original image. If you can see a discrepancy between a native 2160p frame and one that has been created via temporal injection/CBD or whatever, that same discrepancy is going to be reflected in the 1080p frame surely? Yes, you'll gain stability from less sub-pixel jitter, but what about the rest of the image?
Must have misunderstood in the video then, carry on.
A 4:1 remap of data that may have been scaled up already from as little as 1440p original image. If you can see a discrepancy between a native 2160p frame and one that has been created via temporal injection/CBD or whatever, that same discrepancy is going to be reflected in the 1080p frame surely? Yes, you'll gain stability from less sub-pixel jitter, but what about the rest of the image?
It won't be as good as downsampling from native 4k of course, but I have a very hard time imagining any scenario (in a 3D rendered game, not something like pixel art) where the final result of a system-level downsampling feature would not be far preferable to a native 1080p render.
You're missing his point - he's bringing up that as the internal resolution goes down, there should be a point where just applying better AA and extra effects to a 1080p image would provide better IQ then super sampling. Exactly where that point is is unknown, but surely downsampling, say, a 1200p image would be less efficient then just applying some better AA to a 1080p one.Would it be possible to skip the upscaling step when rendering at a sub-2160p resolution? For example can't a 1440p game just downsample to 1080p instead of upscaling to 2160p and downsampling from there?
No. DF have tested it.
If you're on a 4K TV, you can set the Pro to 1080p and the games will downsample. Connect to a 1080p TV? No go.
Set a standard PS4 to 720p, the games are still rendering at 900p/1080p.
Yes.
The 4K-option is greyed out and can't be selected, only the 60/30fps toggle works.
The fact that Sony developed titles are fucking over 1080p owners is the biggest slap in the face.
I can think of one kind of example. If you have the OS trick the game that it's connected onto a 4K screen, and then the OS does the downsampling from 4K, you will be getting slightly worse image quality in many Pro games that render sub 4K, and already do downsampling in their own software.It won't be as good as downsampling from native 4k of course, but I have a very hard time imagining any scenario (in a 3D rendered game, not something like pixel art) where the final result of a system-level downsampling feature would not be far preferable to a native 1080p render.
Isn't that technically false advertising, 1080p downsampling is one of its features and having "Pro Enhanced" written on the back and finding out that does fuck all on your Pro because it's disabled is a bit misleading.
it's not listed as 4K TV only so...., Pro Enhanced definitely implies it's being Enhanced when playing on a Pro.
0%. They know 95% of households don't have 4k TVs. And they know supersampling is useful because they apply it in regular PS4s when connected to 720p tvs.
Maybe they don't want their ~3 million PS4 pro users to have significantly better iq than their ~63 million OG PS4 users? I don't know. But it's pissing me off.
I still have my launch PS4. I'm tempted to sell my Pro for around 300, add 200 more and buy the XboneX. Keep the best OG console for it's exclusives and the best refresh for multiplatforms. And I will save money cause I won't even need to buy a 4k tv. Win Win strategy.
Could it be the built in scaler 'chip' cannot deal with anything above 1080p as it's just the same as the one in the OG PS4. So the only way to do it is with dedicated GPU time, hence requiring manual interference.
/wild guess
Could it be the built in scaler 'chip' cannot deal with anything above 1080p as it's just the same as the one in the OG PS4. So the only way to do it is with dedicated GPU time, hence requiring manual interference.
/wild guess
I don't want to toss in a competitive angle into this, but I was planning to get the white PS Pro for my Panny Plasma, but it's like, what's the point really? They don't seem to give much of a fuck for 1080 players.
Isn't downscaling system wide for the Xbox One X?
Would it be possible to skip the upscaling step when rendering at a sub-2160p resolution? For example can't a 1440p game just downsample to 1080p instead of upscaling to 2160p and downsampling from there?
You're missing his point - he's bringing up that as the internal resolution goes down, there should be a point where just applying better AA and extra effects to a 1080p image would provide better IQ then super sampling. Exactly where that point is is unknown, but surely downsampling, say, a 1200p image would be less efficient then just applying some better AA to a 1080p one.
...frankly I'm not sure why I included your post. Carry on.I'm not missing his point. He specifically mentioned upscaling up then down. I was wondering why they would upscale up instead of just downsampling to 1080p. Also it's less likely for a studio to implement better AA just for one specific resolution, so I think it's a fair question.
I think you are also missing the point, at least the one I was trying to argue.You're missing his point - he's bringing up that as the internal resolution goes down, there should be a point where just applying better AA and extra effects to a 1080p image would provide better IQ then super sampling. Exactly where that point is is unknown, but surely downsampling, say, a 1200p image would be less efficient then just applying some better AA to a 1080p one.
Why? High-end games which care enough to have a specific mode can continue to use that specific mode. The system-level mode is for everything else (which, at least judging by this thread, are more than just "a few fringe games").So you'd end up with many high end games having slightly worse IQ, for the benefit of a few fringe games that down't support downsample in their own software.
Er... OS tricking the game doesn't force an upscale to 4k.Lord Error said:What I mean is: 1440p downsampled to 1080p (Like UC4 or UC:TLL) will look crisper than if in the same games, 1440p was first upscaled to 4K (by the OS tricking the game), then OS scaling that unnecessarily blurred image back to 1080p).
No. DF have tested it.
If you're on a 4K TV, you can set the Pro to 1080p and the games will downsample. Connect to a 1080p TV? No go.
Set a standard PS4 to 720p, the games are still rendering at 900p/1080p.
Link to the DF test?
Er... OS tricking the game doesn't force an upscale to 4k.
That would only happen if a game specifically implements the upscale in software. Which would fall under 'very' special cases that probably benefit from the upscale in some way - custom resolve that does something at native 4k perhaps, in which case you'd not lose out quality wise anyway.
My understanding is that all games on Pro that render at say 1440p, or 1800p do in fact upscale that to 4K - somewhere - before sending that 4K signal to the detected 4K TV. If this upscale step happens somewhere through the OS, then I can see how this could theoretically work. What I don't get is how the games not patched to be aware of this option would behave. Would they have to be tricked that they are displaying on the 4K TV or what?Why? High-end games which care enough to have a specific mode can continue to use that specific mode.
I honestly think the issue is overblown, as very few high profile games suffer from this problem. A 'must downsample" TRC would be more beneficial IMO than a potentially confusing system level option for this. The amount of these games right now is way smaller than the amount of games that render at sub-4K, so I really think that if there's any chance of this option introducing any issue to games that render at sub-4K, the option should be something you activate on a per-game basis.The system-level mode is for everything else (which, at least judging by this thread, are more than just "a few fringe games").
There is no way to upscale the 1080p to 4K TV to look the same as it does on 1080TV. The pixel density is not the same, so even when you do a straight pixel nearest neighbour duplication, the result just doesn't look the same.Somewhat unrelated question
Does Pro use good upscaling so 1080p scales perfectly into 4k? I'm wondering if a non Pro 1080p game such as Persona 5 looks exactly the same on my 4k TV as it would on any 1080p TV or if the image is blurred somewhat from the upscale
It didn't get a full article, I believe it's buried in their GT Sport beta test article.
Paging dark10x, we're still waiting on that promised Pro downsampling article!