Obscuritas
Member
And he's damn right.
I honestly thought this would be the gen where we'd get that golden standard of 1080p60 but we barely get 1080p30Yes this please.
Also problematic for Microsoft is marketing 4K. If the game's graphics look the same but running at 4K resolution rather than 900p, how do you market that to people to get them to buy in? Run a commercial that they're viewing on their 1080p television?
Waste of resources. Put the horsepower to use to make the games look gorgeous at 1080p/60fps.
I honestly thought this would be the gen where we'd get that golden standard of 1080p60 but we barely get 1080p30
I disagree. Sitting about 8 feet from a 65 in 4k, games look amazing in 4k. 1080p is a huge decrease in iq. Witcher 3 is straight up ridiculous 4k maxed, even if it is only 30fps. Don't really feel that 60fps is necessary for non fast paced single player games.
Some us were laughing at the ideas when we saw the CPUs going in to PS4/X1. The GPUs could definitely get there but not with weak ass cpus.
that's because the hardware for proper 4K isn't there yet.
Yeah. How much better does 1440p look? I only have a 1080p monitor so I have no idea. Perhaps a middle ground of higher IQ plus better effects.
Devs can't even do good IQ at 1080p with all that antialiasing blurring, what makes you think they could do good IQ at 4K?
That's pretty much what I'm expecting. Flashy 1080p graphics with heavy anti-aliasing for AAA games and 4K for games with more simple graphics or remastersEspecially AAA developers. I could imagine some indies who don't really have a budget for scalable graphics considering 4K to put the power to use on something.
You know that 2K is 1080p right?Or if there's room to spare, something like 2K would still look great on a 4K screen.
Oh hey, Andrew Lauritzen is in that convo.
Completely agreed.
1080p/60 w/ bells and whistles.
Devs can't even do good IQ at 1080p with all that antialiasing blurring, what makes you think they could do good IQ at 4K?
That 4x more pixels creates a cleaner image? To the point where aa is hardly even necessary?
Do you have a 4k monitor or set? Try running game something at 4k high or maxed settings, even without any aa, and tell me the IQ is not dramatically improved.
I disagree. Sitting about 8 feet from a 65 in 4k, games look amazing in 4k. 1080p is a huge decrease in iq. Witcher 3 is straight up ridiculous 4k maxed, even if it is only 30fps. Don't really feel that 60fps is necessary for non fast paced single player games.
A 4K image wouldn't require as aggressive of an AA solution as a 1080p image. There probably wouldn't be any point in using temporal AA. Just simple FXAA or SMAA would cut it, heck even no AA solution would still look okay. Sure there wouldn't be as much geometry and world detail at 4K but just the resolution bump alone makes every texture stand out so much more. I haven't made up my mind yet about 4K gaming but I have tried 1440p gaming and that's pretty sick.
I disagree. Sitting about 8 feet from a 65 in 4k, games look amazing in 4k. 1080p is a huge decrease in iq. Witcher 3 is straight up ridiculous 4k maxed, even if it is only 30fps. Don't really feel that 60fps is necessary for non fast paced single player games.
If Devs can improve on the method Quantum Break used (which I think it I think is what they were talking about?) then that's fine.
I'd rather they go 1080/60 full AA and effects that blow my socks off. All the bells and whistles.
It's not even close. If you disabled AA for 4k alltogether you'd get less temporal stability, but still look better in stills.nOoblet16 said:A 1080P image with TSSAA can probably provide sharper or atleast as sharp image (with much less power) than a 4K image smeared with FXAA blur.
A 1080P image with TSSAA can probably provide sharper or atleast as sharp image (with much less power) than a 4K image smeared with FXAA blur. Keep in mind TSSAA is effectively supersampling of the temporal kind, as such the difference in jaggies won't be much, it will have artifacts (which can be reduced with good techniques like Doom) but at the same time FXAA will have the blur.
There's also the fact that with the performance overhead you can push for 60FPS at 1080P, 60FPS further provides additional temporal resolution. At 30FPS you don't get this.
In short considering how much power is required to run current gen games at 4K, it's not worth it to go for it, ESPECIALLY for a console like Scorpio (which won't even be as powerful as a 980Ti) when you can go for superior rendering techniques and double the framerate at 1080P with extremely good TSSAA techniques.
I don't think 4K is good for gaming right now, tbh. There are so many things devs don't do already, that 4K just feels like it'd make that worse.
I like the idea of recreating 4K from other data - like killzone did, and another game more recently that I forgot the name of. Using previous frame buffers and temporal data to augment a lower resolution new frame to make a new one.
You'd get 'native' 4K rendering but with much lower overhead
Sorry, but what? Should I assume by the Zelda avatar that you aren't a PC gamer? because there are a bunch of video cards out now that do an exceptional job at 4K. 980ti, 1070, 1080, Fury X...even the 390X could do well at 4k given proper optimization.
Having played a lot at 4k, I'd disagree with the sentiment that it's a waste of resources. The visual clarity is a massive jump in detail, especially when at high settings. I think it would almost be a waste of resources to only do 1080p, 1440@60fps is probably the sweet spot for Scorpio.
It's a trade off with how much detail you can show at a given resolution versus how much you can render.Just a question.
With this argument, so why don't devs make 720p games with even better graphics?
How many here have actually gamed on 4K (UHD technically, but I'll use 4K moving forward) with power sufficient to play the game properly? I'm talking, GTX 1080 or a minimum of Crossfired R9 290s/ SLI'ed 970s.
It is absolutely a difference, and in my opinion, a very large one. I will readily admit I am not like most gamers - I have tried gaming at 120hz and prefer to play at 4K60hz instead. I'd say I am a true resolution junkie.
That said, I game on both my 28" 4K monitor, and my 65" 4K television. I sit 3 feet away from my monitor, and about 8-10 feet from my television. I truly believe that for 2D and VR gaming, 4K is the future and in the case of VR, will be essential.
Now to the point of this post. While I love 4K gaming, I don't think a console with 6TF will cut it at all. The GTX 1080 has ~9TF and it isn't perfect at it, but it is satisfactory. I want to see a truly capable 4K gaming console, and obviously I'd love to see it soon - however that is cost prohibitive and will not work. I'm hoping in the next 4-5 years we'll see hardware from GPU's to VR HMD's be able to sufficiently support such an experience.