• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Doom's lead graphics programmer thinks 4K isn't a good use of Xbox Scorpio's power

10k

Banned
Yes this please.

Also problematic for Microsoft is marketing 4K. If the game's graphics look the same but running at 4K resolution rather than 900p, how do you market that to people to get them to buy in? Run a commercial that they're viewing on their 1080p television?

Waste of resources. Put the horsepower to use to make the games look gorgeous at 1080p/60fps.
I honestly thought this would be the gen where we'd get that golden standard of 1080p60 but we barely get 1080p30 :(
 

LCGeek

formerly sane
He's right it's a waste for only increasing the resolution. It's akin to using too much intensive AA with little benefit.

I honestly thought this would be the gen where we'd get that golden standard of 1080p60 but we barely get 1080p30 :(

Some us were laughing at the ideas when we saw the CPUs going in to PS4/X1. The GPUs could definitely get there but not with weak ass cpus.
 

ZOONAMI

Junior Member
I disagree. Sitting about 8 feet from a 65 in 4k, games look amazing in 4k. 1080p is a huge decrease in iq. Witcher 3 is straight up ridiculous 4k maxed, even if it is only 30fps. Don't really feel that 60fps is necessary for non fast paced single player games.
 

darkinstinct

...lacks reading comprehension.
I disagree. Sitting about 8 feet from a 65 in 4k, games look amazing in 4k. 1080p is a huge decrease in iq. Witcher 3 is straight up ridiculous 4k maxed, even if it is only 30fps. Don't really feel that 60fps is necessary for non fast paced single player games.

Devs can't even do good IQ at 1080p with all that antialiasing blurring, what makes you think they could do good IQ at 4K?
 
Agreed. I'm going to own a 4K TV by the time Scorpio launches (and I'll be getting one of those as well) and I would prefer upscaled 1080p.

Good to know devs are (still) thinking about the best way to use the limited resources at their disposal, even if those resources aren't as limited as they once were.
 
It makes sense from Microsoft's standpoint. Games tend to be limited by the relatively weak CPUs in the consoles. So on the PC-side, throwing more CPU power on something that has to run on consoles isn't going to give you much of an improvement. On the other hand, there's always something you could be doing with more GPU, even if that just means more AA, higher framerate, or higher resolutions. So with the Scorpio, the CPU upgrades aren't going to be significant if they same game is also on the OG One, so those GPU upgrades will likely just go to making things sharper and smoother.
 

Shpeshal Nick

aka Collingwood
If Devs can improve on the method Quantum Break used (which I think it I think is what they were talking about?) then that's fine.

I'd rather they go 1080/60 full AA and effects that blow my socks off. All the bells and whistles.
 

hesido

Member
Absofrigginlutely.

4 times more intricate shaders, allowing realtime everything at 1080p or more of the same stuff at 4K. I know which one I'd choose.

However, I'm hopeful, that the 4X MSAA at 1080p trick that whom I think is sebbbi on Beyond3d on that twitter account mentions, catches on because is the absolutely most reasonable way to do 4K.

Edit: Sebbbi is a genius and I'm a big fan.
 
Some us were laughing at the ideas when we saw the CPUs going in to PS4/X1. The GPUs could definitely get there but not with weak ass cpus.

As far as specs go, that's something I hate about both Sony and MS's consoles. One big reason I'm not mad about the mid gen upgrades is that we wont be shackled to the same issues for 7 goddamn years.
 
I've been saying this for a long time. Things like GUIs won't take much to render at 4K, but rendering the rest of the image is better done with a lower resolution with some form of AA and then upscaled. Only graphically undemanding games will end up native 4K, the rest will use the extra power for better shaders. I'd still say there's benefits to going above 1080p for a lot of people who do have larger screens and sit close to them. 1440p seems like a decent improvement in definition without wasting a lot of performance.

I would wager that we'll see more 60 FPS titles given the two-tier generation, but especially for multiplayer it might not happen because the older consoles can't reach that level, and the player pools aren't supposed to be separated. It seems like devs want to make 60 FPS multiplayer these days, and the higher tier models would certainly allow for that, but I don't see how it would work for older consoles without too large sacrifices.
 
that's because the hardware for proper 4K isn't there yet.

Sorry, but what? Should I assume by the Zelda avatar that you aren't a PC gamer? because there are a bunch of video cards out now that do an exceptional job at 4K. 980ti, 1070, 1080, Fury X...even the 390X could do well at 4k given proper optimization.

Having played a lot at 4k, I'd disagree with the sentiment that it's a waste of resources. The visual clarity is a massive jump in detail, especially when at high settings. I think it would almost be a waste of resources to only do 1080p, 1440@60fps is probably the sweet spot for Scorpio.
 
I think 90% of the people claiming they'd rather see 1080p have never actually seen a game running at native 4K. There's a massive jump in image quality. I actually tested this theory out on my 65" 4K HDR TV last night running Battlefront (took a framerate hit obviously)...but 4x the pixel count just makes the game much more immersive...there's no jaggies, even with AA off and the image is just clean.
 

Calm Mind

Member
he_s_right_you_know_by_nightdemon12-d70r777.jpg


http://www.gameinformer.com/b/news/...d-scorpio-says-microsofts-shannon-lofits.aspx

^^What is the point?
 

ZOONAMI

Junior Member
Devs can't even do good IQ at 1080p with all that antialiasing blurring, what makes you think they could do good IQ at 4K?

That 4x more pixels creates a cleaner image? To the point where aa is hardly even necessary?

Do you have a 4k monitor or set? Try running game something at 4k high or maxed settings, even without any aa, and tell me the IQ is not dramatically improved.
 

RowdyReverb

Member
Especially AAA developers. I could imagine some indies who don't really have a budget for scalable graphics considering 4K to put the power to use on something.
That's pretty much what I'm expecting. Flashy 1080p graphics with heavy anti-aliasing for AAA games and 4K for games with more simple graphics or remasters
 
Devs can't even do good IQ at 1080p with all that antialiasing blurring, what makes you think they could do good IQ at 4K?

A 4K image wouldn't require as aggressive of an AA solution as a 1080p image. There probably wouldn't be any point in using temporal AA. Just simple FXAA or SMAA would cut it, heck even no AA solution would still look okay. Sure there wouldn't be as much geometry and world detail at 4K but just the resolution bump alone makes every texture stand out so much more. I haven't made up my mind yet about 4K gaming but I have tried 1440p gaming and that's pretty sick.
 

darkinstinct

...lacks reading comprehension.
That 4x more pixels creates a cleaner image? To the point where aa is hardly even necessary?

Do you have a 4k monitor or set? Try running game something at 4k high or maxed settings, even without any aa, and tell me the IQ is not dramatically improved.

4k does not in the slightest eliminate the need for proper FSAA.
 

BigTnaples

Todd Howard's Secret GAF Account
I disagree. Sitting about 8 feet from a 65 in 4k, games look amazing in 4k. 1080p is a huge decrease in iq. Witcher 3 is straight up ridiculous 4k maxed, even if it is only 30fps. Don't really feel that 60fps is necessary for non fast paced single player games.



Yep. 4K to me is a bigger leap than SD to HD.


The difference is astounding.
 

nOoblet16

Member
A 4K image wouldn't require as aggressive of an AA solution as a 1080p image. There probably wouldn't be any point in using temporal AA. Just simple FXAA or SMAA would cut it, heck even no AA solution would still look okay. Sure there wouldn't be as much geometry and world detail at 4K but just the resolution bump alone makes every texture stand out so much more. I haven't made up my mind yet about 4K gaming but I have tried 1440p gaming and that's pretty sick.

A 1080P image with TSSAA can probably provide sharper or atleast as sharp image (with much less power) than a 4K image smeared with FXAA blur. Keep in mind TSSAA is effectively supersampling of the temporal kind, as such the difference in jaggies won't be much, it will have artifacts (which can be reduced with good techniques like Doom) but at the same time FXAA will have the blur.

There's also the fact that with the performance overhead you can push for 60FPS at 1080P, 60FPS further provides additional temporal resolution. At 30FPS you don't get this.

In short considering how much power is required to run current gen games at 4K, it's not worth it to go for it, ESPECIALLY for a console like Scorpio (which won't even be as powerful as a 980Ti) when you can go for superior rendering techniques and double the framerate at 1080P with extremely good TSSAA techniques.
 
Scaling techniques and AA will never look as good as a native 4K image. Also TFlops don't always equate to 1:1 with pixel count either. Running a game that has the same assets (polycount, objects...etc) isn't going to ALWAYS require 4x the power to run 4x the resolution and I'm pretty positive that a semi-custom GPU DESIGNED to run 4K native will do a better job of it than something in your desktop with similar specs.
 
People thinking FXAA looks blurry at 4K have probably never seen FXAA at 4K. It looks great. Blurry at 1080p, but perfectly fine at much higher resolutions.
 
I disagree. Sitting about 8 feet from a 65 in 4k, games look amazing in 4k. 1080p is a huge decrease in iq. Witcher 3 is straight up ridiculous 4k maxed, even if it is only 30fps. Don't really feel that 60fps is necessary for non fast paced single player games.

Of course maxed out games look amazing at 4K. Scorpio wouldn't be able to max out Witcher 3 at 4K though.
 
If Devs can improve on the method Quantum Break used (which I think it I think is what they were talking about?) then that's fine.

I'd rather they go 1080/60 full AA and effects that blow my socks off. All the bells and whistles.

You're right its the same concept Quantum break used to reconstruct its image.
Thiago is right that brute forcing 4k would be stupid.
 

Fafalada

Fafracer forever
nOoblet16 said:
A 1080P image with TSSAA can probably provide sharper or atleast as sharp image (with much less power) than a 4K image smeared with FXAA blur.
It's not even close. If you disabled AA for 4k alltogether you'd get less temporal stability, but still look better in stills.
And most TSSAA implementations only work as well as they do because of the dreadful temporal resolution of modern display tech, which is IMO the real problem - ie. if pictures aren't a smeared mess in motion already, temporal AA has to be a lot less aggressive to avoid ruining the image (a lesson that VR displays teach).
 

K.Jack

Knowledge is power, guard it well
Both Sony and Microsoft need a shift in their corporate thinking, from 4k, to the incredible shit you can push at 1080p, with a 5+ teraflop system.

Regular ass graphics at 4K/30fps would be so boring.
 

Stallion Free

Cock Encumbered
6TF can't do Doom 2016 at 4K maxed locked at 60. 9TF can't currently. So yeah, I don't see 4K being a common use of the power with the cuts the devs will have to make elsewhere.
 

KingBroly

Banned
I don't think 4K is good for gaming right now, tbh. There are so many things devs don't do already, that 4K just feels like it'd make that worse.
 

Fbh

Member
As an owner of a 4K TV I can sort of see his point.

4K is nice and all but right now I'd take a console that can run all current games at 60fps than one that can run them at 4K but 30fps. Even if we assume 60fps becomes the standard, I'd rather have devs pushing graphics over resolution
 
A 1080P image with TSSAA can probably provide sharper or atleast as sharp image (with much less power) than a 4K image smeared with FXAA blur. Keep in mind TSSAA is effectively supersampling of the temporal kind, as such the difference in jaggies won't be much, it will have artifacts (which can be reduced with good techniques like Doom) but at the same time FXAA will have the blur.

There's also the fact that with the performance overhead you can push for 60FPS at 1080P, 60FPS further provides additional temporal resolution. At 30FPS you don't get this.

In short considering how much power is required to run current gen games at 4K, it's not worth it to go for it, ESPECIALLY for a console like Scorpio (which won't even be as powerful as a 980Ti) when you can go for superior rendering techniques and double the framerate at 1080P with extremely good TSSAA techniques.

I wasn't making an argument about what to do with Scorpio's power, but rather whether 4K results in good IQ. Like I said, I haven't made my mind up about 4K gaming. I'm not sure it's worth the performance cost on consoles.

TSSAA comes with its own set of problems. It causes blurring during motion, something that Doom hides with its motion blur (and it mitigated by some extent by its high framer rate), and something Uncharted 4 doesn't not do nearly as effectively (probably the lower frame rate here). There could be games on the Scorpio which lack the CPU power to hit 60 fps and might instead opt for 4K resolution at 30fps. In that case, using a simple post processing filter might be a good choice because they work the same, regardless of the frame rate and how quickly you turn your camera around (or when objects in the world move around). Witcher 2 used a sharpening filter to mitigate the blurriness of its filter to great effect and The Division uses one in conjunction with its TAA.
 

mrklaw

MrArseFace
I like the idea of recreating 4K from other data - like killzone did, and another game more recently that I forgot the name of. Using previous frame buffers and temporal data to augment a lower resolution new frame to make a new one.

You'd get 'native' 4K rendering but with much lower overhead
 
It's a good thing so many people are agreeing, because even with Neo- and Scorpio-level power, most games aren't going to be able to run at native 4K.
 

Mit-

Member
Native 4K games on Scorpio would look worse than 1080p Xbox One games. They'd have a higher resolution but a lot less everything else.

Therefore no one is going to render native 4K unless they have really basic graphics.
 

RoboPlato

I'd be in the dick
4K reconstruction is something I've been thinking about ever since I saw Rainbow Six Siege using it to hit 1080p. Wouldn't be surprised if Scorpio and Neo incorporate some of that tech if it's ready. I think that would be a great idea.
 
I like the idea of recreating 4K from other data - like killzone did, and another game more recently that I forgot the name of. Using previous frame buffers and temporal data to augment a lower resolution new frame to make a new one.

You'd get 'native' 4K rendering but with much lower overhead

Killzone used 1080i to get 'native' 1080p which caused lots of blurring. It was half a previous frame and half of a new frame to render. I hope that isn't what we get in the future. I'd rather shoot for native 1080p or 1440p than 2160i (if devs are considering that route).
 

Hypron

Member
Sorry, but what? Should I assume by the Zelda avatar that you aren't a PC gamer? because there are a bunch of video cards out now that do an exceptional job at 4K. 980ti, 1070, 1080, Fury X...even the 390X could do well at 4k given proper optimization.

Having played a lot at 4k, I'd disagree with the sentiment that it's a waste of resources. The visual clarity is a massive jump in detail, especially when at high settings. I think it would almost be a waste of resources to only do 1080p, 1440@60fps is probably the sweet spot for Scorpio.

Even the 1080 isn't powerful enough to consistently play recent games at 4k60 (without reducing settings too much), which is probably what he referred to.
 

coolbrys

Member
How many here have actually gamed on 4K (UHD technically, but I'll use 4K moving forward) with power sufficient to play the game properly? I'm talking, GTX 1080 or a minimum of Crossfired R9 290s/ SLI'ed 970s.

It is absolutely a difference, and in my opinion, a very large one. I will readily admit I am not like most gamers - I have tried gaming at 120hz and prefer to play at 4K60hz instead. I'd say I am a true resolution junkie.

That said, I game on both my 28" 4K monitor, and my 65" 4K television. I sit 3 feet away from my monitor, and about 8-10 feet from my television. I truly believe that for 2D and VR gaming, 4K is the future and in the case of VR, will be essential.

Now to the point of this post. While I love 4K gaming, I don't think a console with 6TF will cut it at all. The GTX 1080 has ~9TF and it isn't perfect at it, but it is satisfactory. I want to see a truly capable 4K gaming console, and obviously I'd love to see it soon - however that is cost prohibitive and will not work. I'm hoping in the next 4-5 years we'll see hardware from GPU's to VR HMD's be able to sufficiently support such an experience.
 

jett

D-Member
Anyone who prioritizes 4K over a higher framerate is gonna be damn low in my list of devs to give a fuck about.
 

Nirolak

Mrgrgr
Just a question.

With this argument, so why don't devs make 720p games with even better graphics?
It's a trade off with how much detail you can show at a given resolution versus how much you can render.

There's a point where you need 4K to see the extra detail being rendered, but we're still pretty far from that point.
 
How many here have actually gamed on 4K (UHD technically, but I'll use 4K moving forward) with power sufficient to play the game properly? I'm talking, GTX 1080 or a minimum of Crossfired R9 290s/ SLI'ed 970s.

It is absolutely a difference, and in my opinion, a very large one. I will readily admit I am not like most gamers - I have tried gaming at 120hz and prefer to play at 4K60hz instead. I'd say I am a true resolution junkie.

That said, I game on both my 28" 4K monitor, and my 65" 4K television. I sit 3 feet away from my monitor, and about 8-10 feet from my television. I truly believe that for 2D and VR gaming, 4K is the future and in the case of VR, will be essential.

Now to the point of this post. While I love 4K gaming, I don't think a console with 6TF will cut it at all. The GTX 1080 has ~9TF and it isn't perfect at it, but it is satisfactory. I want to see a truly capable 4K gaming console, and obviously I'd love to see it soon - however that is cost prohibitive and will not work. I'm hoping in the next 4-5 years we'll see hardware from GPU's to VR HMD's be able to sufficiently support such an experience.

I'm with you on this. My TV does 1080@120hz and 4K@60hz and I prefer 4K over anything else.
 

Smokey

Member
He's entitled to his opinion.

Doesn't mean he's right.

I'm feeling like people saying 1080p is enough or diminishing returns have never gamed at 4k. The difference is astonishing.
 
Top Bottom