• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Titanfall 2 on XOX can go above 4K with Dyanmic Superscaling

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
This is what a GPU can do at 6tflops and not bandwidth limited. I'd be surprised if it ever drops below 1440p

Just so people get the sense of how big a jump Xbox One X is compared to Xbox One compared to PS4/Pro. Pro has double the CU's but is very bandwidth limited.

Granted Xbox One X is running at console settings, but that makes sense
 

Head.spawn

Junior Member
What excites me most about this is I was excited to hear how incredible the Titanfall 2 campaign was, but I was holding out for hope that it would be improved on Xbox One X. Now to hear stuff like this, holy shit lol.

I probably won't be doing too much Titanfall 2 MP, but that campaign? Oh boy.

The campaign is pretty much perfect. The story is interesting enough to keep you going, gameplay is tight, the gameplay variety is completely unexpected, the pacing is on point without an extra inch of padding, you know the gunplay/mobility is great already and I'll be damned it's actually fun throughout.

Any FPS fan should give it a shot.

That's coming from someone who reaction when they announced they were adding campaign was thinking the idea of a campaign shoved into Titalfall and not focusing purely on MP was going to be the dumbest idea ever.

I'm sure high quality textures are already in the console versions of the game, but how much RAM you got determines how they are used.

And seriously, I can play old af games in 4K and it looks insanely better than 1080p. Hell, look at the Dolphin emulation screenshots.

Nintendo didn't use a lot detailed textures back then for the most part, that's why a lot of Gamecube games still hold up. In the instances that they do, those definitely stick out. I think another thing to consider is that most people using CEMU/Dolphin to experience this are often look at these titles 4k downsampled to 1080p, instead of on an actual 4k display. Downsampled, textures are going to look about the same.. on an actual 4k set though, those textures are being stretched/scaled upwards.

Compare RE4 4k via Dolphin vs RE4 HD Remake at 4k (even though they do a mediocre job with textures). The difference is very noticeable. Resident Evil 4 HD Project Thread shows actual decent textures (compared to the original and HD release) can show an even bigger difference in titles that rely on detailed textures.
 
This is what's good about this new console. It can give already released titles additional sales momentum. I was always interested in Titanfall 2's campaign, and this just makes it a sure buy.
 

Crevox

Member
This makes me wonder why NVidia's DSR technology is limited to a max of 4x, and why more games don't include resolution scalers. They seem extremely effective in managing framerate and boosting image quality.
 

CJY

Banned
Maybe they're just lazy, right? But seriously, stop focusing on just the teraflops. It's always been said that it isn't just GPU teraflops alone that separate these two consoles performance wise. There are other fairly important and significant factors also. Then again, I suspect you could be joking? :p



giphy.gif

Are you talking about the secret sauce?
 
Can the 1X devkit do VRR output yet?!? C'mon DKo5 plug that sucker into a FreeSync monitor and give it a 90+ FPS target!

Battlefield 1 has about what, 90ish physics entities at any one time. Loads one level at a time, has low geometry density, almost no physics "clutter" objects, and no AI in MP

Destiny has to be able to stream between areas, has triple digit physics entities (players, AI, clutter objects, weapon projectiles), and also has AI that needs to make relatively smart decisions so the timeslice for the AI has to be big enough to support that

you really can't directly compare between any two games in particular on how or how not they got to a framerate
Yet the original Destiny managed to run more or less the same game on last gen consoles, just with crappier graphics and more pop-in/loading problems. Whereas Battlefield pretty much doubled its player count and frame rate along with the improved graphics.

Are you trying to tell me that even the 1X's Jaguar can't handle doubling up game logic that a now 12 year old Xbox 360 managed?

Also haven't they claimed to have moved their 'physics host' onto their server infrastructure? So really they should have lighter CPU load than the original game.

I guess we'll have a better idea once the PC version is out just how badly CPU limited Destiny 2 is. I'm not sure I'm buying what they're selling.
 

Leyasu

Banned
Battlefield 1 has about what, 90ish physics entities at any one time. Loads one level at a time, has low geometry density, almost no physics "clutter" objects, and no AI in MP

Destiny has to be able to stream between areas, has triple digit physics entities (players, AI, clutter objects, weapon projectiles), and also has AI that needs to make relatively smart decisions so the timeslice for the AI has to be big enough to support that

you really can't directly compare between any two games in particular on how or how not they got to a framerate

I have always thought that you worked for bungie.
 
There's quite a few things, in order of importance:
- RAM Pool
- RAM BW
- GPU Clock
- CPU Clock
- CPU 'customisations'

And Microsoft did say they did a number of GPU customizations also, in addition to stating the following.

http://www.eurogamer.net/articles/digitalfoundry-2017-the-scorpio-engine-in-depth

According to Goossen, some performance optimisations from the upcoming AMD Vega architecture factor into the Scorpio Engine's design, but other features that made it into PS4 Pro - for example, double-rate FP16 processing - do not.

They may not be any big ticket items with any fancy names attached, but every little bit counts. Especially stuff like this.

"Those are kind of the big items but we also leveraged the fact that we understand the AMD architecture really, really well now and how well it does on our games," adds Goossen "So we were able to go through and examine a lot of the internal queues and buffers and caches and FIFOs that make up this very deep pipeline that if you can find the right areas that are causing bottlenecks, for very small area we could increase those sizes and get effective wins. This was a very big focus of ours to go through and you basically really leverage that understanding of having those years of looking at performance on the Xbox One."

Another major factor is all the game profiling Microsoft performed and used to a large extent to design Xbox One X. There's no way they wouldn't have profiled the hell out of games like Titanfall 1 and 2. So if people accept no single aspect, or even multiple aspects, of Xbox One X's hardware as being remotely capable of what's described in the OP, do understand that games like Titanfall 2 are what Xbox One X was designed and built for.

If it's at all based on an existing very popular xbox one title, best believe Microsoft profiled the shit out of it to make sure future iterations of that game or its engine perform as good as they possibly can on Xbox One X.
 

Colbert

Banned
Yet the original Destiny managed to run more or less the same game on last gen consoles, just with crappier graphics and more pop-in/loading problems. Whereas Battlefield pretty much doubled its player count and frame rate along with the improved graphics.

Are you trying to tell me that even the 1X's Jaguar can't handle doubling up game logic that a now 12 year old Xbox 360 managed?

Also haven't they claimed to have moved their 'physics host' onto their server infrastructure? So really they should have lighter CPU load than the original game.

I guess we'll have a better idea once the PC version is out just how badly CPU limited Destiny 2 is. I'm not sure I'm buying what they're selling.

I am not buying it too. Just a bunch of lame excuses for decision based on "console politics" ...
 

KageMaru

Member
Yet the original Destiny managed to run more or less the same game on last gen consoles, just with crappier graphics and more pop-in/loading problems. Whereas Battlefield pretty much doubled its player count and frame rate along with the improved graphics.

Are you trying to tell me that even the 1X's Jaguar can't handle doubling up game logic that a now 12 year old Xbox 360 managed?

Also haven't they claimed to have moved their 'physics host' onto their server infrastructure? So really they should have lighter CPU load than the original game.

I guess we'll have a better idea once the PC version is out just how badly CPU limited Destiny 2 is. I'm not sure I'm buying what they're selling.

What makes you think the 360 could manage Destiny 2 or were you referring to Destiny 1 still?

Also I don't recall the physics host moving to the server.
 

VeeP

Member
Titanfall 2 is well optimized, good to see they can really push the console side of their tech for more powerful systems.

XBOX is X BOX ONE X so that makes the most sense, not this XOX nonsense :)

Xbox is actually one word. Even if you look at Xbox.com, it's not X Box or XBox, its Xbox, one word.

So X BOX doesn't really make much sense.

Xbox One X = XOX
 

DKo5

Respawn Entertainment
Lots of speculation about what I said. Time to get way too real for this random conversation and lemme break it down a bit. Sorry for the long post, I like typing and really love this kind of tech. Also, none of this is work I did - we have an amazing engineering team that manages to pull off absolutely mindblowing stuff. All credit goes to them!

Before the advent of dynamic resolution scaling, you had to set an output resolution that hopefully kept performance at your target - regardless of what action was happening on screen. In a game like Titanfall this sucked because you could go from a handful of pilots on screen running around (low GPU resource usage) to having ten giant Titans exploding while dropships dropped off a dozen AI (too much GPU usage). That is to say, the game is extremely variable in what is happening.

In Titanfall 2 we added dynamic resolution scaling, which can dynamically lower the resolution the game internally renders at before scaling it to fit the output resolution in order to maintain 60 hz output. We do have some parts of our render pipeline that can not be scaled internally, like our UI and post-processing (color correction, bloom, etc.), so the output resolution actually still has to be set based on some performance cliffs we can fall off of. This is why the current console versions of Titanfall 2 don't just output to 4K already and then let dynamic resolution scale take over - they'd constantly be scaling WAAAAY down and it'd be fugly. I tried. We do have a lower bound so that any bugs or ULTRA intense action don't drop the resolution to 240x135 or whatever.

So! Now we can scale down to maintain that sweet sweet 60 hz, and with the extra spicy temporal anti-aliasing we cooked up its actually not that bad of an IQ trade-off for the gameplay benefits of smooth framerates. What this doesn't account for is our ability to supersample. In the shipped version of Titanfall 2 if you load it up on a PS4 Pro (which has an output resolution >1080p) on a 1080p display we're already downsampling to fit the output resolution, providing a crisper image than you'd get with a straight 1080p output. This means you don't need a 4K TV to see the benefits of the higher resolution output on your 1080p display. Similarly, if you plugged in your X1 or PS4 to a 720p TV you'd be getting a higher quality image than straight 720p.

What dynamic supersampling does is the same as the downscaling to maintain 60 hz, but in the opposite direction. We internally render everything HIGHER than the output resolution when possible and then downsample it to fit the output resolution - much like how the PS4 Pro version looks better on a 1080p than it would if we rendered at just 1080p. The end result of this is that we can have the GPU cooking at 100% utilization regardless of action on screen since we're scaling down and up to maintain 60 hz.

All that said - Titanfall 2 running on the X1X devkit at my desk does NOT reliably run at any internal resolution. The output resolution is 4K, but its rarely sitting on just 4K. It'll dip and rise constantly, the same way as a PC game running uncapped will never sit at just one framerate. What we've essentially (in theory) done is gone from a locked resolution and variable framerate, to a "locked" framerate and variable resolution. This does not mean the game is always 60 hz, as there are points in the game where we are not GPU limited (or we are GPU limited, but the lower bound isn't low enough to maintain 60 hz - try getting a dozen Scorch Titans all throwing incendiary traps in one spot :p). In instances where we are CPU bound, we actually increase resolution until we become GPU bound. Keep that GPU pumping!

None of this is new tech we're adding to Titanfall 2 as a game. It is already present in the PC version you can play right now, we're just getting it to work on console for the X1X launch. Someone could easily showcase the end result of this by taking screenshots of the PC game with dynamic supersampling on/off and their framerate target set really low, like 5, to ensure the scaling goes as high as possible. I do not know if it'll make its way to other console SKUs, but it might be possible? Would be interesting to see how the PS4 Pro fares, for sure.

As for the "6K" comment - given the previous explanations - I was playing some
REDACTED
on Wargames and happened to see the scale factor was at ~1.5X while shooting some grunts. 3840x2160 x 1.5 = 5760x3240. As I originally said - no guarantees on internal resolution at any time - but it was pretty amazing to see how high it was going. Good times.

*Whew*

Oh, and "true 4K" - whatever that means. Titanfall 2 on X1X will output at 4K and render at a multitude of resolutions depending on action. There is no such thing as needing "4K assets" to make this happen. If you're playing on high-end PC at 4K you're experiencing an extremely similar game as I'm describing. We will be playing with some detail knobs for X1X (much like how PS4 Pro has some higher details - but nothing major), but it is not "Ultra" PC settings. Specifically, since it was mentioned in here, our ambient occlusion isn't console friendly. As for those asking for higher framerates instead - consoles are limited to 60 hz because of TV displays. If you want higher framerate, play on PC! I love my 1440p 144hz GSync monitors.

OK - back to my corner.
 

peppers

Member
DKo5 doing the lord's work. Thanks for the explanation!

The game should look spectacular on the new hardware, great stuff.
 

nekkid

It doesn't matter who we are, what matters is our plan.
I was going to replay this, but I think I'll hold off until November now. Can't wait.
 
Hold on....

Higher than native 4K? So I can look forward to playing MP come November in a resolution actually surpassing 4K and I assume locked 60fps?
 

RomeoDog

Banned
The 1x is incredibly good at stuff that has definitely no perceived benefit for most people.Game at 1440p or native 4K is practically the same. If the X1 could multiply framemate you'd have a textbook example of better, resolution doesn't mean much now.
 

Izuna

Banned
The 1x is incredibly good at stuff that has definitely no perceived benefit for most people.Game at 1440p or native 4K is practically the same. If the X1 could multiply framemate you'd have a textbook example of better, resolution doesn't mean much now.

1440p and 4K are the same?

Seriously?
 
Top Bottom