• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Titanfall 2 on XOX can go above 4K with Dyanmic Superscaling

Izuna

Banned
Lmao, give people time to adjust to the fact that an Xbox is now the most powerful lol

during 360 to PS3 era, resolution didn't matter until we got 1080p stuff

leading up to the reveal of the consoles, it stopped mattering again

since One release, it matters, a shit ton

now, we're back to 1440p and 4K looking identical

~~

Would be good for Respawn to include this same update on the PS4 Pro if possible, however. The "DYNAMIC SUPERSCALING" (haha) I mean

I'll be more than contend if the game maintains 4K/60fps though. I mean can we even see higher than 4K on our 4K tvs?

Can you see more than 1080p on your 1080p screens?
 

Vashetti

Banned
Im Starting to have disbelief in some of these developers, because I look at games like BF1, Lawbreakers(PS4Pro), & Other FPS games and wonder what are they on about at Bungie not being able to run Destiny 2

It's almost as if games have different engines and GPU and CPU requirements.
 

RomeoDog

Banned
1440p and 4K are the same?

Seriously?
I have a 4K tv and a pro. I've not been impressed and non of my friends who live with 1080p have been impressed by it.

It's a nice feature but it's realy not a game changer. The x1 will flop if it can't 60fps at PlayStation pro resolutions. Increase resolution matters so little when the games are already at a higher res on pc.

I think Phil spencer dream that x1 will win because it's true 4K is gonna fall flat. I expect the buyers remorse to be huge.

When a game gets new textures and 4K assets that's when the x1 will shine. More ram and more capable hardware is there after all.

But saying we have more resolution is not enough to be excited.
 

thill1985

Banned
Damn!! This is nice and Mike Ybarra says they aren't even done optimizing the dev kits and some devs claim they got their dev kits just last month. More games will hit native 4K as devs get familiar with the new hardware. We still got 5 months until launch there is enough time.

There was also the claim from Jez Corden a while back that suggested something like 80% of the dev kits' GPU power was available with the remainder to be unlocked with dev kit updates in June. I wonder if any checkerboard 4k games will end up being native come November. It sounds like nearly all devs outside of Turn 10 got their kits only very recently. Even RARE said they only got theirs a month ago or something like that.
 

thill1985

Banned
I have a 4K tv and a pro. I've not been impressed and non of my friends who live with 1080p have been impressed by it.

It's a nice feature but it's realy not a game changer. The x1 will flop if it can't 60fps at PlayStation pro resolutions. Increase resolution matters so little when the games are already at a higher res on pc.

I think Phil spencer dream that x1 will win because it's true 4K is gonna fall flat. I expect the buyers remorse to be huge.

When a game gets new textures and 4K assets that's when the x1 will shine. More ram and more capable hardware is there after all.

But saying we have more resolution is not enough to be excited.

I'd be surprised if every single title that has a PC version and gets patched for XOX doesn't end up with 4k textures standard. There is basically no good reason not to include them and it requires no new authoring. Is your TV HDR capable? I've read loads of folks saying that 4k + HDR is a game changer visually.

Btw, Spencer never suggested XOX would make Xbox win the console sales war.
 

Izuna

Banned
I have a 4K tv and a pro. I've not been impressed and non of my friends who live with 1080p have been impressed by it.

It's a nice feature but it's realy not a game changer. The x1 will flop if it can't 60fps at PlayStation pro resolutions. Increase resolution matters so little when the games are already at a higher res on pc.

I think Phil spencer dream that x1 will win because it's true 4K is gonna fall flat. I expect the buyers remorse to be huge.

When a game gets new textures and 4K assets that's when the x1 will shine. More ram and more capable hardware is there after all.

But saying we have more resolution is not enough to be excited.

I'm sorry but this is just bullshit.

Here's an imgur album I just made from my 1080p laptop: http://imgur.com/a/u6WWg

I turned the textures to low and car LoD to Very Low, everything else is at Ultra. I also turned off any AA so you can get a direct comparison.

4K to 1080p with OGSSAA is incredible and if your friends can't see it they're as blind as you are.

This whole "4K Assets" thing is every so slightly getting on my nerves. A dev just mocked that sentiment in this very thread, btw.
 

gamz

Member
I have a 4K tv and a pro. I've not been impressed and non of my friends who live with 1080p have been impressed by it.

It's a nice feature but it's realy not a game changer. The x1 will flop if it can't 60fps at PlayStation pro resolutions. Increase resolution matters so little when the games are already at a higher res on pc.

I think Phil spencer dream that x1 will win because it's true 4K is gonna fall flat. I expect the buyers remorse to be huge.

When a game gets new textures and 4K assets that's when the x1 will shine. More ram and more capable hardware is there after all.

But saying we have more resolution is not enough to be excited.

You have a crappy 4K TV then. No fucking way you can't be impressed.
 
What makes you think the 360 could manage Destiny 2 or were you referring to Destiny 1 still?
Well obviously the 360 comment refers to the original. From what I've seen Destiny 2 isn't exactly a quantum leap from the first game. Maybe I'm wrong there and they'll deliver something super impressive that's truly only possible at 30 fps on a console but I have yet to see much evidence to that end.
Also I don't recall the physics host moving to the server.
Source: https://www.bungie.net/en/News/Article/45919/7_This-Week-At-Bungie--05252017
Matt from Bungie said:
Every activity in Destiny 2 is hosted by one of our servers. That means you will never again suffer a host migration during your Raid attempt or Trials match. This differs from Destiny 1, where these hosting duties were performed by player consoles and only script and mission logic ran in the data center. To understand the foundation on which we're building, check out this Destiny 1 presentation from GDC. Using the terms from this talk, in Destiny 2, both the Mission Host and Physics Host will run in our data centers.


As for those asking for higher framerates instead - consoles are limited to 60 hz because of TV displays. If you want higher framerate, play on PC! I love my 1440p 144hz GSync monitors.
http://www.eurogamer.net/articles/digitalfoundry-2017-project-scorpio-supports-freesync-and-hdmi-vrr

As far as I know that hasn't been contradicted. There is no reason TVs need to be limited to 60hz in the near future, and there's no reason not to plug in a console into a monitor w/ FreeSync. But I'm not the one with the devkit and documentation :( I'm just hoping it's all true and that devs will actually go the extra step of supporting it.

And "sweet sweet 60hz" is overselling it. These days it's adequate. Whereas 30hz should probably be prosecuted as a war crime.
 

arhra

Member
Would be good for Respawn to include this same update on the PS4 Pro if possible, however. The "DYNAMIC SUPERSCALING" (haha) I mean

The question I'd have regarding that would be whether they'd have enough free memory to even allow it to go much higher on the Pro. I assume that they're pushing the usable 5GB in the base consoles to the limit, and the Pro only allocates an additional half-gig for games, which I'd expect is mostly eaten by increased rendertarget sizes, including the framebuffer, so there simply may not be much space to allow it to go higher.

The X1, on the other hand, gives them 3/4GB more to play with (depending on whether they moved to the updated XDK yet), so unless/until they start filling that space with higher-res textures, they can let the framebuffer grow to truly ludicrous sizes without running into any problems.

Honestly I kinda hope that they cap the resolution and start doing something else with the extra space before the patch ships. Rendering at 6k and downsampling is a nice party trick, but dynamic 4k with their TAA should be more than adequate image quality, and better textures all the time would probably be a more perceivable jump in quality than occasionally rendering at higher resolution.
 

Izuna

Banned

well well well, if the physics host is moved to the data centres... I can't believe them at all anymore.

We'll likely have to test the PC version with similar CPUs and call them out.

Microsoft lol

xak2N1N.gif


it's being used against them
 

KageMaru

Member
Well obviously the 360 comment refers to the original. From what I've seen Destiny 2 isn't exactly a quantum leap from the first game. Maybe I'm wrong there and they'll deliver something super impressive that's truly only possible at 30 fps on a console but I have yet to see much evidence to that end.

Source: http://www.eurogamer.net/articles/digitalfoundry-2017-project-scorpio-supports-freesync-and-hdmi-vrr




http://www.eurogamer.net/articles/digitalfoundry-2017-project-scorpio-supports-freesync-and-hdmi-vrr
.

Thanks for the link. I don't believe this means all of the physics run on the servers though. It would be interesting to find out what specifically is being handled by the physics host.

Also it's clear that the game was designed for 30fps on consoles, Bungie wouldn't hold back the performance for the hell of it. Unless we were able to see performance profiles for how the game is taxing the CPU, it's unlikely we'll know from what exactly though.

Microsoft lol

The term was used in PC gaming long before MS started using it. There are higher resolution assets, but you don't need them for a game when running in 4K.
 

ghibli99

Member
Since this is turning into a "What do we call it?" thread, I found myself calling it the "1X" at work today. Easy to say and everyone knew what I was referring to.

On-topic, I can't wait to see what this looks like... and to see how the existing library benefits with and without patches.
 
Lots of speculation about what I said. Time to get way too real for this random conversation and lemme break it down a bit. Sorry for the long post, I like typing and really love this kind of tech. Also, none of this is work I did - we have an amazing engineering team that manages to pull off absolutely mindblowing stuff. All credit goes to them!

Before the advent of dynamic resolution scaling, you had to set an output resolution that hopefully kept performance at your target - regardless of what action was happening on screen. In a game like Titanfall this sucked because you could go from a handful of pilots on screen running around (low GPU resource usage) to having ten giant Titans exploding while dropships dropped off a dozen AI (too much GPU usage). That is to say, the game is extremely variable in what is happening.

In Titanfall 2 we added dynamic resolution scaling, which can dynamically lower the resolution the game internally renders at before scaling it to fit the output resolution in order to maintain 60 hz output. We do have some parts of our render pipeline that can not be scaled internally, like our UI and post-processing (color correction, bloom, etc.), so the output resolution actually still has to be set based on some performance cliffs we can fall off of. This is why the current console versions of Titanfall 2 don't just output to 4K already and then let dynamic resolution scale take over - they'd constantly be scaling WAAAAY down and it'd be fugly. I tried. We do have a lower bound so that any bugs or ULTRA intense action don't drop the resolution to 240x135 or whatever.

So! Now we can scale down to maintain that sweet sweet 60 hz, and with the extra spicy temporal anti-aliasing we cooked up its actually not that bad of an IQ trade-off for the gameplay benefits of smooth framerates. What this doesn't account for is our ability to supersample. In the shipped version of Titanfall 2 if you load it up on a PS4 Pro (which has an output resolution >1080p) on a 1080p display we're already downsampling to fit the output resolution, providing a crisper image than you'd get with a straight 1080p output. This means you don't need a 4K TV to see the benefits of the higher resolution output on your 1080p display. Similarly, if you plugged in your X1 or PS4 to a 720p TV you'd be getting a higher quality image than straight 720p.

What dynamic supersampling does is the same as the downscaling to maintain 60 hz, but in the opposite direction. We internally render everything HIGHER than the output resolution when possible and then downsample it to fit the output resolution - much like how the PS4 Pro version looks better on a 1080p than it would if we rendered at just 1080p. The end result of this is that we can have the GPU cooking at 100% utilization regardless of action on screen since we're scaling down and up to maintain 60 hz.

All that said - Titanfall 2 running on the X1X devkit at my desk does NOT reliably run at any internal resolution. The output resolution is 4K, but its rarely sitting on just 4K. It'll dip and rise constantly, the same way as a PC game running uncapped will never sit at just one framerate. What we've essentially (in theory) done is gone from a locked resolution and variable framerate, to a "locked" framerate and variable resolution. This does not mean the game is always 60 hz, as there are points in the game where we are not GPU limited (or we are GPU limited, but the lower bound isn't low enough to maintain 60 hz - try getting a dozen Scorch Titans all throwing incendiary traps in one spot :p). In instances where we are CPU bound, we actually increase resolution until we become GPU bound. Keep that GPU pumping!

None of this is new tech we're adding to Titanfall 2 as a game. It is already present in the PC version you can play right now, we're just getting it to work on console for the X1X launch. Someone could easily showcase the end result of this by taking screenshots of the PC game with dynamic supersampling on/off and their framerate target set really low, like 5, to ensure the scaling goes as high as possible. I do not know if it'll make its way to other console SKUs, but it might be possible? Would be interesting to see how the PS4 Pro fares, for sure.

As for the "6K" comment - given the previous explanations - I was playing some
REDACTED
on Wargames and happened to see the scale factor was at ~1.5X while shooting some grunts. 3840x2160 x 1.5 = 5760x3240. As I originally said - no guarantees on internal resolution at any time - but it was pretty amazing to see how high it was going. Good times.

*Whew*

Oh, and "true 4K" - whatever that means. Titanfall 2 on X1X will output at 4K and render at a multitude of resolutions depending on action. There is no such thing as needing "4K assets" to make this happen. If you're playing on high-end PC at 4K you're experiencing an extremely similar game as I'm describing. We will be playing with some detail knobs for X1X (much like how PS4 Pro has some higher details - but nothing major), but it is not "Ultra" PC settings. Specifically, since it was mentioned in here, our ambient occlusion isn't console friendly. As for those asking for higher framerates instead - consoles are limited to 60 hz because of TV displays. If you want higher framerate, play on PC! I love my 1440p 144hz GSync monitors.

OK - back to my corner.

This post is an embarrassment of awesome. I'll be picking segments from this post for the next 12 months or so to make myself sound smarter.

Some really nice and interesting information in here.

- So this technology, Dynamic Supersampling, is already at use in the PC version of the game, and based on current work being done on Titanfall 2 for Xbox One X, it appears to be in use for Xbox One X also. It is not suddenly new tech being brought to Titanfall 2.

- Don't know if it will come to any other console SKUs besides X1X, such as PS4 Pro, but it may be possible, and you're curious as to how PS4 Pro would fare using the same tech.

- Xbox One X version of the game is not Ultra settings.

- If people are playing Titanfall 2 on a high end PC at 4K, you're experiencing an extremely similar game to the Xbox One X version of Titanfall 2 being described.

- Respawn will play around with specific detail settings for Xbox One X, much like how PS4 Pro benefits from some higher details, but nothing major. Because this part was in parenthesis (probably overthinking it, but here I go anyway), I'm curious as to if that portion being in parenthesis, and specifically mentions PS4 Pro, if there's an implication that something a little more significant might be changed or increased for the Xbox One X version that otherwise wasn't done on PS4 Pro.

- Xbox One X version of the game will render at all kinds of resolutions. It's rarely sitting on just 4K. It'll dip below and rise above it constantly depending on the action happening.

- Locked 60fps is the goal, but the game won't always be 60fps, as there are certain scenarios where, even if GPU bound, there's just way too much happening.

- Xbox One X version during gameplay observed to be using a resolution scale factor of 1.5x, which represents a resolution of 5760x3240. Even with no guarantee of any particular resolution, you were still amazed at how high it was going.

Question if it can be answered. Why must the GPU constantly be running at 100%? What is the benefit of that to Titanfall 2 running on Xbox One X, or any console for that matter, since I assume this makes sense for every console, and not just Xbox One X. Is there an advantage to the GPU always being 100% busy? Does it get things done faster? If so, why does it get things done faster if the GPU is 100% busy, as the layman who doesn't get this stuff would assume that there's nothing, or not as much, left to tackle a newer task since everything is already occupied with other tasks.

Is it for memory bandwidth reasons, computation reasons? RAM capacity reasons? All of the above? Does it need to be pushed 100% all the time somehow because it takes the GPU a certain amount of time to really build up and perform at max performance, and so to maintain that performance, it's best to never allow the GPU to be anything other than 100% loaded?
 

Izuna

Banned
^^^

The point is that, why not make it use 100% GPU if you can? The extra resources is used for the superscaling.
 

gamz

Member
Some crazy powerful shit they achieved judging by devs. Really can't wait to see it in person running and playing games.
 

Hawk269

Member
The 1x is incredibly good at stuff that has definitely no perceived benefit for most people.Game at 1440p or native 4K is practically the same. If the X1 could multiply framemate you'd have a textbook example of better, resolution doesn't mean much now.

That comment sounds like a person that looks like your avatar and is stoned out of their mind. 1440p and 4k is practically the same? Really?
 

leeh

Member
Question if it can be answered. Why must the GPU constantly be running at 100%? What is the benefit of that to Titanfall 2 running on Xbox One X, or any console for that matter, since I assume this makes sense for every console, and not just Xbox One X. Is there an advantage to the GPU always being 100% busy? Does it get things done faster? If so, why does it get things done faster if the GPU is 100% busy, as the layman who doesn't get this stuff would assume that there's nothing, or not as much, left to tackle a newer task since everything is already occupied with other tasks.

Is it for memory bandwidth reasons, computation reasons? RAM capacity reasons? All of the above? Does it need to be pushed 100% all the time somehow because it takes the GPU a certain amount of time to really build up and perform at max performance, and so to maintain that performance, it's best to never allow the GPU to be anything other than 100% loaded?
If you bought a new Ferrari, why wouldn't you always drive it flat out?
 
SenjutsuSage is worried about his console running out of chakra.

Not really, I have confirmation Xbox One X has Kyuubi Sage Mode. :)

naruto_645__naruto_kyuubi_chakra_sage_mode_by_sensational_x-d6l69xu.png


If you bought a new Ferrari, why wouldn't you always drive it flat out?

So I can live to see tomorrow. :p

But if I think about Xbox One X, or any system for that matter, like an assembly line tasked to complete a variety of tasks, it probably makes sense you'd always want it running at full capacity, and you wouldn't want any stalls, since stalls at any stage in the process likely means slowing up other things that are dependent on everything to be busy doing something. But there's still some aspect of it confusing me. I'd probably need to go back and read an anandtech article or something for why everything not being busy isn't a good thing.
 

thill1985

Banned
What I want to know, is whoever started this "4K Assets" crap.

You do not seem to understand what the term is referring to. It is referring to the authored resolution of the art assets like textures. Has nothing to do with the rendering resolution but can absolutely make the game look much better and much cleaner in terms of image quality. Not due to the actual image quality improving but instead due to the imagery being presented being clear and crisp itself. And the Respawn guy did not shoot that down. He said that has nothing to do with rendering resolution (which is true).
 

Ushay

Member
You have a crappy 4K TV then. No fucking way you can't be impressed.

H isn't wrong Gamz, but the fact is these games will be getting more then simple resolution bumps. Developers doing flat out res increases would be either flat out lazy or unwilling to use the hardware, and looking at the list so far, support is excellent across the board.

4k assets (where appropriate)
UHD textures, thanks to the bigger and faster RAM
Greater visual fidelity and effects
HDR

Most of these should be expected for 1X versions of games.
 

Hawk269

Member
This all sounds great. I never bought TF2 because I was really busy and had other priorities. Was a big fan of the first game, but just never got around to it. Will definitely be buying it since I will be getting the XBX.

I still at am at a loss that some people are thinking that this console is not a huge leap. While I agree about the CPU not being great, we don't know all the details on what they did. Sure at the end of the day it is a Jaguar. But what I do know is that a lot of people are undermining the other things that MS has done. Saying that the XBX is 6TF and the PRO is 4.2 that it does not make sense that it would be that much better is not giving all the other things credit.

Memory bandwidth, 12gb of RAM (9 now usable for games alone), CPU optimizations and having a GPU that runs as fast as the XBX one which helps with a host of other things and the added feature they put into the GPU is what gives this thing a big edge.
 
Niiiiiiice. I can't get my head around the scale difference in OP's post, that's huge. DKo5's post was fascinating, too, thanks for the write up.

(TF2 is amazing, too, this is an A+ perfect excuse for y'all to play it if you haven't already.)
 

thelastword

Banned
It's cool that they have a dynamic scaler on XBONEX similar to PC, but I really don't see what the hoopla is, this is how scaling works.. At times when the load on the GPU is not maxed out, resolution will increase.....

I'm pretty sure if the same feature is implemented on PRO, it will go just as high in some scenarios.....


There's still some mystery surrounding the explanation from the dev and the original bit which made the OP...for e.g..

He said the XBONEX will output at 4k: So is that native or checkerboard rendering? As it stands, the 1060/480 can barely hit 30fps at 4k with dips to 20fps....with better CPU's, yes at ultra settings, so how far below those settings is the XBONEX version?


If this is at 6k NON CB at times and supersampled, whilst at other times it can fall way below that resolution, then IQ will be variable/inconsistent and easily noticeable, say from 6k to 1440p.. It's all well and good, but that would be a downside to image consistency.

The real question is at what resolution is XBONEX consistently at 60fps whilst you play? Is it 1620p, is it 1800p......Knowing that, is more important to draw reference as to the power divide between the systems.......moreso than just issuing a hyperbole comment that it runs at 6k at times, because that could very well be 1% of the time......

The other question is this? Will the pro be patched with this dynamic superscaling? Obviously I want to see the results on XBONEX and how well image consistency is maintained there...but if this is a veritable approach and holds it's own, then it should surface on other platforms as well....

Having said all that, I still believe real world results and findings from DF and NXGamer will be most interesting come November, I understand that XB fans are excited for XBONEX, but reading the first few pages tells me that some fans are really shooting the power difference out of this stratosphere. We have to look at what hardware is in the XBONEX, compare it with similar hardware and get back down to earth....

We've already gone from Ark at 4k to Ark at 1080p, then PC2 at native 4k to not, it's a nice piece a kit coming from the XB1 guys, but at least temper your expectations within realistic parameters....I know a lot of you guys will love XBONEX and if TF2 was only 1620p at 60fps, I'm sure you guys would still be fine with that, but don't create too much hype that's not very realistic when you actually play or when the patch goes live...I very much doubt there will be a consistent 6k-4k downsampling at 60fps to really matter in the grand scheme of things....The hardware is not that strong.

I can't wait for November to see real world results though...
 

Izuna

Banned
You do not seem to understand what the term is referring to. It is referring to the authored resolution of the art assets like textures. Has nothing to do with the rendering resolution but can absolutely make the game look much better and much cleaner in terms of image quality. Not due to the actual image quality improving but instead due to the imagery being presented being clear and crisp itself. And the Respawn guy did not shoot that down. He said that has nothing to do with rendering resolution (which is true).

See my post above of why "4K textures" is not required for 4K to be a big IQ benefit.

Frankly, this whole "4K Assets" thing is massively misunderstood.

--

Lol I was wondering when thelastword would post

And yes, thelastword, they are targetting "real" 4k60
 

ganaconda

Member
See my post above of why "4K textures" is not required for 4K to be a big IQ benefit.

Frankly, this whole "4K Assets" thing is massively misunderstood.

--

Lol I was wondering when thelastword would post

And yes, thelastword, they are targetting "real" 4k60

What are you talking about? You do realize 4K textures means more detail in the texture? It definitely makes a big difference unless you have a shitty artist that just stretches a lower res texture. Nothing in your posted debunked anything. The only thing I agree with is that 4K assets doesn't really mean anything besides 4K textures and I'm not really sure what that term means when somebody throws it out there in addition to 4K textures.
 

Izuna

Banned
What are you talking about? You do realize 4K textures means more detail in the texture? It definitely makes a big difference unless you have a shitty artist that just stretches a lower res texture. Nothing in your posted debunked anything. The only thing I agree with is that 4K assets doesn't really mean anything besides 4K textures and I'm not really sure what that term means when somebody throws it out there in addition to 4K textures.

Relevantly, people are saying that without "4K Assets", 4K resolution has little impact over 1080p. Which is why I am touching on the subject. If FH3 on Very Low textures shows a big improvement, why wouldn't Titanfall 2 benefit?

This is why the dev put the term in quotes. Whatever people think it means, it is already in the game.
 
It's cool that they have a dynamic scaler on XBONEX similar to PC, but I really don't see what the hoopla is, this is how scaling works.. At times when the load on the GPU is not maxed out, resolution will increase.....

I'm pretty sure if the same feature is implemented on PRO, it will go just as high in some scenarios.....
I think you misunderstood him. He said that this feature is already in every single version released of TF2, even the Pro.


There's still some mystery surrounding the explanation from the dev and the original bit which made the OP...for e.g..

He said the XBONEX will output at 4k: So is that native or checkerboard rendering? As it stands, the 1060/480 can barely hit 30fps at 4k with dips to 20fps....with better CPU's, yes at ultra settings, so how far below those settings is the XBONEX version?
Native, they don't have CB implemented in the engine, just dynamic scaling. But it will likely fall beneath 4k at some points as well.

They are at least using the same settings Pro had. One thing they already mentioned that's not in is the utra setttings AO, which is very costly.


If this is at 6k NON CB at times and supersampled, whilst at other times it can fall way below that resolution, then IQ will be variable/inconsistent and easily noticeable, say from 6k to 1440p.. It's all well and good, but that would be a downside to image consistency.
Hopefully the drops will be only at the most intense moments, which is usually when you less perceive the difference.

The real question is at what resolution is XBONEX consistently at 60fps whilst you play? Is it 1620p, is it 1800p......Knowing that, is more important to draw reference as to the power divide between the systems.......moreso than just issuing a hyperbole comment that it runs at 6k at times, because that could very well be 1% of the time......
No one is making that claim, except that for what DF checked the game on Pro never goes above 1440p (which wasn't all that bad either giving the resolution on base ps4 and xbone)

The other question is this? Will the pro be patched with this dynamic superscaling? Obviously I want to see the results on XBONEX and how well image consistency is maintained there...but if this is a veritable approach and holds it's own, then it should surface on other platforms as well....
The Pro already has dynamic scalling, so does the ps4 and xbone. On xbone the game usually ranges just shy of 900p, but drops all the way to 480p when things gets very intense. The pro ranges from almost 1080p to 720p. Dunno about the lowest res on Pro, but the max was 1440p according to DF.

We've already gone from Ark at 4k to Ark at 1080p, then PC2 at native 4k to not, it's a nice piece a kit coming from the XB1 guys, but at least temper your expectations within realistic parameters....I know of lot of you guys will love XBONEX and if TF2 was only 1620p at 60fps, I'm sure you guys would still be fine with that, but don't create too much hype that's not very realistic when you actually play or when the patch goes live...I very much doubt there will be a consistent 6k-4k downsampling at 60fps to really matter in the grand scheme of things....The hardware is not that strong.
The ark dev did say they could go 4k30 but instead went 1080p60 with better settings, either way both are kind incredible given how bad the game usually runs on everything;
 

ganaconda

Member
Relevantly, people are saying that without "4K Assets", 4K resolution has little impact over 1080p. Which is why I am touching on the subject. If FH3 on Very Low textures shows a big improvement, why wouldn't Titanfall 2 benefit?

This is why the dev put the term in quotes. Whatever people think it means, it is already in the game.

Ok, that's fine, and I agree with that for sure, but that doesn't mean that adding 4K textures doesn't add even more benefit. If people are saying that you need 4K textures to get benefit out of 4K resolution, that's definitely wrong, but saying having 4K textures makes no difference is wrong too.

Not sure what you mean by already in the game. Titanfall 2 may already have 4K textures, I have no idea. However, it's not something that automagically goes into a game. Artists need to author 4K textures and the game needs to actually use them for them to be in the game.
 

Marmelade

Member
I think you misunderstood him. He said that this feature is already in every single version released of TF2, even the Pro.


There's still some mystery surrounding the explanation from the dev and the original bit which made the OP...for e.g..

No, just on PC

"None of this is new tech we're adding to Titanfall 2 as a game. It is already present in the PC version you can play right now, we're just getting it to work on console for the X1X launch. Someone could easily showcase the end result of this by taking screenshots of the PC game with dynamic supersampling on/off and their framerate target set really low, like 5, to ensure the scaling goes as high as possible. I do not know if it'll make its way to other console SKUs, but it might be possible? Would be interesting to see how the PS4 Pro fares, for sure."
 
during 360 to PS3 era, resolution didn't matter until we got 1080p stuff

leading up to the reveal of the consoles, it stopped mattering again

since One release, it matters, a shit ton

now, we're back to 1440p and 4K looking identical

Pretty much. Gonna be a hard road for the remainder of this gen for a certain group of people.
 

Izuna

Banned
Ok, that's fine, and I agree with that for sure, but that doesn't mean that adding 4K textures doesn't add even more benefit. If people are saying that you need 4K textures to get benefit out of 4K resolution, that's definitely wrong, but saying having 4K textures makes no difference is wrong too.

Not sure what you mean by already in the game. Titanfall 2 may already have 4K textures, I have no idea. However, it's not something that automagically goes into a game. Artists need to author 4K textures and the game needs to actually use them for them to be in the game.

I have and never will say that higher resolution textures doesn't make a difference.

4096x4096 textures, as "4k textures" (8k would be more noticeable) have been used before we thought we could do native 4K. Most textures are very high resolutions but are compressed as they are moved to video memory, which is a reason for banding etc.

It's of course far more complicated than that, but more resources means less compression and more high quality textures at the same time (less noticable lod).
 
Top Bottom