• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Titanfall 2 on XOX can go above 4K with Dyanmic Superscaling

ganaconda

Member
I have and never will say that higher resolution textures doesn't make a difference.

I don't think you know how texture resolutions work, however. 4096x4096 textures, as "4k textures" have been used before we thought we could do native 4K. Most textures are very high resolutions but are compressed as they are moved to video memory, which is a reason for banding etc.

It's of course far more complicated than that, but more resources means less compression and more high quality textures at the same time (less noticable lod).

Yes I know that 4096x4096 textures are 4k textures. I'm a developer and while not a graphics developer I've worked on large scale game tech and am very familiar (delved a bit in graphics development but that wasn't my main focus). And yes they have existed for a while now on PC. I won't say specifically what I worked on (PM me if you care), but let's just say the vehicles in our game could use up to 4K textures, but the rest of the scene typically used lower res textures. So yea, there a lot of variables involved, including much of what you said about compression (actually having higher res textures) compressing them because of limited graphics memory, but also mixing higher res and lower res textures depending on the importance of a given object or the terrain/environment. The difference in detail could be immense inside of the vehicles in our game, going from 1K to 2K to 4K textures.


What you are saying about compression is completely true, but the point you made at the end is completely relevant as well. If you don't have to compress them because you have enough memory to store them at their raw resolution, you will see a great benefit. Also just because certain aspects of a game use 4K textures...doesn't mean that all aspects do.
 

thelastword

Banned
I think you misunderstood him. He said that this feature is already in every single version released of TF2, even the Pro.
No it's not, the 4k mode on PS4.PRO is 1440p...I wouldn't even call it a 4k mode as such...There wasn't much done with the PRO patch tbh...

It makes no sense that's there's dynamic scaler on PRO and it never goes above or below 1440p, yet it can go 6k on XBONEX...That's because it's not implemented on the PRO.


Native, they don't have CB implemented in the engine, just dynamic scaling. But it will likely fall beneath 4k at some points as well.

They are at least using the same settings Pro had. One thing they already mentioned that's not in is the utra setttings AO, which is very costly.
If this is native 4k on the XBONEX, then surely they will have to revisit the PRO version because it was under-utilized, but we shall see the real results in November...4k still does not makes sense for the console seeing as a fully clocked RX480 with an i7 only gets 30fps and below, surely AO is not responsible for an extra 30fps???? on top of the CPU deficit on the XBONEX...

The ark dev did say they could go 4k30 but instead went 1080p60 with better settings, either way both are kind incredible given how bad the game usually runs on everything;
Even at 1080p on PRO, the settings on ARK are not close to the best settings, so don't go thinking it will be on XBONEX.....and don't go thinking that you will get anything close to a locked 60fps either...Perhaps you should look at the last few DF analyses on that game...I would treat those Ark devs as much as I trust Albert Penello tbh...
 

Izuna

Banned
Yes I know that 4096x4096 textures are 4k textures. I'm a developer and while not a graphics developer I've worked on large scale game tech and am very familiar (delved a bit in graphics development but that wasn't my main focus). And yes they have existed for a while now on PC. I won't say specifically what I worked on (PM me if you care), but let's just say the vehicles in our game could use up to 4K textures, but the rest of the scene typically used lower res textures. So yea, there a lot of variables involved, including much of what you said about compression (actually having higher res textures) compressing them because of limited graphics memory, but also mixing higher res and lower res textures depending on the importance of a given object or the terrain/environment. The difference in detail could be immense inside of the vehicles in our game, going from 1K to 2K to 4K textures.


What you are saying about compression is completely true, but the point you made at the end is completely relevant as well. If you don't have to compress them because you have enough memory to store them at their raw resolution, you will see a great benefit. Also just because certain aspects of a game use 4K textures...doesn't mean that all aspects do.

I want to apologise that I didn't intend on sounding arrogant or insulting with my tone. It's 3AM for me and I misplaced my manners.

I just feel like undermining resolution increase (which of course has an effect on VRAM) and saying that better textures are a requirement is a sentiment that's unfortunate too many people on this forum seem to be echoing.

Otherwise, thanks for the clarity. As a non-developer (outside of the casual mobile space) and barely a hobbyist, I'm like to learn more about graphics as much as I can. :)
 
*Rolls up sleeve, slides pencil between ear, puts apple on desk for teacher.*

-Raises hand tentatively and humbly toward the sky-

Ok, so while we are on the topic of 4K assets and textures, here are a few questions to whomever is knowledgeable in this area. Hopefully I can get this straightened out once and for all. I would really like to understand how this all works.

1799-5-1477869191.jpg

1799-6-1477869191.jpg


Both of these textures are @ 4096px.

I've personally seen higher quality textures in higher and lower resolutions and vice versa with lower quality ones and have seen first hand that at least to me, screen/game resolution has a direct relationship with the quality of textures and the detail that is or isn't resolved.

Now, I've only dabbled in a bit of Skyrim modding and so far this is how I understand it,

Texture quality means a whole lot when you start cranking the screen/game resolution up because you start to reveal how low quality some of the textures are or could be. At lower screen/game resolutions you can get away with lower quality textures because they don't have to scale so high and will only resolve so much detail to the viewer. At higher screen/game resolutions higher quality textures will resolve more detail which at the same time can show both how nice a texture looks if it was of high quality to begin with or how bad it does if it was of low quality. Am I wrong or off?

My other question is,

For console only games that aren't/weren't expected to ever go/be seen beyond a certain resolution, would devs bother to use really high quality textures if the detail would never really resolve? If so, why? Wouldn't they try to use textures that would resolve just the right amount of detail for the resolution they are targeting? Again, speaking specifically of console only games that don't have PC counterparts.

Third question. lol.

How does texture scaling work on games that are multiplatform? And are the same exact textures also included with each version that is shipped?

Fourth and final question.

What is the relationship with or the difference between texture quality and texture resolution?
 
No it's not, the 4k mode on PS4.PRO is 1440p...I wouldn't even call it a 4k mode as such...There wasn't much done with the PRO patch tbh...

It makes no sense that's there's dynamic scaler on PRO and it never goes above or below 1440p, yet it can go 6k on XBONEX...That's because it's not implemented on the PRO.
But it goes below 1440p. It may not be often but it goes (at least that's what DF says


Even at 1080p on PRO, the settings on ARK are not close to the best settings, so don't go thinking it will be on XBONEX.....and don't go thinking that you will get anything close to a locked 60fps either...Perhaps you should look at the last few DF analyses on that game...I would treat those Ark devs as much as I trust Albert Penello tbh...
But the developer explicitly said that they couldn't get higher settings on Pro because there's no power or memory for that when promising that the xbonex version would deliver better settings.

https://youtu.be/TNINqHxWQB0?t=3m23s

I know that the performance isn't the best, but I think the very least we can expect it's to look and run better on xbonex, even if they don't deliver 60fps, he says it's already running and looking better on xbonex
 
4K assets are definitely a requirement for 4K to have a bigger impact. Not just higher res textures, but also finer detail in models, shaders etc. Forza 7 demonstrates this perfectly imo. It's the first time I've seen detail that doesn't seem possible at 1080p.
There is untapped potential there. IQ improvements are fine and good but also pretty boring.
 

thelastword

Banned
But it goes below 1440p. It may not be often but it goes (at least that's what DF says
No, DF never says it goes below or goes above, it's at 1440P, they could not find anywhere where it drops or goes beyond 1440p...That's not dynamic...

But the developer explicitly said that they couldn't get higher settings on Pro because there's no power or memory for that when promising that the xbonex version would deliver better settings.

https://youtu.be/TNINqHxWQB0?t=3m23s

I know that the performance isn't the best, but I think the very least we can expect it's to look and run better on xbonex, even if they don't deliver 60fps, he says it's already running and looking better on xbonex
I don't deny it will look or run a bit better, I'm simply saying people thought this would run at 4k on XBONEX, even if they opt for 1080p, this won't be close to 60fps solid, but I guess we shall see when the patch goes live...

These guys propped the hell out of the pro version too and generally, these guys should do more work on their engine/game than propping the next platform they launch their game on...If only graphics fidelity and performance were the only problems this game was having, it has tonnes of bugs as well....physics, graphics related. We'll see how it improves come November, but I don't think this is a game you should put high hopes in....
 

TronLight

Everybody is Mikkelsexual
My other question is,

For console only games that aren't/weren't expected to ever go/be seen beyond a certain resolution, would devs bother to use really high quality textures if the detail would never really resolve? If so, why? Wouldn't they try to use textures that would resolve just the right amount of detail for the resolution they are targeting? Again, speaking specifically of console only games that don't have PC counterparts.

Depends on the dev, I would guess, and the amount of memory they need. Also, you would need to start checking each texture (and games have thousands of textures) to see if it looks good or it's pixelated, etc... It's faster to just say "You have 10mb of memory to texture this object, go nuts".

Third question. lol.

How does texture scaling work on games that are multiplatform? And are the same exact textures also included with each version that is shipped?

What do you mean by texture scaling in this instance? And probably. If PS4 and One are using the same texture resolution, they're just going to pack the same textures for both. Even if PS4 is using a 1k texture on a object, and One a 512 texture on the same object, I would guess they would just pack the 1k texture too unless they need to save space on the disk.

Fourth and final question.

What is the relationship with or the difference between texture quality and texture resolution?

There are many factors, like compression (just like pictures, music and video), and also texel density (if you use a 4k on a surface that's, say, 4km2, you'll be getting 1 pixel per meter. So if you're a character walking on that surface, every meter it's essentially a solid color. If you use the same 4k texture on a surface that's 1m2, you get a really detailed surface)

Tried to anwser you to the best of my knowledge.
 
No it's not, the 4k mode on PS4.PRO is 1440p...I wouldn't even call it a 4k mode as such...There wasn't much done with the PRO patch tbh...

It makes no sense that's there's dynamic scaler on PRO and it never goes above or below 1440p, yet it can go 6k on XBONEX...That's because it's not implemented on the PRO.


If this is native 4k on the XBONEX, then surely they will have to revisit the PRO version because it was under-utilized, but we shall see the real results in November...4k still does not makes sense for the console seeing as a fully clocked RX480 with an i7 only gets 30fps and below, surely AO is not responsible for an extra 30fps???? on top of the CPU deficit on the XBONEX...

Even at 1080p on PRO, the settings on ARK are not close to the best settings, so don't go thinking it will be on XBONEX.....and don't go thinking that you will get anything close to a locked 60fps either...Perhaps you should look at the last few DF analyses on that game...I would treat those Ark devs as much as I trust Albert Penello tbh...

It's a lot easier to type XB1X
 

MaulerX

Member
No it's not. "XBOX" is one word/term. You can't just arbitrarily split the x from the box. So it's

X box
O ne
X


I find it funny that all of a sudden people feel the need to "explain" why "You can't just arbitrarily split the x from the box" but... where were those people when everyone started calling it "Xbone"? If it's ok to say Xbone then it's perfectly fine to say XBOX. The hypocrisy has to stop.
 
No, DF never says it goes below or goes above, it's at 1440P, they could not find anywhere where it drops or goes beyond 1440p...That's not dynamic...

I don't deny it will look or run a bit better, I'm simply saying people thought this would run at 4k on XBONEX, even if they opt for 1080p, this won't be close to 60fps solid, but I guess we shall see when the patch goes live...

These guys propped the hell out of the pro version too and generally, these guys should do more work on their engine/game than propping the next platform they launch their game on...If only graphics fidelity and performance were the only problems this game was having, it has tonnes of bugs as well....physics, graphics related. We'll see how it improves come November, but I don't think this is a game you should put high hopes in....

An RX480 has nearly 100GB/s less memory bandwidth than Xbox One X's GPU, does it not? Definitely 100 less on the 4GB version.

Also not sure how you can say they heavily underutilized PS4 Pro without understanding all the constraints the system might have in their game. You seem to be dropping some pretty major criticisms of this dev and the work they've done. Maybe, just maybe, the game will run quite well on Xbox One X, better than you think. The differences between Pro and X are bigger than people appreciate, I think. Doesn't mean games won't look fantastic and in many instances hard to pick out which is which, but I fully expect Xbox One X to demonstrate much better performance in certain games because of the way Microsoft profiled games and designed the system around particular game engines.

Titanfall would have been near the very top of that list.

On mobile, quoted wrong post.
 

KageMaru

Member
*Rolls up sleeve, slides pencil between ear, puts apple on desk for teacher.*

-Raises hand tentatively and humbly toward the sky-

Ok, so while we are on the topic of 4K assets and textures, here are a few questions to whomever is knowledgeable in this area. Hopefully I can get this straightened out once and for all. I would really like to understand how this all works.

https://staticdelivery.nexusmods.com/mods/1704/images/1799-5-1477869191.jpg[IMG]
[IMG]https://staticdelivery.nexusmods.com/mods/1704/images/1799-6-1477869191.jpg[IMG]

[URL="http://www.nexusmods.com/skyrimspecialedition/mods/1799/?"]Both of these textures are @ 4096px.[/URL]

I've personally seen higher quality textures in higher and lower resolutions and vice versa with lower quality ones and have seen first hand that at least to me, screen/game resolution has a direct relationship with the quality of textures and the detail that is or isn't resolved.

Now, I've only dabbled in a bit of Skyrim modding and so far this is how I understand it,

Texture quality means a whole lot when you start cranking the screen/game resolution up because you start to reveal how low quality some of the textures are or could be. At lower screen/game resolutions you can get away with lower quality textures because they don't have to scale so high and will only resolve so much detail to the viewer. At higher screen/game resolutions higher quality textures will resolve more detail which at the same time can show both how nice a texture looks if it was of high quality to begin with or how bad it does if it was of low quality. Am I wrong or off?

My other question is,

For console only games that aren't/weren't expected to ever go/be seen beyond a certain resolution, would devs bother to use really high quality textures if the detail would never really resolve? If so, why? Wouldn't they try to use textures that would resolve just the right amount of detail for the resolution they are targeting? Again, speaking specifically of console only games that don't have PC counterparts.

Third question. lol.

How does texture scaling work on games that are multiplatform? And are the same exact textures also included with each version that is shipped?

Fourth and final question.

What is the relationship with or the difference between texture quality and texture resolution?[/QUOTE]

On top of what Tronlight has said, regarding you pointing out both textures being the same resolution, the creator of the textures did say this:

[QUOTE]Problem is, it quickly became evident these were obviously the most rudimentary kind of upscaled textures from a low resolution source[/QUOTE]

So while both may be the same resolution, the source art for the textures will make a huge difference in the results.

[quote="thelastword, post: 241737201"]
I don't deny it will look or run a bit better,[B] I'm simply saying people thought this would run at 4k on XBONEX[/B], even if they opt for 1080p, this won't be close to 60fps solid, but I guess we shall see when the patch goes live...

These guys propped the hell out of the pro version too and generally, these guys should do more work on their engine/game than propping the next platform they launch their game on...If only graphics fidelity and performance were the only problems this game was having, it has tonnes of bugs as well....physics, graphics related. We'll see how it improves come November, but I don't think this is a game you should put high hopes in....[/QUOTE]

No, your comment was directly referencing 1080p.

[QUOTE=thelastword]
[B]Even at 1080p on PRO, the settings on ARK are not close to the best settings, so don't go thinking it will be on XBONEX[/B].....and don't go thinking that you will get anything close to a locked 60fps either...Perhaps you should look at the last few DF analyses on that game...I would treat those Ark devs as much as I trust Albert Penello tbh...[/QUOTE]

Also I don't think anyone in their right mind was expecting Ark to be 4K. You can hang on this notion that people expect or think every game on the X will or should be 4K, but most of the logical people don't honestly think that.

I agree that it's not going to run at 60fps, but it's stupid to think that it can't look noticeably better at 1080p.
 
How is it a waste of resources. They are already at 60fps and have a ton of GPU to resources to spare, so this is how they are choosing to use it...

It seems like something where most players won't be able to observe a difference. Thus a waste. Something to dump resources into when there is an excess, because you don't want to push performance in ways that matter.

'Already 60 FPS' isn't that remarkable when there are very readily observable differences between 60 FPS higher framerates.

So, pushing dynamic supersampling beyond 4K, is indeed a waste of resources when there are readily accessible, observable differences elsewhere. However, of course, it's a limitation of being tied to other 60 FPS platforms that they want to retain parity with.

In the end it just serves as another reminder that these new systems, and especially the XB1X are constrained by their previous iterations.

I certainly understand that it's probably the best that they can do with the remaining resources, but I suspect it won't have much observable impact on the IQ.
 

Theorry

Member
It seems like something where most players won't be able to observe a difference. Thus a waste. Something to dump resources into when there is an excess, because you don't want to push performance in ways that matter.

'Already 60 FPS' isn't that remarkable when there are very readily observable differences between 60 FPS higher framerates.

So, pushing dynamic supersampling beyond 4K, is indeed a waste of resources when there are readily accessible, observable differences elsewhere. However, of course, it's a limitation of being tied to other 60 FPS platforms that they want to retain parity with.

In the end it just serves as another reminder that these new systems, and especially the XB1X are constrained by their previous iterations.

You know consoles are mostly played on tv's right with a refresh rate of 60hz.
 

Trup1aya

Member
It seems like something where most players won't be able to observe a difference. Thus a waste. Something to dump resources into when there is an excess, because you don't want to push performance in ways that matter.

'Already 60 FPS' isn't that remarkable when there are very readily observable differences between 60 FPS higher framerates.

So, pushing dynamic supersampling beyond 4K, is indeed a waste of resources when there are readily accessible, observable differences elsewhere. However, of course, it's a limitation of being tied to other 60 FPS platforms that they want to retain parity with.

In the end it just serves as another reminder that these new systems, and especially the XB1X are constrained by their previous iterations.

I certainly understand that it's probably the best that they can do with the remaining resources, but I suspect it won't have much observable impact on the IQ.

What, in your mind, would be a better use of resources?

The Xbox one X doesn't have the hardware to reliably push beyond 60fps - consumers won't be buying it expecting framerates about 60fps - and most console gamers have 60hz tvs - so I don't see how attempting to improve a framerate that consumers are already satisfied with is a better use of resources.

People will be buying the XOX because they want better, sharper visuals, and that's exactly what this will provide.

It's almost like there is one piece of hardware that's causing a bottleneck that has nothing to do with the visual capabilities of the game.

Perhaps, but every other shooter on both jaguar powered consoles is 60fps...
 

EvB

Member
I don't get the cpu argument for consoles. To me it seems like more games than ever are 60fps on consoles now.

I don't get it either, especially when we had numerous remasters/ports from last gen machines to current gen machines that jumped to 60FPS.

When the last gen machines actually had very impressive CPUs
 

FZW

Member
Didn't I post that just before the dev pondered the same thing? I'm not talking about the performance being the same, but the functionality.

Oh. I see.

It's not out of the realm of possibility. I don't know though. Based off dev quotes here they seem to view the pro as a 1080p super sample and the x as a 4K. But again who knows. Maybe the x will benefit the pro more?

This was my understanding as well

I don't get the cpu argument for consoles. To me it seems like more games than ever are 60fps on consoles now.

I think its stems from a lack of understanding of how console development is done. Its just a convenient argument for ppl to use to downplay the X1X as much as possible. Just because Destiny's engine is CPU bound doesnt mean every shooter engine can't achieve 60 fps on these consoles. There are techniques devs can use to ease the strain off CPUs and offload some of those tasks to the GPU. With all the GPU overhead these consoles have, its definitely an option.
 

Colbert

Banned
http://www.neogaf.com/forum/showpost.php?p=228518656&postcount=3545

dat huge IQ boost from 1080p to 1440p, but everything the XOX is doing is a waste of resources

Some people just can't get over it that another console is more powerful than their own favored manufacturer's console. What is the benefit to downplay technical achievements of the competition if the person commenting obviously not interested in the other offerings anyway?
Ah yes, the internet
 

Night.Ninja

Banned
I assume he means it is hitting this on the dev kit correct? Dev kit has 44 CU's compared to the retail that has 40 CU's. At the clock speed they are running it at, those extra 4 CU's are most likely what it pushing it that far. I would love to hear what it hits on the consumer version, unless this test they are doing is running on the consumer version.

image0e310.png
never heard this one before wonder if we will see it again when the Xbox x releases
 

thelastword

Banned
It's a lot easier to type XB1X
It's the bone between the X's ;)

An RX480 has nearly 100GB/s less memory bandwidth than Xbox One X's GPU, does it not? Definitely 100 less on the 4GB version.

Also not sure how you can say they heavily underutilized PS4 Pro without understanding all the constraints the system might have in their game. You seem to be dropping some pretty major criticisms of this dev and the work they've done. Maybe, just maybe, the game will run quite well on Xbox One X, better than you think. The differences between Pro and X are bigger than people appreciate, I think. Doesn't mean games won't look fantastic and in many instances hard to pick out which is which, but I fully expect Xbox One X to demonstrate much better performance in certain games because of the way Microsoft profiled games and designed the system around particular game engines.

Titanfall would have been near the very top of that list.

On mobile, quoted wrong post.
A PC GPU is set up differently than a console GPU, it's the same RX 480 running Forza Apex at 4k 60fps at ultra settings....

Lets be honest here, I'm not laying it down on the dev. I'm simply saying the PRO patch was not the best the PRO could do, in lieu of what the dev said he's pumping out of the XBONEX..(A.K.A: the disparity is too great if 4k is the target).... It could be that the time the dev had to work on the PRO was limited, but now, they have more time to allocate to Scorpio, notwithstanding that the Scorpio is at least 5 months away, so they can now take their time and tweak things better....

More over, even the dev said he'd be interested to see how the dynamic scaling works on PRO and said he hopes it makes it to the platform. So if they have extra time now and they can push something more on these mid gen systems, it makes sense that they update the PRO as well, NO?

As for differences between PRO and XBONEX, I guess we shall see in November, if they do update the pro with the dynamic scaling, it will be an interesting comparison. My only hope is that the scaling does not vary wildly and butcher a consistent image like Halo5 does on XB1....As far as I've seen, the best implementation of Dynamic Scaling this gen has been Wolfenstein, Metro Redux and Mordor......These games stay at their optimum resolution the majority of the time, so image quality is highly consistent.

No, your comment was directly referencing 1080p.

Also I don't think anyone in their right mind was expecting Ark to be 4K. You can hang on this notion that people expect or think every game on the X will or should be 4K, but most of the logical people don't honestly think that.

I agree that it's not going to run at 60fps, but it's stupid to think that it can't look noticeably better at 1080p.
Yes, but my comment was in reference to persons before who said that it would be 4k...I was simply telling that poster, that here we are going from 4k to 1080p, but knowing how ARK runs that he should not even expect a solid 60fps at 1080p.....

As for settings being better, I agreed that it would be on XBONEX, but again, don't expect Epic/Ultra settings or anything close.....Then again, this is a knock more on the engine/game as opposed to the platform Ark runs on.....even behemoth PC's struggle ;)
 
What, in your mind, would be a better use of resources?

The Xbox one X doesn't have the hardware to reliably push beyond 60fps - consumers won't be buying it expecting framerates about 60fps - and most console gamers have 60hz tvs - so I don't see how attempting to improve a framerate that consumers are already satisfied with is a better use of resources.

People will be buying the XOX because they want better, sharper visuals, and that's exactly what this will provide.

If anything, Titanfalls textures are pretty muddy in places. Better textures would be a nice boost to a 4K version of the game.

Obviously, I'm aware that that means producing better textures. But if it were a new platform, that would be a given, they'd push performance to meet the new hardware, but because it has to run on the lower-end system too, and the ONE X will represent such a small portion of the userbase, it's unworthwhile.

I'm not surprised that the outcome is underwhelming, but it's a shame. I think in general once you've hit 4K/60 then you should be looking at texture resolutions, model quality, shadows, lighting, etc. Dumping more of your resources into boosting the resolution, seems like a waste, proportionally. But I get that the better alternative isn't worthwhile to the developers, but that's why this iterative hardware kind of sucks.
You know consoles are mostly played on tv's right with a refresh rate of 60hz.

Neither are most consoles played on a 4K TV, and even those with a 4K tv are unlikely to tell the difference between 4K and beyond 4K supersampling.
 
Top Bottom