• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DigitalFoundry: RotTR Xbox One X vs PS4 Pro First Look Graphics Comaprison

HeWhoWalks

Gold Member
The poster I replied to also mentioned PS4 and it's publicly available information. If you have better information, please show us.

Eh, most places put the original PS4's mb @ 176GB/s and the X1X's at 326GB/s. If these things are true, he'd indeed be correct when he states that the latter's mb isn't double that of the base PS4's.
 

c0de

Member
Eh, most places put the original PS4's mb @ 176GB/s and the X1X's at 326GB/s. If these things are true, he'd indeed be correct when he states that the latter's mb isn't double that of the base PS4's.

And again, it's of no use to compare theoretical maximums. At all. Because they are not true in terms of real applications.
 

Duderino

Member
What this tells me is that people immediately believe that "sharper = better" but, in the world of post processing, that's not always true. I've been saying from the very start that the missing post-fx on Xbox One X would have an impact on the situation and we know that's quite true.

Also, let's entertain the idea that new textures were added to XOX - some people would perceive that as a slight against the PC or a suggestion that XOX is somehow more powerful. That's just silly. Any additional assets would be a new thing added after the fact. It should be obvious to anyone that those assets could be added to the PC version without an issue. Missing features on PC is always down to the developer not a limitation of the platform, which everyone should understand.

In the end, though, we know that new assets were not added.

Another thing to keep in mind, surfacing methods have been evolving over the past few years by a handful of smart people in the industry. One of the benefits is perceived detail is becoming less a function of what texture resolutions the hardware can push. Of course not many studios have adopted this way of working yet, it requires a significant amount of retooling to materials and artists workflows, but I do expect we'll be seeing a lot more of it as time goes on and not just on characters.
 

HeWhoWalks

Gold Member
And again, it's of no use to compare theoretical maximums. At all. Because they are not true in terms of real applications.

That wasn't the basis for my reply. I'm just reminding you, since you asked for an alternative source, that those are the reported memory bandwidth findings for both machines at most sites.
 

c0de

Member
When you start to rely on Wccftech to prove a point that you are wrong since beginning lol

Oh come on, you know better, ethomaz. Do you want to tell us that they forged the slide? It was even brought up by DF when they had their dedicated article to the af filtering issues, especially present on PS4. I don't believe you missed that.
This whole "shoot the messenger"-thing needs to stop.
 

Space_nut

Member
There's more than half the amount of available ram for games on Xbox one x, the memory bandwidth available is more than double even without 100% perfect utilization. The clockspeed for the GPU will allow it to perform even better than the tflop count difference suggests because other things are enhanced as a result of that clockspeed in ways that make a further difference between the pro GPU and Xbox one x gpu.

Microsoft used that extra year pretty wisely.

Yup as stated from a dev I spoke to on twitter

"All fixed function hardware throughput scale linearly based on clock rate. So higher clock is definitely better than simply having more CUs.

Clock rate also scales up fill rate and geometry rate and cache bandwidth, etc. More CUs only scale up flops and texturing. Clock rate also obviously scales up everything inside CUs."

Not just flops get improved by increased clock rate :)

p9KEPcE.png
 

ethomaz

Banned
And again, it's of no use to compare theoretical maximums. At all. Because they are not true in terms of real applications.
And what are the real application bandwidth??? Do you have the same application running in both to comparison???

I'm curious about that info.
 

c0de

Member
That wasn't the basis for my reply. I'm just reminding you, since you asked, that those are the reported memory bandwidth findings for both machines.

These are the memory speeds like they are specced. Like with many things in technology. The difference is not really in the numbers but what they mean.
 

HeWhoWalks

Gold Member
What is the benefit to compare memory speeds of OG PS4 and X1X?

Someone stated that the theoretical speeds weren't double the base PS4 in the X1X (as to signify that it's not that big between the X1X and Pro if it's not that big over the base).
 

ethomaz

Banned
Oh come on, you know better, ethomaz. Do you want to tell us that they forged the slide? It was even brought up by DF when they had their dedicated article to the af filtering issues, especially present on PS4. I don't believe you missed that.
This whole "shoot the messenger"-thing needs to stop.
I am sharing what is publicly available. everyone can take from it what they want but we're already seeing how that goes by the latest reply.
You can argue how much you want, the point that onion and garlic buses both access the PS4 main ram has a certain toll and was even confirmed by Sony by the slide in the wccf article and is also embedded by digital foundry.

No where does the graph says it is 140GB/s max... the graph shows an example where the more bandwidth the CPU use the less GPU will have.

The same case for XB1X... MS own app in labs reached 285GB/s but that didn't mean others apps can reach more or even close to the theoretical max (it will probably will be close with app optimizations and new SDKs in the future).
 

c0de

Member
And what are the real application bandwidth??? Do you have the same application running in both to comparison???

I'm curious about that info.
I am sharing what is publicly available. everyone can take from it what they want but we're already seeing how that goes by the latest reply.
You can argue how much you want, the point that onion and garlic buses both access the PS4 main ram has a certain toll and was even confirmed by Sony by the slide in the wccf article and is also embedded by digital foundry.
 

pixelbox

Member
Oh come on, you know better, ethomaz. Do you want to tell us that they forged the slide? It was even brought up by DF when they had their dedicated article to the af filtering issues, especially present on PS4. I don't believe you missed that.
This whole "shoot the messenger"-thing needs to stop.
Either way you look at it, the extra bandwidth was intended for 4k production with HQ texture packs. Should a developer decide to use CB maybe we'll see some results. As far as I'm aware the pro has some hardware/software to produce CB which may alleviate computing strain.

Then you have to factor the small gains of 16FP calculations which may have some impact. All and all it'll be up to money, time, and motivation. So we'll never know.
 

KageMaru

Member

Lol perfect.

No where does the graph says it is 140GB/s max... the graph shows an example where the more bandwidth the CPU use the less GPU will have.

The same case for XB1X... MS own app in labs reached 285GB/s but that didn't mean others apps can reach more or even the max theorical.

Not really the thread for this but if the limit was higher, don't you think that would be mentioned? What would be the point of either Sony or MS withholding this information?

Edit:

Either way you look at it, the extra bandwidth was intended for 4k production with HQ texture packs. Should a developer decide to use CB maybe we'll see some results. As far as I'm aware the pro has some hardware/software to produce CB which may alleviate computing strain.

Then you have to factor the small gains of 16FP calculations which may have some impact. All and all it'll be up to money, time, and motivation. So we'll never know.

I'm lost, what do you mean by the bolded?
 

c0de

Member
No where does the graph says it is 140GB/s max... the graph shows an example where the more bandwidth the CPU use the less GPU will have.

Dem goalposts... Did you even get what the different colors mean? The maximum is shown with almost no CPU access to ram at all (to the point where you can't even see it in the graph).
 

c0de

Member
Either way you look at it, the extra bandwidth was intended for 4k production with HQ texture packs. Should a developer decide to use CB maybe we'll see some results. As far as I'm aware the pro has some hardware/software to produce CB which may alleviate computing strain.

Then you have to factor the small gains of 16FP calculations which may have some impact. All and all it'll be up to money, time, and motivation. So we'll never know.

You have to consider a lot of things and sure, you are right that consoles are made with different purposes and specced that way. But this wasn't the point with my reply to Liabe Brave, of course 😉
 

ethomaz

Banned
Not really the thread for this but if the limit was higher, don't you think that would be mentioned? What would be the point of either Sony or MS withholding this information?
Nobody is withholding this information... lol

Sony showed a example in how the CPU bandwidth usage affect the GPU bandwidth usage (the same happens on XB1X too) and MS showed an example they could reach 285GB/s in labs.

Both are different apps that they used to show different things... maybe if you ask Sony do to make a app to reach the max they could of the bandwidth (like MS did) or ask MS do to a app to show how the CPU bandwidth usage affect the GPU bandwidth usage (like Sony did) you could get different results.

App (games) optimizations and future SDKs will make this number increase near the theoretical max... devs will get better with the SDK and the SDK will get better itself... it is all down to optimizations to use more of the hardware.

That is exactly why I asked for any test of max bandwidth done by Sony but seems like some guys wants to see what they want using test with different methodology to prove it own point.
 

pixelbox

Member
Lol perfect.



Not really the thread for this but if the limit was higher, don't you think that would be mentioned? What would be the point of either Sony or MS withholding this information?

Edit:



I'm lost, what do you mean by the bolded?
Meaning the developers would have to be motivated to use the extra HP with money and even then they would have to assess how much effort used with current projects. PRO has double the GPU and most developers barely use it despite the fact SONY being the market leader.
 

Colbert

Banned
What is the benefit of the thread at all after all that came out of it?

The benefit of this thread is:
We got an answer/indication on the question if X1X has the edge over PS4 Pro.
We got an answer/indication how close X1X can get compared to PC results.

Edit.
We also know now that RotTR has an excellent dynamic mudd system in place!
 

c0de

Member
The benefit of this thread is:
We got an answer/indication on the question if X1X has the edge over PS4 Pro.
We got an answer/indication how close X1X can get compared to PC results.

Edit.
We also know now that RotTR has an excellent dynamic mudd system in place!

That's hardly a benefit but a result.
 

pixelbox

Member
The benefit of this thread is:
We got an answer/indication on the question if X1X has the edge over PS4 Pro.
We got an answer/indication how close X1X can get compared to PC results.

Edit.
We also know now that RotTR has an excellent dynamic mudd system in place!
We knew this, but to what degree was the question.
 

Darklor01

Might need to stop sniffing glue
You won, its a result! Beneficial to us all wanted to know that!

Didn't we already know that though? I mean, all the stats point to the system being more powerful in every capacity I can think of, maybe with exception to FP 16 stuff (I'm not an expert, no idea). I'm not sure that comparing one game or 100 games would disprove it.
 

Colbert

Banned
Didn't we already know that though? I mean, all the stats point to the system being more powerful in every capacity I can think of, maybe with exception to FP 16 stuff (I'm not an expert, no idea). I'm not sure that comparing one game or 100 games would disprove it.

It always better to see the assumptions gets a real Manifestation in a game you can compare across several devices. This was the first one.

Just recapping, bro.

My bad, I misunderstood!
 

Darklor01

Might need to stop sniffing glue
It always better to see the assumptions gets a real Manifestation in a game you can compare across several devices. This was the first one.

I guess. I understand your point and all that, and it is nice to see. This is why DF does the comparisons I suppose.
 

KageMaru

Member
Nobody is withholding this information... lol

Sony showed a example in how the CPU bandwidth usage affect the GPU bandwidth usage (the same happens on XB1X too) and MS showed an example they could reach 285GB/s in labs.

Both are different apps that they used to show different things... maybe if you ask Sony do to make a app to reach the max they could of the bandwidth (like MS did) or ask MS do to a app to show how the CPU bandwidth usage affect the GPU bandwidth usage (like Sony did) you could get different results.

App (games) optimizations and future SDKs will make this number increase near the theoretical max... devs will get better with the SDK and the SDK will get better itself... it is all down to optimizations to use more of the hardware.

That is exactly why I asked for any test of max bandwidth done by Sony but seems like some guys wants to see what they want using test with different methodology to prove it own point.

I could be wrong but I'm not sure that's how it works. I'm not saying the two tests are like for like but there is more than just programming or SDK inefficiencies that prevents hardware from reaching the theoretical numbers in performance.

Meaning the developers would have to be motivated to use the extra HP with money and even then they would have to assess how much effort used with current projects. PRO has double the GPU and most developers barely use it despite the fact SONY being the market leader.

This isn't really true. While certain studios and projects will be limited with schedules or resources, the majority of the games are likely going to turn out better on the 1X.
 

pixelbox

Member
I could be wrong but I'm not sure that's how it works. I'm not saying the two tests are like for like but there is more than just programming or SDK inefficiencies that prevents hardware from reaching the theoretical numbers in performance.



This isn't really true. While certain studios and projects will be limited with schedules or resources, the majority of the games are likely going to turn out better on the 1X.
Oh no doubt, naturally. But like Pro, the effort i fear will be the issue.
 

Space_nut

Member
Just recapping, bro. Even if, is it worth slightly better effects? PRO being as powerful as it is to ps4 really isn't worth thr extra cost IMO.

You are right about Pro. But saying slightly is a bit harsh. It's mostly just a ps4 with the gpu doubled. This is mostly what we see in games running base ps4 assets but at a higher res and some effects

What MS did with Xbox One X allows devs to do a whole lot. Look at RoTR with not just native 4k, but with higher textures and effects.

AC Origins has double the LOD drawdistance, increased geometry in environments, higher detailed textures, and so much more already
 

Thebonehead

Banned
Just recapping, bro. Even if, is it worth slightly better effects? PRO being as powerful as it is to ps4 really isn't worth thr extra cost IMO.


Pixelbox has the benefit of hindsight having owned a Pro now for some time to be able to come to that conclusion. People are allowed to change their mind without being castigated for it

Some people on here may feel the same way in the future about the 1x in that way.

I think those coming from OG Xbox One or std PS4 will see the jump, more than those coming from the Pro.

Oh no doubt, naturally. But like Pro, the effort i fear will be the issue.

Microsoft, when they get their act together, provide some fabulous development tools. From multiple developer accounts it appears they have pulled out all the stops and made a decent SDK for the 1x, which is extremely easy to develop for. Hence the accounts of having games up and running in a few hours @ 4k etc.
 
Random ass post looking at Lara in leather. PC with DoF disabled, SMAA.

yc3c.jpg


Might even be 4 stages of roughed up and may depend on the outfit. I tried the red expedition jacket and couldn't get Lara to show any cuts or marks, maybe idea is the tough jacket keeps her from getting marked?
 
Top Bottom