• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Sinthor

Gold Member
ps5 36 CUs are 100%
xbox 52 CUs are n%

so xbox has n= 52*100 /36= 144,5% the cu's ps5 has. (somehow when I did the calculation in my mind before, I used 54cus instead of 52, hence the 150% instead of 144%)

as for the "elegant solutions" mystique and etc cerny sweet talk, yeah, ok.
both consoles are customized. why should sony's undescribed solution be more "elegant" than microsoft's? if anything, from what both companies has shown until now it seems that more "elegant" is the microsoft approach.
the "allocation" was to make the example easier to understand. would you prefer to break it down to cycles of individual CU's?



no, I wrote that for the same 28 CUs that PS5 allocates to rendering/shading/etc, xbox allocates 36, thus already over-compensating for the Mhz difference when ps5 runs at full speed.
then, while ps5 will have 8 CUs, xbox will still have 52-36= 16 and not 12, twice as many just like I wrote above.

on top, if you want to get more complicated, I'd ask that -since this sounds like a "worst case game" scenario per cerny's vocabulary- what would the real ps5 clockspeeds be at that case?

anyway....

soonish there will be no more mysteries, that's one thing that is for sure.

On a serious note, if final devkits aren't out, are we SURE that there will be a November release? Frankly, it wouldn't surprise me if either console was delayed, at least in the U.S. With the way things are now and how I fear they will be on our way to November...wouldn't surprise me at all to have delays. Would be the least of our issues if things keep winding up the way they are right now.
 
Lower the cones of silence.

200.gif

Hahaha, love Get Smart!
 

Mr Moose

Member
ps5 36 CUs are 100%
xbox 52 CUs are n%

so xbox has n= 52*100 /36= 144,5% the cu's ps5 has. (somehow when I did the calculation in my mind before, I used 54cus instead of 52, hence the 150% instead of 144%)


as for the "elegant solutions" mystique and etc cerny sweet talk, yeah, ok.
both consoles are customized. why should sony's undescribed solution be more "elegant" than microsoft's? if anything, from what both companies has shown until now it seems that more "elegant" is the microsoft approach. (form factor, thermal footprint, Hz, power levels etc)
the "allocation" was to make the example easier to understand. would you prefer to break it down to cycles of individual CU's?



no, I wrote that for the same 28 CUs that PS5 allocates to rendering/shading/etc, xbox allocates 36, thus already over-compensating for the Mhz difference when ps5 runs at full speed.
then, while ps5 will have 8 CUs, xbox will still have 52-36= 16 and not 12, twice as many just like I wrote above.

on top, if you want to get more complicated, I'd ask that -since this sounds like a "worst case game" scenario per cerny's vocabulary- what would the real ps5 clockspeeds be at that case?

anyway....

soonish there will be no more mysteries, that's one thing that is for sure.
You said PS5 has 50% less, it's ~ 30% less CUs, Series X has ~44% more.
 

Neo Blaster

Member
Please note that you assume that the RT solution is the same on the two machines. We do not know that. In the past Sony has done several customisations even on the CU level. We have to wait to see what their RT solution is until we have a hot-chip like breakdown of the PS5.

I am not sure that most people even know that half the CUs in the PS4 pro was different than all other CUs that AMD has made. Sony has never revealed what change they implemented but you can see on the silicon that each CU in one block has a larger mm2 silicon foot print.

On the PS4 basic model Sony added Shader capacity to the CU block compared to the basic AMD layout that the Xbox had.

And so on.

Edit: Spelling
Someone please correct me if I'm wrong, but didn't Cerny said BVH is done outside CUs?
 

Elog

Member
This brings up an interesting thought. I wonder if Series X was designed (assuming) less custom or less different from the norm to not screw cross compatibility with PC, first and foremost, and Series S. As this machine was built to be cross compatible from the start, whereas PS5 doesnt have these constraints.

That is what I expect - in a fair way I think it is ok to think about the XSX as a very capable PC platform.

Playstation has always had more customisations since they run their own API and I expect the same this time. The biggest one seems to be the geometry engine that is completely custom (and then rumoured to be so liked by AMD that they are thinking implementing it in future cards).

I also expect a different set-up around rasterisers/primitives since Sony did expand that analogous capacity per CU compared to the standard AMD solution on PS4 and PS4 Pro.

The two things that are unknown are cache and RT where very little information is known - there is some weirdness going around regarding the RT solution since all the GitHub leaks from AMD did not contain any RT benchmark data from Sony despite the fact that we know the platform has hardware RT. This would suggest that the RT solution is not part of the CUs...if that is true the PS5 RT solution might very well act independently from the CUs and potentially act straight towards output from the GE. That would be a big difference to the XSX (in a positive sense). Please note that the last bit is speculative. it could of course also be much worse :)
 

Thirty7ven

Banned
Are there any tech insights about how MS extracted 400% more performance for RT out of an APU compared to a full fledged PC card (2080ti)?

They haven’t, Nvidia hasn’t released the numbers. We have Minecraft “benchmark”, where XSX performed over 30 FPS at 1080p. The 2080ti was pushing just under 60(58).

This shouldn’t discourage anyone unless they have dreams of seeing a game like Halo with path tracing.

More so the XSX, but the PS5 also are delivering above expectations with RT. The widespread adoption of RT effects next gen will be great because as is always the case with great devs, they will get smarter and smarter with it, and use it in more and more effective ways.

It’s great news for PC too, as RT will no longer be a thing that only a few select games use, and you won’t need a Nvidia card.
 

jose4gg

Member
no, I wrote that for the same 28 CUs that PS5 allocates to rendering/shading/etc, xbox allocates 36, thus already over-compensating for the Mhz difference when ps5 runs at full speed.
then, while ps5 will have 8 CUs, xbox will still have 52-36= 16 and not 12, twice as many just like I wrote above.


soonish there will be no more mysteries, that's one thing that is for sure.

Let's do the math with that then...


Raytracing in RDNA2 is alleged to be from modified TMUs.
Using the based CU you proposed

PS5:

4 x 8 x 2.23 GHz - 71.12 Billion ray intersections per second

XSX:
4 x 16 x 1.825 GHz - 116.6 Billion Ray intersections per second

Difference:
37-40% Difference (Yes, in your configuration it looks like 40% will be the difference between the consoles, but what happen with other components) 🤔

Now let's take a look at this configuration, in other departments and see how well it performs against PS5.

Texture fillrate is based on 4 texture units (TMUs) per CU.
Using the remain CU

PS5:

4 x 28 x 2.23 GHz - 249.76 Billion texels per second

XSX:
4 x 36 x 1.825 GHz - 262.6 Billion texels per second


Real difference based on the whole CU count should be 18-20% (as expected),

but in this configuration, the one you proposed.

PS5 closes the gap with a total of 5%

Basically the example you give really gave more advantage in RT to XSX but ends up taking power for other parts because it isn't proportional...

Let say PS5 uses 30% of its CU to RT => 11 CU
XSX will use 30% too => 17CU

The difference will still be 18%

In your example, you assigned 22% of the CU in PS5, while assigning 30% of the CU in the XSX... That's where everything goes away...
 
Last edited:

Mr Moose

Member
Someone please correct me if I'm wrong, but didn't Cerny said BVH is done outside CUs?
20200329150107.jpg



There's a specific set of formats you can use their variations on the same BVH concept. Then in your shader program you use a new instruction that asks the intersection engine to check array against the BVH.

While the Intersection Engine is processing the requested ray triangle or ray box intersections the shaders are free to do other work.
I have no idea what any of this means.
 

THE:MILKMAN

Member
You have a strange notion of elegance. Microsoft's solution is hardly elegant, but rather too direct and explicit, like a sledgehammer. Large GPU and wider memory bandwidth. Oh, yeah, very elegant.

I wouldn't describe it has a sledgehammer approach. They rather clearly said they were going for the power crown (as silly as we might see that) and they look to have achieved that. On top of this they smartly incorporated xCloud tech so they get the use of the SoC in the Azure servers.

It's a different route to Sony but valid nevertheless.
 
Let's do the math with that then...


Raytracing in RDNA2 is alleged to be from modified TMUs.
Using the based CU you proposed

PS5:

4 x 8 x 2.23 GHz - 71.12 Billion ray intersections per second

XSX:
4 x 16 x 1.825 GHz - 116.6 Billion Ray intersections per second

Difference:
37-40% Difference (Yes, in your configuration it looks like 40% will be the difference between the consoles, but what happen with other components) 🤔

Now let's take a look at this configuration, in other departments and see how well it performs against PS5.

Texture fillrate is based on 4 texture units (TMUs) per CU.
Using the remain CU

PS5:

4 x 28 x 2.23 GHz - 249.76 Billion texels per second

XSX:
4 x 36 x 1.825 GHz - 262.6 Billion texels per second


Real difference based on the whole CU count should be 18-20% (as expected),

but in this configuration, the one you proposed.

PS5 closes the gap with a total of 5%

Basically the example you give really gave more advantage in RT to XSX but ends up taking power for other parts because it isn't proportional...

Let say PS5 uses 30% of its CU to RT => 11 CU
XSX will use 30% too => 17CU

The difference will still be 18%
I thought of a real-case scenario, as I wrote on my first post. a next-gen 3rd party game.
I supposed 28 ps5 CUs and 36 xbox CUs to ensure visual parity (over-compensated for xbox as I wrote and your calculations show. hence the 5% gap)
then I counted what's left on top for ray tracing effects. (40% difference according to your calculations again)

dont forget I was just replying to someone that told me "good luck with ray tracing"

so, it was supposed to be a simpleton example.
if you want to get more tricky than that, you could try to include in calculations of things we know, as that xbox has machine learning inference acceleration (dlss for simplicity's sake), we also know it has variable shading. we also know that at full load playstation 5 downclocks.
microsoft claims 3X-10X benefit for MLIA, 10-30% for VRS. these factors should be calculated in order to see how many of those 36CUs I arbitrary "locked" for visual parity are really needed. probably less.
for sony we know squat for the above, we don't even know the downclock percentage.
so... as I said, pretty soon no more "elegant mysteries". I am bored of those.
 

Gamernyc78

Banned
Not sure he said exactly that but he clearly suggested it, i.e. that the GE creates a complete output with culling, priorities etc (and hence BVH) that can be used independently by other parts of the GPU such as primitives and RT.

Raytracing implementation is definitely diff on both with Sony going beyond. I just sit back and let the ppl tht keep taking L's keep taking them. Keep thinking the higher CU count makes next box have better RT lol Sony thinks out the box ppl.
 
Raytracing implementation is definitely diff on both with Sony going beyond. I just sit back and let the ppl tht keep taking L's keep taking them. Keep thinking the higher CU count makes next box have better RT lol Sony thinks out the box ppl.

I wouldn't say that Sonys RT is better just yet. But it is possible that they are doing it differently than Microsoft is. However I have my doubts that this is true though and it most likely is the same RT solution from AMD.

My guess.
 

jose4gg

Member
I thought of a real-case scenario, as I wrote on my first post. a next-gen 3rd party game.
I supposed 28 ps5 CUs and 36 xbox CUs to ensure visual parity (over-compensated for xbox as I wrote and your calculations show. hence the 5% gap)
then I counted what's left on top for ray tracing effects. (40% difference according to your calculations again)

dont forget I was just replying to someone that told me "good luck with ray tracing"

so, it was supposed to be a simpleton example.
if you want to get more tricky than that, you could try to include in calculations of things we know, as that xbox has machine learning inference acceleration (dlss for simplicity's sake), we also know it has variable shading. we also know that at full load playstation 5 downclocks.
microsoft claims 3X-10X benefit for MLIA, 10-30% for VRS. these factors should be calculated in order to see how many of those 36CUs I arbitrary "locked" for visual parity are really needed. probably less.
for sony we know squat for the above, we don't even know the downclock percentage.
so... as I said, pretty soon no more "elegant mysteries". I am bored of those.

I understand what you said, that will end up being 4K or whatever resolution the same in both consoles, same for framerate/frame drops, which is possible, then yea, you use almost every power left in the console for something like RT and it's internal resolution, etc.

Now, I won't go in detail about the VRS stuff, or DLSS, because we don't know yet how to compare those things to PS5 and it's customizations... I even keep my though on rasterization going faster because os clock speed to myself, but you can make an argument, you would need less 20% less CU in PS5 to rasterize the same thing that an XSX can rasterize...

Again, these are architecture differences, AND design changes a developer can decide to make, just like a dev can lower the resolution in PS5 and match/reduce the performance gap in RT.
 
I wouldn't say that Sonys RT is better just yet. But it is possible that they are doing it differently than Microsoft is. However I have my doubts that this is true though and it most likely is the same RT solution from AMD.

My guess.
hey, maybe sony found a way to push these calculations to the enhanced ...joypad instead of the rdna2 way. who knows?
thats why a ps5 stripdown and hands-on is wanted, and missing


Again, these are architecture differences, AND design changes a developer can decide to make, just like a dev can lower the resolution in PS5 and match/reduce the performance gap in RT.
my simpleton example was an apples to apples example, and as you saw it checks out.
 
Last edited:

cragarmi

Member
20200329150107.jpg




I have no idea what any of this means.
I believe Cerny is inferring to running BVH independent of RT, decoupling. If so then advantage Sony.
 

Darius87

Member
ps5 36 CUs are 100%
xbox 52 CUs are n%


so xbox has n= 52*100 /36= 144,5% the cu's ps5 has. (somehow when I did the calculation in my mind before, I used 54cus instead of 52, hence the 150% instead of 144%)

and ps5 has 50% less than xbox
if you say PS5 has less then xbox then my calculation applies not yours and anyway highest amount of something allways have to be 100%. yours 50% sounds like PS5 have 26CU's.

as for the "elegant solutions" mystique and etc cerny sweet talk, yeah, ok.
both consoles are customized. why should sony's undescribed solution be more "elegant" than microsoft's? if anything, from what both companies have shown until now it seems that more "elegant" is the microsoft approach. (form factor, thermal footprint, Hz, power levels etc)
we talking about RT customizations not the whole system there's nothing indicating that MS RT is in anyway customized everything points to RDNA2.0 RT and i know that because i've RDNA 2.0 Hybrid-RT whitepaper it's the same.
i've only suggested 'elegant' that PS5 might be customized RT also many rumors suggest that's the case also i have suspicions that it has to do with PS5 high GPU clocks that might give some advantages over XSEX.
the "allocation" was to make the example easier to understand. would you prefer to break it down to cycles of individual CU's?
that's very bad example it sounds like CU's can't do anything else but RT like i said it's parallel and there's not such thing as over-compensating Hz more CU won't overclock themselfs magically.
 
I wouldn't say that Sonys RT is better just yet. But it is possible that they are doing it differently than Microsoft is. However I have my doubts that this is true though and it most likely is the same RT solution from AMD.

My guess.

1 company displayed mulitple next gen games with RT. The other gave us Craig & cgi footage. You have a funny way of saying one is not better than the other.

How can it be same solution when the customisation & Api's are not the same. Insomniac is clearly not using what craig was using. Xbox said it was going to patch in RT later. Behave.

You are giving benefit of a doubt to a constant no show. What are we doing here.
 
Last edited:
The GPUs are customizable.

You tell me what they did with it instead of using the DualSense to aid in ray tracing.

Can you not add anything stupid to the actual speculation here?
I wrote above a simple example. speculation with the very least parameters possible.
if people have to retort with "unknown elegance" "facts", then speculation period is over imo.
if RT is not calculted in CUs, like some people translated cerny's words, then that only leaves the joypad for secret sauce, no? ;P
 
Last edited:
I wrote above a simple example. speculation with the very least parameters possible.
if people have to retort with "unknown elegance" "facts", then speculation period is over imo.
if RT is not calculted in CUs, like some people translated cerny's words, then that only leaves the joypad for secret sauce, no? ;P

No bringing in the DualSense just makes you look bad. Nobody here is suggesting that.

The only thing that I've heard here is that Sony might have asked AMD to customize the ray tracing hardware a bit. That's about it. Before dedicated ray tracing hardware was confirmed for the GPU people were speculating of an external raytracing solution.

Hence it's all alot more logical than saying there's a Dead demonic baby in the power supply doing the ray tracing for the console with the blood of innocent nuns. Or saying the controller is doing it which no one is.

giphy.gif


Now wasn't that fun?
 
Last edited:

geordiemp

Member
I thought of a real-case scenario, as I wrote on my first post. a next-gen 3rd party game.
I supposed 28 ps5 CUs and 36 xbox CUs to ensure visual parity (over-compensated for xbox as I wrote and your calculations show. hence the 5% gap)
then I counted what's left on top for ray tracing effects. (40% difference according to your calculations again)

dont forget I was just replying to someone that told me "good luck with ray tracing"

so, it was supposed to be a simpleton example.
if you want to get more tricky than that, you could try to include in calculations of things we know, as that xbox has machine learning inference acceleration (dlss for simplicity's sake), we also know it has variable shading. we also know that at full load playstation 5 downclocks.
microsoft claims 3X-10X benefit for MLIA, 10-30% for VRS. these factors should be calculated in order to see how many of those 36CUs I arbitrary "locked" for visual parity are really needed. probably less.
for sony we know squat for the above, we don't even know the downclock percentage.
so... as I said, pretty soon no more "elegant mysteries". I am bored of those.

What if 3rd party games look the same on both systems, what will you do then

Its a very real possibility. And first party from Sony will be a world apart ?
 
Last edited:
What if 3rd party games look the same on both systems, what will you do then

Its a very real possibility. And first party from Sony will be a world apart ?
well, math says "no" to your possibility

and about your first party retort, I dont really care to bring once more the s/w side on a h/w talk. my example was about 3rd parties anyway, common games on both systems.
but since you brought it up, if sony's first party games end up being 30fps this generation, I will probably buy much less of them this time around.



Hence it's all alot more logical than saying there's a Dead demonic baby in the power supply doing the ray tracing for the console with the blood of innocent nuns. Or saying the controller is doing it which no one is.

giphy.gif


Now wasn't that fun?
I'd buy a console that has a Dead demonic baby in the power supply doing the ray tracing with the blood of innocent nuns.
but I'd need a stripdown of said power supply before I reach for my wallet fo'sure :messenger_beaming:
 
Last edited:

geordiemp

Member
well, math says "no" to your possibility

and about your first party retort, I dont really care to bring once more the s/w side on a h/w talk. my example was about 3rd parties anyway, common games on both systems.
but since you brought it up, if sony's first party games end up being 30fps this generation, I will probably buy much less of them this time around.

Well patents say yes. TF simple numbers, do you think XB1X is more powerful than Lockart ? No its not.

What if I told you that Ps5 Geometry engine or shaders where 15 % is to be made up, compresses the data between the first stage (vertices) and second stage (pixels) only on ps5 ?

What if that gain is creeping up on your 15 %. What will you crow about then ? Want to know what it overcomes, from patent released last week ...

qwQLrFf.png


Rememeber all the gossip about ps5 unique new geometry engine and RDNA3 crap....well.......add some fatter dx12 abstract apis into the soup......sleep well.

The games will show how close they are.
 
Last edited:
well, math says "no" to your possibility

and about your first party retort, I dont really care to bring once more the s/w side on a h/w talk. my example was about 3rd parties anyway, common games on both systems.
but since you brought it up, if sony's first party games end up being 30fps this generation, I will probably buy much less of them this time around.




I'd buy a console that has a Dead demonic baby in the power supply doing the ray tracing with the blood of innocent nuns.
but I'd need a stripdown of said power supply before I reach for my wallet fo'sure :messenger_beaming:

And you will see it get taken apart.

What's the issue?

 
Last edited:

BGs

Industry Professional

Thanks to the anonymous who gave me a 1 month Gold subscription. Eternally grateful. I don't know how I'm going to be able to make up for it. You have set the bar for Bo_Hazem Bo_Hazem very high. If this continues the next time we will have to start putting the Swiss account to receive donations (just kidding). Anyway, thank you very much.

Does anyone know who it was?:messenger_dizzy:
 
Last edited:

cragarmi

Member

Thanks to the anonymous who gave me a 1 year Gold subscription. It's too much. Eternally grateful. I don't know how I'm going to be able to make up for it. You have set the bar for Bo_Hazem Bo_Hazem very high. If this continues the next time we will have to start putting the Swiss account to receive donations (just kidding). Anyway, thank you very much.

Does anyone know who it was?:messenger_dizzy:
Plot twist, it was Bo! 😆
 

Thanks to the anonymous who gave me a 1 year Gold subscription. It's too much. Eternally grateful. I don't know how I'm going to be able to make up for it. You have set the bar for Bo_Hazem Bo_Hazem very high. If this continues the next time we will have to start putting the Swiss account to receive donations (just kidding). Anyway, thank you very much.

Does anyone know who it was?:messenger_dizzy:

It's a bribe. They want information from you.

tenor.gif


One day they will contact you and say "Yo BGs remember that gold you got? It came from me now you owe me some tasty inside information"
 

FunkMiller

Member
On a serious note, if final devkits aren't out, are we SURE that there will be a November release? Frankly, it wouldn't surprise me if either console was delayed, at least in the U.S. With the way things are now and how I fear they will be on our way to November...wouldn't surprise me at all to have delays. Would be the least of our issues if things keep winding up the way they are right now.

I think console delays get more and more likely every day that passes without solid information. Covid is still having such a massive impact on distribution channels (I’m seeing this at work a lot). It could be very likely that due to that, the consoles delay until Spring. No point putting your product on sale, if there’s no way to get it into the stores.
 
Status
Not open for further replies.
Top Bottom