• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Die Shot has been revealed

Why no one pays attention to the Great Cerny PS5 Scriptures.
"They connect through the custom I/O unit just like our SSD does."
"So they can take full advantage of the decompression I/O coprocessors and all the other features I was talking about."
"Here's the catch though that commercial drive has to be at least as fast as ours. Games that rely on the speed of our SSD need to work flawlessly with M.2 drive."

By going through the custom I/O unit, it act's as addition storage, so no limited external storage.
Only thing that's needed, is that the external storage has the same speed as the PS5's SSD for games.

blueisdumb tried to say the same thing, but this guy had a good take on it as well.


And this guy.

That's the point!?
It's a console solution and different to PC where everything is about multitasking.

The person even replied with the right answer. If you saturate read or write speed of both at the same time you would become limited by the PCIe interface.

That was my point. Nothing more and nothing less. A console scenario would be transferring games. Right now I have internal storage, the SSD extension card for the same speed and an external SSD. So when you play a game and transfer some, you might run into lower speeds than just playing and doing nothing else. Maybe installation times are also limited. It won't effect most people in most scenarios, but it's there.


And, yes it makes more sense for Sony to run the expansion M2 though their custom controller for those decompression benefits. Xbox Series is doing it differently. That stuff is done on the SSD itself and SOC. Both have their separate 2x interface.

Stop being defensive when people analyze hardware solution just for their own curiosity and troll a bit a long the way.


This could also explain why there is no support for expansion SSDs already. It's harder and more custom to implement.



Might be a complete hard limit even. But I'm not sure about that. That would be really a bad solution
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Stop being defensive when people analyze hardware solution just for their own curiosity and troll a bit a long the way.
Well, at least you are honest :).

Seriously, I think both XSX and PS5 are ultimately routing through their I/O block as they need to handle mapping and decompression, but PS5 has the SSD controller also going in the middle as it needs to manage the six priority levels the developers expects from the solution (even though NVMe drives tend to only support two, so burning some extra bandwidth on top of the 5.5 GB/s minimum).

It might just mean that you cannot install games half in one disk and half in the other or maybe that you only replace the internal storage and not extend it, but regardless of this any scenario where two disks are connected any secondary data transfer would likely be sliced not to impact the I/O required by the game playing or the foreground activity: the solution is fast enough and background activity not too frequent and latency sensitive to become a problem with this kind of strategy.

You could have discovered that a scenario that is not likely to happen could be less than ideal, possibly.
 
Last edited:
both SSDs (internal/external) are connected with the same speed to the main SoC
This statement is not entirely correct. Data from the internal ssd and the m.2 nvme ssd is controlled by the PS5 custom flash controller. The m.2 nvme flash controller works in bridge mode, transmitting data to the PS5 flash controller as quickly as possible (so you need a little higher speed to cover the delays). The custom flash controller is already preparing the data in the right form for transmission to the PS5 I/O unit.
 

geordiemp

Member
FWIW this might've been in reference to Vega architecture, not RDNA. Some of Sony's earlier designs for PS5 used Vega GPUs as sit-ins until RDNA silicon was ready. Even if pontificating the significance of this further, the impact of more Dual CUs in the SA as questioned in these patents could've been more RDNA 1-specific; several efficiency changes in the design of the architecture have come with RDNA 2.

None of this also necessarily negates any potential customizations for optimization Microsoft may've done with their setup that are at the CU level and therefore wouldn't even have been covered in a presentation or visible in die x-rays. I would think being such a big and well-funded company, designing an APU that's a fit for both their gaming and Azure blades (and designing a companion system with a CU setup more "normalized"), would have been aware of issues this patent touched on and taken them into consideration, making some lower-level adjustments as best as able to mitigate issues raised here.

Dont think so, the early ps5 designs were likely 5700XT, why would they write a patent and go to all the trouble for an architecture which is unused in gaming ?
 

M1chl

Currently Gif and Meme Champion
Oh boi. External Storage will be limited, because it has to be accessed at the same 4 PCIe lanes as internal storage.
Won't matter for games, as you only play one game, but other tasks will be affected.



Xbox Series has 2 lanes for internal and 2 lanes for external.

It seems to be 1 set of lanes on the top of the board and second on the bottom of the board?
 
Dont think so, the early ps5 designs were likely 5700XT, why would they write a patent and go to all the trouble for an architecture which is unused in gaming ?
For testing purposes? A few articles (including a Japanese one last year) made reference to Vega GPUs being used in some prototypes. In any case, 5700 XT is still RDNA 1, RDNA 2 changes many things and if Sony were aware of the limitations mentioned you bet your ass AMD were well aware of them, too. Why would they not look to fix them for RDNA 2? Because evidence would seem to indicate they have (one big sign is that the GPUs are larger; 40 CU was the limit for RDNA 1 and that includes 5700 XT).

I'm also curious to if there may've been any undesired complications between Sony and AMD for writing and publishing these types of research. In a way it can be seen as unfavorably critical of a technology partner, and that can cause some strain on partnerships. I'm pretty sure patents like the one you're referencing were done with best of intentions and mutually agreed conclusions of research between Sony and AMD engineers, but the way I see some people try using these as an attack vector towards other RDNA 2-based designs would, ironically for them, effectively imply Sony as a technology partner being punitively critical and almost belittling designs of another tech partner in the form of AMD.

They don't seem to know how those kind of implications (that they aren't even aware they are suggesting when they use these sort of patents as "gotcha!"s) don't reflect too kindly on their platform or device of choice insofar as the backstage politics within the corporate world of these tech partnerships is concerned o.0

Oh boi. External Storage will be limited, because it has to be accessed at the same 4 PCIe lanes as internal storage.
Won't matter for games, as you only play one game, but other tasks will be affected.



Xbox Series has 2 lanes for internal and 2 lanes for external.


Were people really expecting simultaneous use of the internal SSD and an external SSD to be interfaced in parallel? That's almost 13 GB/s the controller would have to account for, that was never going to happen 😆

I'm more concerned about what their SSD storage update might do for thermals and fan noise tbh. Hopefully nothing too serious.
 
Last edited:

geordiemp

Member
For testing purposes? A few articles (including a Japanese one last year) made reference to Vega GPUs being used in some prototypes. In any case, 5700 XT is still RDNA 1, RDNA 2 changes many things and if Sony were aware of the limitations mentioned you bet your ass AMD were well aware of them, too. Why would they not look to fix them for RDNA 2? Because evidence would seem to indicate they have (one big sign is that the GPUs are larger; 40 CU was the limit for RDNA 1 and that includes 5700 XT).

I'm also curious to if there may've been any undesired complications between Sony and AMD for writing and publishing these types of research. In a way it can be seen as unfavorably critical of a technology partner, and that can cause some strain on partnerships. I'm pretty sure patents like the one you're referencing were done with best of intentions and mutually agreed conclusions of research between Sony and AMD engineers, but the way I see some people try using these as an attack vector towards other RDNA 2-based designs would, ironically for them, effectively imply Sony as a technology partner being punitively critical and almost belittling designs of another tech partner in the form of AMD.

They don't seem to know how those kind of implications (that they aren't even aware they are suggesting when they use these sort of patents as "gotcha!"s) don't reflect too kindly on their platform or device of choice insofar as the backstage politics within the corporate world of these tech partnerships is concerned o.0



Were people really expecting simultaneous use of the internal SSD and an external SSD to be interfaced in parallel? That's almost 13 GB/s the controller would have to account for, that was never going to happen 😆

I'm more concerned about what their SSD storage update might do for thermals and fan noise tbh. Hopefully nothing too serious.

First there was no CU limit for RDNA1, there was just 1 design, 10 CU x 4 shader arrays and different binning / disabled parts.

Secondly the patent was not just about optimising the LDS and parameter cache, it was about mainaining the vertices so that post processing was more efficient.

Finally, yes everyone will know the limitations of adding more and more CU to a shader array sharing resources. Thats why 6700, 6800 and 6900 have 10 CU per array as does ps5.

XSX has more and you know why that is, it also has a server grade CPU parts and was clearly made with 20 GB RAM in mind.

Lastly, have you seen evidence where some post processing runs better between layouts - Valhalla torch for example, actually there are allot, and this method fits nicely with what we see. Or its magic, your choice.

Sony seem to have spent allot of effort optimising caches in ps5 is my take. We wont see that from looking for simple repeated patterns in upper metalisation layers (as someone who has stripped stuff like 68020 before layer by layer before)
 
Last edited:

RockOn

Member
It's not a gimmick, it's the future norm, all TV's will eventually be HDMI 2.1, it'll be standard.

All consoles ever have had framerate dips, completely locked games are very rare. Dropping a small amount of frames isn't actually a big deal, it's how your TV reflects it that is actually the problem as it displays 60hz regardless, that's what leads to screen tear and stuttering.
I wonder if lower than INT16 is even usable for an upscaling technique based on machine learning. Pixel color calculations are either at FP32 or FP16 up to the moment the framebuffer translates into color space coding where it needs to be from 14 to 24 bits per pixel (or more if HDR is enabled).

Quad-rate INT8 may be useful for a lot of neural network inferencing, but I'd bet DLSS2 mostly uses the tensor cores' maximum precision of FP16, with some operations resorting to the full FP32 shader processors.
So while "ML hardware" in the form of 4x INT8 and 8x INT4 throughputs are nice to have for ex. character behavior or procedural generation of assets, I don't know if those are especially useful for resolution upscaling.
Seen as PS5 is RDNA 2. And AMD has enhanced the CUs as a standard design feature(its not something Microsoft added). PS5 will most likely support Mixed Integer Int4 & int8. So PS5 will be able to do Machine Learning just fine. Also seems to be backed up by AMD DLSS like feature been cross platform(due to come out soon)
 

RockOn

Member
This statement is not entirely correct. Data from the internal ssd and the m.2 nvme ssd is controlled by the PS5 custom flash controller. The m.2 nvme flash controller works in bridge mode, transmitting data to the PS5 flash controller as quickly as possible (so you need a little higher speed to cover the delays). The custom flash controller is already preparing the data in the right form for transmission to the PS5 I/O unit.
Cerny explained this. He says the PS5 custom 12 channel flash controller has to take control of the 3rd party SSD controller. Thats why the specs for a sutable SSD drive need to be 7Gb/s which is higher than the 5.5Gb/s. PS5 has 6 data priority levels compared to 2 of most 3rd party PC drives. Thats probably why ther has been a delay in making the internal SSD expansion useable(ther are not many 7Gb/s drives on market). More are starting to appear now.
 
Last edited:

ToTTenTranz

Banned
I think it's interesting that the external SSD needs to go through Sony's custom controller, considering every NVMe drive has its own controller that probably can't be bypassed.
I guess that unless the custom controller is simply acting as a multiplexer for the 4x PCIe lanes whenever it's using the external drive, then it's probably adding some latency to the mix.

Seen as PS5 is RDNA 2. And AMD has enhanced the CUs as a standard design feature(its not something Microsoft added). PS5 will most likely support Mixed Integer Int4 & int8. So PS5 will be able to do Machine Learning just fine. Also seems to be backed up by AMD DLSS like feature been cross platform(due to come out soon)
Yes, I agree. I didn't suggest that the PS5's GPU has no 4x INT8 and 8x INT4 throughput. I just wondered if those capabilities are relevant for any kind of DLSS-like implementation.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
I think it's interesting that the external SSD needs to go through Sony's custom controller, considering every NVMe drive has its own controller that probably can't be bypassed.
I guess that unless the custom controller is simply acting as a multiplexer for the 4x PCIe lanes whenever it's using the external drive, then it's probably adding some latency to the mix.

This is needed as Sony needs to arbitrate the two priority levels in the NVMe specs (burning some bandwidth) to match the expectation of 6 priority levels the internal SSD gives to the rest of the system. They mentioned this scenario in the Road to PS5 talk.
 

RockOn

Member
I think it's interesting that the external SSD needs to go through Sony's custom controller, considering every NVMe drive has its own controller that probably can't be bypassed.
I guess that unless the custom controller is simply acting as a multiplexer for the 4x PCIe lanes whenever it's using the external drive, then it's probably adding some latency to the mix.


Yes, I agree. I didn't suggest that the PS5's GPU has no 4x INT8 and 8x INT4 throughput. I just wondered if those capabilities are relevant for any kind of DLSS-like implementation.
Cool, think cerny said that custom controller has to take over ther 3rd party controller to take advantage of the PS5 features(that also bumps up the min spec required), which allows for any latency(I agree with you)

I didn't mean to suggest that you said PS5 has no int4/int8. I believe Mixed integer int4/int8 etc will be perfect for AMD DLSS like image supersampling. As its all based of Machine Learning
 

Omni_Manhatten

Neo Member
Cool, think cerny said that custom controller has to take over ther 3rd party controller to take advantage of the PS5 features(that also bumps up the min spec required), which allows for any latency(I agree with you)

I didn't mean to suggest that you said PS5 has no int4/int8. I believe Mixed integer int4/int8 etc will be perfect for AMD DLSS like image supersampling. As its all based of Machine Learning
MS added integer 8/4 support for the Xbox. Sony would of had to also add it. We know MS waited for full RDNA 2 and they added HW for Int8/4 and SFS streaming. MS uses the Azure chips because a majority of their AI work with Nvidia was in Azure. The cross platform benefits are also not shared by Sony. MS and Sony have their own AI Azure deal that will be about Sony and it’s online service. Helping tie in their special features like game help etc with its next gen online services.
 

RockOn

Member
MS added integer 8/4 support for the Xbox. Sony would of had to also add it. We know MS waited for full RDNA 2 and they added HW for Int8/4 and SFS streaming. MS uses the Azure chips because a majority of their AI work with Nvidia was in Azure. The cross platform benefits are also not shared by Sony. MS and Sony have their own AI Azure deal that will be about Sony and it’s online service. Helping tie in their special features like game help etc with its next gen online services.
Microsoft never added it, its a standard feature AMD added when evolving RDNA to RDNA 2

vDfnGEH.png

As the image shows Mixed interger started life in RDNA 1.1, and its the same in RDNA 2
 

MastaKiiLA

Member
Were people really expecting simultaneous use of the internal SSD and an external SSD to be interfaced in parallel? That's almost 13 GB/s the controller would have to account for, that was never going to happen 😆

I'm more concerned about what their SSD storage update might do for thermals and fan noise tbh. Hopefully nothing too serious.
Yeah. Sony didn't include a RAID controller in the PS5. There's no need for it.

I'm not too worried about thermals and fan noise. That should be largely drive-dependent. With SMART, Sony should be throttling fan speeds based on the temp sensor readout. So cooler drives should impact fan speed less.
 

ToTTenTranz

Banned
I believe Mixed integer int4/int8 etc will be perfect for AMD DLSS like image supersampling. As its all based of Machine Learning
In my previous post I was arguing against it.
I understand that low-precision variables are enough for many cases of neural network inference, but in the case of DLSS it's probably still calculating pixel color values. If they downgraded the pixel color values down to 8bit then it would probably look really bad.

Furthermore, I don't know of any document or statement on DLSS that suggests it's using INT8 throughput. For all I know it's using FP16 Tensor FLOPs with FP32 accumulate (highest possible precision on the Tensor cores) which still has an enormous throughput on Turing and Ampere alike.


MS added integer 8/4 support for the Xbox. Sony would of had to also add it.
Microsoft didn't add INT8/4 support for RDNA2, AMD did.
There seem to be some custom optimizations made for the Series SoCs specifically for ML loads because Microsoft claimed as such, but it's not the 4/8x rates for INT8/4 because those are already present in Navi 21.


We know MS waited for full RDNA 2 and they added HW for Int8/4 and SFS streaming.
And Sony didn't?
Both consoles released 1 week apart from each other. Where did this idea that "Sony didn't wait for RDNA2" come from, other than pure speculation on the forum/twitter-sphere?
All that Microsoft's marketing material has been claiming is support for DX12 features, which the PS5 obviously hasn't any.
 
Last edited:

Omni_Manhatten

Neo Member
Cool, think cerny said that custom controller has to take over ther 3rd party controller to take advantage of the PS5 features(that also bumps up the min spec required), which allows for any latency(I agree with you)

I didn't mean to suggest that you said PS5 has no int4/int8. I believe Mixed integer int4/int8 etc will be perfect for AMD DLSS like image supersampling. As its all based of Machine Learning
MS added integer 8/4 support for the Xbox. Sony would of had to also add it. We know MS waited for full RDNA 2 and they added HW for Int8/4 and SFS streaming. MS uses the Azure chips because a majority of their AI work with Nvidia was in Azure. The cross platform benefits are
In my previous post I was arguing against it.
I understand that low-precision variables are enough for many cases of neural network inference, but in the case of DLSS it's probably still calculating pixel color values. If they downgraded the pixel color values down to 8bit then it would probably look really bad.

Furthermore, I don't know of any document or statement on DLSS that suggests it's using INT8 throughput. For all I know it's using FP16 Tensor FLOPs with FP32 accumulate (highest possible precision on the Tensor cores) which still has an enormous throughput on Turing and Ampere alike.



Microsoft didn't add INT8/4 support for RDNA2, AMD did.
There seem to be some custom optimizations made for the Series SoCs specifically for ML loads because Microsoft claimed as such, but it's not the 4/8x rates for INT8/4 because those are already present in Navi 21.



And Sony didn't?
Both consoles released 1 week apart from each other. Where did this idea that "Sony didn't wait for RDNA2" come from, other than pure speculation on the forum/twitter-sphere?
All that Microsoft's marketing material has been claiming is support for DX12 features, which the PS5 obviously hasn't any.
Have you even seen the Xbox reveal?
In my previous post I was arguing against it.
I understand that low-precision variables are enough for many cases of neural network inference, but in the case of DLSS it's probably still calculating pixel color values. If they downgraded the pixel color values down to 8bit then it would probably look really bad.

Furthermore, I don't know of any document or statement on DLSS that suggests it's using INT8 throughput. For all I know it's using FP16 Tensor FLOPs with FP32 accumulate (highest possible precision on the Tensor cores) which still has an enormous throughput on Turing and Ampere alike.



Microsoft didn't add INT8/4 support for RDNA2, AMD did.
There seem to be some custom optimizations made for the Series SoCs specifically for ML loads because Microsoft claimed as such, but it's not the 4/8x rates for INT8/4 because those are already present in Navi 21.



And Sony didn't?
Both consoles released 1 week apart from each other. Where did this idea that "Sony didn't wait for RDNA2" come from, other than pure speculation on the forum/twitter-sphere?
All that Microsoft's marketing material has been claiming is support for DX12 features, which the PS5 obviously hasn't any.
"We knew that many inference algorithms need only 8-bit and 4-bit integer positions for weights and the math operations involving those weights comprise the bulk of the performance overhead for those algorithms," says Andrew Goossen. "So we added special hardware support for this specific scenario. The result is that Series X offers 49 TOPS for 8-bit integer operations and 97 TOPS for 4-bit integer operations. Note that the weights are integers, so those are TOPS and not TFLOPs. The net result is that Series X offers unparalleled intelligence for machine learning."

They literally was touting the work they did to add this HW. Perhaps it is RDNA 2 that got it from MS. Like SFS. VRS Tier 2 which is also patented by MS.
 

ToTTenTranz

Banned
"We knew that many inference algorithms need only 8-bit and 4-bit integer positions for weights and the math operations involving those weights comprise the bulk of the performance overhead for those algorithms," says Andrew Goossen. "So we added special hardware support for this specific scenario. The result is that Series X offers 49 TOPS for 8-bit integer operations and 97 TOPS for 4-bit integer operations. Note that the weights are integers, so those are TOPS and not TFLOPs. The net result is that Series X offers unparalleled intelligence for machine learning."

They literally was touting the work they did to add this HW. Perhaps it is RDNA 2 that got it from MS. Like SFS. VRS Tier 2 which is also patented by MS.


None of what you quoted suggests Microsoft added quad-rate INT8 throughput to RDNA2's ALUs, because they didn't. As shown in this post above, those capabilities in the RDNA ALUs were already available on the Navi 14 cards that released a year before the SeriesX came out.
We don't know what "special hardware support" means, and for all I know it can be (and probably is) hardware acceleration for a handful of ML-specific instructions that may or may not be targeted at Azure server loads.

Of course that sampler feedback and VRS Tier 2 are patented by Microsoft. They're DirectX 12 features. Who else was supposed to apply for patents on DirectX12 features, other than the company that develops and supports DirectX?
 

RockOn

Member
None of what you quoted suggests Microsoft added quad-rate INT8 throughput to RDNA2's ALUs, because they didn't. As shown in this post above, those capabilities in the RDNA ALUs were already available on the Navi 14 cards that released a year before the SeriesX came out.
We don't know what "special hardware support" means, and for all I know it can be (and probably is) hardware acceleration for a handful of ML-specific instructions that may or may not be targeted at Azure server loads.

Of course that sampler feedback and VRS Tier 2 are patented by Microsoft. They're DirectX 12 features. Who else was supposed to apply for patents on DirectX12 features, other than the company that develops and supports DirectX?
Think some are missing point. Ther are basic features of RDNA 2 eg Hardware Ray Tracing, VRS, ML, Mixed integer etc made by AMD. PS5 is custom RDNA 2 and can use all them features even thou they don't use DirectX12U. DirectX12U is only an API and a means to control hardware feature thru software. PS5 will use its own APIs, probably an updated version on GNM/GNMX APIs. And as for SFS & Mesh shading, the PS5 has it own(PS5 has a custom IO block that takes care of streaming data & compressing/decompressing all in hardware using 0 CPU cycles) and culling geomentry/meshes etc will be done with the PS5 Custom Geomentry Engine(which Cerny explained is more advanced).

So just cos Microsoft call a feature by some fancy buzzword like Sampler feedback streaming, doesn't mean it not present in PS5 under another name
 

Omni_Manhatten

Neo Member
Think some are missing point. Ther are basic features of RDNA 2 eg Hardware Ray Tracing, VRS, ML, Mixed integer etc made by AMD. PS5 is custom RDNA 2 and can use all them features even thou they don't use DirectX12U. DirectX12U is only an API and a means to control hardware feature thru software. PS5 will use its own APIs, probably an updated version on GNM/GNMX APIs. And as for SFS & Mesh shading, the PS5 has it own(PS5 has a custom IO block that takes care of streaming data & compressing/decompressing all in hardware using 0 CPU cycles) and culling geomentry/meshes etc will be done with the PS5 Custom Geomentry Engine(which Cerny explained is more advanced).

So just cos Microsoft call a feature by some fancy buzzword like Sampler feedback streaming, doesn't mean it not present in PS5 under another name
I thought we was discussing the accelerated features and Int 8/4 support. If there is any evidence that the PS5 didn’t skip that feature by all means let us know. Full RDNA2 refers to the full support of accelerated feature sets. Even RDNA 1 supported most of these features. Where do you see that Sony has full RDNA2? That would include supported accelerated features. Like Sampler Feedback Streaming. VRS tier 2 ( which MS has a patent for) But I suppose only Sony gets to use those GE patents they have? So Sony can use any tool but MS is not allowed to have the same GE as PS5? I think you guys need to rethink your approach to how AMD produces these APU with Sony and MS. Both have custom features and it’s stupid to say the PS5 supports everything but just doesn’t use DX12U. They took different approaches but it’s all really the same? Oh boy.
 
And Sony didn't?
Both consoles released 1 week apart from each other. Where did this idea that "Sony didn't wait for RDNA2" come from, other than pure speculation on the forum/twitter-sphere?
All that Microsoft's marketing material has been claiming is support for DX12 features, which the PS5 obviously hasn't any.

It's commonly accepted that Sony started PS5 development earlier than Microsoft did Series development; the fact that Microsoft apparently were seriously considering what to do with the Xbox division from 2014-2016, and the budget for the division was scaled back going from that timeframe to even 2017, would lend support to this idea. That isn't to say they didn't start planning or prototyping until after this period; they'd of started much earlier. However, their rate of development on Series was probably slower during this time than Sony's rate of development on PlayStation 5.

As far as the "full RDNA 2, not full RDNA 2" stuff is concerned, at the end of the day it's kind of a moot point because unfortunately warriors on both sides have leveraged it as attack vectors against the other brand. Some Xbox people have obviously used it to denigrate PS5's design and supposed weaknesses, while some PlayStation fans have used it to implicitly denigrate Xbox by hyping up RDNA 3 features in PS5 and stressing "custom design choices" as if to insinuate one system is a "true" console and the other is "just a PC" (this also feeds back into the "smart engineering vs. brute force" angle that formed during the December - March period from a year ago).

Unfortunately, even prominent people in the gaming media played directly into this, again from both sides; you had Xbox channels like Dealer, Dee Batch, Colt (even now those same three channels are doing the same stuff, they tend to downtalk PS5 spec-wise regularly) etc. doing their stuff, and PlayStation channels like MBG (or tech-centric channels that would talk about consoles like Moore's Law Is Dead and Red Gaming Tech...the former of the two has been particularly bad at it while attempting to act as a neutral party but their own DMs (in trying to expose RGT) showed their own biases, let alone the phrasing they tended to use when discussing both platforms in relation to each other) doing it on the PlayStation side. I'm sure the "traditional" online media types did this as well, but I noticed the most fanboyism forming around these type of channels.

All of that just poisoned genuine discourse IMO, that includes even around places like gaming forums, such as ours. So really there's no point in trying to say "this system's full RDNA 2 (even though die shots already show both systems have a mixture of RDNA 2 AND RDNA 1 silicon, and are missing RDNA 2 staples like IC)" or "oh yeah? Well THIS system's RDNA 3! (or has features that might be part of RDNA 3 I can't actually get into specifics of because I don't actually really know what I'm talking about)", etc. anymore. Because the talking points that've formed around that stuff really doesn't mean shit.
 

RockOn

Member
It's commonly accepted that Sony started PS5 development earlier than Microsoft did Series development; the fact that Microsoft apparently were seriously considering what to do with the Xbox division from 2014-2016, and the budget for the division was scaled back going from that timeframe to even 2017, would lend support to this idea. That isn't to say they didn't start planning or prototyping until after this period; they'd of started much earlier. However, their rate of development on Series was probably slower during this time than Sony's rate of development on PlayStation 5.

As far as the "full RDNA 2, not full RDNA 2" stuff is concerned, at the end of the day it's kind of a moot point because unfortunately warriors on both sides have leveraged it as attack vectors against the other brand. Some Xbox people have obviously used it to denigrate PS5's design and supposed weaknesses, while some PlayStation fans have used it to implicitly denigrate Xbox by hyping up RDNA 3 features in PS5 and stressing "custom design choices" as if to insinuate one system is a "true" console and the other is "just a PC" (this also feeds back into the "smart engineering vs. brute force" angle that formed during the December - March period from a year ago).

Unfortunately, even prominent people in the gaming media played directly into this, again from both sides; you had Xbox channels like Dealer, Dee Batch, Colt (even now those same three channels are doing the same stuff, they tend to downtalk PS5 spec-wise regularly) etc. doing their stuff, and PlayStation channels like MBG (or tech-centric channels that would talk about consoles like Moore's Law Is Dead and Red Gaming Tech...the former of the two has been particularly bad at it while attempting to act as a neutral party but their own DMs (in trying to expose RGT) showed their own biases, let alone the phrasing they tended to use when discussing both platforms in relation to each other) doing it on the PlayStation side. I'm sure the "traditional" online media types did this as well, but I noticed the most fanboyism forming around these type of channels.

All of that just poisoned genuine discourse IMO, that includes even around places like gaming forums, such as ours. So really there's no point in trying to say "this system's full RDNA 2 (even though die shots already show both systems have a mixture of RDNA 2 AND RDNA 1 silicon, and are missing RDNA 2 staples like IC)" or "oh yeah? Well THIS system's RDNA 3! (or has features that might be part of RDNA 3 I can't actually get into specifics of because I don't actually really know what I'm talking about)", etc. anymore. Because the talking points that've formed around that stuff really doesn't mean shit.
The RDNA 3 features was taken out of context by Clueless Xbox youtubers(Colt/Dealer/Dee Batch/Rand/madz. What was said was that some custom PS5 features may get used in RDNA 3 by AMD(Cerny even hinted at it in the Road to PS5)
 
Last edited:
The RDNA 3 features was taken out of context by Clueless Xbox youtubers(Colt/Dealer/Dee Batch/Rand/madz. What was said was that some custom PS5 features may get used in RDNA 3 by AMD(Cerny even hinted at it in the Road to PS5)
Except Cerny didn't hint at that; go back and rewatch Road to PS5. He clearly gives a timeline for "later this year" in reference to those GPUs, which logically meant Fall/Winter 2020 and going into early 2021.

Basically, his comment was always in reference to RDNA 2 GPUs, and we can see some of that cross-pollination between aspects of their design and PS5's design in a few areas. However, no one at the time watching AMD's RDNA 2 GPU showcase/reveal were viewing it with an open mind, so they missed the parallels.

Now does that 100% rule out any possibility of certain PS5 elements showing up in RDNA 3? No. But seeing that all previous RDNA features tend to scale upward with the successive generation anyway, by that logic then yes we could technically say a PS5 feature or two will show up in RDNA 3, if it's already present in RDNA 2, because RDNA 2 design features generally carry forward to RDNA 3.
 

Omni_Manhatten

Neo Member
Except Cerny didn't hint at that; go back and rewatch Road to PS5. He clearly gives a timeline for "later this year" in reference to those GPUs, which logically meant Fall/Winter 2020 and going into early 2021.

Basically, his comment was always in reference to RDNA 2 GPUs, and we can see some of that cross-pollination between aspects of their design and PS5's design in a few areas. However, no one at the time watching AMD's RDNA 2 GPU showcase/reveal were viewing it with an open mind, so they missed the parallels.

Now does that 100% rule out any possibility of certain PS5 elements showing up in RDNA 3? No. But seeing that all previous RDNA features tend to scale upward with the successive generation anyway, by that logic then yes we could technically say a PS5 feature or two will show up in RDNA 3, if it's already present in RDNA 2, because RDNA 2 design features generally carry forward to RDNA 3.
The issue is RDNA 2 is a micro architecture. The manufacturers then custom the architecture to fit their custom needs. So the PS5 was never going to be RDNA3 it’s a custom RDNA 2 variant. It’s tech would still be RDNA 2 tech because it’s built off that architecture. Saying something is RDNA3 to make a point and belittle the work that went into the Series X just for fake specs that makes no sense. Cache scrubbers going to be RDNA3? Ok that’s fine but is that RDNA3 or still just a benefit of custom RDNA 2. So when it’s a joke to people that someone wants to claim RDNA3 features don’t look stupid trying to correct them. RDNA 3 would be an architecture available to any company who wants to pay for it. When it’s ready custom variants of that architecture will be made. Albeit PS5 pro or whatever. If Sony wants to keep that tech for themselves it isn’t RDNA3 it’s custom. So again the whole RDNA3 thing was always a farce and yes it was absolutely claimed by Sony fanboys for months to belittle Xboxes claims of full RDNA2.
 

rnlval

Member
Microsoft never added it, its a standard feature AMD added when evolving RDNA to RDNA 2

vDfnGEH.png

As the image shows Mixed interger started life in RDNA 1.1, and its the same in RDNA 2
NAVI 12 has mixed data types RPM

From https://github.com/llvm/llvm-project/commit/9ee272f13d88f090817235ef4f91e56bb2a153d6

// GFX1010: "target-features"="+16-bit-insts,+ci-insts,+dl-insts,+dpp,+flat-address-space,+gfx10-insts,+gfx8-insts,+gfx9-insts,+s-memrealtime"
// GFX1011: "target-features"="+16-bit-insts,+ci-insts,+dl-insts,+dot1-insts,+dot2-insts,+dot5-insts,+dot6-insts,+dpp,+flat-address-space,+gfx10-insts,+gfx8-insts,+gfx9-insts,+s-memrealtime"
// GFX1012: "target-features"="+16-bit-insts,+ci-insts,+dl-insts,+dot1-insts,+dot2-insts,+dot5-insts,+dot6-insts,+dpp,+flat-address-space,+gfx10-insts,+gfx8-insts,+gfx9-insts,+s-memrealtime"
// GFX1030: "target-features"="+16-bit-insts,+ci-insts,+dl-insts,+dot1-insts,+dot2-insts,+dot5-insts,+dot6-insts,+dpp,+flat-address-space,+gfx10-3-insts,+gfx10-insts,+gfx8-insts,+gfx9-insts,+s-memrealtime"

GFX1010 = NAVI 10
GFX1011 = NAVI 12
GFX1030 = NAVI 21

ibelfS7.png
 

rnlval

Member
Think some are missing point. Ther are basic features of RDNA 2 eg Hardware Ray Tracing, VRS, ML, Mixed integer etc made by AMD. PS5 is custom RDNA 2 and can use all them features even thou they don't use DirectX12U. DirectX12U is only an API and a means to control hardware feature thru software. PS5 will use its own APIs, probably an updated version on GNM/GNMX APIs. And as for SFS & Mesh shading, the PS5 has it own(PS5 has a custom IO block that takes care of streaming data & compressing/decompressing all in hardware using 0 CPU cycles) and culling geomentry/meshes etc will be done with the PS5 Custom Geomentry Engine(which Cerny explained is more advanced).

So just cos Microsoft call a feature by some fancy buzzword like Sampler feedback streaming, doesn't mean it not present in PS5 under another name
Mix FP not just mix integer.
 

rnlval

Member
It's commonly accepted that Sony started PS5 development earlier than Microsoft did Series development; the fact that Microsoft apparently were seriously considering what to do with the Xbox division from 2014-2016, and the budget for the division was scaled back going from that timeframe to even 2017, would lend support to this idea. That isn't to say they didn't start planning or prototyping until after this period; they'd of started much earlier. However, their rate of development on Series was probably slower during this time than Sony's rate of development on PlayStation 5.

As far as the "full RDNA 2, not full RDNA 2" stuff is concerned, at the end of the day it's kind of a moot point because unfortunately warriors on both sides have leveraged it as attack vectors against the other brand. Some Xbox people have obviously used it to denigrate PS5's design and supposed weaknesses, while some PlayStation fans have used it to implicitly denigrate Xbox by hyping up RDNA 3 features in PS5 and stressing "custom design choices" as if to insinuate one system is a "true" console and the other is "just a PC" (this also feeds back into the "smart engineering vs. brute force" angle that formed during the December - March period from a year ago).

Unfortunately, even prominent people in the gaming media played directly into this, again from both sides; you had Xbox channels like Dealer, Dee Batch, Colt (even now those same three channels are doing the same stuff, they tend to downtalk PS5 spec-wise regularly) etc. doing their stuff, and PlayStation channels like MBG (or tech-centric channels that would talk about consoles like Moore's Law Is Dead and Red Gaming Tech...the former of the two has been particularly bad at it while attempting to act as a neutral party but their own DMs (in trying to expose RGT) showed their own biases, let alone the phrasing they tended to use when discussing both platforms in relation to each other) doing it on the PlayStation side. I'm sure the "traditional" online media types did this as well, but I noticed the most fanboyism forming around these type of channels.

All of that just poisoned genuine discourse IMO, that includes even around places like gaming forums, such as ours. So really there's no point in trying to say "this system's full RDNA 2 (even though die shots already show both systems have a mixture of RDNA 2 AND RDNA 1 silicon, and are missing RDNA 2 staples like IC)" or "oh yeah? Well THIS system's RDNA 3! (or has features that might be part of RDNA 3 I can't actually get into specifics of because I don't actually really know what I'm talking about)", etc. anymore. Because the talking points that've formed around that stuff really doesn't mean shit.
PC RDNA 2's hardware feature set was being driven by NVIDIA's market and design leadership e.g. AMD has to follow NVIDIA's BVH raytracing use case tech demo.
"Full" RDNA 2 features already exist in Turing RTX. Turing RTX still has extra hardware like Tensors and BVH transverse which runs on normal shaders with RDNA 2.
 

rnlval

Member
ML is more about software than hardware.

Higher precision or low integer helps to speed up the process but the key is the smart coded software that makes it good... logic is what makes ML a thing.

Said that there a lot of conflitante reports about PS5 supporting or not INT3/INT8.
ML can be run on any DX11 GPU. The argument is about process efficiency via hardware changes or additional hardware.
 

ethomaz

Banned
ML can be run on any DX11 GPU. The argument is about process efficiency via hardware changes or additional hardware.
To be exactly is about the GPU supporting Packed Math INT4/INT8 via hardware.
Seems like RDNA was suppose to support it but it was broken (a silicon bug not fixable)... so AMD just fixed it in the RDNA derived cards.
RDNA 2 have it fixed too.

So the difference was if the chip on PS5 supported Packed Math INT4/INT8... Series X we have confirmation.
 

rnlval

Member
To be exactly is about the GPU supporting Packed Math INT4/INT8 via hardware.
Seems like RDNA was suppose to support it but it was broken (a silicon bug not fixable)... so AMD just fixed it in the RDNA derived cards.
RDNA 2 have it fixed too.

So the difference was if the chip on PS5 supported Packed Math INT4/INT8... Series X we have confirmation.

RX 6800 series was confirmed quad-rate INT8 or octa-rate INT4 processing via the LLVM project. We know GFX1030/NAVI 21 is a strict superset of GFX1011/NAVI 12.

Any DX11 GPU can process lesser data types for ML but it's without double rate or quad-rate or octa-rate processing. Extra hardware features lead to a larger chip area.

NAVI 10 baseline IP has double rate INT16 and FP16. The double rate INT16 feature can process INT4 and INT 8 at a double rate processing.

Microsoft has claimed Series X offers 49 TOPS for 8-bit integer operations and 97 TOPS for 4-bit integer operations.

NVIDIA's RTX Tensor cores are more suitable for machine learning hence AMD's CDNA has Tensor cores.
 
Last edited:

ethomaz

Banned
RX 6800 series was confirmed quad-rate INT8 or octa-rate INT4 processing via the LLVM project. We know GFX1030/NAVI 21 is a strict superset of GFX1011/NAVI 12.

Any DX11 GPU can process lesser data types for ML but it's without double rate or quad-rate or octa-rate processing. Extra hardware features lead to a larger chip area.

NAVI 10 baseline IP has double rate INT16 and FP16. The double rate INT16 feature can process INT4 and INT 8 at a double rate processing.

Microsoft has claimed Series X offers 49 TOPS for 8-bit integer operations and 97 TOPS for 4-bit integer operations.
That is my point.... RDNA baseline have broken quad-rate INT8 and octa-rate INT4.... it has the silicon support but it is broken... the new cards based in RDNA already has it fixed (it just Navi 10 will continue with it broken).

For example Navi 12 and late chips already has it fixed... that include any Navi 2x chip.

The chances to PS5 doesn't have Packed INT4/INT8 maths are really near 0.
But you know some fanbase will tell you there is no confirmation.

PS5 probably has 41.2 TOPS for 8-bits integer and 82.4 TOPS for 4 bit integer.
 
Last edited:

rnlval

Member
That is my point.... RDNA baseline have broken quad-rate INT8 and octa-rate INT4.... it has the silicon support but it is broken... the new cards based in RDNA already has it fixed (it just Navi 10 will continue with it broken).

For example Navi 12 and late chips already has it fixed... that include any Navi 2x chip.

The chances to PS5 doesn't have Packed INT4/INT8 maths are really near 0.
But you know some fanbase will tell you there is no confirmation.


PS5 probably has 41.2 TOPS for 8-bits integer and 82.4 TOPS for 4 bit integer.

5KHIOmZ.png


Rosario Leonardi is Principal Graphics Engineer at Sony Interactive Entertainment https://uk.linkedin.com/in/🐱💻-rosario-leonardi-ab10046

XSX's extra hardware features are useless if the middleware software is not up to scratch like NVIDIA RTX's.
 
Last edited:

ethomaz

Banned
5KHIOmZ.png


Rosario Leonardi is Principal Graphics Engineer at Sony Interactive Entertainment https://uk.linkedin.com/in/🐱💻-rosario-leonardi-ab10046
There is no added silicon for ML just what you have on RDNA that uses the CU for ML that is what you have on Series X.
Only nVidia has ML dedicated silicon.

nVidia = dedicated ML units
AMD = ML in CUs thought RPM with INT4/INT8/INT16 support in all RDNA / RDNA 2 cards (Navi 10 having a bug).

Like I said unless Sony asked to RPM be removed (that makes no sense because they use RPM INT16 for BC) the chances to not have RPM INT4/INT8 is zero.

All RDNA cards have CUs with RPM INT16, INT8 and INT4 support.
 
Last edited:

rnlval

Member
There is no added silicon for ML just what you have on RDNA that uses the CU for ML that is what you have on Series X.
Only nVidia has ML dedicated silicon.

nVidia = dedicated ML units
AMD = ML in CUs thought RPM with INT4/INT8/INT16 support in all RDNA / RDNA 2 cards (Navi 10 having a bug).
OXYSH1t.png


NAVI 10 lacking multi-precision's quad-rate and octa-rate processing are optional i.e. it's not a bug. NAVI 10's die size is at 251 mm2 and AMD's restricting chip area size growth for x700 class SKUs i.e. HD 7800 has 212 mm2, RX 480 has 232 mm2.

Xbox Series S RDNA 2 GPU doesn't have octa-rate INT4 and quad-rate INT8 ML hardware features.
 
Last edited:

ethomaz

Banned
OXYSH1t.png


NAVI 10 lacking multi-precision's quad-rate and octa-rate processing are optional i.e. it's not a bug. NAVI 10's die size is at 251 mm2 and AMD's restricting chip area size growth for x700 class SKUs i.e. HD 7800 has 212 mm2, RX 480 has 232 mm2.
It is a bug lol
Navi 10 has the silicon for INT8/INT4 RPM... it is just bugged... Navi 10 and Navi 12 have the same CU size.
Yes Series S supports RPM INT8 and INT4.

AMD internally calls the fixed RPM INT8/INT4 RDNA as RDNA 1.1... only Navi 10 used the bugged version.

BTW you just need to do a big research to understand the PS5 guys was talking about dedicated ML units.




All RDNA cards uses CUs for ML.
 
Last edited:

rnlval

Member
It is a bug lol
Navi 10 has the silicon for INT8/INT4 RPM... it is just bugged... Navi 10 and Navi 12 have the same CU size.
Yes Series S supports RPM INT8 and INT4.

AMD internally calls the fixed RPM INT8/INT4 RDNA as RDNA 1.1... only Navi 10 used the bugged version.

BTW you just need to do a big research to understand the PS5 guys was talking about dedicated ML units.




David Cage, CEO and founder of Quantic Dream,

It is always challenging to compare hardware, as they always have advantages and disadvantages. It is not just a matter of CPU or frequency; it is more about the consistency of the components and the possibilities of advanced features.

The CPU of the two consoles uses the same processor (slightly faster on Xbox Series X), the GPU of the Xbox also seems more powerful, as it is 16% faster than the PS5 GPU, with a bandwidth that is 25% faster. The transfer speed from the SSD is twice as fast on PS5.

The shader cores of the Xbox are also more suitable to machine learning, which could be an advantage if Microsoft succeeds in implementing an equivalent to Nvidia’s DLSS (an advanced neural network solution for AI).

Overall, I think that the pure analysis of the hardware shows an advantage for Microsoft, but experience tells us that hardware is only part of the equation: Sony showed in the past that their consoles could deliver the best-looking games because their architecture and software were usually very consistent and efficient.
--------------


Notice the context is about shader cores (hardware) more suitable to ML and NOT about ML software
 
Last edited:

ethomaz

Banned
David Cage, CEO and founder of Quantic Dream,

It is always challenging to compare hardware, as they always have advantages and disadvantages. It is not just a matter of CPU or frequency; it is more about the consistency of the components and the possibilities of advanced features.

The CPU of the two consoles uses the same processor (slightly faster on Xbox Series X), the GPU of the Xbox also seems more powerful, as it is 16% faster than the PS5 GPU, with a bandwidth that is 25% faster. The transfer speed from the SSD is twice as fast on PS5.

The shader cores of the Xbox are also more suitable to machine learning, which could be an advantage if Microsoft succeeds in implementing an equivalent to Nvidia’s DLSS (an advanced neural network solution for AI).

Overall, I think that the pure analysis of the hardware shows an advantage for Microsoft, but experience tells us that hardware is only part of the equation: Sony showed in the past that their consoles could deliver the best-looking games because their architecture and software were usually very consistent and efficient.
--------------


Notice the context is about shader cores (hardware) more suitable to ML and NOT about ML software
Series X has more sharder cores... yes.

It is unbelievable how Xbox fans twist things lol your two examples is a classic show... that why the group is know for FUD.
 
Last edited:

rnlval

Member
Series X has more sharder cores... yes.

It is unbelievable how Xbox fans twist things lol your two examples is a classic show... that why the group is know for FUD.
Microsoft's engineers specifically highlighted these capabilities.

[...] we have gone even further introducing additional next-generation innovation such as hardware accelerated Machine Learning capabilities for better NPC intelligence, more lifelike animation, and improved visual quality via techniques such as ML powered super resolution
 

rnlval

Member
Series X has more sharder cores... yes.

It is unbelievable how Xbox fans twist things lol your two examples is a classic show... that why the group is know for FUD.
Note that I have asked multiple Sony engineers about PS5 GPU's octa-rate INT4 and quad-rate INT8 hardware capability and I'm waiting for the replies
 

ethomaz

Banned
Note that I have asked multiple Sony engineers about PS5 GPU's octa-rate INT4 and quad-rate INT8 hardware capability and I'm waiting for the replies
We will have your "answer" pretty soon.
FidelityFX SuperResolution is about to use these INT4/INT8/INT16 RPM at max... let's see if PS5 won't support it.
 

rnlval

Member
We will have your "answer" pretty soon.
FidelityFX SuperResolution is about to use these INT4/INT8/INT16 RPM at max... let's see if PS5 won't support it.
That's a wrong question to ask since INT4/INT8 workloads can run at the same rate as FP32 or double rate FP16. The difference is time to complete the workload i.e. it wouldn't break the software.

This is why I specifically ask for octa-rate INT4 and quad-rate INT8 processing capability. I already know ML workloads can run any DX11 GPU.


Series X has more sharder cores... yes.
That's your interpretation.

It is unbelievable how Xbox fans twist things lol your two examples is a classic show... that why the group is know for FUD.
Sony fans have their CPU L3 unified cache smug and it was debunked in addition to reduced FPU capability.
 
Last edited:

ethomaz

Banned
That's a wrong question to ask since INT4/INT8 workloads can run at the same rate as FP32 or double rate FP16. The difference is time to complete the workload i.e. it wouldn't break the software.

This is why I specifically ask for octa-rate INT4 and quad-rate INT8 processing capability. I already know ML workloads can run any DX11 GPU.



That's your interpretation.


Sony fans have their CPU L3 unified cache smug and it was debunked in addition to reduced FPU capability.
Again INT4/INT8 RPM.
 

rnlval

Member
Again INT4/INT8 RPM.
To be determined just as PS5 CPU's unified L3 cache smug.

You can't assume PS5 GPU has the "full RDNA 2" when WGP design does seem closer to RDNA 1 along with the render backends, hence every major hardware feature after NAVI 10 needs to be verified e.g. hardware-accelerated raytracing was confirmed for PS5 after social media storm.


For XSX GPU, MS has confirmed (being transparent) "full RDNA 2" in relation to hardware features that underpin DX12U, quad-rate INT8 and octo-rate INT4 hardware ML features.
 
Top Bottom