• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ps5 may support zen 3 feature

...and yet it hasn't been confirmed by Sony yet. You think if this were a thing, it would have been confirmed by now. It's not like NDAs are holding them up anymore 🤷‍♂️

I honestly thought we were beyond the spec threads especially this close to release.

Some people are still stuck on winning a power narrative even if they told themselves they were over it earlier in the year, simply by conveniently shifting the definition of "power" to an ever-widening scale of applicable metrics. Now it's Zen 3 unified cache. Tomorrow, it could be a special block of ReRAM on the Toshiba NAND modules.

It's like a roulette wheel on The Price is Right; you never really win big but if you're even halfway competent you won't flat-out lose everything on a bad spin, either. Take the 50/50 and see what kind of sticks.
 
Last edited:
No they wouldn't. I don't say it's true, I don't know, but I know nobody cares about Zen 3 or RDNA 3 outside of gaffers and tech hardcore enthusiasts, people care about games and the way they look, so that's what Sony shows. There is maybe 1 % of their user base who cares about the names of their hardware parts and APIs. Yet Demon's Souls gameplay demo #2 already has 2.5 million views.

That's what we should care about too. Nobody here understands how a console works. People here can't even understand how the frequency works on PS5, so no chance they understand RDNA or the way consoles work, which is a lot more complex. Hint : it's not about terafloops, there are tons of parameters changing the performance and cache is a major parameter.

If I'm not mistaken, Matt Hargett, former playstation engineer, also hinted at the fact cache was a more important parameter than a small frequency difference. It may or may not mean he knows something... It was on his Twitter.

The unified L3 Cache on the CPU is almost certain. Matt circled around his NDA, but he did hint at it in one of his tweet, indicating that the latency gains would be more important than a small frequency delta. The geometry engine is also highly customized, and, as has been rumoured, one of the main rendering drivers of the system. Nothing really new in that video, but it presents a nice summary of some of the particulars of the system.
 
Last edited:
The unified L3 Cache on the CPU is almost certain. Matt circled around his NDA, but he did hint at it in one of his tweet, indicating that the latency gains would be more important than a small frequency delta. The geometry engine is also highly customized, and, as has been rumoured, one of the main rendering drivers of the system. Nothing really new in that video, but it presents a nice summary of some of the particulars of the system.

Okay but here's the issue: the NDAs over Zen 3 would more or less be up by now, since AMD's already publicly revealed Zen 3 and Zen 3 processors will be coming to market very, very soon.

Hint. Speculate. Rumor. Etc, etc. That's my main rub with a lot of these PS5 speculations at this late in the game: by now we usually have pretty much a good enough suite of confirmed features. We did for PS4 this gen, so I'm inclined to believe anything of note Sony wanted to speak up on they already did in March. And if a unified L3$ on the CPU is such an important feature (it certainly helps), they could've at least hinted at it back in Road to PS5. They could basically confirm it now, even with a general blanket statement similar to what MS's done with RDNA 2 support on their systems.

The fact neither of these things have happened basically suggests to me these rumors aren't getting any further than just speculation, and in fact Sony's strategy seems to be what I guessed it would be: if you can't boast of certain hardware features outright, just let the games do the talking. And they've been doing that since the UE5 demo back in May. They've been very on-point in this regard, arguably more than Microsoft (for a myriad of reasons), but when pairing that with the lack of official confirmation on certain hardware features upfront, just shows me where they likely stand in that regard.

That all said, when you've got games as good-looking as R&C Rift Apart, DS Remake, Miles Morales etc., you don't even need to expose any potential weaknesses in going into the spec game. Not to say MS doesn't have some lookers on their end too, but most of them are 2021 onward, and most of them are 3P games. The only confirmed 1P game on MS's front in-league with the PS5 stuff I just mentioned coming this year (to the Series consoles) is the Gears 5 update (unless I'm forgetting anything else?).

You figure that would be enough for some folks but nope, they're still as obsessed with the power narrative as they were in March or even last December; the definition of "power" has just mutated several times since late-March apparently 🤷‍♂️
 
Last edited:

Krisprolls

Banned
Personally I'm perfectly fine with letting the games do the talking, DS remake, R&C or Mile Morales look fantastic, I don't need to know the tech behind.

But some people (including DF) said there would be a surprise, so everybody here is speculating.
 
Last edited:

BrentonB

Member
Mark Cerny specifically mentioned that upcoming AMD hardware would also have several features in common with the PS5- a result of Sony and AMD's collaboration. Perhaps this is part of what he was referring to?
 

Lethal01

Member
Posters on this website are obsessed with a phantom "secret sauce" in PS5. We've known the specs for months. Accept the console for what it is or buy XSX or a gaming PC. The era of FUD and misinformation is tiresome and must end.

We know the general specs, we don't know every part of the device, don't be silly.
There is no secret sauce, we just don't really have the details of how the system is built.
 
Last edited:

Lethal01

Member
How long until we ban RGT? Really gonna need some proof for this shit. Why would he have any inside information?

He literally does interview with people working with AMD and sony as well as other people in the field. It's a guarantee he has sources.
 
RGT said GPU does not have an additional large SRAM. He said his sources said that is not the case. It's not the same as a large cache as that of Infinity Cache. He said it's about coherency and fast access to cache. Timestamped.




  • Geometry Engine, according to his sources, is totally custom. Sony basically designed it. "According to developers, it culls geometry even earlier and with more control than a mesh shader."
  • VRS is part and parcel of Geometry Engine.
 
Last edited:

assurdum

Banned
I'm really curious to know why so many people who doesn't believe on RGT are so invested to waste their time in thread about his guesses just to troll . I don't get it
 
Last edited:

rnlval

Member
The unified L3 Cache on the CPU is almost certain. Matt circled around his NDA, but he did hint at it in one of his tweet, indicating that the latency gains would be more important than a small frequency delta. The geometry engine is also highly customized, and, as has been rumoured, one of the main rendering drivers of the system. Nothing really new in that video, but it presents a nice summary of some of the particulars of the system.



Matt didn't confirm PS5's Zen 2 unified L3 cache.

PC Zen 2 APU doesn't have a separate I/O chipset which resulted in slightly lower latency and countered smaller L3 cache storage.
 
Last edited:

assurdum

Banned
...and yet it hasn't been confirmed by Sony yet. You think if this were a thing, it would have been confirmed by now. It's not like NDAs are holding them up anymore 🤷‍♂️



Some people are still stuck on winning a power narrative even if they told themselves they were over it earlier in the year, simply by conveniently shifting the definition of "power" to an ever-widening scale of applicable metrics. Now it's Zen 3 unified cache. Tomorrow, it could be a special block of ReRAM on the Toshiba NAND modules.

It's like a roulette wheel on The Price is Right; you never really win big but if you're even halfway competent you won't flat-out lose everything on a bad spin, either. Take the 50/50 and see what kind of sticks.
I'm genuinely asking if some of you getting the point of his videos. This video is about guesses and rumours and not just secret sauce. I'd like to know where he tries to shift the definition of power in the 2 console when he never implied a single time that ps5 hardware is more powerful of series X.
 
Last edited:

rnlval

Member
Good news if true. How much better is Zen 3 over 2 because of this unified cache?
Zen 3 includes improvements with AVX in addition to lower latency 8 cores CCX's unified L3 cache.

From https://www.sisoftware.co.uk/2020/1...-benchmarks-cpu-8-core-16-thread-performance/

SiSoftware's Executive Summary:
Zen3 (X, 8-core) is ~25-40% faster than Zen2 (X, 8-core) across all kinds of algorithms. We have to give it 10/10 overall!

Despite no major architectural changes over Zen2 (except larger 8-core single CCX layout and thus unified L3 cache), Zen3 manages to be quite a bit faster across legacy and heavily vectorised SIMD algorithms: it beats the competition even with AVX512 and more cores (e.g. 10-core SKL-X). Even streaming algorithms (memory-bound) improve over 20%. We certainly did not expect performance to be this good.

In effect, it is like getting 50% more cores – 8-core Zen3 performs like a 12-core Zen2 (e.g. 3900X) – and thus even a 10-core 10900K cannot compete. Considering you can just “pop it” into an existing AM4 mainboard (requires a BIOS update to support it) it is a massive upgrade from say, original Zen1/Zen+.

---------


AMD didn't elaborate on Zen 3's AVX improvements.
 
Last edited:

MastaKiiLA

Member
Is there an RGT video where he states that the CPUs user 8MB of shared L2 cache? The link in the OP is not to an RGT video.

RGT has been one of the best sources on this forum, as his AMD sources are legit. His speculation is very pointed, where if he made a claim as specific as 8MB of share L2 cache, I'd believe him. He's not some fake insider who spouts off vagueries. That said, I'm having difficulty finding any such claim from him.

EDIT: I think some of you are mistaking RGT for MLiD. Moore's Law is Dead is the YTer who recently made some inflammatory comments, and he uses RGT as a source on some of his stuff. But they're not the same person, and RGT is the more trusted of the pair.
 
Last edited:

rnlval

Member
Is there an RGT video where he states that the CPUs user 8MB of shared L2 cache? The link in the OP is not to an RGT video.

RGT has been one of the best sources on this forum, as his AMD sources are legit. His speculation is very pointed, where if he made a claim as specific as 8MB of share L2 cache, I'd believe him. He's not some fake insider who spouts off vagueries. That said, I'm having difficulty finding any such claim from him.

EDIT: I think some of you are mistaking RGT for MLiD. Moore's Law is Dead is the YTer who recently made some inflammatory comments, and he uses RGT as a source on some of his stuff. But they're not the same person, and RGT is the more trusted of the pair.
The shared L2 cache is not correct. The debate is about shared L3 cache.

"To be, or not to be ". William Shakespeare's play Hamlet, Act 3, Scene 1
 

rnlval

Member
RGT said GPU does not have an additional large SRAM. He said his sources said that is not the case. It's not the same as a large cache as that of Infinity Cache. He said it's about coherency and fast access to cache. Timestamped.




  • Geometry Engine, according to his sources, is totally custom. Sony basically designed it. "According to developers, it culls geometry even earlier and with more control than a mesh shader."
  • VRS is part and parcel of Geometry Engine.

FYI, the earliest geometry cull is at the CPU side when CPU's geometry dot control points are being processed. Sloppy programming sends unseen geometry control points to the GPU and relies on the GPU to cull it.
 
Last edited:

jaysius

Banned
Yes, Sony has put all kinds of expensive tech they didn't hype up, and AMD didn't officially announce a week before launch because?? SECRETS!!! This is amazing.

I can't wait until next week when we hear that Sony has SECRETLY put a nuclear reactor in the PS5s.
 

assurdum

Banned
Yes, Sony has put all kinds of expensive tech they didn't hype up, and AMD didn't officially announce a week before launch because?? SECRETS!!! This is amazing.

I can't wait until next week when we hear that Sony has SECRETLY put a nuclear reactor in the PS5s.
There isn't anything of expensive about it. Jeez why people continues to talk about stuff which barely get it. The more annoying thing it's to see in post like this the idiocy elevate to a smart attitude.
 
Last edited:

bitbydeath

Member
Yes, Sony has put all kinds of expensive tech they didn't hype up, and AMD didn't officially announce a week before launch because?? SECRETS!!! This is amazing.

I can't wait until next week when we hear that Sony has SECRETLY put a nuclear reactor in the PS5s.

They did hype it up though.
Road To PS5 said:
If you see a similar discrete GPU available as a PC card at roughly the same time as we release our console that means our collaboration with AMD succeeded.

We don’t know what feature the above is referring too.
 
Last edited:
I am not sure what's bigger BS, Team Green beating through their chest that 12tflops means their console will output games at native 4K60fps across the board or that velocity architecture will provide them even bigger SSD speeds than PS5 or Team Blue coming up with Zen 3 feature set or rdna 3 feature set in PS5 malarkey
 
Last edited:
FYI, the earliest geometry cull is at the CPU side when CPU's geometry dot control points are being processed. Sloppy programming sends unseen geometry control points to the GPU and relies on the GPU to cull it.

Is that still the case with an APU? CPU and GPU sharing the same RAM and probably with shared cache as well?
 
Last edited:

rnlval

Member
Is that still the case with an APU? CPU and GPU sharing the same RAM and probably with shared cache as well?
Under AMD Fusion,
1. CPU and GPU can share pointers for implied data transfers.
2. CPU-GPU fusion link is the direct path for data exchange between the CPU and GPU.

Certain workloads shouldn't have CPU interference e.g. ROPS operations with the burst read/write modes. CPU should stay out of GPU's rasterization.

Certain workloads can be shared between the CPU and GPU e.g. geometry control points related.


Xbox-Block.jpg




Intel's L3 shared cache with CPU and GPU design for Skylake with Iris Pro IGP

bmuak.jpg
 
I'm genuinely asking if some of you getting the point of his videos. This video is about guesses and rumours and not just secret sauce. I'd like to know where he tries to shift the definition of power in the 2 console when he never implied a single time that ps5 hardware is more powerful of series X.

The guesses and rumors usually pertain to "custom features" that, by the nature of us not knowing a lot about them (due to lack of documentation), would make them, essentially, secret sauce. And the implication of such things is usually almost always to suggest whatever is featuring them, to somehow leapfrog over whatever seemingly doesn't have said features. These things are never directly stated but that's kind of not the point.

And I didn't mean to say the definition of "power" has been transferred in a literal sense. It's more like, the interpretation of "power" going from a more strict, literal, traditional definition (i.e teraflops), to a more loose, relative, non-traditional definition. Generally, since the primary motivating factors behind ascribing aspects of a system design traditionally associated with the power narrative (GPU) was to assert that whichever won that narrative to be the "better" system, what we've observed since late March is a certain contingent of people shifting the interpretation of power into other aspects of a system's design, such as the SSD, to therein frame their talking points around using these new elements that fall outside of the traditional metrics, as being the new deciding factors of what design is supposedly superior.

However, it can't be helped that almost anytime these metrics are changed up, it tends to follow a pattern of benefiting a particular platform, at the expense of a particular other platform, and that pattern's held consistent for months. Moreover, the same desires that tend to have motivated the framing of "power" in the traditional sense (aka to assert superiority of whatever platform's GPU was the stronger of the two), are the EXACT same desires that have been fueling the shifts of changing around the terms of "power" to more contextual, relative terms fitting for other metrics of measure, usually with the pattern of being in preference of a particular platform/brand, and to shift narratives into a direction that fits these new metrics of focus.

As far as I'm concerned, these latest rounds of PS5 RDNA 3 rumors are just more of the same thing, just a bit more refined. The biggest irony being some of these same people (like RGT in this video) assert that Sony's strategy isn't to harp on the technical features anymore and "let the games do the talking"...so why are people like RGT so adamant in doing the complete opposite? Well, it's likely because for them, the general idea of the power narrative is still important to these types to win for their preferred platform, so conjuring new speculations that seemingly put their preferred platform in a more "advanced" light helps to fulfill and satisfy this desire that's within them, and can be done in a way that doesn't necessarily force their preference to the forefront (as that could then amplify into a bias that can be more easily interpreted as a negative bias towards whatever doesn't fall into being their preferred platform/brand, and ruins the image of neutrality they wish to maintain).

They did hype it up though.


We don’t know what feature the above is referring too.

We might, actually.

Pretty much all of the GPUs AMD showed off can have Boost clocks of over 2 GHz. Cerny spoke of PS5's GPU clock as a "continuous Boost mode" at Road to PS5. AMD are introducing SmartShift with RDNA2 including a variant that mimics what Sony is doing on the PS5 WRT variable frequency between the CPU and GPU.

Care to take a guess what the feature is? Let's be logical about it: the feature is very likely the variable frequency through stuff like SmartShift and the other thing AMD talked about on Wednesday (I forgot the name) which allows for dynamic power sharing between their CPUs and discrete GPUs.

The signs are literally right in people's faces but they are drunk on some fantastical unicorn of hype RDNA 3 features, even though RDNA 3 won't even arrive until early 2022 🤷‍♂️

Is there an RGT video where he states that the CPUs user 8MB of shared L2 cache? The link in the OP is not to an RGT video.

RGT has been one of the best sources on this forum, as his AMD sources are legit. His speculation is very pointed, where if he made a claim as specific as 8MB of share L2 cache, I'd believe him. He's not some fake insider who spouts off vagueries. That said, I'm having difficulty finding any such claim from him.

EDIT: I think some of you are mistaking RGT for MLiD. Moore's Law is Dead is the YTer who recently made some inflammatory comments, and he uses RGT as a source on some of his stuff. But they're not the same person, and RGT is the more trusted of the pair.

I agree that RGT is a much more credible lad than MLiD and Moore's more or less made some of the dumbest guesses regarding some aspects of the next-gen console designs I've seen around. However that doesn't mean RGT is impervious to bad speculation or having bad sources on certain things.

What I find too perplexing in all of this is, why are we relying on random secret devs to parlay these supposed features to us, this late towards the early phase of the systems right before they launch, and yet even with these sort of things they're as vague as they've ever been? It simply doesn't stack up too well in my book.

In the latest vid he's seemingly making reference to Sony in-house features that could've been plucked by AMD for RDNA 3...seeing that Sony likely shot first in finalizing their spec earlier, wouldn't that have given AMD enough time to include some of these same things in...RDNA 2? Especially if they're seemingly so good, why leave that performance on the table when they could take it to Nvidia even moreso, sooner? I don't think they would be so chaotic as to tell Sony "Hey! Look we're gonna hold off on putting any of your features in our GPUs for a couple years, 'kay?". If Sony's partnership with AMD is so strong, surely Sony would've been helpful in implementing these features into AMD's GPU designs sooner rather than later, correct?

So, the timeline on that front is completely botched and these latest rounds of speculation are hurt massively because of that when it comes to credibility. In fact, take a look at bitbydeath bitbydeath 's quote from Cerny from Road to PS5. Cerny clearly referred to GPUs coming to the market at the time of PS5's launch, not two years down the line. So...what feature does Sony have, that we've also seen from AMD's GPU lineup for RDNA 2, that seem similar, and therefore would indicate a successful collaboration?

Welp, it's variable frequency. AMD has their own variable frequency setup for their Zen 3 CPUs and RDNA 2 GPUs that is part of the reason their GPUs can hit such high frequencies at Boost mode clocks. That's very likely what Cerny was hinting at when you look at the timeline and the likelihood of things. However, even guys like RGT are so far up in the clouds with these other "exotic, fantastical features" that they can't see the forest from the trees on this one.
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
I'm pretty sure Sony would tell us if they had any support for RDNA 3 features. But if a youtuber says so I guess it's true!

Are you sure Sony will tell us? They haven't really said much before the PS5 launch.
 
Under AMD Fusion,
1. CPU and GPU can share pointers for implied data transfers.
2. CPU-GPU fusion link is the direct path for data exchange between the CPU and GPU.

Certain workloads shouldn't have CPU interference e.g. ROPS operations with the burst read/write modes. CPU should stay out of GPU's rasterization.

Certain workloads can be shared between the CPU and GPU e.g. geometry control points related.


Xbox-Block.jpg




Intel's L3 shared cache with CPU and GPU design for Skylake with Iris Pro IGP

bmuak.jpg


Oh. Bodes well for PS5's geometry engine then. APU's tight integration between CPU and GPU with Cerny's brand new geometry engine.
 

longdi

Banned
...and yet it hasn't been confirmed by Sony yet. You think if this were a thing, it would have been confirmed by now. It's not like NDAs are holding them up anymore 🤷‍♂️

Some people are still stuck on winning a power narrative even if they told themselves they were over it earlier in the year, simply by conveniently shifting the definition of "power" to an ever-widening scale of applicable metrics. Now it's Zen 3 unified cache. Tomorrow, it could be a special block of ReRAM on the Toshiba NAND modules.

It's like a roulette wheel on The Price is Right; you never really win big but if you're even halfway competent you won't flat-out lose everything on a bad spin, either. Take the 50/50 and see what kind of sticks.

AFAIK Sony fanbase are the masters of secret sauce peddling.
I have no idea why it is always them every gen?
Sony drip hints of their 'custom' hardware, and this group goes wild with secret sauces.. 🤷‍♀️
 
Uh why exactly?

he is full of shit. The same as this Roberto guy, uk Fox gamer. Or whatever all those wannabe insiders are called that know nothing at all and are just guessing.
the only reason why he is mentioned here is because fanboys love to eat his shit up.
he even said that PS5 has RDNA 3.0 features lol
Don’t believe his lies
 
AFAIK Sony fanbase are the masters of secret sauce peddling.
I have no idea why it is always them every gen?
Sony drip hints of their 'custom' hardware, and this group goes wild with secret sauces.. 🤷‍♀️

To be perfectly fair, some PS dudes are definitely on the secret sauce high this time around, but some Xbox people like Misterxmedia were equally crazy about it with the XBO. The whole "hidden dGPU in the power brick" stuff didn't become a meme for nothing ;)

We're just witnessing the shoe on the other foot this time around, that's all. I mean the real answer for this feature Sony were hinting at is right in front of everyone's faces but it almost feels like these latest RDNA 3 rumors are misdirects.
 

longdi

Banned
To be perfectly fair, some PS dudes are definitely on the secret sauce high this time around, but some Xbox people like Misterxmedia were equally crazy about it with the XBO. The whole "hidden dGPU in the power brick" stuff didn't become a meme for nothing ;)

We're just witnessing the shoe on the other foot this time around, that's all. I mean the real answer for this feature Sony were hinting at is right in front of everyone's faces but it almost feels like these latest RDNA 3 rumors are misdirects.

I think Misterxmedia was an outlier,or sent to do damage control about XBO.

Whereas, every gen, there are Sony fanbase speculating about great secrets engineered by SCE team.
 

rnlval

Member
he is full of shit. The same as this Roberto guy, uk Fox gamer. Or whatever all those wannabe insiders are called that know nothing at all and are just guessing.
the only reason why he is mentioned here is because fanboys love to eat his shit up.
he even said that PS5 has RDNA 3.0 features lol
Don’t believe his lies

"RDNA 3.0" X feature means little when RX 6800 has 96 ROPS, 6 Shader Engines, and a very fast 128 MB Infinity Cache designed for 4K frame buffers.

Remember, XBO's 32MB eSRAM can handle 1600x900p frame buffer without delta color compression.

128 MB can contain 4K resolution frame buffer with delta color compression.
 


About zen 3 unified cache and source

According to Redgamingtech who is pretty much well known. On hes youtube around 21.25 he states that ps5 is using zen 2 cores but has 8mb L3 unified cache which u would find on the new zen 3 processors according to hes sources abit further on he states that there is some form of infinite cache on the gpu but its not that same as u would find on the desktop version

So if its true then u could find that the cpu on ps5 could be abit better then xsx despite 100mhz dofference

Well known to talk just some stuff he thinks might be the cases without any evidence and sources you meant?
 

TBiddy

Member
I think Misterxmedia was an outlier,or sent to do damage control about XBO.

Whereas, every gen, there are Sony fanbase speculating about great secrets engineered by SCE team.

blueisviolet on twitter is already one of the.. special cases in that regard (on team green, that is).
 

geordiemp

Member
Well looking at the PS5 apu being highly rectangular its most probable the CPU clusters are together and as such it would be perfectly logical for ps5 to have a common CPU L3 cache even if it is 8 MB in size. If the upgrade was available and no extra die space cost, then it is clear Sony would take it.

For XSX layout, the option is not there anyway as the CPU halves are effectively miles apart in silicon terms.

Also, XSX is designed witha server class CPU, so MS were clearly primarily interested in running 4 games with the CPU design.


UXwsr88.jpg
 
Last edited:
SonyGAF trying it'd hardest to downplay less CU, CPU frequency, RAM bandwidth and less SSD space for same price as XSX by inventing bs facts to give PS5 a positive spin.

Did I mention slow PSN servers?
 
Top Bottom