• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

|OT| Next-Gen PS5 & XSX |OT| Speculation/Analysis/Leaks Thread

Hendrick's

Member
Jan 7, 2014
7,415
10,484
785
running natively says microsoft taking full advantage of the console. Some rumors suggest they've had to actually use gcn to allow for running natively.
Either way, it does not in any way indicate the full power of the console, and is a shit argument.
 

DaGwaphics

Member
Dec 29, 2019
2,377
3,333
460
Could be, seems a long long way to go from 2.4 to 1.9, also why would you take such a CU hit, from 80 to 56 I must of been smoking next to those wafers lol
I don't know. Maybe just for market differentiation. Why does intel bin 4 core chips from the same wafer as 10 core chips?

But their absolutely could be a third chip in the AMD lineup. They certainly used to do more than two.
 

Epic Sax CEO

Member
Nov 4, 2019
95
176
245
Hopefully on the 28 we get more clarification on RDNA2 in these consoles.
He is a well known leaker with quite a few so called tech you tubers hovering around like flies on shit...but I agree its FUD without any explaination or detail and should be ingnored. We have the AMD RDNA2 reveal soon and lots to discuss, all will become clearer (a bit anyway).
We don't need to wait, we already know that what he wrote there is absurd, completely wrong and even impossible to exist.
Just remember that this year Hot Chips already happened.
 
Last edited:
  • Like
Reactions: Riky

raul3d

Member
Mar 30, 2020
55
234
200
I've never seen a benchmark that showed a bit of difference between DRAM and DRAMLESS SSDs. Gaming doesn't fit the usage pattern that maximizes DRAM use. Basically the data being read needs to be cached already to make a difference. This is easy to do in productivity work, where you may use the same files over and over again over a period of time. Gaming tends to exhaust the cache, eliminating the benefit.
I don't think the SSD's DRAM is used to cache files..
 

DaGwaphics

Member
Dec 29, 2019
2,377
3,333
460
And the different CU per shader (14 vs 10). You cant grow a shader array size in a bin.
If N21 lite has 14 CUs per shader, this is probably just Arden and never hits the desktop market at all. If that is a desktop chip then yes, that would indicate a different piece of silicone (obviously). As you can tell I don't follow the AMD chatter that close.
 
Last edited:

geordiemp

Member
Sep 5, 2013
10,409
15,314
960
UK
Is this similar to the PS5 dropping down to 5TF claims?
Because its vague and therefore can be interpreted any way and therefore its FUD.

If that tweaker means Infinity cache is main of RDNA2 in his opinion, then both consoles will be lacking here as 96 MB L2 is damn huge.

What is RDNA2 but a lot of logic building blocks, consoles wont have them all for sure as some are very die space expensive. Same with Zen2, consoles wont have a 32 MB CPU cache, does that make them Zen 1.9 ?.

Consoles will have some RDNA2 parts PC does not have. Ho hum...

People get hung up over naming and marketing terms.
 
Last edited:
Mar 27, 2020
7,274
21,806
690
Because its vague and therefore can be interpreted any way and therefore its FUD.

If that tweaker means Infinity cache is main of RDNA2 in his opinion, then both consoles will be lacking here as 96 MB is damn huge.

What si RDNA2 but a lot of logic building blocks, consoles wont ahve them all for sure as some are very die space expensive. Same with Zen2, consoles wont have a 32 MB CPU cache, does that make them Zen 1.9 ?.

People get hung up over naming and marketing terms.
I definitely hate it when they give different names to the same features. It really makes it more difficult to determine which ones are unique to the platform.
 
  • Like
Reactions: geordiemp

OSC

Member
Jun 16, 2018
2,690
2,015
515
You should ask Cerny. How the hell do I know. :messenger_tears_of_joy:

I was just stating what can be seen in PC benchmarks. You can find several that compare the WD750 to the WD550 in gaming with nearly identical results (and in some cases the WD550 outperforms the WD750 since the path from the nand through the DRAM can add latency when the cache misses).
TLC is very low performance, often a chunk of TLC drives are run in SLC mode to increase performance. Wouldn't surprise if ram also improves performance of TLC ssds further too.
 
Mar 27, 2020
7,274
21,806
690
And how do you explain PS5 GPU reaching 2.23Ghz if not RDNA2.
Because it has RDNA2 CUs? This is something that Cerny confirmed earlier this year.

Just pointing out that GitHub looks really strange when you look at what's really in these systems. It's almost as someone wanted to leak old specs on purpose instead of the newest ones.
 

Epic Sax CEO

Member
Nov 4, 2019
95
176
245
Is this similar to the PS5 dropping down to 5TF claims?
Worse because we have more concrete and detailed information.
The "front end" is new, this is no "hybrid" mix of generation parts.
The RT function is an integral and basic part of the RDNA2 CU, one can't exist without the other.
It's thanks to Microsoft's presentation that we know a bit about Big Navi and it's new features.
It's a mystery what he is trying to pull off with this stunt.
 
Last edited:
Mar 27, 2020
7,274
21,806
690
Worse because we have more concrete and detailed information.
The "front end" is new, this is no "hybrid" mix of generation parts.
The RT function is an integral and basic part of the RDNA2 CU, one can't exist without the other.
It's thanks to Microsoft's presentation that we know a bit about Big Navi and it's new features.
It's a mistake what he is trying to pull off with this stunt.
So I guess the same applies for the PS5 when it comes to RDNA2.
 

ethomaz

is mad becasue DF didn't do a video on a video of a video of a video on PS5
Mar 19, 2013
33,481
21,969
1,175
38
Brazil
Because it has RDNA2 CUs? This is something that Cerny confirmed earlier this year.

Just pointing out that GitHub looks really strange when you look at what's really in these systems. It's almost as someone wanted to leak old specs on purpose instead of the newest ones.
GitHub never locked right.

Sony confirmed that they where already working in over 2Ghz for the APU in 2018 but GitHub tests shows lower clocks in 2019.

And the GitHub data has pulled off by a big Xbox fanboy... that makes the data suspicious because how a nobody Xbox Fanboy can have the power to delete the GitHub data? Unless he was the owner of the data in GitHub.
 
Last edited:

OSC

Member
Jun 16, 2018
2,690
2,015
515
Because its vague and therefore can be interpreted any way and therefore its FUD.

If that tweaker means Infinity cache is main of RDNA2 in his opinion, then both consoles will be lacking here as 96 MB L2 is damn huge.

What is RDNA2 but a lot of logic building blocks, consoles wont have them all for sure as some are very die space expensive. Same with Zen2, consoles wont have a 32 MB CPU cache, does that make them Zen 1.9 ?.

Consoles will have some RDNA2 parts PC does not have. Ho hum...

People get hung up over naming and marketing terms.
96MB for 80CU, 32MB or maybe even 24MB might be enough to do the same function for 36 CUs.
 
  • Like
Reactions: Bo_Hazem

geordiemp

Member
Sep 5, 2013
10,409
15,314
960
UK
96MB for 80CU, 32MB or maybe even 24MB might be enough to do the same function for 36 CUs.
The AMD driver leaks are listing 128 MB for 80 CU and 96 MB for 40 CU, but that 40 CU is a 192 buit bus

Remember PC always goes overkill, PCs CPU have zen 2 at 32 MB cache, console its 8 MB.

I would be ecstatic with 16 MB L2 on Ps5 witha 256 bit bus of RDNA2. The cache is the big unknown and will make a huge difference of every step above 4 MB.
 
Last edited:
Mar 27, 2020
7,274
21,806
690
GitHub never locked right.

Sony confirmed that they where already working in over 2Ghz for the APU in 2018 but GitHub tests shows lower clocks in 2019.

And the GitHub data has pulled off by a big Xbox fanboy... that makes the data suspicious because how a nobody Xbox Fanboy can have the power to delete the GitHub data? Unless he was the owner of the data in GitHub.
Just a theory.

It's possible that GitHub was real. But it's also possible that someone manipulated the data to put what they wanted into it. And then they deleted it because they didn't want anyone to review it and see that it was manipulated.

Could explain why it looks off.
 

ethomaz

is mad becasue DF didn't do a video on a video of a video of a video on PS5
Mar 19, 2013
33,481
21,969
1,175
38
Brazil
Just a theory.

It's possible that GitHub was real. But it's also possible that someone manipulated the data to put what they wanted into it. And then they deleted it because they didn't want anyone to review it and see that it was manipulated.

Could explain why it looks off.
He deleted it because he wanted people on ERA thinks it was AMD that deleted it... to looks real.

He archived the goal because people took the delete as AMD sending ninjas because it was a internal leak but in reality AMD never give a shit about the data.
 
Last edited:

OSC

Member
Jun 16, 2018
2,690
2,015
515
The AMD driver leaks are listing 128 MB for 80 CU and 96 MB for 40 CU, but that 40 CU is a 192 buit bus

Remember PC always goes overkill, PCs CPU have zen 2 at 32 MB cache, console its 8 MB.

I would be ecstatic with 16 MB L2 on Ps5 witha 256 bit bus of RDNA2. The cache is the big unknown and will make a huge difference of every step above 4 MB.
Hmmm, you have a source for that?
Maybe 128 for the 80CU and then a cut down version 70CU will also have something like 96MB and then the 40CU GPU will have 64 megs. Or insert whatever is appropriate -eastmen beyond3d
 
  • Like
Reactions: Bo_Hazem

geordiemp

Member
Sep 5, 2013
10,409
15,314
960
UK
Hmmm, you have a source for that?
The spec listing is in the apple mac os beta drivers

/System/Library/Extensions/AMDRadeonX6000HWServices.kext/Contents/PlugIns/AMDRadeonX6000HWLibs.kext/Contents/MacOS/AMDRadeonX6000HWLibs.

The L2 sizes are from estimates based on that data.

Thats all we got, and all the tech you tubers, who may also be using the apple mac os data lol for their "sources" :messenger_beaming: .

Note AMD have patented infinity cache, so it is coming together.

So its real information, but is it correct and up to date ? Who knows, but its a decent source at least.

It is a speculation thread, and we use info that we have that is credible.
 
Last edited:
  • Thoughtful
Reactions: Bo_Hazem

OSC

Member
Jun 16, 2018
2,690
2,015
515
Its in the apple mac os beta drivers

/System/Library/Extensions/AMDRadeonX6000HWServices.kext/Contents/PlugIns/AMDRadeonX6000HWLibs.kext/Contents/MacOS/AMDRadeonX6000HWLibs.

Thats all we got, and all the tech you tubers, who may also be using the apple mac os data lol for their "sources" :messenger_beaming: .

So its real information, but is it correct and up to date ? Who knows, but its a decent source at least.
does it say 40CU? Because they were talking about a 70CU 96MB infinity cache card possibility on beyond3d.
 
  • Like
Reactions: Bo_Hazem

Bo_Hazem

Gold Dealer
Feb 10, 2020
8,290
44,671
870
34
Salalah, Oman
I'm not liking what I'm seeing here, maybe because it's 1080p, it looks horrendous:




Here it looks much better, but not sure what HW it's running on but most likely PC:




But still, draw distance looks like current gen, dirty in the far distance. I'm a big Assassin's Creed fan, played all of them except Rogue as it came to PS4 pretty late. Usually preorder the gold edition, but this time I might wait as I'm not liking what I'm seeing.
 
Last edited:

geordiemp

Member
Sep 5, 2013
10,409
15,314
960
UK
does it say 40CU? Because they were talking about a 70CU 96MB infinity cache card possibility on beyond3d.
70 CU is not listed in the apple beta, but that does not mean it is not possible, just means its not on the apple beta driver list.

The main ones were


Navi 21 lite - 56 CU, 1.9 Ghz
Navi 21 - 80 cu , 2.2 Ghz
Navi 22 - 40 CU, 2.5 Ghz

The rest were Navi 10-14 and some of the smaller variants.
 
Last edited:
Oct 26, 2018
13,471
18,315
695
GitHub never locked right.

Sony confirmed that they where already working in over 2Ghz for the APU in 2018 but GitHub tests shows lower clocks in 2019.

And the GitHub data has pulled off by a big Xbox fanboy... that makes the data suspicious because how a nobody Xbox Fanboy can have the power to delete the GitHub data? Unless he was the owner of the data in GitHub.
The github data was Oberon, 36 CUs, 2 ghz, 9.2TF.

PS5 is 36 CU, 2.23 ghz, 10.3TF.

Looks pretty spot on to me. The githubbers didn't find a matching schematic with a bumped up GPU at over 2 ghz. If PS5 was 2 ghz, there's your 9.2TF.

And it was no shotgun guess. There were no leakers or fake insiders who claimed PS5 would have the same number of CU as PS4 Pro. So for theme to go all-in the whole time for over a year that it was a 36 CU system is impressive and correct. The typical gamer would assume specs go up, not stay the same.

Every other so called leaker had PS5 around 12-ish with spec configs similar with Series X where each system is around 12 TF and 50+ CUs.

I don't think there was even one leaker who said PS5 would have CU's even in the 40s. So for them to claim it's 36 like 2016 PS4 Pro is no lucky guess or BS like you are trying to make it.
 
Last edited:
Mar 27, 2020
7,274
21,806
690
The github data was Oberon, 36 CUs, 2 ghz, 9.2TF.

PS5 is 36 CU, 2.23 ghz, 10.3TF.

Looks pretty spot on to me. The githubbers didn't find a matching schematic with a bumped up GPU at over 2 ghz. If PS5 was 2 ghz, there's your 9.2TF.

And it was no shotgun guess. There were no leakers or fake insiders who claimed PS5 would have the same number of CU as PS4 Pro. So for theme to go all-in the whole time for over a year that it was a 36 CU system is impressive and correct. The typical gamer would assume specs go up, not stay the same.

Every other so called leaker had PS5 around 12-ish with spec configs similar with Series X where each system is around 12 TF and 50+ CUs.

I don't think there was even one leaker who said PS5 would have CU's even in the 40s. So for them to claim it's 36 like 2016 PS4 Pro is no lucky guess or BS like you are trying to make it.
What about the part where GitHub was saying the PS5 would be RDNA1?

You can't just flip a switch and go from RDNA1 to RDNA2.
 
Last edited:

ethomaz

is mad becasue DF didn't do a video on a video of a video of a video on PS5
Mar 19, 2013
33,481
21,969
1,175
38
Brazil
That diagram is a bit "wrong".
There's no separate "RT Unit". the RT calculations is done inside the Texture Unit.
Show me the real diagram please...
Because that is the only way to you know it is not right.

And yes the diagram shows it inside the TMU.
 
Last edited:

Nowcry

Member
Aug 7, 2015
192
1,077
640
BTW somebody replied to me on Twitter with the diagram of Series X and RDNA1.
I’m not sure how realizable but I can say the RDNA diagram looks real because it matches with AMD diagrams and docs.

Much is progressing this to be FUD.

As a general rule when it is FUD it generally disappears afterwards. It is very rare that so many elabobarod plans are coming out from so many different sites that support that statement.

Personally, it fits me a lot with the clocks of the SX and with some of its decisions.
 

martino

Member
Apr 25, 2013
5,020
2,753
675
France
there lot of better (at 4k) but compression is not great (still better than yeoyong 1080p and him ranting when locking enemy and expecting he can play stealth :D)
all gameplay is availlable fo this guy :
 
Last edited:
Oct 26, 2018
13,471
18,315
695
What about the part where GitHub was saying the PS5 would be RDNA1?

You can't just flip a switch and go from RDNA1 to RDNA2.
Don't know. I don't remember them saying anything to do with RDNA 1, RDNA 2, or a mishmash of this and that to make it "RDNA 1.5".

The big data mining thing they did a long time ago (I think it started in April 2019) was Oberon, 36 CU, 9.2TF and maybe (I forget) them, or someone else tagging along saying that's around 5700 or 5700XT performance. I don't remember if the 5700 talk had to do with the initial Oberon talk or it was something totally separate and later on.

Not only was their 36 CU correct, but the gpu at 2 ghx was close. It was off by 200 mhz.

At the time, a lot of gamers were claiming 2 ghz is way to hot for a console, but turns out it's 2.23 ghz. So their claim of 2 ghz is better than just about everyone too.
 
Last edited:

Nowcry

Member
Aug 7, 2015
192
1,077
640
What about the part where GitHub was saying the PS5 would be RDNA1?

You can't just flip a switch and go from RDNA1 to RDNA2.
It might be possible that RDNA1 to RDNA2 is quite compatible. That's why there are hybrid GPUs halfway there.

It could just be that PS5 realized they needed higher clocks at some iteration they changed the CU from RDNA1 to RDNA2 and realized they had gained a lot.

The architecture was clear, but it is possible that the update from RDNA1 CU to RDNA2 CU was not as complex as we thought.

o

Maybe RDNA1 CU was going to be directly the original architecture of RDNA2 CU but with RT. However Cerny decided to edit it and finally AMD will adopt that architecture for RDNA2.

That is precisely why MS had to settle for the previous diagram since it came out of a collaboration.
 
Last edited:

Bo_Hazem

Gold Dealer
Feb 10, 2020
8,290
44,671
870
34
Salalah, Oman
there lot of better (at 4k) but compression is not great (still better than yeoyong 1080p and him ranting when locking enemy and expecting he can play stealth :D)
all gameplay is availlable fo this guy :
Man this is horrible as well, animation, and low res LOD's/errors, some kinda drop/screen tearing as well. It's not even close to the official trailer. But I'll wait for better footage and for these weird animations to be polished, which is a doubt this late.
 
Last edited:
Mar 27, 2020
7,274
21,806
690
Don't know. I don't remember them saying anything to do with RDNA 1, RDNA 2, or a mishmash of this and that to make it "RDNA 1.5".

The big data mining thing they did a long time ago (I think it started in April 2019) was Oberon, 36 CU, 9.2TF and maybe (I forget) them, or someone else tagging along saying that's around 5700 or 5700XT performance. I don't remember if the 5700 talk had to do with the initial Oberon talk or it was something totally separate and later on.

Not only was their 36 CU correct, but the gpu at 2 ghx was close. It was off by 200 mhz.

At the time, a lot of gamers were claiming 2 ghz is way to hot for a console, but turns out it's 2.23 ghz. So their claim of 2 ghz is better than just about everyone too.
The clocks speeds alone indicated that it was RDNA2. It was said that RDNA1 couldn't support them. Also in the GitHub leak ray tracing was missing from Oberon but that's an RDNA2 feature.
 

J_Gamer.exe

Member
May 4, 2020
423
2,358
350
"cold hard reality" is that theoretical numbers are not representative of real world performance. We'll have to wait and see if that power gap ever shows up.


I told everyone once they were freaking out about the PS5 size that it was about the same size in volume as XSX lmao. Instead everyone went for that graphic that took the widest measurement from each axis and pretended the ps5 was a box
PS5 teardown in depth(Japanese)
https://www.4gamer.net/games/990/G999027/20201016035/

Some intesting point
-Sony used latest CAE technology to design the airflow inside PS5
-Chimney effect is at the measurement error level
-SSD slot can accept <8mm tall heatsink
-2 exhaust holes are provided for SSD slot, take the heat away by negative pressure
-not recommended to use heatsink that are high enough to touch the metal cover

Also ps5 DE is 0.95 x the volume of XSX (smaller but different shape.). Ps5 disk is 5 % bigger by volume.
So some crow to be eaten by some people. Sony actually fit the components into a similar volume space as Xbox.

Some seemed to had forgotten that by working out the volume the PS5 is not a box like the SX, therefore the curves etc make the actual volume much smaller than people thought.

🤣
 
Oct 26, 2018
13,471
18,315
695
The clocks speeds alone indicated that it was RDNA2. It was said that RDNA1 couldn't support them. Also in the GitHub leak ray tracing was missing from Oberon but that's an RDNA2 feature.
Maybe it's a hybrid.

I don't know anything about gpus and RT, but consoles have had mishmashes of weird configs in history as well.

Old ass consoles used to have a mix of 8-bit/16-bit/and balls deep Atari Jaguar 64-bit chips.

Intellivision I think was technically a 16-bit system, T-16 was I think 8-bit class cpu and 16-bit gpu, and Neo Geo was 16-bit cpu with crazy effects, and Atari Jag did it's own thing.

And every system had it's own custom effects like scaling, rotating, or none at all.
 
Last edited:
Mar 27, 2020
7,274
21,806
690
Maybe it's a hybrid.

I don't know anything about gpus and RT, but consoles have had mishmashes of weird configs in history as well.

Old ass consoles used to have a mix of 8-bit/16-bit/and balls deep Atari Jaguar 64-bit chips.

Intellivision I think was technically a 16-bit system, T-16 was I think 8-bit class cpu and 16-bit gpu, and Neo Geo was 16-bit cpu with crazy effects, and Atari Jag did it's own thing.

And every system had it's own custom effects like scaling, rotating, or none at all.
Actually the CUs themselves are RDNA2 ones and that's where the RT hardware is. It's definitely not RDNA1 CUs with RT hardware bolted on.
 

THE:MILKMAN

Member
Mar 31, 2006
6,142
4,105
1,535
Midlands, UK
BTW somebody replied to me on Twitter with the diagram of Series X and RDNA1.
I’m not sure how realizable but I can say the RDNA diagram looks real because it matches with AMD diagrams and docs.

This is mostly beyond my knowledge base for the most part to be honest but the one thing that sticks out here for me compared to the Hot Chips info is the 4MB L2. It is 5MB in the Hot Chips slides.
 

ethomaz

is mad becasue DF didn't do a video on a video of a video of a video on PS5
Mar 19, 2013
33,481
21,969
1,175
38
Brazil
Somebody send me the timestamp from the L2 cache being 5MB on Series X.


So I believe he made up the Xbox diagram.
 
Last edited:
  • Like
Reactions: Bo_Hazem

ethomaz

is mad becasue DF didn't do a video on a video of a video of a video on PS5
Mar 19, 2013
33,481
21,969
1,175
38
Brazil
It's the PS5 that only has 4MBs right? But I understand it doesn't need as much due to the lower CU count.
I don’t think we have PS5 L2 data.

By AMD RDNA doc the L2 cache is made of 4 slices and each one can have 128-512KB.

That means 512KB to 8MB total... each vendor product choose the amount.
 
Last edited:
  • Like
Reactions: Bo_Hazem

Epic Sax CEO

Member
Nov 4, 2019
95
176
245
Show me the real diagram please...
Because that is the only way to you know it is not right.

And yes the diagram shows it inside the TMU.
That's what I'm talking about!
People pulling theories and "leaks" out of their asses and ignoring the concrete and official information that we already have.

RDNA2 does Ray calculations with the TMU - https://images.anandtech.com/doci/15994/202008180220211.jpg



We have the official information and yet people insist on giving attention to these mercenaries farming clicks.
 

THE:MILKMAN

Member
Mar 31, 2006
6,142
4,105
1,535
Midlands, UK
Somebody send me the timestamp from the L2 cache being 5MB on Series X.


So I believe he made up the Xbox diagram.


It's the PS5 that only has 4MBs right? But I understand it doesn't need as much due to the lower CU count.
I don't think we have info about PS5...? The XSX has 5MB L2 and RDNA1 has 4MB. This is what I understand.