• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

R600

Banned
Still equivalent to 10TF of GNC for rasterization performance, nowhere close to the 13TF found in devkits
This theory has a bunch of holes in it and going through a bunch of assumptions to make it work:
  1. Is it real?
  2. Its not a Chinese gaming chip?
  3. How is it linked to PS5 other than PS4 chips? What's to say this isn't cashing it on that to make it believable
  4. April dev kits weren't Navi because? Didn't gonzolo appear before that?
  5. It performs like 10TF GNC even though devkits report 13TF of unknown arch
  6. Why would Sony use a Latin last name for the APU code name?
To many holes and assumptions made to be treated as anything other than wild speculation.

We are talking of navi not turing
8TF Navi would perform similar to 6TF Turing.
Your theory sucks ass mate because NaviXT sucks up 225W with 8GB of GDDR6.

So tell me, how will they fit it in console first, along with additional 8GB of RAM (~16W), RT hardware (?W) and 8 core Zen2 (~40W)? Because they had to downclock Pitcairn for PS4, which was :

A) smaller PC chip at the time, compared to NaviXT (212mm² v 251mm²)

and

B)COMFORTABLY less hot (180W for Pitcairn v 225W for 9.75TF Navi XT)

Ok so now I expect..."Sony/MS magicians will optimize Navi GPUs and go with 7nm EUV node, even though nothing points to that even being remotely the case".
 
Last edited:
I believe it will be a killer move if they do that.
There is only pros in this move:

- No holidays stock issues
- Two big seller period in the first year
It would be a reckless move, because hardware component prices (7nm wafers, GDDR6, NAND chips) need to drop (Phil said that, not me). That's why both companies will release them in late 2020. BF/Xmas period is always the best.

Sony has also confirmed it that they're not releasing the PS5 in early 2020.

I believe there is no issue in installing a Windows in a machine with PS4's APU because it is fully compatible.
No, it's not IBM PC compatible in any shape or form (for example, legacy BIOS functions are totally missing and the southbridge is rather exotic compared to PC chipsets).

Don't take my word for it, watch this presentation (PS4 hackers):

What if that were to happen but it was only 8.5-9 Tflops, 399? Game over for Xbox division?
$349-399 would make sense for specs like that.

$499? Hell no!

I suspect going near 400mm2 7nm will be a no no. They probably don't want to trigger EPA regulations/tariffs or something as far as CE devices go. Times have changed from PS3 days.
PS4 Pro and XB1X increased the gaming power consumption (up to 155-175W) and yet, the idle power consumption is lower (70 vs 50 watts) compared to OG models.

Modern chips with modern lithography support more advanced power gating and aggressive clock scaling (CPU/GPU/DRAM clocks), so I don't see this being an issue.
 
That is why I asked.. seems weird to see 3DMark benchmarks for PS4.

But maybe for test only before they had the full PS4 hardware they used the APU in a PC with Windows and so they did the benchmarks.
I believe there is no issue in installing a Windows in a machine with PS4's APU because it is fully compatible.

Of course the final PS4 machine can be better around in performance than that 3DMark frankstein test.

yeah well all past APISAK console leaks rely on the 3D mark database. and the ps4 one was a perfect fit (as can be read in the DF article). so somehow it must be possible.

im pretty sure if it was as easy as putting the SOC into a PC you could just install windows on your PS4/5. and im not sure why that shouldn't be the case tbh. AMD provided windows drivers for the Subor SOC. why wouldn't they do that for their biggest customer?

another possibility would be, that futuremark is giving the support themselfs. after all the console SOC results are hidden over the website view.
 

SonGoku

Member
EPA regulations/tariffs or something as far as CE devices go
Not familiar with those, can you elaborate why they would regulate a 380-400 mm2 chip?
I thought the Apisak etc work looked reliable - but doesn't make it guaranteed right
Many leaks banked on PS4/XB type leaks design to make it look more believable
One of the first PS5 leaks used the PS4 PDF presentation style lol
But the simple explanation is : we have 13TF dev kits that exceed even the ~8TF RDNA>GCN TF conversion because/if dev kits are/were running more powerful hardware to ease development before optimization.
Sure we can come up with all sorts of explantions, doesn't make it any less of a stretch
Your theory sucks ass mate because NaviXT sucks up 225W with 8GB of GDDR6
And running 36CUs at 1.8GHz won't be power hungry because? let alone 18Gbps chips
So tell me, how will they fit it in console first, along with additional 8GB of RAM (~16W), RT hardware (?W) and 8 core Zen2 (~40W)? Because they had to downclock Pitcairn, which was :
Easy with a ~250W total system TDP and the hobbit method
56CUs undervolted to maintain 1540Mhz stable (compared to 1950Mhz 5700 XT)
380-390 mm2 SoC size late 2020 shrink to 320-340 mm2 late 2021
 
Last edited:

R600

Banned
Your 60CU + RT CANNOT fit when 40CU part is already 251mm². I dont know why you ignore PC > console conversion from last gen when PS4 had smaller, 212mm² chip from PC, in 348mm² die.

You want them to increase die by 15% on more expensive process with worse yields and then ti shrink it to original PS4 level in year time.

Look, you are not getting 11-13TF part because that is >2080TI beater. You just arent gonna get that in console.
 

vpance

Member
It would be a reckless move, because hardware component prices (7nm wafers, GDDR6, NAND chips) need to drop (Phil said that, not me). That's why both companies will release them in late 2020. BF/Xmas period is always the best.

Sony has also confirmed it that they're not releasing the PS5 in early 2020.


No, it's not IBM PC compatible in any shape or form (for example, legacy BIOS functions are totally missing and the southbridge is rather exotic compared to PC chipsets).

Don't take my word for it, watch this presentation (PS4 hackers):


$349-399 would make sense for specs like that.

$499? Hell no!


PS4 Pro and XB1X increased the gaming power consumption (up to 155-175W) and yet, the idle power consumption is lower (70 vs 50 watts) compared to OG models.

Modern chips with modern lithography support more advanced power gating and aggressive clock scaling (CPU/GPU/DRAM clocks), so I don't see this being an issue.

Not knowing any of these regulations, Pro and X seem very reasonable at sub 200w. But where do you see a theoretical 400 mm2 7nm PS5 power consumption wise? If it's well above 200w which it seems it would be then it's unlikely. That's why going EUV will be the key.
 

bitbydeath

Member
yeah well all past APISAK console leaks rely on the 3D mark database. and the ps4 one was a perfect fit (as can be read in the DF article). so somehow it must be possible.

It wasn’t discovered til after the consoles so anyone could have created them. If these were added nearly two years out from release then there is no way the data is accurate or real.
 
Not knowing any of these regulations, Pro and X seem very reasonable at sub 200w. But where do you see a theoretical 400 mm2 7nm PS5 power consumption wise? If it's well above 200w which it seems it would be then it's unlikely. That's why going EUV will be the key.
You can read the regulation here:

http://efficientgaming.eu/fileadmin..._Games_Console_ACR_2018_Final_report_V1.0.pdf (page 24)

TL;DR: they don't regulate gaming power usage, but navigation/media playback/idle modes.

Imagine if they enforced double-digit wattage numbers for gaming usage. Consoles would be rather underpowered.

I mean, what's next? Banning expensive AAA games because they don't promote "green computing" (unlike low-power pixelized indies)? It would be ridiculous!
 

SonGoku

Member
What if that were to happen but it was only 8.5-9 Tflops, 399? Game over for Xbox division?
I'll bite for $299
PS4 had smaller, 212mm² chip from PC, in 348mm² die.
Im suggesting a 380-390 mm2 SoC, big difference
64CU btw
You want them to increase die by 15% on more expensive process with worse yields and then ti shrink it to original PS4 level in year time.
The difference is there was no 6nm equivalent on the horizon back when PS4 launched
6nm (7nm with EUV layers) its essentially a free 15% shrink with minimal investment (cross compatible designs)
Look, you are not getting 11-13TF part because that is >2080TI beater.
11TF would be roughly equal to the RTX2080, nowhere close to the Ti
But where do you see a theoretical 400 mm2 7nm PS5 power consumption wise? If it's well above 200w
Big chip will net best perf/watt so around ~250W total
why going EUV will be the key.
Indeed but in case 7nm EUV isn't ready the fallback option is big die on 7nm and 6nm shrink a year later.
 
Last edited:

vpance

Member
Not familiar with those, can you elaborate why they would regulate a 380-400 mm2 chip?

Many leaks banked on PS4/XB type leaks design to make it look more believable
One of the first PS5 leaks used the PS4 PDF presentation style lol

Sure we can come up with all sorts of explantions, doesn't make it any less of a stretch

And running 36CUs at 1.8GHz won't be power hungry because? let alone 18Gbps chips

Easy with a ~250W total system TDP and the hobbit method
56CUs undervolted to maintain 1540Mhz stable (compared to 1950Mhz 5700 XT)
380-390 mm2 SoC size late 2020 shrink to 320-340 mm2 late 2021

Sad thing is 250W is probably way too much. I don't know the exact power regulations but I remember posts from Rigby going on and on about them. I think EU has even stricter limts. They can't just launch a 300w console or something without significant financial penalties.
 
Sad thing is 250W is probably way too much. I don't know the exact power regulations but I remember posts from Rigby going on and on about them. I think EU has even stricter limts. They can't just launch a 300w console or something without significant financial penalties.
There is no such limit, otherwise 9900k and high-end 300W GPUs would be banned as well. What matters is having decent power/clock scaling, since most of the time a computer/console is idle, not operating at full throttle.

We're not talking about vacuum cleaners, or light bulbs.
 

SonGoku

Member
Sad thing is 250W is probably way too much. I don't know the exact power regulations but I remember posts from Rigby going on and on about them. I think EU has even stricter limts. They can't just launch a 300w console or something without significant financial penalties.
From N Negotiator post: they don't regulate gaming power usage, but navigation/media playback/idle modes.
It would be around for 250W BUT with the hobbit method it'll go lower than that

Interesting:
However, this may not be the last attempt that Zhongshan Subor makes to enter the Chinese games console market according to a statement by the company's CEO, Wu Song: "While the Shanghai office has been closed, the project is still ongoing and we will have a new announcement to make regarding its progress in the next few months.
Subor evolved to Gonzolo perhaps?
 

R600

Banned
Shit is getting boring ... It has been long time since we had a legit info
We had one today. No way any Chinese console is packing 8 core Zen2 and Navi10 with 20K+ on 3DMark. Just not hapoening. Its completely different matter that some expect 2080 perf in console when last time around we got 60% less FLOPS AMD gave in PC space.
 

vpance

Member
You can read the regulation here:

http://efficientgaming.eu/fileadmin..._Games_Console_ACR_2018_Final_report_V1.0.pdf (page 24)

TL;DR: they don't regulate gaming power usage, but navigation/media playback/idle modes.

Imagine if they enforced double-digit wattage numbers for gaming usage. Consoles would be rather underpowered.

I mean, what's next? Banning expensive AAA games because they don't promote "green computing" (unlike low-power pixelized indies)? It would be ridiculous!

Ok yeah that statement sounds familiar. I forgot it only applied in media and standby.

Still, the cost of engineering a 250W-ish console that's also not unreasonably loud could be expensive. Probably better to wait for (here comes the magic word) E U V 🤗
 

pawel86ck

Banned
Guys do you think AMD Navi 20 and Nv 3800 RTX GPU's from Nv will launch before next gen consoles? 1080 / vega64 performance may look good today (compared to current cards), but 1.5 year from now everything may change.
 

SonGoku

Member
We had one today. No way any Chinese console is packing 8 core Zen2 and Navi10 with 20K+ on 3DMark.
Why not? Subor was targetting 4TF + Zen 4 core
YOU FOOL, THIS ISNT EVEN MY FINAL FORM YET
c64417.png
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Your 60CU + RT CANNOT fit when 40CU part is already 251mm². I dont know why you ignore PC > console conversion from last gen when PS4 had smaller, 212mm² chip from PC, in 348mm² die.

You want them to increase die by 15% on more expensive process with worse yields and then ti shrink it to original PS4 level in year time.

Look, you are not getting 11-13TF part because that is >2080TI beater. You just arent gonna get that in console.
everything shrinks when you go to 7nm. Yes, the PS4 had a 212mm2 GPU with a 70mm2 jaguar in a 350mm2 GPU, but those I/O parts, memory controllers etc all get a major size reduction as well. 4x compared to 28nm. So 212+70=282-350=~70mm2 reserved for everything else on the die on a 28nm die. 70/4=17.5mm2. Thats how much all that extra stuff should take up on a 7nm die.

70mm2 Zen 2 chip + 17.5mm2 controllers + 251 mm2 = 338mm2. The Scarlett die is in the 380-400m2 range. Thats a whole lot of space for extra CUs AND RT cores.
 
OKay so here is the thing i hope sonys custom navi 8tf(if it ends up being the case) acts like a nvidia card. Nvidia had cards have low tf but make the most out if it and have better performance than amds 13tf chips for example. If thats true then ill settle for that.
 

LordOfChaos

Member
We had one today. No way any Chinese console is packing 8 core Zen2 and Navi10 with 20K+ on 3DMark. Just not hapoening.

Why?

It's not their talent that's in question here, it's what they ask for from AMDs semicustom division. If they were happy to make one, why not another?

They were also on 4Tflops on 14nm with the previous attempt. The extra density afforded by 7nm plus Navi's IPC gains would seem awfully close.

Being on 14nm the first time they did pay for a beefy die

feng_575px.JPG
 
Last edited:

vpance

Member
Big chip will net best perf/watt so around ~250W total

Indeed but in case 7nm EUV isn't ready the fallback option is big die on 7nm and 6nm shrink a year later.

It's probably best to avoid having to have a redesign model so quickly after launch. It's like double the work. But in that case $500 is no doubt. I'd rather they didn't because I'd certainly end up buying both.

The other option is they do a Pro again 3 years in, but they stick with a smaller 7nm die at launch.
 

R600

Banned
Why?

It's not their talent that's in question here, it's what they ask for from AMDs semicustom division. If they were happy to make one, why not another?

They were also on 4Tflops on 14nm with the previous attempt. The extra density afforded by 7nm plus Navi's IPC gains would seem awfully close.

Being on 14nm the first time they did pay for a beefy die

feng_575px.JPG
Because I am pretty sure you would hear about their next console with 8core Zen2 and Navi 10 chip on die alreay. In fact, I am pretty sure they would yell about it from their balcony.

They are not in game against MS and Sony, they would get with that info far before chip ends up in QS, especially seeing as they went bankrupt.
 

R600

Banned
everything shrinks when you go to 7nm. Yes, the PS4 had a 212mm2 GPU with a 70mm2 jaguar in a 350mm2 GPU, but those I/O parts, memory controllers etc all get a major size reduction as well. 4x compared to 28nm. So 212+70=282-350=~70mm2 reserved for everything else on the die on a 28nm die. 70/4=17.5mm2. Thats how much all that extra stuff should take up on a 7nm die.

70mm2 Zen 2 chip + 17.5mm2 controllers + 251 mm2 = 338mm2. The Scarlett die is in the 380-400m2 range. Thats a whole lot of space for extra CUs AND RT cores.
Scarlett die is not 380-400. It can be anything from 338-400 and you are forgetting RT hardware + wider bus on Scarlett. So no, not alot of space, just alot of space for wishfull thinking :p
 

SlimySnake

Flashless at the Golden Globes
5700 DCUs weight 3.37 mm2 each
Add 12DCU (24 CUs for a total of 64CUs or 32DCUs)
12DCUs + 15% RT = 46.5 mm2
12DCUs + 10% RT = 44.48 mm2

Pascal or Turing?
yep. fits perfectly within the 380-400mm2 range.

the only question is the wattage and i dont think they can get a 56-60CU GPU in console with the 5700XT TDP already pushing 150W. Even for a 200W console (50W higher than the base PS4) you need the APU to be under 150W. With the Zen2 taking up 45W by itself, the GPU needs to be under 110W.

If a 40 CU part is taking up 150W at 1.9ghz, how can a 56-60 CU part be less than that even at 1.5ghz?
 

LordOfChaos

Member
Because I am pretty sure you would hear about their next console with 8core Zen2 and Navi 10 chip on die alreay. In fact, I am pretty sure they would yell about it from their balcony.

They are not in game against MS and Sony, they would get with that info far before chip ends up in QS, especially seeing as they went bankrupt.

I've said multiple times now that I'm not suggesting it's this same company, but there's plenty out there in China and plenty of speculative VC.
 

SonGoku

Member
Scarlett die is not 380-400. It can be anything from 338-400 and you are forgetting RT hardware + wider bus on Scarlett. So no, not alot of space, just alot of space for wishfull thinking :p
10% extra RT sillicon doesnt take that much space and 320-384 bit bus is tiny on 7nm
Die size estimates
75mm for CPU
45mm for 10 GDDR6 controllers (20GB GDDR6)
8.8mm for ROPs
140mm for buses, caches, ACE, geometry processors, shape etc. (over estimating this part as the 5700 seems to have lots of "empty" areas. )

3.37mm2 is the DCU size

  • 118.6mm for 64CUs + RT silicon making CUs 10% bigger compared to Navi10
  • 124mm for 64CUs + RT silicon making CUs 15% bigger compared to Navi10
Total size 387-393mm2
Tensor/RT sillicon takes 22% space on 12nm for nvidia cards, 7nm has a 3.2X density increase (or 0.31X area) over 12nm. So 10% or less is a safe bet for RT silicon on 7nm

396 mm2 with 24GB GDDR6 (12 GDDR6 memory controllers)
For 64CUs APU I'd expect anywhere between 380-390mm2
If a 40 CU part is taking up 150W at 1.9ghz, how can a 56-60 CU part be less than that even at 1.5ghz?
400Mhz its a huge difference for TDP in GPUs especially when hitting diminishing returns
Also with better yields a lower voltage will be required to maintain lower stable clocks
Even for a 200W console
Yeah ~250W seems more realistic
 
Last edited:

MadAnon

Member
SonGoku, I still don't understand how you managed to put in so many CUs.

I can't find any realistic estimates of the ryzen 3000 sizes but...
The last gen ryzen 12nm 8c/16t 2700x had 213mm die. 7nm would bring it down to 124mm. There could be some small architectural upgrades but I don't see a 7nm ryzen 7 going anywhere near 100mm.

Add to 251mm 40cu 5700xt and you are way over 350mm. Add rt cores and you are at the estimated size of that APU in MS presentation. Don't see where you add 24 more CUs.
 

R600

Banned
Interesting point by one user as reply to Komachi.


WA1eSG6

(Should check entire tweet chain)

Seems there is definite difference between codename numbering between Sony and MS.
 
Last edited:

CyberPanda

Banned
Interesting point by one user as reply to Komachi.


WA1eSG6

(Should check entire tweet chain)

Seems there is definite difference between codename numbering between Sony and MS.

Image is not loading.

EDIT: Nevermind, it's fixed.
 
Last edited:

R600

Banned
SonGoku SonGoku ,

One question regarding this theory of yourse...Why didnt AMD just release 64CU version of Navi @ 1600MHZ and ~300mm² size and literally slaughtered Nvidia in price segment? Could even take performance crown ffs with 13TF+ Navi.
 
A semicustom part in the same vein as that is what I suggested, not that exact part or company

I mean, what's the explanation for this APU running a Windows benchmark when Playstation is FreeBSD based? The closest plausible explanation if you squint would be that they also run Linux on it for testing and ran it through Wine, but then you'd expect such a performance drop, and to even land between the Vegas...

And why would they need to do that anyways, FreeBSD is what they know and customize and run on PS hardware for years.


There's so much about that you need to squint really hard to make it seem plausible, vs it just being a semicustom part for another obscure Windows box.

I wonder when we find out that apisak is a twitter bot ran by the green or blue team?

2018 isn't exactly a long standing twitter account.

Easier to poison the well when you own the well ;)
 
Last edited:

SonGoku

Member
SonGoku SonGoku ,

One question regarding this theory of yourse...Why didnt AMD just release 64CU version of Navi @ 1600MHZ and ~300mm² size and literally slaughtered Nvidia in price segment? Could even take performance crown ffs with 13TF+ Navi.
Because of yields and the design not being ready yet, and im thinking AMD is designing RDNA2 (big navi) on 7nm EUV.
Also what makes sense for a console doesn't apply to the discrete gpu market
AMD will want to clock big Navi as high as possible well beyond perf/Watt sweet spot to profit from the big chip
 
Last edited:

R600

Banned
Because of yields and the design not being ready yet, and im thinking AMD is designing RDNA2 (big navi) on 7nm EUV.
But 50% difference in CUs is small as you said, certainly price of card ~600$ would overcome small price you pay for worse yields.

Because seriously, 20 additional CUs is nothing, since MS and Sony will be putting out 400mm² with SSDs in consoles for 499$.
 

SonGoku

Member
You 140mm part is made up estimate based on what?
Its from Proelites estimates, (he knows his stuff) for the record that 140mm2 is overcompensating for empty spaces that wont necessarily be on a console chip
But 50% difference in CUs is small as you said, certainly price of card ~600$ would overcome small price you pay for worse yields.
I can only speculate to the design not being ready yet and maybe AMD doesn't consider it worth the investment to port their 7nm EUV design to 7nm
Waiting for faster GDDR6 chips etc.
 
Last edited:
Do we know yet if a whole dual CU has to be disabled on Navi?

Or is it possible to just disable a single CU (half of a dual CU)?

Hoping it's the latter because Navi is supposed to be scalable.
 

SonGoku

Member
Because I am pretty sure you would hear about their next console with 8core Zen2 and Navi 10 chip on die alreay. In fact, I am pretty sure they would yell about it from their balcony.
Not saying gonzolo is certainly them, could be from another dime a dozen Chinese company, but this statement is certainly suspect:
However, this may not be the last attempt that Zhongshan Subor makes to enter the Chinese games console market according to a statement by the company's CEO, Wu Song: "While the Shanghai office has been closed, the project is still ongoing and we will have a new announcement to make regarding its progress in the next few months.
 
Last edited:
With the Zen2 taking up 45W by itself
45W? Why?

According to this, quad-core Zen+ CCX at 7nm consumes 10W:


Maybe a bit more for Zen 2 enhancements (256-bit FPU etc.), but certainly within a 30W power envelope (same as Jaguar at 28nm).

I think people are overestimating the CPU power/die budget of consoles, just because we're getting Zen this time around.

Either way, it's the GPU that needs to go as wide as possible CU-wise and maintain moderate clocks for optimal power efficiency (aka sweet spot).

Off-the-shelf Navi 5700 doesn't fit the bill.

I can't find any realistic estimates of the ryzen 3000 sizes but...
The last gen ryzen 12nm 8c/16t 2700x had 213mm die. 7nm would bring it down to 124mm. There could be some small architectural upgrades but I don't see a 7nm ryzen 7 going anywhere near 100mm.
Where did you get those numbers from?

12/14/16nm (it's the same thing die size-wise) to 7nm offers 3.3x density scaling. It's 70-80mm2 at worst (no L3 cutbacks).

SonGoku SonGoku ,

One question regarding this theory of yourse...Why didnt AMD just release 64CU version of Navi @ 1600MHZ and ~300mm² size and literally slaughtered Nvidia in price segment? Could even take performance crown ffs with 13TF+ Navi.
Why didn't Nvidia release Ampere GPUs this year?

Apple has been using 7nm chips since 2018, earlier than anyone else. Why?


Ask those questions and you'll get your answer.

I wonder when we find out that apisak is a twitter bot ran by the green or blue team?

2018 isn't exactly a long standing twitter account.

Easier to poison the well when you own the well ;)
I love your cynicism, haha!

False flag operations are indeed a real thing.
 

R600

Banned
Not saying gonzolo is certainly them, could be from another dime a dozen Chinese company, but this statement is certainly suspect:
Im going with occams razor on this one and maintain that bankrupt firm is not releasing not yet announced console with best performing APU you can currently find, especially since its codename tells us its much more likely to be Sonys console and not from no named Chinese manufacturer that AMD never mentioned on their quarterly call as new semi customer.
 
Status
Not open for further replies.
Top Bottom