• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

xool

Member
"SCEA 2019 Report e1 " Are we supposed to believe that someone copy pasted the text of an official SCEA document?

Is there any other way to search for leaks? Or is this simply it?
Keep pressing F5 on 4chan ?

RuthenicCookie strikes again? Reads awfully similar.


What happens when internet access to porn is blocked I guess. People find another outlet.

Is Subere a name I'm supposed to recognize ?
 

Darius87

Member
Navi XT (9.75TF) is 14% faster then Vega64 (12.56TF) so I think 8.3TF Navi would be closer to 12TF Vega then 10.5TF one in Stadia.
yea that's was just ipc improvements not overall performance which is hard to tell without real testing in games rather just believe amd slides.
There was a reason Google said with cloud they can deliver more perf then if the went with closed box and thermal limits. Thats why everything above 10TF (let alone 14+) is pipe dream.
i disagree i think we'll see something 10 -11 tflops, to pass 10 tflops it only needs 48 CU at 1680mhz or 52CU at 1525Mhz which seems likely considering that next-gen consoles comes out after more then a year, and will be time reduce power draw bellow 200w, don't know about die size limits though.
 

CrustyBritches

Gold Member
igorslab-tdp.png

One RX5700 draw 122W, so dual GPU should draw 244W? It's still too much for a console, but maybe PS5 GPU will be even more power efficient.
That 5700 Pro will have like 190W average gaming power consumption. You need add an additional 32-48W for the extra Speicher most people who follow this line are expecting.
 

pawel86ck

Banned
That 5700 Pro will have like 190W average gaming power consumption. You need add an additional 32-48W for the extra Speicher most people who follow this line are expecting.
Standard RX5700 is 7.95TF GPU, and because dual RX5700 would offer 15.9TF, so they can downclock it to 14TF and end up with somewhat acceptable TDP
 
Last edited:

R600

Banned
Imagine if 32/10/18 for Gonzalo means 32CUs (since its code is Navi Lite) 1.0GHz base and 1.8GHz boost speed and ends up in console? Would result in 7.4TF GPU and, with RT cores, GPU TDP of round about ~110W. Throw in 16GB of GDDR6 and ~30W Zen2, and we are at 180W console.

This is obviously most pesimistic prediction, but considering Navi numbers and last gens difference between console GPUs and top PC parts (2x TF for PS4, 3x for XBONE), this would fit in well.
 
Last edited:

ethomaz

Banned
That's 700$ console at minimum 😀, but maybe sony will sell it with big loss. 2012 was a really bad time for sony, so I can understand why they wanted to play safe back then, but now maybe they want to deliver premium product once again like in PS2-3 times.

14TF Navi architecture should not only crush radeon 7 performance but maybe even match 2080ti. Can you guys imagine god of war 2 build for console like that?😀


igorslab-tdp.png

One RX5700 draw 122W, so dual GPU should draw 244W? It's still too much for a console, but maybe PS5 GPU will be even more power efficient.
I will eat a hat if RX 5700 power consumption in load is just that low :D
 
Last edited:

vpance

Member
Mass production yes.

But I’m sure there are a lot of wafers produced already with engineer samples.

Sony / MS probably get a new wave of chips each few months.

Wafer and chip shots rarely if ever get leaked for these anyways, so on that side of things we will never know when they truly start.
 

ethomaz

Banned
Last edited:

CrustyBritches

Gold Member
That article is addressing Nvidia's marketing numbers for Navi 10 and saying that the TDP(GPU only) numbers don't match up. "Miraculous multiplication" was Google's translation. I don't think most people read the article.

It's best not to look at GPU-only power. Better to take TBP and add 32-48W for RAM(+16-24GB GDDR6 on top of 8GB Speicher). Then add 30-40W for CPU, and then the rest of the system(SSD, BD drive, etc). I'm guessing one of these systems will pull ~200W, and the other ~180W. That's at the wall, so total system power consumption under 200W. PS4 Pro was ~155W and X1X ~175W at the wall.
 

ethomaz

Banned
That article is addressing Nvidia's marketing numbers for Navi 10 and saying that the TDP(GPU only) numbers don't match up. "Miraculous multiplication" was Google's translation. I don't think most people read the article.

It's best not to look at GPU-only power. Better to take TBP and add 32-48W for RAM(+16-24GB GDDR6 on top of 8GB Speicher). Then add 30-40W for CPU, and then the rest of the system(SSD, BD drive, etc). I'm guessing one of these systems will pull ~200W, and the other ~180W. That's at the wall, so total system power consumption under 200W. PS4 Pro was ~155W and X1X ~175W at the wall.
TBP already include RAM.... it is Typical Board Power... include everything in the GPU board.
But AMD cards cross that TBP most of times... myabe Typical means the power draw is most in idle.

RX 5700 TBP 180W
RX 5700 XT TBP 225W
 
Last edited:

CrustyBritches

Gold Member
TBP already include RAM.... it is Typical Board Power... include everything in the GPU board.
So next-gen systems are only including 8GB RAM?

He says 8GB GDDR6 = 16W. For 24-32GB total(what most people who predict 56-64CUs expect), you'd need to add 16-32GB on top of the 8GB included in 5700 Pro TBP.
 
Last edited:

ethomaz

Banned
So next-gen systems are only including 8GB RAM?

He says 8GB GDDR6 = 16W. For 24-32GB total(what most people who predict 56-64CUs expect), you'd need to add 16-32GB on top of the 8GB included in 5700 Pro TBP.
I misunderstood your comment.

GPU and console boards are different and so the cooling system... that changes the power draw for the board.
 
Last edited:

R600

Banned
I think mid gen releases have dissorted a picture a bit.

For example, PS4 was about 8x PS3 GPU in terms of pure TF after 8 years.

Xbox One was 5x TF compared to 360, after 8 years.

If there was no mid gen console releases, 8-9TF with RT would be more then expected, considering node shrinking has gotten slower, mm² vs mm² on die has gotten more expensive and graphics technology, especially in pure TFs, has hit a celling a bit.

Basically, if they shrinked GCN to 7nm they could have gotten something like 10TFs which would be 5.5x more then PS4. Since Navi arch has advantage in TF v TF, pure TF wont tell the whole story.
 

R600

Banned
That article is addressing Nvidia's marketing numbers for Navi 10 and saying that the TDP(GPU only) numbers don't match up. "Miraculous multiplication" was Google's translation. I don't think most people read the article.

It's best not to look at GPU-only power. Better to take TBP and add 32-48W for RAM(+16-24GB GDDR6 on top of 8GB Speicher). Then add 30-40W for CPU, and then the rest of the system(SSD, BD drive, etc). I'm guessing one of these systems will pull ~200W, and the other ~180W. That's at the wall, so total system power consumption under 200W. PS4 Pro was ~155W and X1X ~175W at the wall.
When you put it like that, people should be happy if they get ~5700 with 8TF, let alone 10 lol.
 

ethomaz

Banned
Article talks about Nv chart and according to Igor Nvidia has provided (intentionally or not) wrong TDP for AMD GPU's.
Why did he thinks it is wrong?

Radeon VII TBP: 300W
Radeon VII consumption: ~300W

I found his estimate pretty biased and hard to happen in real scenario.
While the other pic from presentation picture even optimistic with RX 5700 but we will see soon.
 
Last edited:

CrustyBritches

Gold Member
When you put it like that, people should be happy if they get ~5700 with 8TF, let alone 10 lol.
These are numbers I've pulled from articles, not my own. Of course I'm on the conservative side with estimates, assuming consoles will be compact, quiet, power efficient, and affordable. BoM not to exceed retail msrp.

Like that Tom's Hardware Germany article, he says 12W for 2060 Speicher and 16W for the others. That has to be talking VRAM since 2060 has 6GB and the others have 8GB. So for 24GB GDDR6, by his measurements, it would be 48W.

Then somebody will get a Ryzen 3000 and underclock it and we'll get consumption for our CPU. Then same for 5700 XT and Pro, we'll get undervolt and underclock to get best-case perf/watt for Navi 10. That will be real numbers and give a good indication of what's possible in console form-factor.

Last part is dependent on how much power Sony and MS want these things to pull. If they allow 220W, or more, things would be drastically different than with ~175W like Xbox One X.
 
Last edited:

quest

Not Banned from OT
When you put it like that, people should be happy if they get ~5700 with 8TF, let alone 10 lol.
I honestly don't see 4x the OG ps4 with half ass RT as next generation. Sure the CPU will be great and a SSD ram drive but that is about the minimum upgrade they could give and cheaping out. The RT is a complete waste right now if nvidia cant do it yet with any efficiency. Those transistors should go to more CUs. I'll be there day 1 because of gamepass but it disappointment of what could of been. That be 2 under powered generations in a row. These should be 10 plus but really 12tf if they are to be 7 year generations plus 2 of cross generation. If they want to bring back 5 year generations then I could see cheaping some with only a 4x leap in raw power.
 

R600

Banned
These are numbers I've pulled from articles, not my own. Of course I'm on the conservative side with estimates, assuming consoles will be compact, quiet, power efficient, and affordable. BoM not to exceed retail msrp.

Like that Tom's Hardware Germany article, he says 12W for 2060 Speicher and 16W for the others. That has to be talking VRAM since 2060 has 6GB and the others have 8GB. So for 24GB GDDR6, by his measurements, it would be 48W.

Then somebody will get a Ryzen 3000 and underclock it and we'll get consumption for our CPU. Then same for 5700 XT and Pro, we'll get undervolt and underclock to get best-case perf/watt for Navi 10. That will be real numbers and give a good indication of what's possible in console form-factor.

Last part is dependent on how much power Sony and MS want these things to pull. If they allow 220W, or more. Things would be drastically different than with ~175W like Xbox One X.
I agree with this, I am pretty sure there wont be more then 16GB of GDDR6 in consoles. I already said 320 bit bus for Scarlett (going by reveal video), and 256 bit for PS5 (going by PCB leak)

So that is around 30W for memory.

Zen 2700 with 3.2GHz base clock is 45W.
8 core Zen2 in consoles on 7nm node running at 3GHz should be round about 30W.

To this we add Blu Ray, SSD, controllers etc and we are safely around 70W without GPU.

So, 70W.

For 180W console we are left with 110W for GPU. For 20W console we have 130W for GPU. 220W console and we have 150W for GPU (~8.5TF IMO).
 

LordOfChaos

Member
Liquid metal makes consoles quieter and cooler?





Yeah that's not going to be in a mass production system. It eats other metals so you have to surround the chip with an epoxy, it shifts around in shipping, it may leak around even sitting idle with time. That's the realm of higher end overclockers, not mass production systems.





It's enough for me if they don't use shitty thermal paste
 
Last edited:

xPikYx

Member
I don't really care about resolution (I still play at 1080p and I think it a good resolution), i think 4k is ridiculous per se, imagine 8k, but absolutely next ges has to come with hardware RT acceleration, because this means next gen to me, no matter what
 

Imtjnotu

Member
I agree with this, I am pretty sure there wont be more then 16GB of GDDR6 in consoles. I already said 320 bit bus for Scarlett (going by reveal video), and 256 bit for PS5 (going by PCB leak)

So that is around 30W for memory.

Zen 2700 with 3.2GHz base clock is 45W.
8 core Zen2 in consoles on 7nm node running at 3GHz should be round about 30W.

To this we add Blu Ray, SSD, controllers etc and we are safely around 70W without GPU.

So, 70W.

For 180W console we are left with 110W for GPU. For 20W console we have 130W for GPU. 220W console and we have 150W for GPU (~8.5TF IMO).
When did the PS5 pcb leak?!
 

R600

Banned
When did the PS5 pcb leak?!


Along with Gonzalo, this is most believable leak to date. Coincidently, Gonzalo was ES1 in January, QS in April and to have dev kits with real SOC manufactured by the end of May it would make perfect sense, timeline fits completely.

Now, obviously this could be VERY good fake, but given that this user was obviously from Asia (as he said they are celebrating "cake day on 21st", when GMT date was still 20th) and he had electric circuit as avatar before he deleted his account, he could very well be someone from production line leaking this.

18Gbps Samsung chips just got into full production in Jan btw.
 
Last edited:

Chronos24

Member
Yup. Even Phil didn't say a word about TFs and we know how much he loves it. Lol

Sony and MS will follow AMD and just talk about how efficient, silent, powerful etc their systems are.
Silent and powerful is huge honestly. Especially SILENT! How easily we forget just how loud the OG PS4 and Pro were/are.
 

LordOfChaos

Member


Along with Gonzalo, this is most believable leak to date. Coincidently, Gonzalo was ES1 in January, QS in April and to have dev kits with real SOC manufactured by the end of May it would make perfect sense, timeline fits completely.

Now, obviously this could be VERY good fake, but given that this user was obviously from Asia (as he said they are celebrating "cake day on 21st", when GMT date was still 20th) and he had electric circuit as avatar before he deleted his account, he could very well be someone from production line leaking this.

18Gbps Samsung chips just got into full production in Jan btw.



I still find it funny that the Phison SSD controller has "PS5" openly in the code name lol, sure it's probably "Phison (gen) 5" but it lines up so well
 

Fake

Member
Speaking of Dualshock I don't think Dualshock 5 will be a generation leap to DS4. Or maybe some improves or just like Microsoft using a elite version of regular DS4.
 

CrustyBritches

Gold Member
For 180W console we are left with 110W for GPU. For 20W console we have 130W for GPU. 220W console and we have 150W for GPU (~8.5TF IMO).
I came across this site when researching console power consumption, and you might find it interesting...Efficientgaming.eu. It's an organization that sets guidelines for console power consumption with members including Sony, MS, and Nintendo. There's a lot interesting info in there, but the whole premise is that game consoles ought to be energy efficient as possible. They have annual voluntary compliance reports from Sony and MS you can read where they even break down their own internal testing for different modes(media, gaming, etc.) on PS4 Pro and Xbox One X.

I agree with this, I am pretty sure there wont be more then 16GB of GDDR6 in consoles. I already said 320 bit bus for Scarlett (going by reveal video), and 256 bit for PS5 (going by PCB leak)
Agreed. I was thinking some DDR4 for PS5, but now that I've read more on the SSD and possible implementation, it's feasible they don't need the extra memory.
---
This question is for anybody: Can somebody break down the way a 320-bit bus would work, the possible memory layout based on shots, and what type of bandwidth we're talking? It's easier for me to ask right now, I don't have a lot of time for research atm.
 
Last edited:

R600

Banned
I came across this site when researching console power consumption, and you might find it interesting...Efficientgaming.eu. It's an organization that sets guidelines for console power consumption with members including Sony, MS, and Nintendo. There's a lot interesting info in there, but the whole premise is that game consoles ought to be energy efficient as possible. They have annual voluntary compliance reports from Sony and MS you can read where they even break down there own internal testing for different modes(media, gaming, etc.) on PS4 Pro and Xbox One X.
Jesus, thanks thats awesome find mate.

Well there it is... 😎
but but OQA leak,... and OQA leak... OQA leak... OQA leak....
PCB leak is 100x more legit that this thing. 14.2TF Navi? Lol, wanna bet that you wont have 14.2TF Navi as highest performing GPU from AMD in PCs next year, let alone consoles?
 
Last edited:
This question is for anybody: Can somebody break down the way a 320-bit bus would work, the possible memory layout based on shots, and what type of bandwidth we're talking? It's easier for me to ask right now, I don't have a lot of time for research atm.

here you go:

xboxscarlettlayoutamjxc.png


funnily enough 2 x 4 core CCXs and 56 CUs (8 disabled) fit nicely in pro elites measurements.
 
Last edited:

Ovech-King

Gold Member
These consoles will have more power than the current 2080ti and yes it will be 500$, here's why . You need to get familiar with the concept of high volume purchase rebate . To give you a parallel with my own job, I work for a Canadian tiles distributor; the home owners (gamers) end up paying 4 times our cost on tiles and our main retail stores customers (Sony , Microsoft) are paying about 2 times our cost.

This being said , the more the retail stores buy in high volume , the bigger discount they get and gets closer to the cost. It's called good business relationship .

Moral of the story; when Sony and Microsoft buy MILLIONS of Samsung and AMD parts , they don't pay a lot more than the cost and yes they will sell at a small loss but you'll be surprised of how powerful the hardware they can sell you will be while keeping the lost minimal at 500$

A quick example is the Xbox one X which sell as a whole the same price of its GPU only counter part on PC for a end user like you and me.
 

SonGoku

Member
Hypothetically, if a next -gen system added multiple vapor chamber coolers would that bring the heat down or is this just not physically possible within the arrangement? If it were possible it would be a cheap solution to getting more performance into the systems without risking overheating. But I'm no expert.
We're mostly presuming it's an APU, so one contact point = one big vapor chamber would make sense. They could add some cooling to the flip side of the board but it's generally a minimal gain for the cost, only on high end GPUs.
Would a 3 fan case design be better? Those type of GPU aftermarket coolers perform better than vapor chambers
if ps5 soc 315mm2 vs scarlet soc 385mm2 is true then i guess it's bad news for sony
The only way that's happening is with 7nm EUV, maybe that was Sony plan all along
315mm2 on 7nm EUV is roughly equal to 400 mm2 on plain 7nm
56-64 enabled CUs 12-14TF possible on that node

So i wouldn't worry, all things point to Sony going with a big die on plain 7nm or conservative die on 7nmEUV
 
Last edited:

R600

Banned
Just one point, MS and Sony buy design and licence from AMD, they pay TSMC for chip manufacture. Same for other parts of console. AMD has to fit everything on motherboard (memory included) and sell at solid profit margin. Sony/MS fit everything at discount and sell at slight loss. That does not mean however they can go for 2080TI, because if they could they wouldnt go for mid range chips last time around.

If Sony went with EUV 7nm to get smaller and cheaper chip with more performance certainly MS would too. These companies know very well node roadmaps, thats why every time they got out with new console at same time they went for same node.
 
Last edited:

SonGoku

Member
These consoles will have more power than the current 2080ti
Nah mate don't get over exited
With plain 7nm 11TF is a reasonable expectation, 12TF is the absolute limit. So expect around RTX2080 levels take or add.

For RTX2080Ti levels 13-14TF would be needed on 7nm EUV
 
Last edited:

Imtjnotu

Member


Along with Gonzalo, this is most believable leak to date. Coincidently, Gonzalo was ES1 in January, QS in April and to have dev kits with real SOC manufactured by the end of May it would make perfect sense, timeline fits completely.

Now, obviously this could be VERY good fake, but given that this user was obviously from Asia (as he said they are celebrating "cake day on 21st", when GMT date was still 20th) and he had electric circuit as avatar before he deleted his account, he could very well be someone from production line leaking this.

18Gbps Samsung chips just got into full production in Jan btw.

but for 16 can you use a 256? seems like instant bottle neck
 

Ovech-King

Gold Member
Nah mate don't get over exited
With plain 7nm 11TF is a reasonable expectation, 12TF is the absolute limit. So expect around RTX2080 levels take or add.

For RTX2080Ti levels 13-14TF would be needed on 7nm EUV

448Gb/s bandwidth equivalent for a release in more than 12 months and have to remains relevant for a few years ... I just can't get on board with that Goku I'm sorry
 
Its not bottleneck because in case of 16Gbps chips you are looking at 512 GB/s. 448 for GPU, 64 for Zen2. Exact sweetspot.

why would you go trough the hassle of 16 modules and than use such a narrow bus? 512bit bus or bust :p

no really 16 modules makes no sense. neighter cost, nor power, nor design wise.
 

R600

Banned
why would you go trough the hassle of 16 modules and than use such a narrow bus? 512bit bus or bust :p

no really 16 modules makes no sense. neighter cost, nor power, nor design wise.
They are 2GB capacities therefore you would actually need 8 of them for 16GB.

You would go with 256 GB/s because you would still be able to get necessary bandwidth and less space on die duo ti narrower bus.
 
Last edited:
Status
Not open for further replies.
Top Bottom