• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Evilms

Banned
xW1WlEc.jpg
EH0G5SZ.jpg
 

ethomaz

Banned
What is wrong with you people? Most people in this thread are clueless about electronic parts.

ZEN2 is ZEN2 U , H OR X version . ryzen 3600 is just 8 core chiplet with 2 cores disabled.

4800u is the same cpu as 3700x or 3800x but with only 8mb l3 cache and with lower clocks inside apu.

Console will have 8 core 16threads , 16MB or 8MB l3 cache 3-3.5Ghz lockd during heavy load ( gaming).
Console CPU will be the mobile parts not the desktop one.

That with smaller cache.
 
Last edited:

Gavin Stevens

Formerly 'o'dium'
18tf. Oh me‘boyos, we are in cloud cuckoo land now., hold on to your butts!

1DeGaha.jpg


A 18tf monster for $399 in a normal form factor case!

Cerny isn’t just good, he’s a GOD, and he’s done work that no man, nay, TEAM on the planet, with all their research, could manage. Nearly 6tf faster than a £1000 GPU that’s built to be nearly half the size of a ps4, but Cerny has done it...

Man, what a guy...

FA6flK4.gif


Also does anybody else’s iPhone auto correct Cerny to Corny? Because now I can’t get this guy out of my head.

DZ4kWxB.jpg


M U N S T E R
U
N
S
T
W
E
 
Last edited:

Niked

Member
18tf. Oh me‘boyos, we are in cloud cuckoo land now., hold on to your butts!

1DeGaha.jpg


A 18tf monster for $399 in a normal form factor case!

Cerny isn’t just good, he’s a GOD, and he’s done work that no man, nay, TEAM on the planet, with all their research, could manage. Nearly 6tf faster than a £1000 GPU that’s built to be nearly half the size of a ps4, but Cerny has done it...

Man, what a guy...

FA6flK4.gif


Also does anybody else’s iPhone auto correct Cerny to Corny? Because now I can’t get this guy out of my head.

DZ4kWxB.jpg


M U N S T E R
U
N
S
T
W
E
DEVKIT 18TF...NOT RETAIL
 

R600

Banned
What is wrong with you people? Most people in this thread are clueless about electronic parts.

ZEN2 is ZEN2 U , H OR X version . ryzen 3600 is just 8 core chiplet with 2 cores disabled.

4800u is the same cpu as 3700x or 3800x but with only 8mb l3 cache and with lower clocks inside apu.

Console will have 8 core 16threads , 16MB or 8MB l3 cache 3-3.5Ghz lockd during heavy load ( gaming).
Consoles will almost certainly pack 8MB of L3. We saw it with Flute benchmark already (PS5, sporting Oberon A0). Zen2 is 50% L3 cache, therefore going from 32MB on 75mm² die to 8MB would probably yield at least 25mm² smaller chip.

I guess what you see in 4800H is what we will see in consoles. Something between 3600 and 3700.
 
UE66xLL.jpg


This is the info he was sent...again, salt required.

This is an attempt to discredit the Github leak and a very poor one.

You can argue that the Github leak was not the final PS5 chip. That's fine.

However, claiming that the chips being tested were Navi chips magically using the same CU and clock speeds from PS4 and PS4Pro for PC/Mobile chips to stress test the chips is ludicrous. Why would 911mhz ever be used as a stress test of a Navi chip? Why would anyone want to use those in PC/Mobile when AMD has already debuted those chips. Claiming that its just a coincidence that the clock speeds and CUs match is absurd no matter what you think of the final PS5 chip.

There are no existing 18TF GPUs nor are they in the pipeline. Why would you send out dev kits with different TF than the final. Sony and MS don't do this. They may up the ram and storage but the extra TF are a waste if even possible because you aren't designing for that.

Claiming that Sony is running the testing in its labs makes absolutely no sense. So AMD is designing chips and then sending them offsite to protect the info? What? AMD already knows whats in the chips and their expected performance. What does this accomplish? This prevents AMD from knowing that their chips aren't achieving the theoretical max performance?

Also, you are sending dev kits out. Isn't that a higher likelihood of a leak versus AMD ?

This is pure fanboy drivel and a poor attempt to try and hand waive the Github data.
 

CyberPanda

Banned
18tf. Oh me‘boyos, we are in cloud cuckoo land now., hold on to your butts!

1DeGaha.jpg


A 18tf monster for $399 in a normal form factor case!

Cerny isn’t just good, he’s a GOD, and he’s done work that no man, nay, TEAM on the planet, with all their research, could manage. Nearly 6tf faster than a £1000 GPU that’s built to be nearly half the size of a ps4, but Cerny has done it...

Man, what a guy...

FA6flK4.gif


Also does anybody else’s iPhone auto correct Cerny to Corny? Because now I can’t get this guy out of my head.

DZ4kWxB.jpg


M U N S T E R
U
N
S
T
W
E
#munster
 

Handy Fake

Member
This is an attempt to discredit the Github leak and a very poor one.

You can argue that the Github leak was not the final PS5 chip. That's fine.

However, claiming that the chips being tested were Navi chips magically using the same CU and clock speeds from PS4 and PS4Pro for PC/Mobile chips to stress test the chips is ludicrous. Why would 911mhz ever be used as a stress test of a Navi chip? Why would anyone want to use those in PC/Mobile when AMD has already debuted those chips. Claiming that its just a coincidence that the clock speeds and CUs match is absurd no matter what you think of the final PS5 chip.

There are no existing 18TF GPUs nor are they in the pipeline. Why would you send out dev kits with different TF than the final. Sony and MS don't do this. They may up the ram and storage but the extra TF are a waste if even possible because you aren't designing for that.

Claiming that Sony is running the testing in its labs makes absolutely no sense. So AMD is designing chips and then sending them offsite to protect the info? What? AMD already knows whats in the chips and their expected performance. What does this accomplish? This prevents AMD from knowing that their chips aren't achieving the theoretical max performance?

Also, you are sending dev kits out. Isn't that a higher likelihood of a leak versus AMD ?

This is pure fanboy drivel and a poor attempt to try and hand waive the Github data.
As much as I don't believe it, I would say that they'd possibly upclock the dev kits simply for development before optimisation.
 
As much as I don't believe it, I would say that they'd possibly upclock the dev kits simply for development before optimisation.

From what I recall neither Sony nor MS have ever had dev kits that were different than final. The point of a dev kit is to give you a platform to target. Why would it be that high?

Also, what AMD chip can get overclocked to far north of 2.25ghz and survive?
 

ethomaz

Banned
From what I recall neither Sony nor MS have ever had dev kits that were different than final. The point of a dev kit is to give you a platform to target. Why would it be that high?

Also, what AMD chip can get overclocked to far north of 2.25ghz and survive?
MS devkit have more TFs than final hardware... X dev kit was 6.8TFs if I’m not wrong.
Sony deckits usually only more RAM.
 
Last edited:

Handy Fake

Member
From what I recall neither Sony nor MS have ever had dev kits that were different than final. The point of a dev kit is to give you a platform to target. Why would it be that high?

Also, what AMD chip can get overclocked to far north of 2.25ghz and survive?
Oh I'm not disputing what you say, I'm just theorising.
What I meant, and badly put, is that they'd possibly use higher clocks for unoptimised code so it runs smoothly.
 

GustavoLT

Member
Are upcoming current gen games like, The Last of Us 2, Ghost of Tsushima, etc.. going to be upgradeable on PS5, or am I going to need to buy the PS5 specifc version!?
 

Gavin Stevens

Formerly 'o'dium'
£1000 RETAIL PRICE? If that is what you refer, then think what Sony/Microsoft get these for in bulk.
But, I agree, this is not that likely. 50% chance at best

That is true, and I will give you that one. They can buy things a lot cheaper than us, as well as save in other areas.

But that’s JUST the GPU, without factoring one anything else. Last time I checked, a PS4 controller costs $18 just to make, and a PS4 costs like $380. But that was a whole back.
 

Tiago07

Member
This is an attempt to discredit the Github leak and a very poor one.

You can argue that the Github leak was not the final PS5 chip. That's fine.

However, claiming that the chips being tested were Navi chips magically using the same CU and clock speeds from PS4 and PS4Pro for PC/Mobile chips to stress test the chips is ludicrous. Why would 911mhz ever be used as a stress test of a Navi chip? Why would anyone want to use those in PC/Mobile when AMD has already debuted those chips. Claiming that its just a coincidence that the clock speeds and CUs match is absurd no matter what you think of the final PS5 chip.

There are no existing 18TF GPUs nor are they in the pipeline. Why would you send out dev kits with different TF than the final. Sony and MS don't do this. They may up the ram and storage but the extra TF are a waste if even possible because you aren't designing for that.

Claiming that Sony is running the testing in its labs makes absolutely no sense. So AMD is designing chips and then sending them offsite to protect the info? What? AMD already knows whats in the chips and their expected performance. What does this accomplish? This prevents AMD from knowing that their chips aren't achieving the theoretical max performance?

Also, you are sending dev kits out. Isn't that a higher likelihood of a leak versus AMD ?

This is pure fanboy drivel and a poor attempt to try and hand waive the Github data.
Doing the math 18TF is possible to reach, but very difficult I guess

64 CUs * 64 ROPs * 2 IPC * 2200 Mhz = 18,02 TFlops

It's possible to reach but not credible
 

Gamernyc78

Banned
We still don't know yet about that 12Tf if its RDNA, early kits should be vega 64 gpu.
We all agree that XsX is 2,??x scorpio, i read some people bumpin those TF numbers like crazy, if it was the case Phil Spencer would have said : XsX is nearly 3x scorpio.

If its the case, be prepred for desapointement perf wise.

Either that or 449$.

Yup I've akways said for months 450-500
 

DJ12

Member
Lol at 18tf dev kit. BS . Next
That's just foxys interpretation of what's said.

The fakebin says Sony have tested it to 18+ tflops not that's what it sits at in devkits.

It's well written, guess is the same faker that posted as YamikaG, probably upset it only made it to gaf and no one else ran with it.

Everyone knows if you've got some fud to spread get it on pastebin then it's fact!!! Lol
 

Disco_

Member
Navi was the same as well. Ariel(Sony’s first iteration of Navi gpu) in 2016 was numbered 1000 meaning it was the first Navi gpu by AMD numbering and it was for Sony). So Sony and amd have been working on Navi for the past 3 to 4 years.

Sony likely didn't have an actual chip design til 2018, late 2017 at the earliest. Someone on b3d explained the process.
 

Gudji

Member
Launch PS3 did PS2 games with actual hardware and PS1 games with emulation I had though, the only reason the PS2 had hardware BC with PS1 was because the IO controller for the CD/DVD drive used the original PS1 processor, but since the PS3 has its own IO controller I am sure that was not included on the PS3, I think the PS3 only had the CPU/EE of the PS2 on the launch model and not the IO controller from the PS2 (which would be the PS1 processor) so that would have to be emulated. From my understanding, below is the chip they had in the launch PS3 for PS2 BC.

https://en.wikipedia.org/wiki/File:Scph79001_eegeram.jpg

653px-Scph79001_eegeram.jpg

Holy moly PS5 SoC is hugeee like 500m2. :messenger_relieved:
 
Last edited:

DJ12

Member
From what I recall neither Sony nor MS have ever had dev kits that were different than final. The point of a dev kit is to give you a platform to target. Why would it be that high?

Also, what AMD chip can get overclocked to far north of 2.25ghz and survive?
Not true at all pal, most ms devkits since the 360 start with top of the line Nvidia cards in and gradually get closer to actual hardware when the CPUs/GPUs or Apus are ready.

Not a problem when you mandate direct x.

Sony normally start weaker and build up but ms go balls to the walls. Two different approaches to the same problem eg lack of actual hardware.
MS devkit have more TFs than final hardware... X dev kit was 6.8TFs if I’m not wrong.
Sony deckits usually only more RAM.
Also this.
That's a slight increase which doesn't matter during development.
Lol ok, you move those goalposts lad.
 
Last edited:

pawel86ck

Banned
Doing the math 18TF is possible to reach, but very difficult I guess

64 CUs * 64 ROPs * 2 IPC * 2200 Mhz = 18,02 TFlops

It's possible to reach but not credible
80 CUs (dual RX 5700) should be 5248 SP. In order to hit 18TF on 80CUs GPU you need 1715MHz.

However retail unit should have 72 CUs active and with lower clocks on top of that.

72 CUs (4608 SP) x 1520 MHz x2 = 14 TF
 

Gamernyc78

Banned
Yes it's doubtful, but in a era of over $1000 smartphones get queues to be bought, I dont see this price as Impossible anymore

Not impossible but a stupid move for the masses. Sony already learned from the PS3 era and won't commit tht mistake again, let's use common sense. Only two options exist either sell for 500 and under or come out with two skus one high end and one affordable.
 
Last edited:
80 CUs (dual RX 5700) should be 5248 SP. In order to hit 18TF on 80CUs GPU you need 1715MHz.

However retail unit should have 72 CUs active and with lower clocks on top of that.

72 CUs (4608 SP) x 1520 MHz x2 = 14 TF
We should stop the madness at some point there will never be a dual gpu. It’s expensive and inefficient.
 
Status
Not open for further replies.
Top Bottom