• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[WIRED] Exclusive with Mark Cerny, PS5 specs detailed

Snake29

RSI Employee of the Year
Better for your wallet?



In regards to PC gaming. I will never buy a AMD GPU/CPU over an Intel/Nvidia combo.


Why trying to convince me with something i already know. Still i have no reason to buy Intel anymore for what i do with it (not only gaming).

My next CPU will be again....Ryzen.
 

sendit

Member
Why trying to convince me with something i already know. Still i have no reason to buy Intel anymore for what i do with it (not only gaming).

My next CPU will be again....Ryzen.

Not trying to convince you at all. But to say they're better for your wallet without taking individual use cases in to account is a pretty ignorant statement to make.
 
Make of the videos below what you will, he makes compelling arguments (I/O, memory, scalability).
It's not good to follow tech too closely, it will give you a freakin' headache /S, the Zen 2 video might be interesting also.
Wish it was E3 2020 already...



This guy is clueless. I used to be more on the side of 'he knows what he's talking about'. Then in December 2018 he made a video claiming to have the base/boost clocks of every single Ryzen 3000 series SKU and, hilariously, prices. I checked through the SKUs and some of the base clocks were ridiculously high (4.2 and 4.3Ghz on a 16-core chip!) I said those figures point to someone that doesn't really know about CPUs as there was 1% chance of such clockspeeds. He didn't reply.

Lo and behold 2-months after that video at CES Lisa Su revealed Ryzen 3000 series were only in engineering sample stage, so clocks werent finalised, and release was ages away, not until 'middle 2019'. Confirmation Adored made that entire video up or didnt do a sanity check on his sources.
 
Last edited:

thelastword

Banned
Anybody expecting PS5 to be 8TF is not to be taken seriously......It's just low-level trolling at this point.....If PS5 is announced as 20TF tomorrow, they will say XB2 is 24TF and will be more powerful...It's like a Brads Sams fused with Leadbetter joint....

Better for your wallet?



In regards to PC gaming. I will never buy a AMD GPU/CPU over an Intel/Nvidia combo.
You're comparing an i5 intel CPU to a R7 Ryzen in pricing, of course intel will be cheaper and everybody knows that most games favor clockspeed over cores......I mean Gee, I wonder why that is? "because intel has had the monpoly for quite a bit".....So Ryzen is behind a few frames in games, Intel is clocked at 5.2Ghz, but what are the temps, what type of cooler is the intel CPU using? You think it's the crap that comes in the box with said CPU's? How does that intel cpu fare whilst streaming games compared to Ryzen, how does it do in workloads.......Intel wins in gaming, but the difference in frames is not that large considering the much larger difference in clockspeed......which speaks to the quality of Ryzen, how do you think Ryzen would perform if it could reach clocks of 5.2Ghz, I guess you will find out soon, because Snake doesn't have to change his motherboard if he wants to upgrade to Ryzen 3000 which are actually bringing these clocks with higher IPC to boot......Even now, I'm pretty sure Snake is running a much cooler system with his Ryzen at 4.3Ghz over a guy doing 5.2GHz on an intel, with Ryzen, he can game and stream like a champ, he can zip through workloads, his emulator will love the cores and games will be gearing towards cores, since consoles will be 8c/16t beasts for next gen with AMD CPU's......Yeah he's right, it's better for his wallet, now and in the future....

FYI, R5 Ryzen 2600 = $164 on Amazon......The R5 Ryzen 2600X = $179 on Amazon, both with decent coolers unlike intel. I'm pretty sure you can get these cheaper elswhere too......The Intel 9600k is $264 with only 6 cores/6 threads...Compare the Intel 9700k to the Ryzen 2700x,, they're both 7 class processors, Intel's 9700k is $409 compared to Ryzen 2700x at $292, number of cores you don't want to go there....The 9900k? Forget about it at that crazy price it has.....And you want to push these intel CPU's at a 5.2 Ghz, you need to invest in decent cooling, which the box does not provide, extra money on Intel's side....There's a reason everybody is shifting over to Ryzen and it's outselling Intel CPU's....It's because it's a better value with more cores......You can even buy the 1600 series for cheap and upgrade to your Ryzen 3000 with all the high clockspeeds that Intel has been banking on, yet, that changes in July.....In the here and now, It's a win win for AMD, the best value CPU's, the best performance per dollar.....
 

ethomaz

Banned
So I took a wild guess that 18432MB would be a perfect multiple of 18 / 36 / 72 / 256 / 512 & so on



So if the GPU RAM dump is real we are looking at a GPU with either 36 CUs matched up to 512MB each or 72 CUs matched up to 256MB each



18432 is also a perfect match for PS4 GPU megaFLOPS 1.8432


look like they are trying to get perfect BC with enhancements like going from 1080P to 4K or from 30fps to 60fps or 120fps
RAM chips and CUs are not direct related.

18GB means 12 chips with 12Gb density but actually people forget the system RAM so it is probably 12 chips with 16Gb density.

In any case 12 GDDR6 chips means 384bits bus that at 15200Mhz give us 733GB/s.
 
Last edited:

onQ123

Member
RAM chips and CUs are not direct related.

18GB means 12 chips with 12Gb density but actually people forget the system RAM so it is probably 12 chips with 16Gb density.

In any case 12 GDDR6 chips means 384bits bus that at 15200Mhz give us 733GB/s.



who make 12Gb chips? Samsung make 8Gb (1GB) & 16Gb (2GB) GDDR6 chips & 4Gb (512MB) & 8Gb (1GB) GDDR5 chips





18GB could come from 9 2GB chips but that would be 288bits & it wouldn't be that fast
 
Last edited:

Snake29

RSI Employee of the Year
Anybody expecting PS5 to be 8TF is not to be taken seriously......It's just low-level trolling at this point.....If PS5 is announced as 20TF tomorrow, they will say XB2 is 24TF and will be more powerful...It's like a Brads Sams fused with Leadbetter joint....

You're comparing an i5 intel CPU to a R7 Ryzen in pricing, of course intel will be cheaper and everybody knows that most games favor clockspeed over cores......I mean Gee, I wonder why that is? "because intel has had the monpoly for quite a bit".....So Ryzen is behind a few frames in games, Intel is clocked at 5.2Ghz, but what are the temps, what type of cooler is the intel CPU using? You think it's the crap that comes in the box with said CPU's? How does that intel cpu fare whilst streaming games compared to Ryzen, how does it do in workloads.......Intel wins in gaming, but the difference in frames is not that large considering the much larger difference in clockspeed......which speaks to the quality of Ryzen, how do you think Ryzen would perform if it could reach clocks of 5.2Ghz, I guess you will find out soon, because Snake doesn't have to change his motherboard if he wants to upgrade to Ryzen 3000 which are actually bringing these clocks with higher IPC to boot......Even now, I'm pretty sure Snake is running a much cooler system with his Ryzen at 4.3Ghz over a guy doing 5.2GHz on an intel, with Ryzen, he can game and stream like a champ, he can zip through workloads, his emulator will love the cores and games will be gearing towards cores, since consoles will be 8c/16t beasts for next gen with AMD CPU's......Yeah he's right, it's better for his wallet, now and in the future....

FYI, R5 Ryzen 2600 = $164 on Amazon......The R5 Ryzen 2600X = $179 on Amazon, both with decent coolers unlike intel. I'm pretty sure you can get these cheaper elswhere too......The Intel 9600k is $264 with only 6 cores/6 threads...Compare the Intel 9700k to the Ryzen 2700x,, they're both 7 class processors, Intel's 9700k is $409 compared to Ryzen 2700x at $292, number of cores you don't want to go there....The 9900k? Forget about it at that crazy price it has.....And you want to push these intel CPU's at a 5.2 Ghz, you need to invest in decent cooling, which the box does not provide, extra money on Intel's side....There's a reason everybody is shifting over to Ryzen and it's outselling Intel CPU's....It's because it's a better value with more cores......You can even buy the 1600 series for cheap and upgrade to your Ryzen 3000 with all the high clockspeeds that Intel has been banking on, yet, that changes in July.....In the here and now, It's a win win for AMD, the best value CPU's, the best performance per dollar.....

This is what i running atm:

PC specs:

--------------------------------------------------------

CPU: AMD Ryzen 2700X 8C/16T (Core/Threads)

CPU Cooling: NZXT Kraken X62 liquid cooling

Motherboard: MSI X370 Gaming Pro Carbon

RAM: G.Skill Trident Z F4-3200C14D 16GB DDR4 @3200Mhz

GPU: MSI GTX 1080 Gaming X 8GB (2100mhz with Afterburner OC)

Storage:

- Windows 10: Samsung 960 Evo 256GB NVMe SSD

- Star Citizen: Intel Optane 900P 280GB PCI-E SSD

- Second Samsung 960 Evo 256GB NVMe SSD

- 2 other normal SSD's

--------------------------------------------------------


Gameplay capture HD:

- Western Digital My Passport 2TB USB 3.1


Screenshots:

- For screenshots i use MSI Afterburner to make screenshots in 4K, 8K and Panorama.


Gameplay Videos:

- I capture gameplay videos with Nvidia Shadowplay @4K/60fps.

Just a copy/paste from my Flickr album
 

ethomaz

Banned
who make 12Gb chips? Samsung make 8Gb (1GB) & 16Gb (2GB) GDDR6 chips & 4Gb (512MB) & 8Gb (1GB) GDDR5 chips





18GB could come from 9 2GB chips but that would be 288bits & it wouldn't be that fast
Anybody can make it is in the GDDR6 specification.

GDDR6 can have 8Gb, 12Gb or 16Gb

The devkit is 384bits with 12 GDDR6 chips... I believe people are forgeting the system ram so it is probably 12 chips of 16Gb.
 
Last edited:

onQ123

Member
Anybody can make it is in the GDDR6 specification.

GDDR6 can have 8Gb, 12Gb or 16Gb

The devkit is 384bits with 12 GDDR6 chips... I believe people are forgeting the system ram so it is probably 12 chips of 16Gb.

I'm on Samsung's website & they only have 8Gb & 16Gb chips no 12Gb


https://www.samsung.com/semiconductor/dram/gddr6/





I think it comes back to Oirisblack rumor of 18GB of GDDR5 for the devkits because 18 GDDR5X chips at 10 - 12 Gbps will give you around 733GB/s on a 576bit bus.
 

ethomaz

Banned
Last edited:

onQ123

Member
Because they were never asked about 12GB?

It is in the definition of GDDR6... it can come in thee density: 8Gb, 12Gb or 16Gb. Anybody can manufacture any of these density.

If Sony ask to Samsung 250 million chip of 12Gb GDDR6 for year they will give that.


I know they have 12Gb GDDR6 chips & they also have 12Gb GDDR5X chips but the 12Gb chips are not in mass productions as far as I know
 

Shin

Banned
Public website vs internal supply network, given the market these things belong to it's not surprising.
Also not listed I believe, yet who's the customer for this mass order....early 2018 https://www.anandtech.com/show/12338/samsung-starts-mass-production-of-gddr6-memory
I vaguely remember checking up PS4 chips before or right after the configuration was known and IIRC it wasn't on their site either.

Case in point a public website doesn't always tell the whole story, a lot on the business side of things you never hear or read about.
 
Last edited:

ethomaz

Banned
I know they have 12Gb GDDR6 chips & they also have 12Gb GDDR5X chips but the 12Gb chips are not in mass productions as far as I know
Sony planned PS4 GDDR5 chips to 8GB didn't exists when they announced it... so even after the launch they had to take a bit of lose using the double amount of chips to reach these 8GB until the bigger density was out... so after that their PS4 cost dropped a lot.

I'm not saying it is that case but GDDR6 12Gb is possible... it just need any mass consumer ask Samsung or other memory manufacture to produce them.

I still believe if the devkit is true then it is 24GB system + VRAM... the Devkit have 18GB for VRAM because that is what games will use in the PS5 so devkits didn't allow devs to use what is reserved to system.
 
Last edited:

onQ123

Member
Public website vs internal supply network, given the market these things belong to it's not surprising.
Also not listed I believe, yet who's the customer for this mass order....early 2018 https://www.anandtech.com/show/12338/samsung-starts-mass-production-of-gddr6-memory
I vaguely remember checking up PS4 chips before or right after the configuration was known and IIRC it wasn't on their site either.

Case in point a public website doesn't always tell the whole story, a lot on the business side of things you never hear or read about.

I know GDDR6 chips are in mass production I'm talking about the 12Gb chips that you said would make up the 18GB
 

ethomaz

Banned
I know GDDR6 chips are in mass production I'm talking about the 12Gb chips that you said would make up the 18GB
It doesn't need to be in mass production now... there is a year o more to that.

Sony can have samples for the devkits for example.

To be fair even the CPU (APU) in devkit is probably an actual Ryzen 7 pared with a custom Vega 64 (or something like)... PS5's APU is not in mass production yet.
 
Last edited:

demigod

Member
Anybody expecting PS5 to be 8TF is not to be taken seriously......It's just low-level trolling at this point.....If PS5 is announced as 20TF tomorrow, they will say XB2 is 24TF and will be more powerful...It's like a Brads Sams fused with Leadbetter joint....

You're comparing an i5 intel CPU to a R7 Ryzen in pricing, of course intel will be cheaper and everybody knows that most games favor clockspeed over cores......I mean Gee, I wonder why that is? "because intel has had the monpoly for quite a bit".....So Ryzen is behind a few frames in games, Intel is clocked at 5.2Ghz, but what are the temps, what type of cooler is the intel CPU using? You think it's the crap that comes in the box with said CPU's? How does that intel cpu fare whilst streaming games compared to Ryzen, how does it do in workloads.......Intel wins in gaming, but the difference in frames is not that large considering the much larger difference in clockspeed......which speaks to the quality of Ryzen, how do you think Ryzen would perform if it could reach clocks of 5.2Ghz, I guess you will find out soon, because Snake doesn't have to change his motherboard if he wants to upgrade to Ryzen 3000 which are actually bringing these clocks with higher IPC to boot......Even now, I'm pretty sure Snake is running a much cooler system with his Ryzen at 4.3Ghz over a guy doing 5.2GHz on an intel, with Ryzen, he can game and stream like a champ, he can zip through workloads, his emulator will love the cores and games will be gearing towards cores, since consoles will be 8c/16t beasts for next gen with AMD CPU's......Yeah he's right, it's better for his wallet, now and in the future....

FYI, R5 Ryzen 2600 = $164 on Amazon......The R5 Ryzen 2600X = $179 on Amazon, both with decent coolers unlike intel. I'm pretty sure you can get these cheaper elswhere too......The Intel 9600k is $264 with only 6 cores/6 threads...Compare the Intel 9700k to the Ryzen 2700x,, they're both 7 class processors, Intel's 9700k is $409 compared to Ryzen 2700x at $292, number of cores you don't want to go there....The 9900k? Forget about it at that crazy price it has.....And you want to push these intel CPU's at a 5.2 Ghz, you need to invest in decent cooling, which the box does not provide, extra money on Intel's side....There's a reason everybody is shifting over to Ryzen and it's outselling Intel CPU's....It's because it's a better value with more cores......You can even buy the 1600 series for cheap and upgrade to your Ryzen 3000 with all the high clockspeeds that Intel has been banking on, yet, that changes in July.....In the here and now, It's a win win for AMD, the best value CPU's, the best performance per dollar.....

I like how he comes back to LOL my post but won't bet me. Sometimes people don't believe the BS they're spouting.

Being an AMD, Intel, Nvidia fanboy is just stupid. Go with the best bang for your buck.
 
I always wondered how devkits are matched to specs of the launch console. Must be using a Ryzen 2nd gen CPU as 3rd gen was still engineering sample stage until recently.

So interestingly you'd assume they'd be using a 2700X slightly downclocked plus Vega 64, TDP not a worry for a dev kit obviously :)
 

ethomaz

Banned

Sony timing was perfect imo.

It did drive conversion even to the point of Xbox “insiders” feel pressure and put cool water over possible leaks with the new devkits reaching developers.

But why only the arch and not detailed specs? Because Sony probably didn’t have the set the final specs yet... number of CUs, clock, amount of RAM, all ca be changed and this devkits is probably a slower version of the actual final devkit.
 
Last edited:

ethomaz

Banned
Wasn't this confirmed before by the Wired Jouro?
Yes but some thought they are jumping conclusion lol

To be fair the best way to counter a leaks is just to release officially the info.
Plus Sony created the buzz word before MS.

IMO that shows confidence in what they are doing... they have a clear plan to how they will announce and release their next-fen platform.
 
Last edited:

Aceofspades

Banned

Shin

Banned
Is that you or are you blatantly stealing/copying someone's post word for word?

94a7c4cf0a.jpg
 

Shin

Banned
^Yikes. Still nothing about the PS5 OS. Hope they make similar to the PS3.
Should be FreeBSD 12, I don't see why they would need to change from it.
It's updated frequently and supports (from the top of my head) practically everything they need for the upcoming gen.
IMO our best bet is leaks via developers that are getting or got the dev-kit, Sony will most likely be silent until PSX (if that's a go).
 
Last edited:

Fake

Member
Should be FreeBSD 12, I don't see why they would need to change from it.
It's updated frequently and supports (from the top of my head) practically everything they need for the upcoming gen.
The early rumors about PS5 talked about some memory hungry OS if I remember right.
 

CrustyBritches

Gold Member
The thing about the Wired reveal and the 3rd party dev kit leak thing is that Sony didn't actually release any detailed info on anything. No specs were detailed...at all.

I think it was done to generate some hype and shit on the Xbox SAD reveal, possibly just as a favor to Wired.

The most substantial info stills come from the Gonzalo id string, which is probably PS5 or Anaconda, the former imo.
 

Fake

Member
The thing about the Wired reveal and the 3rd party dev kit leak thing is that Sony didn't actually release any detailed info on anything. No specs were detailed...at all.

I think it was done to generate some hype and shit on the Xbox SAD reveal, possibly just as a favor to Wired.

The most substantial info stills come from the Gonzalo id string, which is probably PS5 or Anaconda, the former imo.
True. Reveal the info was in many respect a win to win case for Sony that time.
 

Shin

Banned
Probably for the UI 4K/8K native, in terms of footprint nothing else comes to mind within Linux distribution that would offer just as low or lower than FreeBSD.
Could be wrong, that's just from the top of my head. Took a couple of peek last year to check it's update interval and what's added, etc etc but it's not something I'm knowledgeable about.
 

bitbydeath

Member
The thing about the Wired reveal and the 3rd party dev kit leak thing is that Sony didn't actually release any detailed info on anything. No specs were detailed...at all.

I think it was done to generate some hype and shit on the Xbox SAD reveal, possibly just as a favor to Wired.

The most substantial info stills come from the Gonzalo id string, which is probably PS5 or Anaconda, the former imo.

Specs probably aren’t considered as final until the rumoured PSX blowout.
 
Top Bottom