• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

CrustyBritches

Gold Member
What are we looking at here? I'm not entirely sure.

The CPU seems right for what other leaks have said, but the storage is a 7200RPM HDD. We know this isn't true, as there is going to be a dedicated SSD in there. As for the RAM - yes, the latency is higher than would be expected. Also, am I reading it right that it's using 16 1GB modules? Honestly, I'd expect larger amounts, 2GB or 4GB. I think more channels is more efficient, but there is space and cost to be considered.

Maybe this is an early devkit or prototype? I believe Sony has in the past sent devkit revisions out as new features have been introduced and specs have been locked down.*

(* - well, expect that Shadowfall was built for a 4GB machine)
I've been wondering about this too in relation to the unconfirmed OQA PCB leak. For the sake of not shitting up the thread I'll put this in a quote.
This is a ES for the chip with pci-id 13E9, as stated by Apisak in his tweet from June 10, 2019:
I guess 13F8 may be the Dev Kit with defferrent PCI ID
There is also a new benchmark of Gonzalo.
tests in the same benchmark of ES 2G (13E9)
(Not 100%, but I think that's Gonzalo tests,even they hide the code)
"ES 2G" = 2nd Gen(Gen1) engineering sample of a gaming APU. Gonzalo with 13F8 pci-id was already in "Z" status, or QS by April 10, 2019. QS = essentially final version of chip. Chiphell and Komachi had been following 13E9 as far back as January, 2019.
---
I wouldn't focus too much on the HDD, as it's probably just part of the test bed they're using. As for the memory module sizes, yes, I expected different density based on the May 20, 2019 OQA PCB leak. It states, "16 Samsung K4ZAF325BM-HC18 in clamshell configuration". "325BM" is tied with 16Gb modules on Samsung's product selector, which would indicate 32GB of GDDR6 in clamshell, and you'd think 16GB in retail consisting of 8x 16Gb modules.
---
In all likelihood, it's as Apisak said, that the pci-id 13F8 with "Z" status is the dev kit, and if Reddit PCB leak is true(yet to be proven), then it was already being review by QA in May. I don't know enough about the minutiae of this process to say whether Flute is possibly using 16x 8Gb modules in clamshell, as opposed to half of the 16Gb modules of the alleged dev kit.
---
We know Gonzalo(13F8) went from ES1 to QS during the January to April time period. August 12, 2019 Komachi tweets concerning Oberon clocks and states, "For Oberon : ........../....GPU_ariel refs/.......". My guess is Oberon = retail chip.
 
Different environment, different time of day, different weather .Therefore different lighting and visuals. Not really an apples to apples comparison.
Stiffer animation, the lighting is completely off especially on her hands, no SSAO, less detail. Didn't Naughty Dog do this with Uncharted 4 also? While TLOU2 still looks good, it doesn't look as good as the trailers do and that's probably because they realized during production that it wasn't possible to keep a steady framerate. Plus they will want to make it look it's best with additional visuals for PS5.
 

Insane Metal

Gold Member
16GB of RAM would be very poor honestly, they want to run games at 4k (even 8k contents) and than you dont have room for ultra HD textures? That would be shameful. Minimum 24GB, better 32GB. We are talking about machines with a lifetime of 5+ years
32GB is overkill and unnecessarily expensive. 20-24GB is the sweet spot.
 

Imtjnotu

Member
16GB of RAM would be very poor honestly, they want to run games at 4k (even 8k contents) and than you dont have room for ultra HD textures? That would be shameful. Minimum 24GB, better 32GB. We are talking about machines with a lifetime of 5+ years
I'm all for that ram but I don't feel like spending $600 on a console again
 

MikonJuice

Member
People talking about power and ram... and I'm here, cheering for the chance to use all my play2 and playone games that are on discs for... decades...

Sakura Wars and Dragon Quest VIII to be more specific.
 

TeamGhobad

Banned
wasn't 20gigs confirmed for nexbox? people read off the serial numbers off the RAm and it was a mixmatch of 2gb and 1gb and it added up to 20?
 

psorcerer

Banned
Again a lot of fun in this thread!
1. Over 16GB in a console... have you no pity for PC guys? They will need 32GB and more to keep up.
2. Powerful CPU here, powerful CPU there. There is no need in any CPU power in a console. Its only purpose is to run poor, underoptimized game logic code. If you need computational power - use GPGPU, if you cannot use it - you don't need power, you need to run your crappy underoptimized code. The only need for good CPU in a new console is for fast asset decompression when streamed from SSD.
3. Fast SSD is there for a reason, not to "reduce loading times", fuck me sideways, using a fast side bus just to load everything in memory, because you cannot stream properly? Go work in mcdonalds.
SSD is there for the streaming, use small amounts of RAM and constantly load assets, this way with a good flash bandwidth (>2GB/sec) you can get to almost infinite resources per-frame.
For example Killzone Shadowfall used ~1.5GB non-streaming assets and 500GB streaming pool (with 1.6GB streamable assets), essentially doubling the available assets, and all of that with 30-40MB/sec drive.
If you have 50x bandwidth, you can theoretically have 50x asset pool, like 80GB of streamable assets instead of 1.6GB. And if done right it behaves exactly like another 80GB of read-only RAM pages.
 
Last edited:

Insane Metal

Gold Member
I have a feeling MS are going to push their clocks higher to compete with PS5 and then end up with another RRoD fiasco.
I don't think so. Higher clocks on Ryzen 2 increase power consumption exponentially. MS will make some very customized changes if they want a better CPU, but much higher clocks probably won't be the case.
 
MS might push the CPU to 3.4 GHz vs only 3.2 GHz on the PS5:


CPUs tend to have miniscule differences (OG PS4 vs XB1, Pro vs X). Nothing to brag about.
 
16GB total would be disappointing. You'd be reserving some 3-4GB just for the OS, giving the devs just 12-13GB to work with.

That 12GB would then have to be split between CPU and GPU. Just 6GB each.

Probably sounds ridicilous to say "just 6GB each", but there's a lot you can do with 2GB extra.

Sticking with my 16+4 prediction. The PS4 Pro has 1GB DDR3 RAM just for the OS, so it would be sensible of the PS5 to take the same approach.
 
16GB total would be disappointing. You'd be reserving some 3-4GB just for the OS, giving the devs just 12-13GB to work with.

That 12GB would then have to be split between CPU and GPU. Just 6GB each.

Probably sounds ridicilous to say "just 6GB each", but there's a lot you can do with 2GB extra.

Sticking with my 16+4 prediction. The PS4 Pro has 1GB DDR3 RAM just for the OS, so it would be sensible of the PS5 to take the same approach.
I don't agree with 16GB rumors, but the bold part is not true.

AMD APUs offer hUMA, which means that both the CPU and the GPU have a unified address space and common memory pointers to access the same data sets (a requirement for HSA/GPGPU algos):


On the other hand, XBOX 360 offered plain UMA. You still had to split the pool there.

hUMA is the reason current-gen consoles still chug along with only 5GB of free RAM, while PCs need much more.
 

TeamGhobad

Banned
  • Custom CPU based on AMD’s Zen 2 architecture: 12 cores; 3.4 GHz clock
  • Custom GPU also from AMD – “Arcturus”: L4 1 GB @ 1.3 TB/s; 14.366 teraflops of power (80 CUs)
  • 28 GB GDDR6 RAM (4 GB for the OS) @ 672 GB/s
  • 500 GB NVMe SSD @ 6 GB/s cache + 1 TB HDD
  • Hardware-based real-time ray tracing
  • The Xbox Scarlett “secret sauce”
any truth to this at all?
 

Insane Metal

Gold Member
  • Custom CPU based on AMD’s Zen 2 architecture: 12 cores; 3.4 GHz clock
  • Custom GPU also from AMD – “Arcturus”: L4 1 GB @ 1.3 TB/s; 14.366 teraflops of power (80 CUs)
  • 28 GB GDDR6 RAM (4 GB for the OS) @ 672 GB/s
  • 500 GB NVMe SSD @ 6 GB/s cache + 1 TB HDD
  • Hardware-based real-time ray tracing
  • The Xbox Scarlett “secret sauce”
any truth to this at all?
Nope
 

MadAnon

Member
  • Custom CPU based on AMD’s Zen 2 architecture: 12 cores; 3.4 GHz clock
  • Custom GPU also from AMD – “Arcturus”: L4 1 GB @ 1.3 TB/s; 14.366 teraflops of power (80 CUs)
  • 28 GB GDDR6 RAM (4 GB for the OS) @ 672 GB/s
  • 500 GB NVMe SSD @ 6 GB/s cache + 1 TB HDD
  • Hardware-based real-time ray tracing
  • The Xbox Scarlett “secret sauce”
any truth to this at all?
No, because Arcturus has nothing to do with traditional GPUs. It's a Vega based GPU purely for computation like Instinct.
 
Last edited:

Imtjnotu

Member
  • Custom CPU based on AMD’s Zen 2 architecture: 12 cores; 3.4 GHz clock
  • Custom GPU also from AMD – “Arcturus”: L4 1 GB @ 1.3 TB/s; 14.366 teraflops of power (80 CUs)
  • 28 GB GDDR6 RAM (4 GB for the OS) @ 672 GB/s
  • 500 GB NVMe SSD @ 6 GB/s cache + 1 TB HDD
  • Hardware-based real-time ray tracing
  • The Xbox Scarlett “secret sauce”
any truth to this at all?
All for the low low price of $999. I'd buy this lol
 

Handy Fake

Member
  • Custom CPU based on AMD’s Zen 2 architecture: 12 cores; 3.4 GHz clock
  • Custom GPU also from AMD – “Arcturus”: L4 1 GB @ 1.3 TB/s; 14.366 teraflops of power (80 CUs)
  • 28 GB GDDR6 RAM (4 GB for the OS) @ 672 GB/s
  • 500 GB NVMe SSD @ 6 GB/s cache + 1 TB HDD
  • Hardware-based real-time ray tracing
  • The Xbox Scarlett “secret sauce”
any truth to this at all?
Can I presume you got this from "misterXmedia"? ;)
 

TLZ

Banned
  • Custom CPU based on AMD’s Zen 2 architecture: 12 cores; 3.4 GHz clock
  • Custom GPU also from AMD – “Arcturus”: L4 1 GB @ 1.3 TB/s; 14.366 teraflops of power (80 CUs)
  • 28 GB GDDR6 RAM (4 GB for the OS) @ 672 GB/s
  • 500 GB NVMe SSD @ 6 GB/s cache + 1 TB HDD
  • Hardware-based real-time ray tracing
  • The Xbox Scarlett “secret sauce”
any truth to this at all?
Tales from who's ass is this?
 

SlimySnake

Flashless at the Golden Globes
  • Custom CPU based on AMD’s Zen 2 architecture: 12 cores; 3.4 GHz clock
  • Custom GPU also from AMD – “Arcturus”: L4 1 GB @ 1.3 TB/s; 14.366 teraflops of power (80 CUs)
  • 28 GB GDDR6 RAM (4 GB for the OS) @ 672 GB/s
  • 500 GB NVMe SSD @ 6 GB/s cache + 1 TB HDD
  • Hardware-based real-time ray tracing
  • The Xbox Scarlett “secret sauce”
any truth to this at all?
pretty much everything about this is wrong. avatar bet. you can bookmark and quote me on this.

the only thing true is the hardware based realtime ray tracing. ssd will be 1gb minimum. gpu isnt arcturus, thats not even a desktop gpu. cpu is going to be 8 cores. ram might be accurate but i dont see why they would use 4gb of gddr6 ram for the os when ddr4 will do just fine.
 

Xdrive05

Member
I’m in the dedicated OS RAM camp. Probably cheaper for them to have a 4GB DDR4 side piece and thus get away with “only” 16GB of GDDR6 to run the games.

Either way, seems like the solid state will be bussed such that it will be available as a 2nd tier RAM cache to mitigate limitations years down the road. In fact, if they do forego a dedicated OS chip that will be their excuse why. “12GB is plenty, we have 64GB as SSD RAM cache!”
 
no one pc player will switch to a consol for a exclusives game...
This is old, but I know many who got the og xbox for Halo, I'm pretty sure we can find PC gamers who got PS4s for Bloodborne, Horizon: ZD, Uncharted 4, God of War (one or multiple of them). Same for the Switch, in fact I know of one, well he probably also got it because there was some cool factor around the release of this console for whatever reason.
 

Imtjnotu

Member
This is old, but I know many who got the og xbox for Halo, I'm pretty sure we can find PC gamers who got PS4s for Bloodborne, Horizon: ZD, Uncharted 4, God of War (one or multiple of them). Same for the Switch, in fact I know of one, well he probably also got it because there was some cool factor around the release of this console for whatever reason.
I bought my switch just to play super smash. No other reason. There are plenty of gamers who buy consoles for exclusive games. Just let his delusional opinion be.
 

CrustyBritches

Gold Member
Not making any guesses on CU. I don't see PS5 gpu clocks being higher than 2ghz and I doubt MS will stick with 1.6ghz.
RROD was caused by the switch to lead-free solder iirc.

Can you give a scenario using estimates with clocks and CUs that would result in MS choosing to "push their clocks higher to compete with PS5 and then end up with another RRoD fiasco."

If you believe that 2GHz is the ceiling for clocks, and Scarlett has the same amount of CUs, why would that clock speed damage the Scarlett and not the PS5? If Scarlett has more CU's, then it could have lower clock speed to get to the same performance, which should be even less of a risk.
 

demigod

Member
RROD was caused by the switch to lead-free solder iirc.

Can you give a scenario using estimates with clocks and CUs that would result in MS choosing to "push their clocks higher to compete with PS5 and then end up with another RRoD fiasco."

If you believe that 2GHz is the ceiling for clocks, and Scarlett has the same amount of CUs, why would that clock speed damage the Scarlett and not the PS5? If Scarlett has more CU's, then it could have lower clock speed to get to the same performance, which should be even less of a risk.

You might want to refer to the leaks on why PS5 is estimated to be ahead right now instead of asking these silly questions.
 

CrustyBritches

Gold Member
demigod demigod

No worries. These new AMD chips have all sorts of thermal sensors in them and they'll simply throttle before cause any harm to the chip. A lot of "fun" has been taken out of CPU overclocking, because AMD has basically tied performance with quality of cooling and their boost systems handle the frequency scaling. Admittedly, I still don't understand the intricacies, but GamersNexus did some awesome vids on this subject...




Based on the X1X, MS can get to 20K+ Fire Strike score pretty easily. Not to say the PS5 isn't more powerful, just I don't see easy-breezy covergirl for Sony and crash and burn for MS at this level of performance. A 5700 XT ballz out will hit over 225W, and the cheapo blower can handle it "OK". These consoles shouldn't be over 200W.
 
Last edited:
Crazy that we pretty much had the final specs for the PS4/X1 seventeen months away from launch, aside from the RAM sizes

 
Last edited:

Mass Shift

Member
Crazy that we pretty much had the final specs for the PS4/X1 seventeen months away from launch, aside from the RAM sizes


I think we have the PS5, minus memory. So it's almost like a repeat.
 
Status
Not open for further replies.
Top Bottom