• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

ethomaz

Banned
Its fucking pastebin lol
Its a mix and match of different rumors...

Xbox die size: 392mm reera analysis

PS5 die size: 319mm reddit rumor

Clock speed in current devkits:

Xbox (Dante): 1,1 GHz (not final; Goal is 1,6) Arturos rumor
PS5 (Gonzalo): 1,8 GHz (was 1 GHz; now pretty much final version) Gonzolo rumor...

Another pastebin:
PS5 Kugoson :D
 

DanielsM

Banned
"I think it will absolutely be the most powerful immersive console on the market"

Why isn't he just "absolutely certain"? (not that I care about the power difference, but the marketing never stops) I'm not even sure what "immersive" is, more of subjective term.

Why not just release the specs, if he is "absolutely" positive? If the specs are not complete, how can he be "absolutely" sure of anything? Of course, he's the same guy talking up the power of da cloud saying how everything can be unloaded. All this is, is marketing.... very piss poor marketing but marketing.

 
Last edited:
There is no more time to PC build devkit anymore.

If you want a next year release you need to have the APU inside the devkits already.

For what purpose?

I already told you that devs didn't have the real Scorpio until 8 months before launch. Plenty of time to optimise their games for the true hardware. All they need now is approximate spec.

The consoles are just PC architecture now anyway and devs are already building for that. Lower some settings for the consoles, do some optimisation and you are there.
 

CrustyBritches

Gold Member
Just a thought, but these threads should have been posted in here.
We should get this official thread as big as possible, it was annoying before when we had to argue across 2-3 threads, saying the same thing over and over. This way shows the strength of the forum too...the ability to hopefully have ten thousand+ posts and maybe get up to a million views. How fucking awesome would that be?

/rant
 
Last edited:

FranXico

Member
Just a thought, but these threads should have been posted in here.
We should get this official thread as big as possible, it was annoying before when we had to argue across 2-3 threads, saying the same thing over and over. This way shows the strength of the forum too...the ability to hopefully have ten thousand+ posts and maybe get up to a million views. How fucking awesome would that be?

/rant
Some of those have actually been posted here first, but they got buried under the WARZ noise.
 

SonGoku

Member
Yet you do not have the confidence to take my Navi 10 bet which are 8TF and 9TF GPUs? You did the opposite, actually, you forced me right on the razors edge for prediction and I had to take 1080/2060/Vega 64 against PS5 while you flaunt the idea of RTX 2080 level GPUs???
Thats fair call out, but keep in mind that was before we knew any official Navi10 info
For the record my baseline expectation is 11TF so no quite RTX2080 but definitely >RTX2070. I'll take that bet now

What im most confident off is 56CU minimum
Navi info of course.
No Navi info pointed towards 36CU consoles...
I just don't think you are being realistic. 5700 XT is already a 225W TBP GPU so it is unrealistic to expect a 56-64CU console APU. Even one lower clocked
5700XT has a default voltage meant to sustain 1950Mhz
A 56CU APU undervolted to hit 1540Mhz stable or a 64CU at 1470Mhz would consume much less

(a bigger chip is more expensive to make).
But within console budgets
RTX2060 is 445 mm2 at $350
Besides, I much prefer to be conservative and end up with better than expected.
That's fair, I'll just say this even if they target 9tf they'll use a bigger chip
 
Last edited:

CrustyBritches

Gold Member
Suggestions on how to accomodate it to the OP while keeping it nice and organized?
We need to a have a news ticker sub-banner that shows the latest "marked" posts.

Like this:
the-news-ticker-in-times-square-in-new-york-on-tuesday-december-20-HF7X1H.jpg

for a topic. Then you click the little button for the mark and it takes you to the post. I'm going make a mockup and see what Evilore thinks.
 
Last edited:

THE:MILKMAN

Member
Thats fair call out, but keep in mind that was before we knew any official Navi10 info
For the record my baseline expectation is 11TF so no quite RTX2080 but definitely >RTX2070. I'll take that bet now

What im most confident off is 56CU minimum

No Navi info pointed towards 36CU consoles...

5700XT has a default voltage meant to sustain 1950Mhz
A 56CU APU undervolted to hit 1540Mhz stable or a 64CU at 1470Mhz would consume much less


But within console budgets
RTX2060 is 445 mm2 at $350

That's fair, I'll just say this even if they target 9tf they'll use a bigger chip

Okay lets see our it all pans out. If I could just ask one thing though.....What estimated TDPs would your proposed 56-64CU APUs be?
 

SonGoku

Member
We need to a have a news ticker sub-banner that shows the latest "marked" posts.

Like this:
the-news-ticker-in-times-square-in-new-york-on-tuesday-december-20-HF7X1H.jpg

for a topic. Then you click the little button for the mark and it takes you to the post. I'm going make a mockup and see what Evilore thinks.
Added to the OP let me know what you think.
Anyone feel free to @ me to add relevant info to OP

The one thing i dislike about threadmarks is the gross, dismmisive, elitist attitude displayed by reera users when someone posts something they think its already covered in their threadmark as some undisputed truth.
Okay lets see our it all pans out. If I could just ask one thing though.....What estimated TDPs would your proposed 56-64CU APUs be?
200W total for the console though I'd like 300W+
I asked this before whats stopping consoles from going with 300W? Cooling tech and cases today are advanced enough to handle it without breaking the bank at while retaining consoles size and decent noise
 
Last edited:
-The more i think about the more likely a 399 ps5 sounds likely. Its very hard to imagine sony going on stage and saying We are selling a console for 500 dollars without some backlash. Unless they have plans. Even if sony and ms make their console 499 that makes it easy for nintendo to pull a wii and outsell both consoles compeletly.

I think max is 10 TFLOPs
 

SonGoku

Member
PS5 pastebin rumor
CPU 7nm Ryzen 8 core 16 threads
GPU 7nm Navi ~14TF, powerful and power efficient, much better bandwidth overall.
24GB GDDR6 + 4GB DDR4 for OS (32GB dev kits)
2TB HDD some sort of nand flash
DualShock 5: some sort of camera inside for VR, more analog precision for fps games,something similar to steam analog trackpad
499$ (100$ loss per console at a beginning)

Don't mind me, reposting to link to the OP
 

DanielsM

Banned
-The more i think about the more likely a 399 ps5 sounds likely. Its very hard to imagine sony going on stage and saying We are selling a console for 500 dollars without some backlash. Unless they have plans. Even if sony and ms make their console 499 that makes it easy for nintendo to pull a wii and outsell both consoles compeletly.

I think max is 10 TFLOPs

Since the console will be fully backwards compatible on day one, and most of the games being on both for 3-4 years not sure sales have to be great. Eliminate the PS 4 slim, sell the PS4 Pro for the low-end and PS5 as the high-end.

The PS4/PS4 Pro will be very relevant for many years yet.
 
Last edited:

CrustyBritches

Gold Member
The one thing i dislike about threadmarks is the gross, dismmisive, elitist attitude displayed by reera users when someone posts something they think its already covered in their threadmark as some undisputed truth.

100% agree.

I was thinking something like a news ticker on the main page for news articles.

Quick mockup:
NeoGAF_ThreadNewsTicker.jpg


You press the button and it takes you right to the post with the article. The way thread titles are handled here, it could be problematic to update unless we had mod support. Thoughts?
 
Last edited:

SonGoku

Member
Scarlet/Anaconda (Snek), Arcturus rumor
wHcrp5L.png

CPU: 8 Core Ryzen at 3.3 GHz with 1GB of L4 cache (probably from main memory)
GPU: 11.468 TF (64CUs at 1400 MHz)
Ram: 22GB (probably GDDR6) on a 352 bit bus

Don't mind me, reposting to link to the OP
 

SonGoku

Member
4oAziR6.png

Za21Rqo.png

K6HmOxF.png

0oaCtUj.png
12 chips
4 more are on the bottom, shown in the first picture showcasing the SOC
So 24GB GDDR6 assuming 16Gbit (2GB) chips
24 GB of GDDR6 with 384 bits bus
14 Gbps -> 672 GB/s (spotted chips on the video)
16 Gbps -> 768 GB/s
18 Gbps -> 864 GB/s
20 Gbps -> 960 GB/s

In 2016 Microsoft showcased a render of the Xbox One X's board while the system was in development and the number of chips were able to match up with the number of chips the retail system has. Keep in mind things may be subject to change
 
Last edited:

TeamGhobad

Banned
the 14Gbps model was confirmed. someone zoomed in on the Samsung serial number. that's a lot of ram on a big bus.
if sony decides to split the pool and use HBM2/3 what advantage will it have over nexbox?

Aren't the top two X1X? The lower two are Scarlett.

green was xbox one x or lockhart. blue is anaconda.
 
Last edited:
I just read Albert Panello's latest tweet in response to the PS5 dev-kits being more powerful than the current Scarlett dev-kits. And that got me thinking.

What would be the industry fallout be if Sony's new console happens to be the technically most competent of the consoles, next-gen?

15hTlf5.jpg
 
Last edited:

Imtjnotu

Member
I just read Albert Panello's latest tweet in response to the PS5 dev-kits being more powerful than the current Scarlett dev-kits. And that got me thinking.

What would be the industry fallout be if Sony's new console happens to be the technically most competent of the consoles, next-gen?

15hTlf5.jpg
all this says to me right now is

-DEV KITS FOR NEXT BOX AND PS5 ARE INDEED OUT.
-PS5 KITS ARE STRONGER.
-DAMAGE CONTROL for now.
 
Last edited:

THE:MILKMAN

Member
I think there is a lot of confusion between devs telling Andrew Reiner about written target, or even final, specs and the state of current physical devkits.

At least that is how I'm making sense of the back and forth....
 

TBiddy

Member
"I think it will absolutely be the most powerful immersive console on the market"

Why isn't he just "absolutely certain"? (not that I care about the power difference, but the marketing never stops) I'm not even sure what "immersive" is, more of subjective term.

Hey champ. How would Matt Booty know anything about the specs of the PS5, besides industrial espionage? You think that he has direct access to Cernys team or a special leaker at AMD that provides him the specs of the PS5?

Jesus christ.
 

ethomaz

Banned
Somebody replied to me about a Dual-GPU on Mac that looks similar to chiplet... I opened the page last week, my note restarted on the weekend and I lost the link lol

Too buzzy to look again... I was interested.
 

ethomaz

Banned
what does RT Hardware look like? is it on the SoC or a different soc? is it more of a cpu or gpu? any tips?
Do you mean RT hardware in nVidia cards? Well it is part of the SoC... it is blocks just like CUs are blocks too... by nVidia own words it increases 22% the chip.
 

SonGoku

Member
Somebody replied to me about a Dual-GPU on Mac that looks similar to chiplet... I opened the page last week, my note restarted on the weekend and I lost the link lol

Too buzzy to look again... I was interested.
Yeah its the dual Vega, what about it?
Previous dGPUs from AMD and Nvidia looked like chiplets
 

ethomaz

Banned
Yeah its the dual Vega, what about it?
Previous dGPUs from AMD and Nvidia looked like chiplets
It is just that chiplet didn't work with GPUs yet.
But that DualGPU seem like a step in that path.

i meant for nexgen consoles. they said they are doing Hardware Raytracing.
Nobody knows.

AnandTech dude believe that both MS and Sony has their own hardware implementation for RT different from what AMD will put on future Navi... but it just guesstimates.

I believe the RT units will be put inside each DCU... so actual Navi could have 20 RT units (again big assumptions here lol).
 
Last edited:

pawel86ck

Banned
The reason the X pulled so far ahead is memory bandwit, not some silver gaming bullet only MS posess

The PS4 low level API was already more efficient with CPU draw calls than dx

If anything it would increase it lol
Hybrid RT is a resource hog
Are you suggesting PS4 has something similar to DX12 command processor build into xbox x GPU? If not then how PS4 can be faster at processing draw calls? Xbox x has a real hardware to offload CPU
 

SonGoku

Member
Are you suggesting PS4 has something similar to DX12 command processor build into xbox x GPU? If not then how PS4 can be faster at processing draw calls? Xbox x has a real hardware to offload CPU
Read closely there's no additional chip/silicon its just a buzzword for coding to the metal which the ps4 ip already has
 
We know both MS and Sony have stated they're using Navi.

With each taking a branch from Navi master when their projects started. Work is done on their own branch but never goes back into the Navi master branch. So each have their own "special sauce" to use a well known term.

The start/restart dates of these projects could be projects could have a significant gap between them. Making the foundations of each look noticeably different to each other.

Now that interesting back story is out of the way into the point of this post.

The high clock speeds of Navi disclosed in leaks and AMD's Navi presentation seemed to take the MS camp by surprise.

Is this because the foundations of the MS project were very different (earlier start date, so not including those foundation enhancements?) or are they using Navi in different configuration where it's not possible to push the clock so high? Monolith vs chiplet+IO Die overhead for example.
 

pawel86ck

Banned
Read closely there's no additional chip/silicon its just a buzzword for coding to the metal which the ps4 ip already has
MS has modified xbox x GPU in such a way high frequency API calls are implemented DIRECTLY into the GPU’s command processor. It's DX12 build into a hardware level and it's not just software method like DX12 on PC. In my previous comment I have linked you interesting quotes from digital foundry article in regards to that.
 
Status
Not open for further replies.
Top Bottom