• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

|OT| Next-Gen PS5 & XSX |OT| Speculation/Analysis/Leaks Thread

LordOfChaos

Member
Mar 31, 2014
11,301
6,310
915
It would be a shame if 3rd party games are (ironically) held back by PCs
Maybe that would be for the best. In recent console generations it's always been the other way around, games are built to the baseline of consoles and PCs only have higher resolutions/textures/filtering to play with. PC hardware adapts far faster than consoles, so it's preferable that the onus would be on PC hardware to catch up than PCs having a new anchor for 7 years.

Theoretically at least. Given we're just talking about storage the difference should just be in load screens or slightly longer periods of loading in assets while you're playing. And as for VRAM, when the PS4 launched with 8GB of unified GDDR5, ~5.5 available, the contemporary GPUs were also 2-4GB, and it took a few more years into the generation before those VRAM limits started getting hit and PC hardware moved on anyways. So possibly, same old.
 
  • Thoughtful
Reactions: SonGoku

SonGoku

Gold Member
Aug 16, 2018
4,754
6,572
595


Why can't consoles go with this form factor to push 300W+ ?
It wouldn't be bigger than past phat consoles
 
Last edited:
  • Like
Reactions: CyberPanda

LordOfChaos

Member
Mar 31, 2014
11,301
6,310
915
300W consoles would cost $699 minimum, that's why.
Why do you think it would push up cost? Pushing the existing chips past their most optimal point, losing perf/watt but gaining total performance, would only cost what extra cooling would be needed. Plus being able to use chips that drew more power might make binning a net neutral, vs excluding chips that could clock that high.

I was always selfishly into the idea of ditching the optical drives and going balls out with heatsinks and fans so the same size and nearly the same BoM cost consoles could dissipate way more watts.
 

Negotiator

Member
Jun 28, 2011
4,676
1,480
885
Why do you think it would push up cost? Pushing the existing chips past their most optimal point, losing perf/watt but gaining total performance, would only cost what extra cooling would be needed. Plus being able to use chips that drew more power might make binning a net neutral, vs excluding chips that could clock that high.

I was always selfishly into the idea of ditching the optical drives and going balls out with heatsinks and fans so the same size and nearly the same BoM cost consoles could dissipate way more watts.
In the photo above I see a (former) high-end discrete GPU (GTX 980), not an APU.

If you're going to push the thermal envelope to such a high degree, then APUs no longer make sense.

200-250W+ concentrated on a single BGA chip is just asking for trouble (failing solder joints, which led to us to RROD/YLOD while only having ~100W GPU chips).

RSX was only 81W and the PS3 had a pretty beefy cooling system (3+2 heatpipes):




XBOX 360 had it worse with RROD due to a combination of factors (worse cooling system + eDRAM generating extra heat):


Even if they didn't fail, 300W consoles would be insanely loud compared to 150-200W consoles.

It might be a good idea if you want to indirectly push people to the cloud, where they have industrial-grade cooling systems and you no longer have to worry about hardware ownership/heat/noise/electricity consumption. :)
 
  • Like
Reactions: Samsomite

SonGoku

Gold Member
Aug 16, 2018
4,754
6,572
595
300W consoles would cost $699 minimum, that's why.
Why though? all it would take is decent cooling
In the photo above I see a (former) high-end discrete GPU (GTX 980), not an APU.
I meant case and cooling
200-250W+ concentrated on a single BGA chip is just asking for trouble (failing solder joints, which led to us to RROD/YLOD while only having ~100W GPU chips).
300W+ single die GPUs seem to be doing just fine why would an APU be any different?
And if it was really limiting a MCM chiplet design could be used to fuse separate CPU/GPU dies
RSX was only 81W and the PS3 had a pretty beefy cooling system (3+2 heatpipes):
Technology has come long ways since then
Even if they didn't fail, 300W consoles would be insanely loud compared to 150-200W consoles.
With decent cooling it shouldn't be
 
Last edited:
  • Like
Reactions: CyberPanda

SonGoku

Gold Member
Aug 16, 2018
4,754
6,572
595
I was always selfishly into the idea of ditching the optical drives and going balls out with heatsinks and fans so the same size and nearly the same BoM cost consoles could dissipate way more watts.
Yeah and include a external usb BD drive
 

Ar¢tos

Member
Oct 24, 2017
4,548
7,043
740
Krynn
Weirdly, I'm more excited for AMD Navi info than xbox next reveal. Probably because I don't expect any real specs from MS this early, just basic info like Sony did, and with more detailed Navi info it's easier to extrapolate console specs, since we know what the limitations console form have.
 
  • Like
Reactions: TLZ

CyberPanda

Banned
Mar 4, 2019
11,955
20,078
1,190


The highest SKU launching will be named RX 5700 XT, 40CU, 9.5TFLOPS, 1900MHz max clocks, with 1750MHz being the typical gaming clock. Power delivery is through 2X 6pin connectors.
It's legit. It appears Navi indeed sacrificed compute to gain more pixel pushing power, just like Digital Foundry predicted/anticipated. A Vega 64 is 12.5 TFLOPS, yet an RX 5700 is 8.5 TFLOPS at typical gaming clocks, and it's faster than a Vega 64.
 

Aceofspades

is a 9 year old console warrior
Mar 31, 2015
3,518
6,577
805

What I get from this is that Vega is unoptimized PoS arch. Navi is bringing AMD closer to Nvidia.
 
  • Like
Reactions: mckmas8808 and TLZ

TeamGhobad

Member
Oct 15, 2018
3,930
4,242
485

pawel86ck

Banned
Jan 27, 2018
2,126
3,232
400
Was this posted?


The image.

What I'm looking at exactly here? This picture suggest me RT acceleration will be done thanks to CPU customisation? If that's the case then that leak from yesterday (with 14TF arcturus GPU and 12 cores CPU) has also mentioned RT implementation thanks to collaboration with intel, and even huge 1 GB ultra fast cache that could be used for such data transfer.
 
Last edited:

Sycomunkee

Member
Jan 21, 2019
3,205
3,789
435
My new expectation:
  • 9.75TF PS5 (4k)
  • 4.6 - 5TF Xbox S (1440p)
  • 9.75 - 10TF Xbox X (4k)
Which is not far off from my conservative prediction.
 
Last edited:
  • LOL
Reactions: CyberPanda

LordOfChaos

Member
Mar 31, 2014
11,301
6,310
915

Heres your news. 12TF dream is dead! ;_;
Navi 12 is 40CUs, Navi 10 could be 48, 56, or up to 64 in theory. And then whatever 'Navi 10 lite' is.


Navi 10 will give us an idea how much of RDNA's 1.25x IPC improvement applies in the real world vs Vega at least. Though even then there's all this talk about this being a midway part and "real" next gen RDNA coming out later in the year ¯\_(ツ)_/¯

It always did look surprising to me that "next gen" was on their roadmap for years, coming shortly after Navi, and then they re-branded Navi to RDNA, maybe the old roadmaps are still right but this was enough of an uplift to call it something new, while the full turnover is later.

The Xbox One X was practically the same as the 580. There is no proof that they would add CU’s
Not adding anything, the rumor was for Navi 10 'lite', this is Navi 12
 
Last edited:
  • Like
Reactions: Fake and luca_29_bg