• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Aceofspades

Banned
I hope any difference between the two is either slight or none, indeed.

Do you remember the army of MS insiders who kept claiming that PS5 is comparable to "Lockhart" because MS will use all its might and "unlimited " money to create the most powerful machine ever. I have been saying that no one should underestimate Sony because Sony has never gimped their hardware, they are not Nintendo or Sega.
 
Not really sure why the price of the 2080ti is being brought up.
Because nVidia (and sadly AMD as well) has conditioned PCMR folks to believe that these prices are "normal". We can also thank famous PC YouTubers/streamers for normalizing the ever increasing prices.

Why not price gouge gullible consumers? Every cynical CEO would do the same.

Thank god consoles follow a different business model:


People may argue that console games are more expensive than PC ones, but that's not the case anymore (Nintendo is the only exception).

Digital sales and services have made console gaming more affordable than ever before. If they can subsidize semi-custom APU R&D as well, then more power to them!

Back in the PS2 era we didn't have PS+, PS Now, Game Pass etc.

Just a quick addition to this, as I don't know how/cannot edit my post.
Memory bandwidth is particularly important. The PS5/Scarlett could have 80 compute units and be capable of 20TFlops, but if its only got 256-bit bus with 448GB/s bandwidth, then its just not gonna be fed with enough data for that extra compute to matter. Heck even a 384-bit bus would struggle with that much compute.
There is concern online about rumours of a Navi 12 die, which has been theorised to be larger than Navi 10, as it may only have a 256-bit bus. People reckon that, that bus width would be insufficient to keep a larger Navi die fed with enough data, and therefore think its between Navi 10 and 14. Others think it might be a 2-stack HBM2E solution with a 2048-bit bus, as that also matches up with the datamined specs, but that's another story entirely.
Either way, memory bandwidth is hugely important for the consoles.
$399 -> 256-bit 16GB GDDR6
$499 -> 384-bit 24GB GDDR6

We're not gonna see $399 consoles this time around.
 

JLMC469

Banned
I just wanna go back to the PS2/360/PS3 days. This gen has been the worst IMO. Creativity has been at a stand still for years.
 
Maybe Microsoft will be willing to take the hit to undercut Sony. Maybe.
And if the consoles have indeed comparable power, but one costs $100 more than the other...
If one company is willing to take a $200 loss ($600 BoM), then that will decide the fate of the next generation.
 

Aceofspades

Banned
If one company is willing to take a $200 loss ($600 BoM), then that will decide the fate of the next generation.

Forget power or price, PS5 will sell more than Xbox no matter what.
The fact that PS5 is more powerful at similar price point will be the icing on the cake and sales gap will be just wider.
 
Because nVidia (and sadly AMD as well) has conditioned PCMR folks to believe that these prices are "normal". We can also thank famous PC YouTubers/streamers for normalizing the ever increasing prices.

Why not price gouge gullible consumers? Every cynical CEO would do the same.

Thank god consoles follow a different business model:


Yes exactly. But it should still be fairly obvious that Retail graphics card prices have no bearing on the cost of silicon to console manufacturers.
It should be obvious that questions like "How can a console perform the same as a graphics card that costs $700 or more?" are meaningless.

People may argue that console games are more expensive than PC ones, but that's not the case anymore (Nintendo is the only exception).

Digital sales and services have made console gaming more affordable than ever before. If they can subsidize semi-custom APU R&D as well, then more power to them!

Back in the PS2 era we didn't have PS+, PS Now, Game Pass etc.
Indeed, gamepass, PS+ and so on can subsidise costs for upcoming generations. I imagine they might want a slower transition to next gen, than last gen, as it might keep existing revenue streams to help alleviate added costs. Then again, who can argue with 1 million units sold day one? Will be very interesting.
Its also interesting how hyped xCloud is, given that Sony have had their own streaming service for years now, and its no joke. I look forward to seeing how PSNow and xCloud (and indeed Stadia) develop and compete against one another in the future.

$399 -> 256-bit 16GB GDDR6
$499 -> 384-bit 24GB GDDR6

We're not gonna see $399 consoles this time around.
It will be interesting to see what the configurations are at the end. The PS4 had twice as many async compute engine's as the 7850 it was based on, and the PS4 pro had FP16 (rapid packed math) from Vega despite being based on the Polaris 10. I am quietly looking forward to what customisations each console manufacturer has brought to their device. Also very interested in what the memory bandwidth and volume will be. Cause that will set the standard for PC graphics cards for the next 7 years.
 
I don't know how/cannot edit my post.
NeoMembers can't Edit post's
$499 -> 384-bit 24GB GDDR6
24Gb of Ram is not happening.
Maybe Microsoft will be willing to take the hit to undercut Sony. Maybe.
And if the consoles have indeed comparable power, but one costs $100 more than the other...
If they do that i think it will be even worse, Sony 100$ more expensive because it's more powerful, more/better exclusives,... it's just Sony.
 

bitbydeath

Gold Member
Stolen from "that other place". PS5 is slighty more powerful, slightly.


Both hitting 4K the differences will be minimal.

Aside from Sonys first party who are magicians.
 

DeepEnigma

Gold Member
Stolen from "that other place". PS5 is slighty more powerful, slightly.



Man, what webs these "insiders" weave.

To be fair he had to say that at that time as the thread went to hell and was unapproachable haha . So he needed to say something to bring the thread back to calm state and he said based on his own feeling .
 
Last edited:
  • LOL
Reactions: TLZ

Insane Metal

Gold Member
No I can't really elaborate or go into more detail but.....yeah, my friend, who has been making games since the Dreamcast Era, and who is developing software for both Next-gen consoles said PS5 has the edge.

I am about to go to the airport soon. I will talk to you guys later.

Boom! 🤯

----

I expect both consoles to be very close, like vanilla XBone vs. vanilla PS4, but closer. I'll take my chances saying PS5 has the GPU edge while the NXB has a CPU edge.
 
Last edited:

Evilms

Banned
M9rNbgh.png
 

SlimySnake

Flashless at the Golden Globes
He was offered a job elsewhere and took the offer. Nothing sinister there.
Sure thats possible, but if I was Phil and I thought my $500-600 console would beat Sony's cheaper console easily, and made a comment about setting a benchmark for consoles based on the designs by my lead architect, i'd fire that guy too.

The only way Sony or MS have a more powerful console is if they decided to increase clockspeeds. I firmly believe they are both using the same CPU and GPU. And if both are using vapor chamber cooling then how can one increase clockspeeds more than the other? Just what the hell did Cerny do to get this slight advantage?

HBM2 comes to mind. It doesnt consume as much power and its controllers and bus takes a lot less space on the die.
CPU Cache. Flute leak suggests Sony cut down the L3 cache from 32mb to 8mb drastically reducing the size of the chip. From 70mm2 to 40 mm2.
That gives us somewhere between 30-50mm2 of extra space. Smaller die means less power needed/cooler chip. Maybe thats why Sony is able to push the clocks to 2.0 ghz like the oberon leak suggests.

The recent devkit patent leak also suggests a pretty bold design for a console with lots of vents so maybe they are willing to push clocks so high because the design of the console gives them more head room. their patent for their heatsink was pretty wild too. something about cooling from both sides of the heatsink.

Cerny might have gone a bit crazy with out of the box thinking where as their lead architect followed the simple but effective solutions to increase power and tflops like they did with the x1x. all speculation of course. but its possible. if cerny left right now you would see people freak out too.
 

bitbydeath

Gold Member
People talking about X more powerful than Y based on what ? Dev kits ? This is stupid.

Klee said there were still two major revisions to come in terms of dev kits. It’s supposed to be based on the script of paper devs get from MS and Sony that outlines everything for what to expect of next-gen.

This isn’t the 900P days though where resolution differences were instantly noticeable so a slight difference means nothing third party wise.
 
Last edited:

TLZ

Banned
am i the only one that doesn't give two rats about BC? like seriously i've already played the games i want.
Well it's not all about you is it? There are many others who like it. This is also the best way for preservation. If the PS5 can also go the extra mile and do better upscaling or whatever it is called plus other options then even better.
 

TLZ

Banned
Thank you, I was waiting for someone to bring this up :) Common misconception (at least for PS4). More education:

Most people assume the PS4 GPU was a scaled down 7870 desktop which released in March 2012 and was only the 2nd fastest card AMD made (the fastest was the 7970). In fact, the PS4 was actually based on the mobile GPU variant called the 7970M which initially released on the PC in April 2012. That card was the absolute fastest mobile card AMD produced and it was in fact based on the full 7870 desktop silicon. If you look at the specs you will see that it is indeed virtually identical to the known PS4 GPU sans the PS4 GPU being even more underclocked from 850 to 800 Mhz. Now even though the PS4 did not release until Nov 2013, that 7970M card was still the fastest mobile card AMD manufactured at the time of the PS4 release! In fact, if you look at the history, AMD did not make another faster mobile GPU until the R9 M295X which didn't release until Nov 2014. This was essentially the mobile version of the 7970 desktop card but it came too late to be included in the PS4 (and was also way too power hungry to be in a console).

So contrary to popular belief, Sony included the absolute best and fastest card available at that time considering AMD's roadmap, PS4 release schedule, and thermal considerations for the console. In fact, if you know about computer hardware and thermals than you will know why it the PS4 HAD to include the mobile version. In short, AMD GPUs at that time were not very power effecient. GPUs in a console typically need to be < 100W TDP in order to be practical given the thermal and cooling limitations in a console. The desktop 7970 GPU released with a TDP of 250W which is absolutely not practical for a console. Even the desktop 7870 GPU had a TDP 175W which again is not practical given that the total TDP for the console is less than 150W. However, AMD was able to make a mobile variant of the 7870 with a TDP of only 75W, perfect for a console. Again this was the absolutely fastest card that was feasible up until the PS4 released in Nov 2013.

People always want to talk about the PS4 being so "underpowered" at launch but if you know that facts and what it takes to build a console from a hardware standpoint, Sony did the best they could given what AMD had available. But the focus for the PS4 (and Xbox One) wasn't on pure power, it was on developer ease and convenience. The real change was the move to x86 architecture on the CPU and again the Jaguar was the only thing AMD had available that could "fit" in a console form factor at that time (the AMD desktop parts were way too big and power hungry).

However, AMD is NOT the same company today that it was in 2012/2013. They have a wide range of highly competitive CPUs and will be releasing a wide range of highly competitive GPUs in the coming year. Not just in raw performance, but more important in terms of efficiency.

Also, the priorities of the console manufactures are different this time around. While the PS4 and Xbox One were mostly about removing barriers for developers, both Sony and Microsoft have made it clear that PS5 and Scarlett will be about pushing the boundaries of gaming. Having announced a GPU with 8K, 120fps, and ray-tracing capabilities says that they will definitely be significantly more powerful than the current RX 5700. If you look back, the PS4 and Xbox One GPU didn't really offer anything "new" at the time (except for ACE units on PS4). There were no marketing buzzwords to highlight advanced features and in fact all Sony could say was that it was a "super charged" PC.

Microsoft focused on software and UX features for Xbox One as part of their defocus on gaming initially. They paid the price and are hell bent on making sure they are technically competitive from day one with Scarlett. The difference in power between the two will be extremely small, probably smaller than any other previous generation.

So again if you think that the next consoles will ship with AMD Navi GPUs equivalent to the current RX 5700, then you are surely mistaken and you will see in due time!
So you can say that their partnership with console makers Sony and MS paid dividends and helped propel their research and therefore tech this time around.
 
Correct me if i'm wrong, i'm not a native eng speaker so you will excuse me if i got it wrong.
Klee said there were still two major revisions to come in terms of dev kits.
That makes my point, why are they arguing which is better than the other, also they do that to mislead the competition in therm of leaks.
It’s supposed to be based on the script of paper devs get from MS and Sony that outlines everything for what to expect of next-gen.
isn't that what a Dev kit is for ?
This isn’t the 900P days though where resolution differences were instantly noticeable so a slight difference means nothing third party wise.
we are not talking resolution anymore since both of them are 8k capable, we are talking details.
 

CrustyBritches

Gold Member
Lol, people acting like Sony and MS actually have significant contribution to AMD's designs. Zen was designed under Jim Keller, while Navi was made under Raja Koduri. Keller left for Tesla in 2015, while Raja left for Intel in 2017 and as far as I know, both are currently working for Intel.

Surely this is a sign Zen and Navi would do poorly, with their chief architects leaving AMD in such an untimely manner. /s
 

bitbydeath

Gold Member
Correct me if i'm wrong, i'm not a native eng speaker so you will excuse me if i got it wrong.

That makes my point, why are they arguing which is better than the other, also they do that to mislead the competition in therm of leaks.

isn't that what a Dev kit is for ?

The paper is supposed to be based on the final retail build so devs know what to aim for long term as opposed to building against a dev kit which will be later outdated.

we are not talking resolution anymore since both of them are 8k capable, we are talking details.

Maybe just me but I’m still expecting a very minimal difference even when it comes to details.
 

Mass Shift

Member
Ehhhh, not convinced yet. Just too many assuptions being made. Based on a leaker who says he "was told" that the PS5 was slightly better than Scarlett.

And based on dev kits. 🤷
 
Last edited:
The paper is supposed to be based on the final retail build so devs know what to aim for long term as opposed to building against a dev kit which will be later outdated.
LOL WHAT ?!?!
I'm sorry it doesn't work like that at all, you know why early gen games look bad compared to last cycle ? that's why .
Maybe just me but I’m still expecting a very minimal difference even when it comes to details.
The answer is it depends, we just have to wait and see.
 

Fake

Member
Lol, people acting like Sony and MS actually have significant contribution to AMD's designs.
Dunno if this part of your post was sarcasm, but the new surface have a 'Ryzen Surface Edition'. AMD can indeed have mutual benefit of cooperation in this.
The new Microsoft Surface Laptop 3 extends the long-standing collaboration between Microsoft and AMD from the world of Xbox console gaming to the PC. Just as we have done with our Xbox collaboration, AMD and Microsoft set out several years ago with a shared vision to bring the best of both companies together to revolutionize the laptop.
They probably help each other.
 
Last edited:

CrustyBritches

Gold Member
Dunno if this part of your post was sarcasm, but the new surface have a 'Ryzen Surface Edition'. AMD can indeed have mutual benefit of cooperation in this.
They're simply custom SoC contracts using AMD ip blocks. What Sony and MS do is no different than what Zhongshan Subor did with the Subor Z+.

Let's not ignore the example of the Chief Architects of the very tech MS and Sony are using leaving AMD for Intel, and assume the conspiracy theory that that's a sign of impending doom is legitimate. Can we agree on this?

Ehhhh, not convinced yet. Just too many assuptions being made. Based on a leaker who says he "was told" that the PS5 was slightly better than Scarlett.

And based on dev kits. 🤷
From the same site that has contradicting insiders: "Klee", "Matt", "Klobrille", "Benji". Komachi and Apisak are the only ones who have produced anything with actual data tied to it. Well, you could argue that "Benji" was talking about Vega-based dev kits, which could be accurate in that case. *edit* First 169 results for Fire Strike, Ryzen 2600+Vega 64 are "20K+". 550 results over 20K for Ryzen 3600+Vega 64.
 
Last edited:

Fake

Member
They're simply custom SoC contracts using AMD ip blocks. What Sony and MS do is no different than what Zhongshan Subor did with the Subor Z+.

Let's not ignore the example of the Chief Architects of the very tech MS and Sony are using leaving AMD for Intel, and assume the conspiracy theory that that's a sign of impending doom is legitimate. Can we agree on this?
The cooperation can be beyond CPU and GPU mind you. Sony for example, have one of the best color calibrants yet (because Sony Camera division) and they use and abuse on Playstation division for HDR purposes. I still looking for some AMD blog info when they spoke about the benefics of cooperation with Sony on PS4 and what this could matters for the notebook division.
 
Last edited:

TLZ

Banned
Lol, people acting like Sony and MS actually have significant contribution to AMD's designs. Zen was designed under Jim Keller, while Navi was made under Raja Koduri. Keller left for Tesla in 2015, while Raja left for Intel in 2017 and as far as I know, both are currently working for Intel.

Surely this is a sign Zen and Navi would do poorly, with their chief architects leaving AMD in such an untimely manner. /s
Didn't Lisa Su say onstage they worked with both Sony and MS for a long time on Navi? As in it was a joint research/efforts?
 
Didn't Lisa Su say onstage they worked with both Sony and MS for a long time on Navi? As in it was a joint research/efforts?
i think what CrustyBritches CrustyBritches meant to say is AMD take notes from their partners but AMD gonna do what AMD see fit for them money wise, AMD is not going to make something specific for a specific partner, they take notes from different partners and make what suits everyone.
 

Lunatic_Gamer

Gold Member
Thank you, I was waiting for someone to bring this up :) Common misconception (at least for PS4). More education:

Most people assume the PS4 GPU was a scaled down 7870 desktop which released in March 2012 and was only the 2nd fastest card AMD made (the fastest was the 7970). In fact, the PS4 was actually based on the mobile GPU variant called the 7970M which initially released on the PC in April 2012. That card was the absolute fastest mobile card AMD produced and it was in fact based on the full 7870 desktop silicon. If you look at the specs you will see that it is indeed virtually identical to the known PS4 GPU sans the PS4 GPU being even more underclocked from 850 to 800 Mhz. Now even though the PS4 did not release until Nov 2013, that 7970M card was still the fastest mobile card AMD manufactured at the time of the PS4 release! In fact, if you look at the history, AMD did not make another faster mobile GPU until the R9 M295X which didn't release until Nov 2014. This was essentially the mobile version of the 7970 desktop card but it came too late to be included in the PS4 (and was also way too power hungry to be in a console).

So contrary to popular belief, Sony included the absolute best and fastest card available at that time considering AMD's roadmap, PS4 release schedule, and thermal considerations for the console. In fact, if you know about computer hardware and thermals than you will know why it the PS4 HAD to include the mobile version. In short, AMD GPUs at that time were not very power effecient. GPUs in a console typically need to be < 100W TDP in order to be practical given the thermal and cooling limitations in a console. The desktop 7970 GPU released with a TDP of 250W which is absolutely not practical for a console. Even the desktop 7870 GPU had a TDP 175W which again is not practical given that the total TDP for the console is less than 150W. However, AMD was able to make a mobile variant of the 7870 with a TDP of only 75W, perfect for a console. Again this was the absolutely fastest card that was feasible up until the PS4 released in Nov 2013.

People always want to talk about the PS4 being so "underpowered" at launch but if you know that facts and what it takes to build a console from a hardware standpoint, Sony did the best they could given what AMD had available. But the focus for the PS4 (and Xbox One) wasn't on pure power, it was on developer ease and convenience. The real change was the move to x86 architecture on the CPU and again the Jaguar was the only thing AMD had available that could "fit" in a console form factor at that time (the AMD desktop parts were way too big and power hungry).

However, AMD is NOT the same company today that it was in 2012/2013. They have a wide range of highly competitive CPUs and will be releasing a wide range of highly competitive GPUs in the coming year. Not just in raw performance, but more important in terms of efficiency.

Also, the priorities of the console manufactures are different this time around. While the PS4 and Xbox One were mostly about removing barriers for developers, both Sony and Microsoft have made it clear that PS5 and Scarlett will be about pushing the boundaries of gaming. Having announced a GPU with 8K, 120fps, and ray-tracing capabilities says that they will definitely be significantly more powerful than the current RX 5700. If you look back, the PS4 and Xbox One GPU didn't really offer anything "new" at the time (except for ACE units on PS4). There were no marketing buzzwords to highlight advanced features and in fact all Sony could say was that it was a "super charged" PC.

Microsoft focused on software and UX features for Xbox One as part of their defocus on gaming initially. They paid the price and are hell bent on making sure they are technically competitive from day one with Scarlett. The difference in power between the two will be extremely small, probably smaller than any other previous generation.

So again if you think that the next consoles will ship with AMD Navi GPUs equivalent to the current RX 5700, then you are surely mistaken and you will see in due time!

A wise man once said; “knowledge is worth more than gold”. Great post by the way.
 

CrustyBritches

Gold Member
Didn't Lisa Su say onstage they worked with both Sony and MS for a long time on Navi? As in it was a joint research/efforts?
The CPU and GPU they're using were designed under Chief Architects who left AMD and are now working for Intel along with John Sell, the Chief Architect of the Xbox Scarlett for fucks sakes. Does that mean impending doom for AMD and Xbox? Doesn't that sound like a dumb conspiracy theory?

Reality:
VentureBeat: I wanted to tap your brain on semiconductor design in general. Why did you decide to make the move to Intel at this time?


Sell:
Well, Intel presented me with a really exciting opportunity. I was not, honestly, out looking. I’m very excited, very bullish on Microsoft and the things I was doing there. It is true that my work on the coming Xbox was mostly done. To be leading something as important as security at a company as important as Intel is a pretty exciting opportunity.
 
Sony was the reason GCN 1.1 got 8 ACEs vs only 2 ACEs on GCN 1.0. Some people speculate it's because Cerny tried to replicate Cell's 8 SPUs structure. Sony wanted a spiritual successor of Cell and it looks like they got what they wanted.

MS also contributed quite a bit of customizations, from Move Engines to eSRAM. I don't think you'll find these stuff in bog standard APUs.

Regarding next-gen secret sauce:


Maybe Sony contributed the RDNA2/RT secret sauce, we'll see.

There's a reason Sony/MS APUs have their company logo on top of the APU die, instead of the AMD logo. Nobody has noticed that?

On the other hand, Tegra X1 has an nVidia logo, not a Nintendo one. Huge difference.
 

demigod

Member
They're simply custom SoC contracts using AMD ip blocks. What Sony and MS do is no different than what Zhongshan Subor did with the Subor Z+.

Let's not ignore the example of the Chief Architects of the very tech MS and Sony are using leaving AMD for Intel, and assume the conspiracy theory that that's a sign of impending doom is legitimate. Can we agree on this?

Raja leaving AMD was the greatest thing that happened to AMD.
 
Status
Not open for further replies.
Top Bottom