• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

LiquidRex

Member
Honestly, I saw this coming for a long time now, Coreteks, RGT and other PC centric channels all predicted/leaked that RDNA 2 GPU's were going to hit some serious clock speeds based off what was revealed about the PS5's GPU.

RGT in almost all his recent videos about RDNA 2 mentioned that the RDNA 2 discreet GPU's would hit "console level clock speeds", this was the quote he used from his "sources" and it was obvious they were referring to PS5. However he also mentioned that AMD were having issues with running the clocks "too high", not because of overheating but because there was a "logic breakdown" and this was a drawback of the RDNA architecture, don't know exactly what that means but Mark Cerny also said the same thing about why the PS5's GPU wasn't clocked higher than 2.23 Ghz. Maybe AMD have resolved the issue? only time will tell.
Redgamingtech knocks another one out of the park. 💪👍
 

Nickolaidas

Member
Redgamingtech knocks another one out of the park. 💪👍
I'm curious ... if RDNA2 allows such high-clocking ... why didn't MS did the same with the Series X which would produce even MORE TFs at higher clock speeds, especially since it's 'safe' to do so? They couldn't invest as much on a solid cooling solution? And if that high-clocking on the PS5 is 'normal' for RDNA2, why DOES it need a custom-made high quality cooling solution?

Not a tech guy, I'm just wondering.
 

Gediminas

Banned
I'm curious ... if RDNA2 allows such high-clocking ... why didn't MS did the same with the Series X which would produce even MORE TFs at higher clock speeds, especially since it's 'safe' to do so? They couldn't invest as much on a solid cooling solution? And if that high-clocking on the PS5 is 'normal' for RDNA2, why DOES it need a custom-made high quality cooling solution?

Not a tech guy, I'm just wondering.
you seen the size of PS5? :messenger_tears_of_joy:

now imagine xbox with 2,23ghz + higher clocked cpu, ram and ssd :messenger_tears_of_joy:
 
Last edited:

LiquidRex

Member
I'm curious ... if RDNA2 allows such high-clocking ... why didn't MS did the same with the Series X which would produce even MORE TFs at higher clock speeds, especially since it's 'safe' to do so? They couldn't invest as much on a solid cooling solution? And if that high-clocking on the PS5 is 'normal' for RDNA2, why DOES it need a custom-made high quality cooling solution?

Not a tech guy, I'm just wondering.
From my understanding and I'm a newb... Sony choose to go Narrow and Fast (Less CU's but Higher Clock) for through put/next to no bottlenecks.
Microsoft have gone the route of a traditional PC brute force... Wide and Slow (More CU's but Lower Clock)

I think at the moment there's more to come with regards to the RDNA features in the PS5 compared to XSX... What I mean is Microsoft have confirmed most of the RDNA features in the XSX, yet Sony have been mostly silent on the subject.... Maybe due to NDAs 🤔 if due to NDAs then we'll be waiting till after 28th of October until Sony reveal all on the Consoles RDNA 2 features and beyond. 🤔
 
From my understanding and I'm a newb... Sony choose to go Narrow and Fast (Less CU's but Higher Clock) for through put/next to no bottlenecks.
Microsoft have gone the route of a traditional PC brute force... Wide and Slow (More CU's but Lower Clock)

I think at the moment there's more to come with regards to the RDNA features in the PS5 compared to XSX... What I mean is Microsoft have confirmed most of the RDNA features in the XSX, yet Sony have been mostly silent on the subject.... Maybe due to NDAs 🤔 if due to NDAs then we'll be waiting till after 28th of October until Sony reveal all on the Consoles RDNA 2 features and beyond. 🤔

We started this year arguing over whether the PS5 would even be RDNA1. Now imagine what's going to happen if what you said becomes true?
 
And the way I see it, it had space for even more cookies.
Cooling systems like the XsX are good only up to a point. I think that the frequency is just calculated from the capabilities of the system. Earlier I heard that the GPU was clocked at 1623MHz for the same known CU. Tends to believe it. The cooling system of the XsX is quite effective, but it is unlikely that it is designed to quickly remove heat from the chip, as in the PS5, but rather to dissipate heat.
 

Aceofspades

Banned
We started this year arguing over whether the PS5 would even be RDNA1. Now imagine what's going to happen if what you said becomes true?

If anything, the recent AMD leak proves that PS5 is using same tech as AMD latest cards. Something Cerny alluded to in his presentation when he said something like .. if you see similar tech in cards releasing at the same time, it doesn't mean we simply incorporated PC tech, it means our collaboration with AMD has succeeded ...etc
 

geordiemp

Member
Cooling systems like the XsX are good only up to a point. I think that the frequency is just calculated from the capabilities of the system. Earlier I heard that the GPU was clocked at 1623MHz for the same known CU. Tends to believe it. The cooling system of the XsX is quite effective, but it is unlikely that it is designed to quickly remove heat from the chip, as in the PS5, but rather to dissipate heat.

Or there is another reason than just saving on cooling. But it is possible. Just speculating possibilities....

Could also be number of other reasons, N7 is a node, RDNA2 is a node and amd logic blocks. It MAY be that Microsoft used less EUV critical layers to keep die cost down., so push frequency less. Not every N7 will use same process steps.

Or could be propagation logic is different as XSX is a much larger shader array..

Also none of the RDNA2 cards rumoured have similar CU number to XSX....its either ps5 or 2 ps5 size. mmm

Could be all or none. Most likely MS designed a server cards and low power with 2 boards, and just kept same design for xsx, but propagation limits and process steps could be different.
 
Last edited:

Godfavor

Member
this is no longer true with rdna 1.0 and turing tflops. The 2070 is 9.1 tflops at average game clocks and the 5700 xt is 9.3 tflops at average game clocks. Both offer nearly identical performance.

Nvidia is no longer undereporting their tflops numbers. But their tflops are not offering 1:1 performance anymore. 2080 is 11 tflops but the 3080 is 30 tflops and offers only 2x more performance. Adding more shader cores is clearly not giving linear performance increases.

Hm, I believe that big Navi would be closer to performance with a 3090 in 4k but in 8k 3090 will be better.

There might be issues utilizing all these cores for the 3090, in 4k at least.
 
the magic sauce that improves PS5, but sony want people to believe it is weaker, is back again?

Sony wants people to believe the PS5 is weaker than it actually is?

I don't get it.

Also the sauce stuff well both consoles have their own.



"We are so honoured and proud to be part of Sony's next-generation PlayStation," Su told Jim Cramer. "This has been a really long-term partnership with them. We love gaming. We think gaming is a really good secular growth market. What we have done with Sony is really architect something for their application, for their special sauce. It's a great honor for us. We're really excited about what the next generation PlayStation will do. And happy to be a part of it."
 
Last edited:

LiquidRex

Member
That talk was derived from Vega architecture (compute focused) vs a Pascal gaming architecture, engineered to run games primarily....

So Pascal had compressed textures, lower color fidelity in many games, but at 1080p and even below you could not see it clearly. So Nvidia traded Image Quality at lower TFLOPS and lower power draw. AMD did not compress anything so though it's Vega and Polaris cards had higher TF, they sucked in more power because the IQ and colors were better....

Vega was compute focused, yet, too many games were not compute focused, so in the majority of games Nvidia won. Notwithstanding certain features which kinda crippled AMD performance like tesselation and gameworks. Now Polaris was the first step by AMD to go the pure gaming route, at 5.1 and 6.1 TF the RX 570 and 580 were great cards against the 1050ti and 1060, yet if AMD scaled up polaris, the powerdraw would have skyrocketed at 14nm......So they went with Vega and HBM for the high end, but that was not a gaming architecture, most of the times 50% of Vega's raw compute power remained idle, so the cards had lots of power, just never utilized due to the non-gaming architecture or rather devs not prioritizing on compute focused games....

Which brings us to RDNA 1. An improvement on Polaris and a partial departure from GCN at a lower node. RDNA 1 was customized as a gaming GPU and it's ultra fast....ON IPC, RDNA 1 beat Turing easily if you look at performance per watt. In essence, if AMD developed a larger chip than the 251 MM2 5700XT chip, it would most certainly be as fast and even faster than 2080ti. Yet a 5700XT at 251MM2 beats a 2070 and is on par with a 545MM2 2070S. This means that if the 5700XT was only 400MM2, the chip would be packed enough to beat the 2080ti on RDNA 1, yet 2080ti is a 754MM2 chip.....

In short, RDNA has better IPC and performance per watt over Turing by a large margin. Now the team on RDNA 2 took all of this to a new level, it's said performance per watt got upshot to 60% from RDNA 1. The IPC gains are about 20% from RDNA 1, the clock speeds are insanely high, the CU's have improved a million fold relative to instructions per CU, the node is super efficient hence the high clockspeeds.....Then people are forgetting AMD has not stopped there, they implemented their own form of HBCC straight on the board with 128MB of cache. AMD is not effing around, they are trying to push things this gen and go for a complete kill with MGPU's on RDNA 3 in the future....

Now, people are weary, they can't be patient. AMD is not going to allow Sony to announce a feature set of their architecture yet, at least one where both companies benefitted from their collaboration, fueled by Sony for their PS5 regardless. yet, that's how collaborations work, I scratch your back, you scratch mine. AMD has to sell GPU's too and they have always said their GPU's would hit shelves before the console.....After the 28th of October I'm sure all the breakdowns people want to know will be a go........Sometimes it's good to keep something in the oven for longer, keep quiet and work out your kinks....Cerny said that there were some logic issues with clocking PS5 over 2.23GHz, I'm almost certain that has been resolved, that's what time gives you......So you will see AMD GPU's clock pretty high and over that of PS5. Not sure if it would make sense for them to boost the PS5 clock even more, I guess it depends on the formidability of their cooling solution, but it's a possibility. I still think there are some nice perks we have not heard about PS5, hardware related and software related (OS especially)........There is still time to announce all of that, there is more than enough time.....I told folk AMD would hit it out of the park, they are coming for both the CPU and GPU market this holiday.....The best kit you will buy will be AMD CPU + AMD GPU as the most forward moving pieces of architecture in both realms later on this year...People who were wishing for a intel + nv combo for consoles have no idea how power hungry, expensive and limited that would have been.....
It's possible Sony have increased the clock, however they would have to change the PS5 technical spec sheet... They are advertising the console as 10.28tfs and with Preorders out, can they legally change the specs of the console 🤔
 

Godfavor

Member
Exactly, we don't know how high RDNA 2 clocks can go before it's considered an overclock. It could very well reach 2.5 ghz without overclock and if that's the case, one more narrative can be put to rest.

Clock rate to power consumption ratio is somewhat limited around 2.25 ghz per kerny talk as a 2% reduction in frequency is a 10% power reduction.

This does not tell how the gpu feeds from this clock rate and translate it into performance though. But having high clocks in PC gpus is good. I suspect that the chip will lose effeciency after 2.1 to 2.2 ghz range
 
Exactly, we don't know how high RDNA 2 clocks can go before it's considered an overclock. It could very well reach 2.5 ghz without overclock and if that's the case, one more narrative can be put to rest.

It seems like they comfortably clock high, and PS5s issue was probably more heat related than an inability to clock much higher due to limitations with chip logic. It is after all a small box compared to a PC tower.

How the tables have turned here once reality caught up.

Full fat RDNA2 XSX doesn't seem to resemble RDNA2 very much. I am still looking for the 56CU RDNA2 GPU clocked at less than 2000MHz.
 

kyliethicc

Member
I'm curious ... if RDNA2 allows such high-clocking ... why didn't MS did the same with the Series X which would produce even MORE TFs at higher clock speeds, especially since it's 'safe' to do so? They couldn't invest as much on a solid cooling solution? And if that high-clocking on the PS5 is 'normal' for RDNA2, why DOES it need a custom-made high quality cooling solution?

Not a tech guy, I'm just wondering.
They clearly just wanted 12 TFLOPS. Phil even said the goal was to double the One X's 6 TF.

They picked 56 CUs for their server chip size to make sure they could run 4 Xbox One games at once on the same chip for XCloud servers (14x4=56). This way they could disable 2 of 14 CUs each (times 4) to replicate the Xbox One for 4 games at once per chip for XCloud.

So once they arrived at 56 CUs for their chip, they disabled 4 CUs on the console dies for yields.
And then they had 52 active CUs. Do the math to get to the magic 12 TFLOPs and thats how they got to 1.825 GHz.

Notice how the Series S had the exact clock speed of 1.565 GHz needed for 20 CUs to get to 4.0 TFLOPs?
And how the One X has the exact random clock of 1.172 GHz to get to exactly 6.0 TF with 40 CUs?

They could have run the One X higher, and the Series X higher, and the Series S higher.... but they don't care.
They figure out the CU count and set the clock speed arbitrarily just to get to the magic TFLOP goal.

Its not about performance or gameplay. Its about the FLOPS. Marketing.
 
Last edited:

Rea

Member
They clearly just wanted 12 TFLOPS. Phil even said the goal was to double the One X's 6 TF.

They picked 56 CUs for their server chip size to make sure they could run at least 4 Xbox One games at once on the same chip for XCloud servers (14x4=56).

This way they could have 4x 14 CUs and disable 2 of 14 CUs each (times 4) to replicate the One and run 4 games for XCloud at once per chip.

So once they arrived at 56 CUs for their chip, they disabled 4 CUs on the console dies for yields.

And then they had 52 active CUs. Do the math to get to the magic 12 TFLOPs and thats how they got to 1.825 GHz.

Notice how the Series S had the exact clock speed of 1.565 GHz needed for 20 CUs to get to 4.0 TFLOPs?
And how the One X has the exact random clock of 1.172 GHz to get to exactly 6.0 TF with 40 CUs?

They could have run the One X higher, and the Series X higher, and the Series S higher.... but they don't care.
They figure out the CU count and set the clock speed arbitrarily just to get to the magic TFLOP goal.

Its not about performance or gameplay. Its about the FLOPS. Marketing.
If it is real, it will be a waste for Microsoft to put so much performance on the table just for the sake of Tflops and marketing POWA bullshit.
 

kyliethicc

Member
If it is real, it will be a waste for Microsoft to put so much performance on the table just for the sake of Tflops and marketing POWA bullshit.
Well it's undeniable they could have run the GPUs from the Series X/S and the One X at higher clocks. They just would have had to up the power supply, maybe make bigger boxes.

Xbox clearly pick the clocks for FLOPs, but also don't want any extra performance because it could make their consoles bigger, or louder, as well.
 
Last edited:

Rea

Member
Well it's undeniable they could have run the GPUs from the Series X/S and the One X at higher clocks. They just would have had to up the power supply, maybe make bigger boxes.

Xbox clearly pick the clocks for FLOPs, but also don't want any extra performance because it could make their consoles bigger, or louder, as well.
Yeah, it will probably cost more than 499 for SX too.
 

You see what happens when you reduce those Mailbox 📮 numbers. 😜😜😜
This is what I expect going into next gen, games will be around the same size as they are today but for different reasons. Duplicate data space on disk will be replaced by higher quality assets and textures on next gen games, at least that's how I think it will be when we start seeing sequels from current gen games being the same size as their predecessors despite being next gen only titles.
 
D

Deleted member 775630

Unconfirmed Member
The design of the X Series X hardly shows elegance over efficiency. They literally made a plain boring box to put inside all the cookies they had made. And the way I see it, it had space for even more cookies. If they wanted to put more cookies, they would.
The design might not be your cup of tea. But creating a more powerful system and putting it in a box that is volume wise 35% smaller than the PS5 is amazing engineering.
 

LiquidRex

Member
What's this I'm seeing about the PS5 won't ship with an Hdmi 2.1 cable, but the XSX will. 🤔

Anyone heard any different... Is it pure bollocks. 🤔
 
Last edited:

Godfavor

Member
Well it's undeniable they could have run the GPUs from the Series X/S and the One X at higher clocks. They just would have had to up the power supply, maybe make bigger boxes.

Xbox clearly pick the clocks for FLOPs, but also don't want any extra performance because it could make their consoles bigger, or louder, as well.

They could just add boost mode like the ps5 with a bios update without changing the power supply. It would make the system more effecient, but certainly louder
 

hemo memo

Gold Member
What's this I'm seeing about the PS5 won't ship with an Hdmi 2.1 cable, but the XSX will. 🤔

Anyone heard any different... Is it pure bollocks. 🤔

On one hand fud is spreading at rapid pace on the other Sony is all over the place and going cheap wouldn’t be surprising.
Never mind.
 
Last edited:

TrippleA345

Member
Edit: No market has a supply to pick up. Thanks to @ LokusAbriss LokusAbriss

Just found out that you can still buy the Xbox Series S and the Xbox Series X in Germany in one of the biggest retailer.
I stoped researching after the two ones.

mediamarkt
saturn

7qY0czL.png
iiIaYUK.png
 
Last edited:
They could just add boost mode like the ps5 with a bios update without changing the power supply.
No, they can't. Their chip design is completely different. If they did turbo-boost, they would be at great risk of easily burning the console due to power surges. Or do you think that turbo-boost is taken from the air? This requires an overvoltage supply and an increased power package.
 

ZehDon

Member
I'm curious ... if RDNA2 allows such high-clocking ... why didn't MS did the same with the Series X which would produce even MORE TFs at higher clock speeds, especially since it's 'safe' to do so? They couldn't invest as much on a solid cooling solution? And if that high-clocking on the PS5 is 'normal' for RDNA2, why DOES it need a custom-made high quality cooling solution?

Not a tech guy, I'm just wondering.
Of all the companies in the game, Microsoft learned the hardest lesson about cooling. The Xbox 360's RROD was a billion dollar problem stemming from thermal issues (thermal paste issues due to the internal heat of the components, if memory serves). They'll push the hardware, but only up until a point. I suspect cooler hardware with a wider, slower pipe will be Microsoft's strategy moving forward, because it ensures heating remains under control. The PS5 needs close to 40% more volume in its case to get the GPU clock speeds it has - and its still 20% behind the Xbox Series X in theoretical performance. Cerny knows his stuff, no question, but Microsoft have seen what happens when you get it wrong.
 
Status
Not open for further replies.
Top Bottom