• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Imo MS locking at sustained speeds is a mistake to be corrected

longdi

Banned
I think SX designs has more to offer. But by locking at only 1.8ghz, seems like MS is selling it too short
Look at RX6800, the closest to SX.

RX6800 runs metro 1440p ultra at around 2.25ghz, takes only 230W. It generates 85fps, beats 2080Ti Fe.
From reviews, a 7% overclock 6800, gives about 3% fps improvements
Working backwards, 1.8ghz to 2.25ghz is a 25% increase. So if we capped RX6800 to 1.8ghz, it will lose 10~12% in perf, so Metro 1440p may lose 10fps in doing so. 🤷‍♀️

Let see if any reviewers review capped 6800/6800XT clocks. Please share if you come across them.

zJBwNYXab6TMJQ5vdNmVeX-2751-80.png

SDTxTksKUDpKAH7vhCPL4X-2751-80.png

VbT8NM8Q4EjvnyZSVGVs2K-2751-80.png

clock-speed-comparison2.png
 
Last edited:

longdi

Banned
What's worse is the series S clocks. 1.565 Ghz. I mean, really ? That is a low clock speed for rdna 1, let alone rdna 2. I don't understand why they are being so conservative.

Series S is too small, locking it clocks is understandable choice.
Series X should be allow to breathe more, it can be done.
 

slade

Member
They designed a box whose internals are pretty much squished together. To run those type of speeds, they need a much more expensive cooling system.

Considering how far behind they already are, SDK wise, any improvements to the clocks will probably have to wait until the mid generation refresh.

They weren't ready to launch this year.
 

LordOfChaos

Member
Not sure if you've considered the power side - it's just going to be a fact that clocking 52 CUs as high as 36 would come with significantly more power draw, and Sony had to do the whole variable clock sharing deal to get here.

Interestingly, both get to about the same power draw figures in the end. I don't think it's being conservative so much as two different approaches to get to a similar spot.
 

SlimySnake

Flashless at the Golden Globes
No offense, but 1.825 ghz is pretty crazy for a console. we had a max of 1.172 ghz last gen.

MS wasn't conservative at all. They pushed both the CUs and the clocks. They went with a super expensive vapor chamber cooler to achieve that. You seem to be forgetting that MS and Sony arent just cooling the GPU like AMD is doing, there is a CPU in there with a big io block and ram bus that needs to be cooled too.

Where MS screwed up was not thinking outside the box like Sony did. Sony went with a much cheaper and more traditional cooler/heatsink. it made their console look like a planet but they were willing to look ugly and big in order to save a buck. they also did a lot of work with liquid thermal cooling that tbh, MS nor any other GPU maker has bothered to implement. Sony was able to get to 2.23 ghz because they had fewer CUs AND because they went with liquid metal cooling and a traditional and super cheap heatsink that cost them only a few dollars according to bloomberg. Penello was super surprised to hear that the cooling was only costing sony a few dollars and refused to believe it. it likely cost MS almost $30 to do vapor chamber cooling.

I think in the future, everyone will go with variable clocks but its going to impossible to go with variable clocks with a 52 CU GPU with the cooling they have available now. what they have is good enough for the cooling they have. they simply cannot go above that.
 
It's not like they are underclocking the CPU or GPU (though, as with all hardware - that is a feature built in by the manufacturer) so I don't see what the problem is, you are suggesting Microsoft - with superior frequency speeds - should force their CPU and GPU to perform faster than factory settings even though, to my knowledge they already outperform the PS5 without forcing higher frequencies?
 

longdi

Banned
Not sure if you've considered the power side - it's just going to be a fact that clocking 52 CUs as high as 36 would come with significantly more power draw, and Sony had to do the whole variable clock sharing deal to get here.

Interestingly, both get to about the same power draw figures in the end. I don't think it's being conservative so much as two different approaches to get to a similar spot.

Well RX6800 at 2.2ghz takes only 230w. The zen2 mobile apu is very watt friendly, probably another 30w max. So you add in the br drive, wireless, ssd everything. I dont think SX will exceed 280-300w
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
No offense, but 1.825 ghz is pretty crazy for a console. we had a max of 1.172 ghz last gen.

MS wasn't conservative at all. They pushed both the CUs and the clocks. They went with a super expensive vapor chamber cooler to achieve that. You seem to be forgetting that MS and Sony arent just cooling the GPU like AMD is doing, there is a CPU in there with a big io block and ram bus that needs to be cooled too.

Where MS screwed up was not thinking outside the box like Sony did. Sony went with a much cheaper and more traditional cooler/heatsink. it made their console look like a planet but they were willing to look ugly and big in order to save a buck. they also did a lot of work with liquid thermal cooling that tbh, MS nor any other GPU maker has bothered to implement. Sony was able to get to 2.23 ghz because they had fewer CUs AND because they went with liquid metal cooling and a traditional and super cheap heatsink that cost them only a few dollars according to bloomberg. Penello was super surprised to hear that the cooling was only costing sony a few dollars and refused to believe it. it likely cost MS almost $30 to do vapor chamber cooling.

I think in the future, everyone will go with variable clocks but its going to impossible to go with variable clocks with a 52 CU GPU with the cooling they have available now. what they have is good enough for the cooling they have. they simply cannot go above that.

Yep to all of that. The PS5 was designed by pure engineers. The XSX to me seems like it was designed with a few too many execs in the room.
 

James Sawyer Ford

Gold Member
Nah, MS was too conservative. SX has room to spread still, but i guess they calculated locked 1.8ghz is enough to win.
It just bugs me of wasted higher potential.

You are delusional.

They designed around a certain power footprint and cooling solution. They also designed around a certain chip yield at a given frequency.

They can't just magically update the clocks.

Sorry, you are stuck with a console that is, at best, a wash with the PS5 you have bashed for the past few years. And what we are seeing now shows that PS5 is more performant.
 
Last edited:

LordOfChaos

Member
Well RX6800 at 2.2ghz takes only 230w. The zen2 mobile apu is very watt friendly, probably another 30w max. So you add in the br drive, wireless, ssd everything. I dont think SX will exceed 280-300w

300W is the entire SX power supply - efficiency, provisioning for capacitor aging, etc. It draws 150-200W, same as the PS5 does. 1/3rd to 50% more power draw is no trivial thing in consoles already pushing the size boundaries.
 
Last edited:

Xenon

Member
Yep to all of that. The PS5 was designed by pure engineers. The XSX to me seems like it was designed with a few too many execs in the room.


I think you're confusing this gen with last one. Please point out to me a feature in the Series X that would be driven by an executive rather than an engineer. It may turn out that Sony's design was better. But I'm definitely not going to judge that on a few multiplats during a pandemic.
 
Last edited:

longdi

Banned
You are delusional.

They designed around a certain power footprint and cooling solution. They also designed around a certain chip yield at a given frequency.

They can't just magically update the clocks.

Sorry, you are stuck with a console that is, a best, a wash with the PS5 you have bashed for the past few years. And what we are seeing now shows that PS5 is more performant.

This is not about catching up, but more to widening the gap.
It seems MS chose compactness and quietness over even more performant

The SX dimensions are crazy small 30x30x30mm, some mitx cases are larger.
But imo a 35x35x35mm SX wont take up much space.
 

longdi

Banned
300W is the entire SX power supply - efficiency, provisioning for capacitor aging, etc. It draws 150-200W, same as the PS5 does. 1/3rd to 50% more power draw is no trivial thing in consoles already pushing the size boundaries.

Yep it seems they can go a few cm larger and have a 350w psu. Now 1.8ghz is too conservatives, i wont be surprise if laptop RX6000M can boost higher.
 

SlimySnake

Flashless at the Golden Globes
Yep to all of that. The PS5 was designed by pure engineers. The XSX to me seems like it was designed with a few too many execs in the room.
thats a good point. I remember Penello saying that he was there when they came up with the Anaconda and Lockhart sandwich strategy. TBH, i have no idea why someone like Penello is even involved in gpu design decisions. hes not an engineer and a PR guy like him needs to stay the fuck away from those brainstorming sessions. Phil is the only exec that needs to be in Jason Roland's ear.

the funny thing is that if they hadnt come up with a sandwich strategy, they wouldve been forced to create a $399 box and forced to think outside the box to hit 10 tflops like sony did. but they thought series S was their trump card and figured sony would just settle for 8 tflops. wouldnt be surprised if they had the github leaks way back in 2018 and were a 100% sure sony would go with that, and just stopped innovating.
 

James Sawyer Ford

Gold Member
This is not about catching up, but more to widening the gap.
It seems MS chose compactness and quietness over even more performant

The SX dimensions are crazy small 30x30x30mm, some mitx cases are larger.
But imo a 35x35x35mm SX wont take up much space.

They can't widen the completely theoretical paper gap. You'd have consoles going over their power limit and chips starting to die because they weren't rated at the higher frequency and power.

But as Shu would say, let them dream.

This week has been a great vindication for all the ridiculous amounts of blue negruho FUD your ilk have been spreading leading up to launch
 

James Sawyer Ford

Gold Member
thats a good point. I remember Penello saying that he was there when they came up with the Anaconda and Lockhart sandwich strategy. TBH, i have no idea why someone like Penello is even involved in gpu design decisions. hes not an engineer and a PR guy like him needs to stay the fuck away from those brainstorming sessions. Phil is the only exec that needs to be in Jason Roland's ear.

the funny thing is that if they hadnt come up with a sandwich strategy, they wouldve been forced to create a $399 box and forced to think outside the box to hit 10 tflops like sony did. but they thought series S was their trump card and figured sony would just settle for 8 tflops. wouldnt be surprised if they had the github leaks way back in 2018 and were a 100% sure sony would go with that, and just stopped innovating.

This brings up another point...

The XSS is a complete disaster. It is not just a simple "downrez". Games are flat out performing worse at 30 fps vs 60.

This is a stillborn console IMHO, and looks like a horrible value compared to PS5 DE.

What the hell was Phil thinking? They are going to have to carry this baggage around for the entire gen...ugh. Just awful.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
I think you're confusing this gen with last one. Please point out to me a feature in the Series X that would be driven by an executive rather than an engineer. It may turn out that Sony's design was better. But I'm definitely not going to judge that on a few multiplats during a pandemic.

The size of the console and how the internals are just smashed together on the inside. Plus how MS went with a split RAM solution. At some point, an exec told the engineers that they could go 550 GB/s for the full 16 GBs. No engineer would want to do it that way.
 

Neo_game

Member
SX PSU is 315watts. According to NXgamer power usage of console has to be less than 80% so 252 is the limit and probably never reach it. I think gears5 SX reached a max of 211watts already. It is stupid idea to increase the clocks now via firmware update now. SX problem is the API, software it has nothing to do with hardware or power of the console.
 
I'm kinda wondering if that leak that the CUs in the XSX are actually based on RDNA1 are true. RDNA2 seems to be able to handle higher clocks, so why would MS go so low with theirs? It's either that or the larger chip is so big/expensive that they want to maximize yields by accepting chips that can't be clocked much higher. Either way, don't expect them to change clocks anytime soon.
 

SlimySnake

Flashless at the Golden Globes
This brings up another point...

The XSS is a complete disaster. It is not just a simple "downrez". Games are flat out performing worse at 30 fps vs 60.

This is a stillborn console IMHO, and looks like a horrible value compared to PS5 DE.

What the hell was Phil thinking? They are going to have to carry this baggage around for the entire gen...ugh. Just awful.
the funny thing is that it might actually be performing worse the the x1x which is 30 fps and definitely higher than the 1080p-1180p resolution the xss plays during combat sections. i told everyone that a 4 tflops rdna 2.0 gpu would not be the same as a 6 tflops polaris gpu but no one believed me. it also has less vram and lower bandwidth than the x1x.

Phil just wanted his cake and eat it too. he let himself get talked into a stupid strategy by people like Panello who are not engineers. a 4 tflops gpu simply was not going to be good enough. thats just not how games scale. wait until we get to next gen games. they are going to scale even more poorly. watch dogs with ray tracing is literally 900p with worse shadows, world detail and textures. aside from ray tracing, it actually looks worse than the base ps4 version of the game.
 

James Sawyer Ford

Gold Member
the funny thing is that it might actually be performing worse the the x1x which is 30 fps and definitely higher than the 1080p-1180p resolution the xss plays during combat sections. i told everyone that a 4 tflops rdna 2.0 gpu would not be the same as a 6 tflops polaris gpu but no one believed me. it also has less vram and lower bandwidth than the x1x.

Phil just wanted his cake and eat it too. he let himself get talked into a stupid strategy by people like Panello who are not engineers. a 4 tflops gpu simply was not going to be good enough. thats just not how games scale. wait until we get to next gen games. they are going to scale even more poorly. watch dogs with ray tracing is literally 900p with worse shadows, world detail and textures. aside from ray tracing, it actually looks worse than the base ps4 version of the game.

It's amazing to me how much MS marketing gets away with

"World's most powerful console"

"Introducing the XSS...a console targeting 1440p"

What a joke. Their strategy just reeks of insecurity, and it's going to be a complete afterthought...no developer is going to spend time trying to make the best version they can on XSS.

Is $100 savings vs PS5 really going to sway a segment of the consumer base? Not really
 

Md Ray

Member
MS should have also went with narrow and fast approach.

A 44 CU GPU at 2156 MHz would have landed them at the same compute power as their current setup. ROPs and rasterization rate would have been 18% faster too as a result.

EDIT: I mean, 18% faster than their current 52 CU @1825 MHz setup.

Fun fact: Their One X devkits are equipped with 44 GCN CUs and One X retail have 4 of them disabled.
 
Last edited:

rnlval

Member
I think SX designs has more to offer. But by locking at only 1.8ghz, seems like MS is selling it too short
Look at RX6800, the closest to SX.

RX6800 runs metro 1440p ultra at around 2.25ghz, takes only 230W. It generates 85fps, beats 2080Ti Fe.
From reviews, a 7% overclock 6800, gives about 3% fps improvements
Working backwards, 1.8ghz to 2.25ghz is a 25% increase. So if we capped RX6800 to 1.8ghz, it will lose 10~12% in perf, so Metro 1440p may lose 10fps in doing so. 🤷‍♀️

Let see if any reviewers review capped 6800/6800XT clocks. Please share if you come across them.

From https://www.techpowerup.com/review/amd-radeon-rx-6800/32.html

RX 6800 reference card has the following clock speed behavior

Average: 2,205 Mhz, yields about 16.934 TFLOPS
Median: 2,218 Mhz

From https://www.techpowerup.com/review/amd-radeon-rx-6800/32.html

Average power consumption for gaming: 164 watts

Peek power consumption for gaming: 215 watts

This is not including Rage Mode.


From https://www.techpowerup.com/review/amd-radeon-rx-6800/40.html



Maximum Overclock Comparison
Avg. GPU Clock​
Max. Memory Clock​
Performance​
AMD RX 6800
2472 MHz
2140 MHz​
197.2 FPS​
AMD RX 6800 XT
2322 MHz
2140 MHz​
219.8 FPS​
AMD RX 6800 XT @ Max Power Limit
2572 MHz
2140 MHz​
235.5 FPS​

At 2472 Mhz average overclock speed, it yields 18.984 TFLOPS


MS should have given developers the option for variable clock speed e.g. trade reduced CPU power consumption allocation for higher GPU clock speed.
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
the funny thing is that it might actually be performing worse the the x1x which is 30 fps and definitely higher than the 1080p-1180p resolution the xss plays during combat sections. i told everyone that a 4 tflops rdna 2.0 gpu would not be the same as a 6 tflops polaris gpu but no one believed me. it also has less vram and lower bandwidth than the x1x.

Phil just wanted his cake and eat it too. he let himself get talked into a stupid strategy by people like Panello who are not engineers. a 4 tflops gpu simply was not going to be good enough. thats just not how games scale. wait until we get to next gen games. they are going to scale even more poorly. watch dogs with ray tracing is literally 900p with worse shadows, world detail and textures. aside from ray tracing, it actually looks worse than the base ps4 version of the game.

Nothing wrong with having cake and wanting to eat it. I think you mean to say, "Phil wanted to eat his cake and have it too." You can eat your slice of the cake and also have it. Once you eat it, it's gone [/random rant] But I agree with you. The XSS is a pure exec move that doesn't make sense really. Only a bean-counting suit would actually want to sell the XSS. Especially at launch.
 

GHG

Member
Oh longdi longdi ... Not so long ago:

36CU was right. 9Tflops was right
Sony had to overclocked it to get 10Tflops

Mark failed.
Sony had to overclocked last minute.

Same shit like Xbox1. Things keep flipping between each gen
That is down to Mark Sony. So DF had a chat with him, and they still cant get convincing answers on PS5 boost clocks. So much so, Richard ended with overclocking overclocking! And even did a GPU tests, showing OC is OC and more CU > high OC low CU.


So what you're saying is that you now want Microsoft to do a "last minute overclock"?
 
Last edited:

Riky

$MSFT
That clock was very deliberate to deliver 12tflops, I wouldn't be bothered about a couple of cross gen launch games in the grand scheme of things.

None of these games are next gen, none are using the RDNA2 performance features or even Velocity Architecture. There is a lot more to come over the next seven years.
 

Tajaz2426

Psychology PhD from Wikipedia University
From the most powerful console ever to "guys the differences are really small, don't worry about it!" Good stuff.
I think it’s just icing on the cake for PlayStation folks. They were never expecting the PS5 to do better, just be close to the XBox. They got what they wanted and more. If I wasn’t a PC gamer mainly, I would use the PS5 because of the controller alone.
 

Md Ray

Member
Oh longdi longdi ... Not so long ago:






So what you're saying is that you now want Microsoft to do a "last minute overclock"?
He was hell-bent on the idea that PS5's GPU frequency was a last min OC judging by the 6800 non-XT's game clock which was advertised around 1800-ish MHz (and because it was similar to XSX's).

And now we're seeing even the 6800 XT comfortably hitting over 2300 MHz at stock, no less with 2x more CUs.

It's nice to see these guys getting proven wrong left and right.
 
Last edited:
Top Bottom