• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

kyliethicc

Member
This is all you ain't it.
even if it's not i am going to point the finger at you.

you know those little sticks you use to find water when you were a kid, i am using them now, and they are pointing at you.

N4stJWf.jpg
 

jimbojim

Banned
Yes.. such FUD:



I explained outright that variable frequencies allow a system to push farther than the same system with fixed clocks...

Never denied it... in fact it's at the VERY CORE OF THE FACT IVE BEEN REPEATING:

They are variable because the same chip COULDNT maintain max at all times. It's saying the exact same thing and it isn't FUD:

The PS5 would be slower if they fixed the clocks as they'd have to fix the clocks at a lower speed than they can achieve with variable clocks.

You guys just knee jerk assume everything is anti-PS5 FUD because you are seriously kind of mentally ill.
For some games/instructions, PS5 can run both CPU+GPU at full clocks. Call that “X”.

Sony creates fans/heat sink to cool for X so console is reasonably quiet and doesn’t draw too much power.

So what if a developer wants to run instructions or creates a game that is X+1 ?

You let them down clock to stay within the thermal limits of the cooling solution and max draw.

It’s let’s a game do X+1 while they can design the console’s fans and heatsink around X.

Some people should read this crap over and over why variable are better than fixed clocks. It was mentioned before, why not repeat it over and over :

"
Sony did tell us how their design works. The thing you're missing is that the PS5 approach is not just letting clocks be variable, like uncapping a framerate. That would indeed have no effect on the lowest dips in frequency. But they've also changed the trigger for throttling from temperature to silicon activity. And that actually changes how much power can be supplied to the chip without issues. This is because the patterns of GPU power needs aren't straightforward.

Here's a depiction of the change. (This is not real data, just for illustrative purposes of the general principle.) The blue line represents power draw over time, for profiled game code. The solid orange line represents the minimum power supply that would need to be used for this profile. Indeed, actual power draw must stay well below the rated capacity. Power supplies function best when actually drawing ~80% of their rating. And when designing a console the architects, working solely from current code, will build in a buffer zone to accommodate ever more demanding scenarios projected for years down the line.
standardpowersokvg.png


You'd think the tallest peaks, highlighted in yellow, would be when the craziest visuals are happening onscreen in the game: many characters, destruction, smoke, lights, etc. But in fact, that's often not the case. Such impressive scenes are so complicated, the calculations necessary to render them bump into each other and stall briefly. Every transistor on the GPU may need to be in action, but some have to wait on work, so they don't all flip with every tick of the clock. So those scenes, highlighted in pink, don't contain the greatest spikes. (Though note that their sustained need is indeed higher.)

Instead, the yellow peaks are when there's work that's complex enough to spread over the whole chip, but just simple enough that it can flow smoothly without tripping over itself. Unbounded framerates can skyrocket, or background processes cycle over and over without meaningful effect. The useful work could be done with a lot less energy, but because clockspeed is fixed, the scenes blitz as fast as possible, spiking power draw.

Sony's approach is to sense for these abnormal spikes in activity, when utilization explodes, and preemptively reduce clockspeed. As mentioned, even at the lower speed, these blitz events are still capable of doing the necessary work. The user sees no quality loss. But now behind the scenes, the events are no longer overworking the GPU for no visible advantage.
choppedpower2hjss.png



But now we have lots of new headroom between our highest spikes and the power supply buffer zone. How can we easily use that? Simply by raising the clockspeed until the highest peaks are back at the limit. Since total power draw is a function of number of transistors flipped, times how fast they're flipping, the power drawn rises across the board. But now, the non-peak parts of your code have more oomph. There's literally more computing power to throw at the useful work. You can increase visible quality for the user in all the non-blitz scenes, which is the vast majority of the game.

raisedpowerc3keg.png


Look what that's done. The heaviest, most impressive scenarios are now closer to the ceiling, meaning these most crucial events are leaving fewer resources untapped. The variability of power draw has gone down, meaning it's easier to predictively design a cooling solution that remains quiet more often. You're probably even able to reduce the future proofing buffer zone, and raise speed even more (though I haven't shown that here). Whatever unexpected spikes do occur, they won't endanger power stability (and fear of them won't push the efficiency of all work down in the design phase, only reduce the spikes themselves). All this without any need to change the power supply, GPU silicon, or spend time optimizing the game code.

Keep in mind that these pictures are for clarity, and specifics about exactly how much extra power is made available, how often and far clockspeed may dip, etc. aren't derivable from them. But I think the general idea comes through strongly. It shows why, though PS5's GPU couldn't be set to 2GHz with fixed clocks, that doesn't necessarily mean it must still fall below 2 GHz sometimes. Sony's approach changes the power profile's shape, making different goals achievable.

I'll end with this (slowly) animated version of the above.

variablepowerudkmp.gif




 

IntentionalPun

Ask me about my wife's perfect butthole
What this means, he gave examples, is running game play ps5 will be at 2.23 and 3.5 Ghz most of the time, the down clocks will occure on CPU for AVX 256 and when doing simple triangles in map screens or game code written like that like furmark.

THAT IS WHAT HE MEANT BY FIXED STRATEGY NOT WORKING, what if avx + Furmark is run, you need to downclock and fixed is not good. But who cares ?

Some tghink fixed strategy not working must be cant sustain 2 GHz, thats just crap and shows NO UNDURSTANING of anything and basic reading skills. Cerny made it so simple yet I just dont get it..

If you wish to interpret it differently, its up to you . Most either do not understand the above or seem to read it upside down, its fucking mindboggling.
Why in the world do you think this contradicts, any of my posts, in any way?
 

geordiemp

Member
That dude insulted me a month ago and nothing was done, so figured it was OK. Dude in general can barely type a post without condescending.

Why in the world do you think this contradicts, any of my posts, in any way?

Becasue your talkiing systained clocks. Posters get frustrated because we are repeating the same stuff all the time to the same question.

Yes, you cannot sustain clocks correctly with AVX and with map screen as its repeating small triangles and that creates heat. In that context its correct, but you know every poster does not use that context....and you know exactly why as well.

Yes 2 Ghz would not work running AVX and furmark lol, probably would shut down XSX too at 1.8 Ghz eventually....its not healthy code.

So variable clocks, 2.23 Ghz and 3.5 Ghz will be met when it matters, playing a game with CPU and gPU doing stuff in a frame. CPU does calcs, GPU draws stuff and complex geometry that changes around as your playing....
 
Last edited:

kyliethicc

Member
Some people should read this crap over and over why variable are better than fixed clocks. It was mentioned before, why not repeat it over and over :

I agree with what you’re saying, and I think I understand the PS5 variable approach to clock speeds. Seems smart.

Why did you quote me exactly? Did I say something that was incorrect? Because I might have lol.

You seem to know much more on this than I do, so I’m genuinely asking here, not trying to argue.
 

IntentionalPun

Ask me about my wife's perfect butthole
Some people should read this crap over and over why variable are better than fixed clocks. It was mentioned before, why not repeat it over and over :

Why did you quote me? I never said fixed clocks were better.. in fact I think variable clocks are.

Sony did tell us how their design works. The thing you're missing is that the PS5 approach is not just letting clocks be variable, like uncapping a framerate. That would indeed have no effect on the lowest dips in frequency. But they've also changed the trigger for throttling from temperature to silicon activity. And that actually changes how much power can be supplied to the chip without issues. This is because the patterns of GPU power needs aren't straightforward.

Oh I missed that huh?


IntentiaonlPun said:
Typical method: React to thermals, give more power to the processor if the thermals are good, lower it when it heats up.
Sony method: Do this based on workload, basically PREDICTING ahead of time that a piece of code would be likely to cause thermal issues and do it within a specific power budget (another way to avoid thermal issues.)
 

chilichote

Member
The only way I can see that happening is if Microsoft thinks the PS5 DE is 499$.

I guess this is why they both are reluctant to give us the price. It would be a disaster if the PS5 DE is priced the same as the Xbox Series S.
If you click on the link to the tweet, you can see that he has corrected in the comments the price for the XSeS to 299$.
 

IntentionalPun

Ask me about my wife's perfect butthole
You keep doing the same sustained stuff, so I will try again.

Yes, you cannot sustain clocks correctly with AVX and with map screen as its repeating small triangles and that creates heat. Cerny explained that. And the above does not matter for normal gaming. Yes 2 Ghz would not work running AVX and furmark lol, probably would shut down XSX too at 1.8 Ghz eventually....its not healthy code.

Yes clocks will drop for furmark as well and XSX will heat up like a bitch, but its not game code is it.

So variable clocks, 2.23 Ghz and 3.5 Ghz will be met when it matters, playing a game with CPU and gPU doing stuff in a frame....That is why the variable clocks and smart shift is good for game play.

So... what you are saying is..

Variable clocks allow you to push beyond what you could do on a fixed clock setup. Because you could never "fix clocks" as high as a variable clock can go.

That's...literally....all...I've...been...saying.

Why are you so offended by the word "sustained"? That's literally the point of fixing clocks.. you fix them at the performance you believe you can actually sustain without frying the entire thing.

With variable clocks you can go higher than you could with a sustained clock setup, as a sustained clock setup is leaving performance on the table in scenarios where increasing clocks would benefit the code, but not cause thermal issues.

It's like you hear the word sustained and think I'm some Xbot promoting "sustained performance" over "variable performance" when I never said that at all..
 

jimbojim

Banned
If you click on the link to the tweet, you can see that he has corrected in the comments the price for the XSeS to 299$.

I was just speculating on what would happen if it's 399$. But I still stand that it would be very bad for the Series S if it's priced the same as the PS5 DE. I don't think that will happen though because Microsoft will eat the cost if they have to. Unless they truly don't care about the competition.


Edit: Misquoted IntentionalPun IntentionalPun no idea how it messed up.
 
Last edited:

geordiemp

Member
So... what you are saying is..

Variable clocks allow you to push beyond what you could do on a fixed clock setup. Because you could never "fix clocks" as high as a variable clock can go.

That's...literally....all...I've...been...saying.

Why are you so offended by the word "sustained"? That's literally the point of fixing clocks.. you fix them at the performance you believe you can actually sustain without frying the entire thing.

With variable clocks you can go higher than you could with a sustained clock setup, as a sustained clock setup is leaving performance on the table in scenarios where increasing clocks would benefit the code, but not cause thermal issues.

It's like you hear the word sustained and think I'm some Xbot promoting "sustained performance" over "variable performance" when I never said that at all..

OK I apologise, I have answered that question maybe 50 times on GAF so I do get a bit short on the subject.

Yes sustained boost or ps5 will fall down with furmark or map screens simple geometry looping fast, and it is good that it does that and healthy IMO .
 
Last edited:

JonnyMP3

Member
Threads read so funny when someone's ignored.
Being ill, my brain couldn't handle the pain anymore. But even in my weakened state I understand that smartshift is about the power envelope and thermal draw than clocks.
 

jimbojim

Banned
I agree with what you’re saying, and I think I understand the PS5 variable approach to clock speeds. Seems smart.

Why did you quote me exactly? Did I say something that was incorrect? Because I might have lol.

You seem to know much more on this than I do, so I’m genuinely asking here, not trying to argue.

Why did you quote me? I never said fixed clocks were better.. in fact I think variable clocks are.



Oh I missed that huh?


No, no. Not at all. Both of you didn't do nothing wrongly. My post was in addition to your post as a reaction like yours posts referred to crap about fixed clocks.
kyliethicc kyliethicc

Really, i'm not more educational than you. Maybe you're better than me. Just i'm trying to understand some things. Also, like bunch of others are doing, saying what others said across the net. :D
 
Last edited:

chilichote

Member
I was just speculating on what would happen if it's 399$. But I still stand that it would be very bad for the Series S if it's priced the same as the PS5 DE. I don't think that will happen though because Microsoft will eat the cost if they have to. Unless they truly don't care about the competition.
Sorry, only saw the conversation about that tweet^^

I don't believe it's only 299€ - it has more power than OneX. So maybe 399€ is right but if the PS5DE is 399€ too, than XSeS is DOA^^
 

IntentionalPun

Ask me about my wife's perfect butthole
OK I apologise, I have answered that question maybe 50 times on GAF so I do get a bit short on the subject.

Yes sustained boost or ps5 will fall down with furmark or map screens simple geometry looping fast, and it is good that it does that and healthy.

There are a number of best case scenarios where Sony's system will just be an insane improvement.

The frame timing convo earlier for instance; it doesn't make what I said WRONG though.

Fact: Lowering CPU speed will make whatever code is running on that CPU execute slower, and take longer to complete (what I said earlier, and that you and others said 'no' too)

Another Fact: IF that code can still finish within the drawing of a frame, then it taking longer will not actually effect game performance, and that extra bit of power/clock can be given to the GPU for it to do it's part of the job of rendering said frame.

It's completely fair for some of you to ADD to what I've said and claim there's more to it.. but honestly I do not believe I said anything actually WRONG in this thread hence my frustration lol
 
Last edited:
Sorry, only saw the conversation about that tweet^^

I don't believe it's only 299€ - it has more power than OneX. So maybe 399€ is right but if the PS5DE is 399€ too, than XSeS is DOA^^

Yeah that's why I think Microsoft will absorb the cost if the two are priced the same.

I know they keep saying that their objective isn't to sell as many systems as possible and I know they said at times that Sony isn't their competitor.

But does anyone really think they will let the XSS compete with the PS5 at the same price point?
 
Last edited:

geordiemp

Member
There are a number of best case scenarios where Sony's system will just be an insane improvement.

The frame timing convo earlier for instance; it doesn't make what I said WRONG though.

Fact: Lowering CPU speed will make whatever code is running on that CPU execute slower, and take longer to complete (what I said earlier, and that you and others said 'no' too)

Another Fact: IF that code can still finish within the drawing of a frame, then it taking longer will not actually effect game performance, and that extra bit of power/clock can be given to the GPU for it to do it's part of the job of rendering said frame.

It's completely fair for some of you to ADD to what I've said and claim there's more to it.. but honestly I do not believe I said anything actually WRONG in this thread hence my frustration lol

The CPU is 0.1 GHz difference, and DX12 is more abstract so I would not bet any horses on DX12 being faster at all.

The real question is why are ps5 and XSX have same CPU GHz but XSX is at frequencies below RDNA1 PC parts. RDNA2 is 50 % perf / watt and AMD have supposedly addressed the limitations of RDNA1, so I would expect all RDNA2 parts to be at 2.1 GHz or there abouts or close to Ps5 if it had ps5 cooling and accounting for die size...

The mystery goes both ways....and PC RDNA2 parts will make understanding better when they surface.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
The CPU is 0.1 GHz difference, and DX12 is more abstract so I would not bet any horses on DX12.

The real question is why are ps5 and XSX have same CPU GHz but XSX is at frequencies below RDNA1 PC parts. RDNA2 is 50 % perf / watt and AMD have supposedly addressed the limitations of RDNA1, so I would expect all RDNA2 parts to be at 2.1 GHz or there abouts or close to Ps5 if it had ps5 cooling and accounting for die size...
With so many more CUs doesn't it's power requirements and heat output go up pretty quickly as they raise the GPU clock?
 

jose4gg

Member
Why did you quote me? I never said fixed clocks were better.. in fact I think variable clocks are.

Do this based on workload, basically PREDICTING ahead of time that a piece of code would be likely to cause thermal issues and do it within a specific power budget (another way to avoid thermal issues.)

The point here is that Sony is not predicting for thermal issues, Sony is predicting which part of the power is not been used, this isn't related to power issues, this is related to put the power where it's needed, they still need to make the cooling system work in a way it can cool the amount of power they are distributing between the components.

Been able to do this to "cool" the machine, will basically be "smart throttling", that is not the paradigm...

If that was the paradigm when they put power in the CPU than that will require to do something like the GPU starts to run at something less than 2GHz which isn't what they said it would happen even in the worst case they mention.
 
Last edited:
Some people should read this crap over and over why variable are better than fixed clocks. It was mentioned before, why not repeat it over and over :

"
Sony did tell us how their design works. The thing you're missing is that the PS5 approach is not just letting clocks be variable, like uncapping a framerate. That would indeed have no effect on the lowest dips in frequency. But they've also changed the trigger for throttling from temperature to silicon activity. And that actually changes how much power can be supplied to the chip without issues. This is because the patterns of GPU power needs aren't straightforward.

Here's a depiction of the change. (This is not real data, just for illustrative purposes of the general principle.) The blue line represents power draw over time, for profiled game code. The solid orange line represents the minimum power supply that would need to be used for this profile. Indeed, actual power draw must stay well below the rated capacity. Power supplies function best when actually drawing ~80% of their rating. And when designing a console the architects, working solely from current code, will build in a buffer zone to accommodate ever more demanding scenarios projected for years down the line.
standardpowersokvg.png


You'd think the tallest peaks, highlighted in yellow, would be when the craziest visuals are happening onscreen in the game: many characters, destruction, smoke, lights, etc. But in fact, that's often not the case. Such impressive scenes are so complicated, the calculations necessary to render them bump into each other and stall briefly. Every transistor on the GPU may need to be in action, but some have to wait on work, so they don't all flip with every tick of the clock. So those scenes, highlighted in pink, don't contain the greatest spikes. (Though note that their sustained need is indeed higher.)

Instead, the yellow peaks are when there's work that's complex enough to spread over the whole chip, but just simple enough that it can flow smoothly without tripping over itself. Unbounded framerates can skyrocket, or background processes cycle over and over without meaningful effect. The useful work could be done with a lot less energy, but because clockspeed is fixed, the scenes blitz as fast as possible, spiking power draw.

Sony's approach is to sense for these abnormal spikes in activity, when utilization explodes, and preemptively reduce clockspeed. As mentioned, even at the lower speed, these blitz events are still capable of doing the necessary work. The user sees no quality loss. But now behind the scenes, the events are no longer overworking the GPU for no visible advantage.
choppedpower2hjss.png



But now we have lots of new headroom between our highest spikes and the power supply buffer zone. How can we easily use that? Simply by raising the clockspeed until the highest peaks are back at the limit. Since total power draw is a function of number of transistors flipped, times how fast they're flipping, the power drawn rises across the board. But now, the non-peak parts of your code have more oomph. There's literally more computing power to throw at the useful work. You can increase visible quality for the user in all the non-blitz scenes, which is the vast majority of the game.

raisedpowerc3keg.png


Look what that's done. The heaviest, most impressive scenarios are now closer to the ceiling, meaning these most crucial events are leaving fewer resources untapped. The variability of power draw has gone down, meaning it's easier to predictively design a cooling solution that remains quiet more often. You're probably even able to reduce the future proofing buffer zone, and raise speed even more (though I haven't shown that here). Whatever unexpected spikes do occur, they won't endanger power stability (and fear of them won't push the efficiency of all work down in the design phase, only reduce the spikes themselves). All this without any need to change the power supply, GPU silicon, or spend time optimizing the game code.

Keep in mind that these pictures are for clarity, and specifics about exactly how much extra power is made available, how often and far clockspeed may dip, etc. aren't derivable from them. But I think the general idea comes through strongly. It shows why, though PS5's GPU couldn't be set to 2GHz with fixed clocks, that doesn't necessarily mean it must still fall below 2 GHz sometimes. Sony's approach changes the power profile's shape, making different goals achievable.

I'll end with this (slowly) animated version of the above.

variablepowerudkmp.gif




Look at this muh'fucka over here just makin' graphs n'shit.
 

geordiemp

Member
With so many more CUs doesn't it's power requirements and heat output go up pretty quickly as they raise the GPU clock?

Bigger die yes. And lets be real, Ps5 with 2 % clock giving 10 % power reduction is beyond the power knee. Sony clearly have some novel cooling above what MS are doing.

And Sony released 2 cooling patents....so...
 
Last edited:
Status
Not open for further replies.
Top Bottom