• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Balancing clocks for max power draw is a different animal than variable clock throttling

PaintTinJr

Member
The Xbox 360 launched a year before the PS3, I doubt they knew the specs.

They bumped the Xbox One in the last minute and nothing bad happened.
Fair point, except that the gap between 360 and X1 was much bigger (R&D time) than X1X to XsX.
The X1 compared to the PS4 was massively adrift in performance by architecture design and specs, so bumping the APU wasn't going to impact stability of GDDR5, because it didn't have lots of fast memory.

The X1 also cost £100 more and had a big empty box(external PSU) form factor with more expensive cooling solution to match, that the XsX doesn't.
And lastly, the 1850Mhz clock of the XsX is already near top tier compared to current PC GPUs that use the old hat conventional solution. At these limits, pushing harder will disproportionately increase power draw and stress the cooling solution.

unlike the PS5, the XsX CU count suggests they haven't gone for an even distribution of transistors in the APU for being able to clock/cool differently. If Xbox clock higher, then I would expect it to be so small a spec sheet bump that it wouldn't be worth the risk - even if a higher clock would give them more performance on narrower workloads.
 
Last edited:

PaintTinJr

Member
…..
I don't think it works like that. A game optimized for an RTX 2070 doesn't somehow fail to make use of the additional SM on a 2080Ti.

Which AAA game optimised for any PC (graphics card) are we talking about? Cerny described the issues of keeping a 48 CU GPU constantly busy with console (meaningful?) game workload compared to 36CU. From my 3 decades of experience with PC gaming, you typically get the choice to turn on options that can use the excess hardware resources in your PC. But that isn't optimising, (IMHO).

On console the designer picks a target frame-rate against a gameplay feedback loop and inverse kinematic animation timing, then targets a visual fidelity at a given resolution. Then when they fall short they have to optimise to reach their target design priotities.
 
Last edited:

sullydabricks

Neo Member
So he's saying hot environments will impact the clocks because I thought it was generally accepted that wouldn't be the case? Granted, this same guy has also been a source for other PS5 info I've seen quite a few unquestionably take as truthful; what he's saying in this instance doesn't sound particularly great for PS5's variable frequency strategy so I'd be interested to see if those same types still take his word or somehow find ways to argue against them this time xD.

My personal opinion? Variable frequency was always going to be a bit of a pain no matter how they went about doing it. The approach just demands more micromanagement on the side of the developer. Needing to make sure they know what power consumption their code will likely draw to ensure clock ranges are kept to a spec that won't end up resulting in excess use that can produce excess heat and thus requiring a reduction in power resulting in a reduction of clock frequency...yeah that's going to complicate things no matter how you look at it.

It's being suggested that Sony have some logic on the APU to "automatically" handle power load shifting within 2 Ms, but how exactly does that part of the design even work? How is it determining when to adjust power? Do devs need to write triggers in their code to signal "hey, this is probably going to draw up a lot of power so start reducing the power load on Event X okay?", because that require developer's direct input. And it wouldn't be automatic in the sense of requiring no dev input the way it's been suggested.

If that silicon is doing the detection automatically, I guess it could be using a type of sensor with a microprocessor or microcontroller unit built into it. Has to be able to detect the currents, so I guess it'd need to be integrated into the PSU, but that will complicate the PSU design in both engineering and costs. And I can't imagine that type of sensor capability (if it requires no developer input) to be on the cheap, not if they want quality.
Why can’t a developer just lock there own frequency. Develop the game so nothing draws over that Frequency. I don’t know the first thing when it comes to how it works just a thought.🤷‍♂️
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
I mean.. if the clocks are variable, the clocks are variable.

Sony's solution is just different; while he has a point he's being a bit silly with the language here.

You just don't understand how games are made. That's the truth. Nothing he said is silly.
 
Last edited:

Jon Neu

Banned
The X1 compared to the PS4 was massively adrift in performance by architecture design and specs, so bumping the APU wasn't going to impact stability of GDDR5, because it didn't have lots of fast memory

That’s irrelevant. MS didn’t know the specs of the PS4 neither.

The point is that you can bump the power of your console if it’s well designed.


And lastly, the 1850Mhz clock of the XsX is already near top tier compared to current PC GPUs that use the old hat conventional solution. At these limits, pushing harder will disproportionately increase power draw and stress the cooling solution

It still has room. They could easily make the console reach 13 teraflops if they wanted.

But they just don’t need to do so. They are already clearly more powerful.
 

MastaKiiLA

Member
Even though it's true that supplying power might not be the highest priority in designing a system, it still is a huge influence indirectly. Why? Because of heat.
Consoles rarely are designed to have a power supply over 300W. Not only do you want to avoid power hungry systems, the amount of power usage directly influences the size and the acoustics of the console. That's why I find it a bit odd to say that decisions made for the PS5 are not dictated by power supply. That's probably one of the first things that are set in stone, and everything else grows from that, based on the capability of the hardware manufacturer, in this case, AMD.

Specifically in the case of the PS5, power is really central to its design. They went with a (relatively) constant power delivery and variable clocks on the different components, as opposed to the fixed clocks and variable power design of the XSX and pretty much every other console that ever existed.

As for the amount of power needed for a game console not being complicated, Cerny disagrees with that in the Road to PS5 presentation. Short version is that engineers have to 'guess' how much power a system will use.
Power management and power delivery aren't the same. My post was in response to the notion that Sony "gimped" their PU because they changed clock speeds at a later date. The former is laughable, and the latter is impossible to know if you weren't privy to the design process. That information would be closely guarded since MS and Sony are both competing in the same arena.

As for your reply, I don't know if that's conflating power management with power delivery. I think the PS4 turbine is a known thing, and would be affected by some of the design choices that Cerny described them wanting to fix. That's power management, but wouldn't be on the PU side, but on the controller side, with how load balancing is performed inboard. If we want to use Cerny's words here, then it jives with his efficiency mantra. Making the use of power more efficient, at their targeted performance range. That would work with the notion that power is important, but it's still trumped by what their initial targets were for performance, and what the tolerances of the cooling package are. I don't think it was referring to the PU in any specific way, other than it's the source that the system is modulating.
 
Last edited:

Dontero

Banned
Mark Cerny looked at statistics of people homes and realized one truth. That people in homes have issues with power delivery. The power delivered to them is really varied and they have huge issues with how they used their home hardware.

How many of you had to turn off fridge to play PS2 ?
How many of you had to think about switching off lights in house to charge your pads ?
HOW MANY kids in houses instead of just enjoying their games had to think about use of electricity !

Thanfuly Mark Cerny realized the truth and thanks to trully innovative aproach to console design he made it possible so that console power requirements will be completely stable. With this unique technology no longer kids will have to calculate how much power is currently in house used and switch off extra equipment just in case PS5 will require 280W instead of 250W.

Now PS5 will require 250W no matter what !
Sony is really stepping their game !

In normal console like PS2 or PS3 when game had really power hungry segment, console power requirement would jump from 250W to almost 300W !!! Imagine how troublesome is that !

With Sony patented technology now when PS5 game will reach power hungry segment game will slow down with framerate and lower resolution. So that your fridge could stay on when you play games.

No more tyrany of varied power delivery !

Moreover Sony and Cerny gave tools to developers to make extra sure that console will not exceed default power requirement ! There will be even profiles for them to use ! Just one click and game lowers framerate and resolution so that power delivery will stay at 250W.

How inane is that ? Sony and Cerny literally gave people light. Because you can now play PS5 and have house lights on AT THE SAME TIME ! WOOOSH !


edit:

unknown.png


This is false and i object ! It is not derail but trolling. I hereby demand that mods should change reason for ban in this thread from derail to trolling and remove that part about unfunny since people found it funny which is confirmed in reactions and reposts. Otherwise mods will be declared to have absolutely "no humor".

Moreover my trolling was made to show absolute insanity of PR statements about power management as if this was huge change and something that users will care or even notice in their daily lifes.
 
Last edited:
Fair point, except that the gap between 360 and X1 was much bigger (R&D time) than X1X to XsX.
The X1 compared to the PS4 was massively adrift in performance by architecture design and specs, so bumping the APU wasn't going to impact stability of GDDR5, because it didn't have lots of fast memory.

The X1 also cost £100 and had a big empty box(external PSU) form factor with more expensive cooling solution to match, that the XsX doesn't.
And lastly, the 1850Mhz clock of the XsX is already near top tier compared to current PC GPUs that use the old hat conventional solution. At these limits, pushing harder will disproportionately increase power draw and stress the cooling solution.

unlike the PS5, the XsX CU count suggests they haven't gone for an even distribution of transistors in the APU for being able to clock/cool differently. If Xbox clock higher, then I would expect it to be so small a spec sheet bump that it wouldn't be worth the risk - even if a higher clock would give them more performance on narrower workloads.

Slight correction; it's 1825, not 1850. They could theoretically clock higher, but like MS themselves have said it's not all about the TFs. They had a target, they hit it, they happen to incidentally have the higher TF out of the two, and they aimed for their target for maximum balance and efficiency of the design.

It's kind of like both MS and Sony chose a path of balance and optimization in their design rather than just brute force, but happened to settle upon different configurations and TF numbers to achieve that.

Why can’t a developer just lock there own frequency. Develop the game so nothing draws over that Frequency. I don’t know the first thing when it comes to how it works just a thought.🤷‍♂️

Mainly because the developer would need to know what exact effect their code is having on system power draw and use, else certain code they do might end up running gimped (or not able to run at all) if the processor components need more power to execute the code yet they are not being granted that power since the game is forbidding it.

Plus, given how this probably works, the power management is likely being handled by a utility of the BIOS/UEFI style environment the OS has access to, and likely won't permit game code to have direct control of that feature as it could cause stability and security issues. I'm assuming that would be the case, anyway.
 
Last edited:
Mark Cerny looked at statistics of people homes and realized one truth. That people in homes have issues with power delivery. The power delivered to them is really varied and they have huge issues with how they used their home hardware.

How many of you had to turn off fridge to play PS2 ?
How many of you had to think about switching off lights in house to charge your pads ?
HOW MANY kids in houses instead of just enjoying their games had to think about use of electricity !

Thanfuly Mark Cerny realized the truth and thanks to trully innovative aproach to console design he made it possible so that console power requirements will be completely stable. With this unique technology no longer kids will have to calculate how much power is currently in house used and switch off extra equipment just in case PS5 will require 280W instead of 250W.

Now PS5 will require 250W no matter what !
Sony is really stepping their game !

In normal console like PS2 or PS3 when game had really power hungry segment, console power requirement would jump from 250W to almost 300W !!! Imagine how troublesome is that !

With Sony patented technology now when PS5 game will reach power hungry segment game will slow down with framerate and lower resolution. So that your fridge could stay on when you play games.

No more tyrany of varied power delivery !

Moreover Sony and Cerny gave tools to developers to make extra sure that console will not exceed default power requirement ! There will be even profiles for them to use ! Just one click and game lowers framerate and resolution so that power delivery will stay at 250W.

How inane is that ? Sony and Cerny literally gave people light. Because you can now play PS5 and have house lights on AT THE SAME TIME ! WOOOSH !

That sounds nice, but XSX is all most 2tf more powerful at sustained performance,and smaller in size to boot.
 

EverydayBeast

thinks Halo Infinite is a new graphical benchmark
SONY just gave Playstation a giant shell from the first day it launches with its new shell, graphics card etc. will get PS5 to 2027.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
What we need is an actual detailed hardware breakdown like we've had with the Series X.

Also we could use CURRENT HW/SW Sony engineers giving talks about the nuances of the console.

Very weird that we haven't had either yet... it's not that far from launch.

Too many questions. Feels like smoke and mirrors/bait and switch at this point.

This is how you know we've hit pure joke territory. I honestly can't tell if this is a troll post or a smart\amazing hyperbole joke post poking fun at people having bad reading comprehension and lack of awareness.
 
This is how you know we've hit pure joke territory. I honestly can't tell if this is a troll post or a smart\amazing hyperbole joke post poking fun at people having bad reading comprehension and lack of awareness.
I honestly can't tell you either. The FUNNIEST thing about this is how regular ass joes actually think they understand anything about hardware engineering. We hear these general breakdowns, we pick up on these terms and now we think we're experts. So much so that we have just regular rando people, who now believe they KNOW stuff, calling out people like Tim Sweeney "This guy doesn't know what he's talking about" :messenger_neutral:
 

Dolomite

Member
I just want to know how loud this console will be. If it's a low hum idgaf honestly.
If it's an F-15 take off then I may wait for a slim version
 
Last edited:

J_Gamer.exe

Member
You've got to laugh at this thread.

People completely misunderstanding the tweet and what it actually meant.

Another one of Sony's next gen breakthroughs this time with the clocks means getting more out of what you have, like the gpu than with a fixed clock.

Most game code isn't as taxing as the much rarer worst case scenario game code that a fixed clock system is limited by, where it could actually be running at a higher clock most the time.

The XSX with the same setup could probably run at 2ghz most the time and only need to drop when those worst case scenarios come around.

Therefore the PS5 clocks are superior overall and gives Sony an advantage in other areas of the gpu that rely on clock speed.

Devs have said the performance is actually close, closer than a simplistic number comparison suggests. Add in cache scrubbers for higher cu occupancy and that's another factor.
 
Last edited:

PaintTinJr

Member
That’s irrelevant. MS didn’t know the specs of the PS4 neither.

The point is that you can bump the power of your console if it’s well designed.
They most certainly did know the PS4 specs - other than the last minute bump to 8GBs of gddr5, hence why they thought their 8GBs of DDR3 + esarm was on par to offset having a lesser GPU. And no one suggested that the X1 wasn't well constructed for cooling or that the XsX is deficient at cooling its current specs.

It still has room. They could easily make the console reach 13 teraflops if they wanted.
But they just don’t need to do so. They are already clearly more powerful.
You don't know that, and it would need a clock of 125Mhz greater than it is now, which considering the async memory setup caused by signal integrity issues - according to xbox engineers - would seem like no small feat.

What is equally possible - as the scenario you claim - is that Xbox thought they knew PS5 specs back at the github leak and extrapolated 9.2TF based on highest PC GPU boost clocks (~2Ghz), at a time when they probably didn't have async memory and the XsX was probably at 11.2TF - with the 2TF spec difference they wanted.,

Then as things got closer to reveal them finding out that PS5 had higher clocks and was 10TF caused them to push their own clocks higher at the expense of losing the unified 20GB @ 560GB/s setup. If that is the case, then they've already used up more than their headroom for a bump.

If the XsS is real, and if it has a unified 10GB @ 560GB/s that would seem like a big indicator that the XsX was previously lower than 12TF and had 20Gb@560GB/s. It would certainly explain why those on the side of team Xbox FUDing the PS5 as a 9.2TF machine are pushing that narrative long after the Road to PS5 GDC presentation.
 

Genx3

Member
Power management and power delivery aren't the same. My post was in response to the notion that Sony "gimped" their PU because they changed clock speeds at a later date. The former is laughable, and the latter is impossible to know if you weren't privy to the design process. That information would be closely guarded since MS and Sony are both competing in the same arena.

As for your reply, I don't know if that's conflating power management with power delivery. I think the PS4 turbine is a known thing, and would be affected by some of the design choices that Cerny described them wanting to fix. That's power management, but wouldn't be on the PU side, but on the controller side, with how load balancing is performed inboard. If we want to use Cerny's words here, then it jives with his efficiency mantra. Making the use of power more efficient, at their targeted performance range. That would work with the notion that power is important, but it's still trumped by what their initial targets were for performance, and what the tolerances of the cooling package are. I don't think it was referring to the PU in any specific way, other than it's the source that the system is modulating.

Smart SHift is literally Power Management. It will allow either the CPU or GPU more power depending on Load.
Why would you think we're debating something else?
The power supply has to be able to handle the full load that console can produce.
I stated in an earlier post that the reason that Sony likely didn't use a bigger power supply was likely due to heat.
 

MastaKiiLA

Member
Smart SHift is literally Power Management. It will allow either the CPU or GPU more power depending on Load.
Why would you think we're debating something else?
The power supply has to be able to handle the full load that console can produce.
I stated in an earlier post that the reason that Sony likely didn't use a bigger power supply was likely due to heat.
Your choice of words did not suggest this at all. You don't gimp something that is built for a solution. That suggests shrinking a PU, which you would have no knowledge of, and couldn't possibly speculate on. You don't know what their projected operating tolerances we're to begin with, to know what targets there would shoot for on clock speed. Thus, you wouldn't know what their projected power requirements would be, so saying they gimped the PU is just baseless speculation.

Speculation shouldn't get delivered with that level of certainty. What we can say with certainty is that unless we can prove that they missed a target, then the power management system is built for the system they had in mind. I haven't seen anyone produce anything even close to evidence anywhere on the hardware side. The pseudo-science being discussed here works for good banter, but no one actually know for sure. It's Schrodinger's console at this point.
 

Genx3

Member
Your choice of words did not suggest this at all. You don't gimp something that is built for a solution. That suggests shrinking a PU, which you would have no knowledge of, and couldn't possibly speculate on. You don't know what their projected operating tolerances we're to begin with, to know what targets there would shoot for on clock speed. Thus, you wouldn't know what their projected power requirements would be, so saying they gimped the PU is just baseless speculation.

Speculation shouldn't get delivered with that level of certainty. What we can say with certainty is that unless we can prove that they missed a target, then the power management system is built for the system they had in mind. I haven't seen anyone produce anything even close to evidence anywhere on the hardware side. The pseudo-science being discussed here works for good banter, but no one actually know for sure. It's Schrodinger's console at this point.

I'll admit I posted with a level of certainty that maybe I shouldn't have. It is really my opinion.
I apologize if I offended you by it.

About the bolded:
The name of the Thread is:
Balancing clocks for max power draw is a different animal than variable clock throttling

Its literally the topic discussion.
Just because I didn't state it in my post doesn't mean its not understood. Everyone here is discussing the topic at hand.
Should I copy and paste the topic in each of my posts?
 

Jon Neu

Banned
and the XsX was probably at 11.2TF - with the 2TF spec difference they wanted.,

Then as things got closer to reveal them finding out that PS5 had higher clocks and was 10TF caused them to push their own clocks higher at the expense of losing the unified 20GB @ 560GB/s setup. If that is the case, then they've already used up more than their headroom for a bump.

Shit, impressive movie.

Is Neil Cuckmann the director?
 

PaintTinJr

Member
Shit, impressive movie.

Is Neil Cuckmann the director?
Why the juvenile response? Why assume I'm a fan of TLoU?

I thought we were speculating with the technical info available, but your reaction suggests that you now don't believe they have headroom to bump the XsX - which based on your older assertion that the XsX is the most powerful console and didn't need a bump, all seems a little strange.

Here's a question related to your
MS could also give another power up to the console right now if they want it, but there’s no need because they have already won the power battle.

and my hypothesis of the XsX previously being 11.2TF(1650Mhz clock of Titan X IIRC) is:

If the XsX doesn't need a clock bump, but could get one, then why wouldn't they do it if it could take headline game: Minecraft RT from 1080p at 30fps capped upto 60fps capped? There "optimised for XsX" is supposed to be about 60fps and 120fps IIRC, so surely a clock bump would help there, no?
 
Last edited:

Jon Neu

Banned
Why the juvenile response? Why assume I'm a fan of TLoU?

I thought we were speculating with the technical info available, but your reaction suggests that you now don't believe they have headroom to bump the XsX - which based on your older assertion that the XsX is the most powerful console and didn't need a bump, all seems a little strange.

Here's a question related to your


and my hypothesis of the XsX previously being 11.2TF(1650Mhz clock of Titan X IIRC) is:

If the XsX doesn't need a clock bump, but could get one, then why wouldn't they do it if it could take headline game: Minecraft RT from 1080p at 30fps capped upto 60fps capped? There "optimised for XsX" is supposed to be about 60fps and 120fps IIRC, so surely a clock bump would help there, no?

I just found funny how we have gone from PS5 overclocked to it’s actually XsX the one who overclocked after seeing the PS5 specs.

It’s impressive. The greatest subverting of expectations ever. Sony always leading.

But no, MS simply doesn’t need to do it after seeing that they are already clearly more powerful.
 

jigglet

Banned
One of the first things Cerny said was boost / variable clocks in PS5 had a totally different meaning compared to its traditional use in PC's. If he started with this and went to great lengths to caveat the entire presentation with this...clearly they were expecting it would cause confusion? Why did they not simply come up with a new name for it? Sony is not shy about coming up with new marketing names, surely this would have been the time for it? Honestly this feels like a problem they've made for themselves.
 

ZywyPL

Banned
They bumped the Xbox One in the last minute and nothing bad happened.

Nothing good either ;p

One of the first things Cerny said was boost / variable clocks in PS5 had a totally different meaning compared to its traditional use in PC's. If he started with this and went to great lengths to caveat the entire presentation with this...clearly they were expecting it would cause confusion? Why did they not simply come up with a new name for it? Sony is not shy about coming up with new marketing names, surely this would have been the time for it? Honestly this feels like a problem they've made for themselves.

Well of course it is different, on PC the CPU and GPU work completely independently, whereas PS5/AMD SmartShift is a mobile technology, where the CPU and GPU are tied together, sharing the power draw. IMO Cerny/Sony should've skipped that part, said CPU runs at this speed, GPU at that speed, and never even mention the variable clocks, no one would be ever able to tell if they are dropping or not, and by how much. So yeah, they just created confusion by themselves.
 

ToadMan

Member
Yeah, after reading the additional info in the DF follow up interview to the Road to PS5, it was interesting that the power control unit in the APU uses a simulation model of the silicon, so that the specifics of each APU aren't important to the deterministic clock & power/versus workload.

I was surprised to see Cerny mention that CPU, GPU and memory controller activity all played a part in the clock rate chosen by the power control unit. So in theory, an algorithm re-written to be more compute intensive, than memory intensive might see a clock boost(and performance boost) if it reduced power draw, or vice versa,

After considering the usage model of game consoles, the design has to be this way to maintain quality of service.


For example if someone is playing a 60fps multiplayer game - Street fighter for example - the logic of the game is based on the frame rate. Hit boxes, network latency correction, control response and so on.

It would be a poor experience if someone playing in a different ambient environment (or just a “loser” in the silicon lottery) were to find they’re playing at a disadvantage because their system is modifying clocks while their opponent isn’t getting that.

In fact such a scenario would spell disaster for the brand, so they can’t allow varying clocks or power to impact performance in an unpredictable way.

With traditional console design, an overhead of TDP and computing power is left to ensure a stable frame rate. Couple that with fixed clocks and the experience is the same on all units. That takes a lot of testing of course and leaves performance on the table and all of that is down to the developers to work out.


Using Sony’s approach - an SoC “model” determining the load to manage clocks and smartshift for power allocation - means that each unit reacts in the same way to the code it’s running regardless of ambient conditions or a particular piece of silicon.


It also means devs code to max out that model - they don’t have to leave performance “on the table” because the system is managed algorithmically at runtime, performance and response is predictable and is identical for every console.

Now having said that, there are risks. If Sony haven’t done a good job, the SoC model will be too conservative and there won’t be any dev “tricks” to extract more performance - they’re handcuffed by the system. The code can only be as good as the SoC model Sony implement.

If Sony go the other way and don’t have good QA, then their model may run too close to the limit of the hardware and in some environments lead to instability or failures.

So a lot remains to be seen with this solution. But the tech is interesting, and perhaps revolutionary if does lead to performance and/or development gains over traditional console design.

I’m not sure how easy it will be to modify that SoC model post release either. Any changes to the model would be like changing the hardware - games could be affected in weird ways.

So yeah - there are a lot of implications and at the moment not much in the way of data to go on.
 
Last edited:

jigglet

Banned
Nothing good either ;p



Well of course it is different, on PC the CPU and GPU work completely independently, whereas PS5/AMD SmartShift is a mobile technology, where the CPU and GPU are tied together, sharing the power draw. IMO Cerny/Sony should've skipped that part, said CPU runs at this speed, GPU at that speed, and never even mention the variable clocks, no one would be ever able to tell if they are dropping or not, and by how much. So yeah, they just created confusion by themselves.

The funny thing is even off the top of my head I could come up with half a dozen marketing names for it. I don't know why they allowed themselves to step foot into this shit.
 

PaintTinJr

Member
After considering the usage model of game consoles, the design has to be this way to maintain quality of service.


For example if someone is playing a 60fps multiplayer game - Street fighter for example - the logic of the game is based on the frame rate. Hit boxes, network latency correction, control response and so on.

It would be a poor experience if someone playing in a different ambient environment (or just a “loser” in the silicon lottery) were to find they’re playing at a disadvantage because their system is modifying clocks while their opponent isn’t getting that.

In fact such a scenario would spell disaster for the brand, so they can’t allow varying clocks or power to impact performance in an unpredictable way.

With traditional console design, an overhead of TDP and computing power is left to ensure a stable frame rate. Couple that with fixed clocks and the experience is the same on all units. That takes a lot of testing of course and leaves performance on the table and all of that is down to the developers to work out.


Using Sony’s approach - an SoC “model” determining the load to manage clocks and smartshift for power allocation - means that each unit reacts in the same way to the code it’s running regardless of ambient conditions or a particular piece of silicon.


It also means devs code to max out that model - they don’t have to leave performance “on the table” because the system is managed algorithmically at runtime, performance and response is predictable and is identical for every console.

Now having said that, there are risks. If Sony haven’t done a good job, the SoC model will be too conservative and there won’t be any dev “tricks” to extract more performance - they’re handcuffed by the system. The code can only be as good as the SoC model Sony implement.

If Sony go the other way and don’t have good QA, then their model may run too close to the limit of the hardware and in some environments lead to instability or failures.

So a lot remains to be seen with this solution. But the tech is interesting, and perhaps revolutionary if does lead to performance and/or development gains over traditional console design.

I’m not sure how easy it will be to modify that SoC model post release either. Any changes to the model would be like changing the hardware - games could be affected in weird ways.

So yeah - there are a lot of implications and at the moment not much in the way of data to go on.
I'm not necessarily disagreeing with you, but between the GDC talk and DF follow up interview Cerny actually has told us a lot more. Leaving performance on the table would mean something completely different when he talking about code that saved power - because if the SoC sees power saved in busy workloads, then it responds by boosting. So if you could have a 128bit instruction usage versus a 256bit wide usage to do an identical task, then the narrow algorithm should boost higher clocks from what I understood.

From the GDC talk we know that it was the quietness of the cooling solution that determined the maximum power draw, so whatever PlayStation determine as quiet enough is where the SoC model is set. Even if some systems slightly fail the QA they will just be louder AFAIK, and as the 3year and slim refresh arrives, they could change the SoC model slightly for increased performance at the cost of making older models sounding a bit louder.

On first loadup, there may even be a situation in which the GPU is synthetically stressed - to lower clocks - until enough data has arrived in the GPU cache to commence work and then rely on cache scrubbing for new data - because at lowest clocks memory gets closer , and at highest clocks processing of all aspect gets faster, as Cerny explained in the GDC talk.

Despite the PS5 sounding straight forward to develop for and get great performance, I expect the optimisation on the PS5 to rival that of any of the esoteric PlayStations of the past and see major improvements through the generation.
 

GymWolf

Member
Mark Cerny looked at statistics of people homes and realized one truth. That people in homes have issues with power delivery. The power delivered to them is really varied and they have huge issues with how they used their home hardware.

How many of you had to turn off fridge to play PS2 ?
How many of you had to think about switching off lights in house to charge your pads ?
HOW MANY kids in houses instead of just enjoying their games had to think about use of electricity !

Thanfuly Mark Cerny realized the truth and thanks to trully innovative aproach to console design he made it possible so that console power requirements will be completely stable. With this unique technology no longer kids will have to calculate how much power is currently in house used and switch off extra equipment just in case PS5 will require 280W instead of 250W.

Now PS5 will require 250W no matter what !
Sony is really stepping their game !

In normal console like PS2 or PS3 when game had really power hungry segment, console power requirement would jump from 250W to almost 300W !!! Imagine how troublesome is that !

With Sony patented technology now when PS5 game will reach power hungry segment game will slow down with framerate and lower resolution. So that your fridge could stay on when you play games.

No more tyrany of varied power delivery !

Moreover Sony and Cerny gave tools to developers to make extra sure that console will not exceed default power requirement ! There will be even profiles for them to use ! Just one click and game lowers framerate and resolution so that power delivery will stay at 250W.

How inane is that ? Sony and Cerny literally gave people light. Because you can now play PS5 and have house lights on AT THE SAME TIME ! WOOOSH !
I love this post :ROFLMAO:
 

ZywyPL

Banned
The funny thing is even off the top of my head I could come up with half a dozen marketing names for it. I don't know why they allowed themselves to step foot into this shit.

Actually, Velocity Architecture would be so spot-on, but that's already taken by MS ;p
 

RaySoft

Member
So he's saying hot environments will impact the clocks because I thought it was generally accepted that wouldn't be the case? Granted, this same guy has also been a source for other PS5 info I've seen quite a few unquestionably take as truthful; what he's saying in this instance doesn't sound particularly great for PS5's variable frequency strategy so I'd be interested to see if those same types still take his word or somehow find ways to argue against them this time xD.

My personal opinion? Variable frequency was always going to be a bit of a pain no matter how they went about doing it. The approach just demands more micromanagement on the side of the developer. Needing to make sure they know what power consumption their code will likely draw to ensure clock ranges are kept to a spec that won't end up resulting in excess use that can produce excess heat and thus requiring a reduction in power resulting in a reduction of clock frequency...yeah that's going to complicate things no matter how you look at it.

It's being suggested that Sony have some logic on the APU to "automatically" handle power load shifting within 2 Ms, but how exactly does that part of the design even work? How is it determining when to adjust power? Do devs need to write triggers in their code to signal "hey, this is probably going to draw up a lot of power so start reducing the power load on Event X okay?", because that require developer's direct input. And it wouldn't be automatic in the sense of requiring no dev input the way it's been suggested.

If that silicon is doing the detection automatically, I guess it could be using a type of sensor with a microprocessor or microcontroller unit built into it. Has to be able to detect the currents, so I guess it'd need to be integrated into the PSU, but that will complicate the PSU design in both engineering and costs. And I can't imagine that type of sensor capability (if it requires no developer input) to be on the cheap, not if they want quality.
First he's defining what some ppl think is the case with the PS5:
“Variable clocks” is a term used when top clock speeds get throttled due to heat. That can be unpredictable depending on the user’s local weather, etc. That can be difficult to optimize games for.
Then he states how the PS5 actually is doing it:
100% Predictably balancing clocks for max power draw is a different animal.
 
Top Bottom