Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core.
The CPU and GPU each have a power budget, of course the GPU power budget is the larger of the two," adds Cerny. "If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU.
Y'all know what it truly means but I won't say it because I got banned last time I did.
They hated Jesus because he told them the truth.
“Variable clocks” is a term used when top clock speeds get throttled due to heat. That can be unpredictable depending on the user’s local weather, etc. That can be difficult to optimize games for. 100% Predictably balancing clocks for max power draw is a different animal.
Yeah, I am honestly really curious how it will hold up in terms of temperature and how often it will actually sit at those boost clocks. My game room is upstairs and can get pretty warm during the summer, so am I just going to have to accept poorer frame rates as the system will be unable to keep a stable clock? Or will my console stay at those clocks and get all toasty? And then they talked about having a "Master APU" or whatever they called it that they are using to profile the clock variation of all chips but how will they ensure that every chip will hold up to those clocks exactly the same way?
I feel like they would have been better using locked clocks like every other console
The tweet it's stating the complete opposite of what you are writing.Yeah, I am honestly really curious how it will hold up in terms of temperature and how often it will actually sit at those boost clocks. My game room is upstairs and can get pretty warm during the summer, so am I just going to have to accept poorer frame rates as the system will be unable to keep a stable clock? Or will my console stay at those clocks and get all toasty? And then they talked about having a "Master APU" or whatever they called it that they are using to profile the clock variation of all chips but how will they ensure that every chip will hold up to those clocks exactly the same way?
I feel like they would have been better using locked clocks like every other console
You are our own Jesus and Sonys engineers are all a bunch of deceiving Judahs.Y'all know what it truly means but I won't say it because I got banned last time I did.
They hated Jesus because he told them the truth.
I do my best to spread compassion, love, and most importantly, truth.You are our own Jesus and Sonys engineers are all a bunch of deceiving Judahs.
Eh it's just poorly worded. He's just saying the PS5 aims to balance clocks for max power draw and that it's very different from the traditional method of clocks throttling based on temps.Someone didn't read this correctly.
Isn't he essentially saying the term variable clocks shouldn't really be applied to PS5 because traditionally it had been associated with throttling due to temperatures and is unpredictable. Whereas the PS5 is the "different animal" in a good way?
So he's saying hot environments will impact the clocks because I thought it was generally accepted that wouldn't be the case? Granted, this same guy has also been a source for other PS5 info I've seen quite a few unquestionably take as truthful; what he's saying in this instance doesn't sound particularly great for PS5's variable frequency strategy so I'd be interested to see if those same types still take his word or somehow find ways to argue against them this time xD.
My personal opinion? Variable frequency was always going to be a bit of a pain no matter how they went about doing it. The approach just demands more micromanagement on the side of the developer. Needing to make sure they know what power consumption their code will likely draw to ensure clock ranges are kept to a spec that won't end up resulting in excess use that can produce excess heat and thus requiring a reduction in power resulting in a reduction of clock frequency...yeah that's going to complicate things no matter how you look at it.
It's being suggested that Sony have some logic on the APU to "automatically" handle power load shifting within 2 Ms, but how exactly does that part of the design even work? How is it determining when to adjust power? Do devs need to write triggers in their code to signal "hey, this is probably going to draw up a lot of power so start reducing the power load on Event X okay?", because that require developer's direct input. And it wouldn't be automatic in the sense of requiring no dev input the way it's been suggested.
If that silicon is doing the detection automatically, I guess it could be using a type of sensor with a microprocessor or microcontroller unit built into it. Has to be able to detect the currents, so I guess it'd need to be integrated into the PSU, but that will complicate the PSU design in both engineering and costs. And I can't imagine that type of sensor capability (if it requires no developer input) to be on the cheap, not if they want quality.
Someone didn't read this correctly.
He's just saying the PS5 aims to balance clocks for max power draw and that it's very different from the traditional method of clocks throttling based on temps.So he's saying hot environments will impact the clocks because I thought it was generally accepted that wouldn't be the case? Granted, this same guy has also been a source for other PS5 info I've seen quite a few unquestionably take as truthful; what he's saying in this instance doesn't sound particularly great for PS5's variable frequency strategy so I'd be interested to see if those same types still take his word or somehow find ways to argue against them this time xD.
My personal opinion? Variable frequency was always going to be a bit of a pain no matter how they went about doing it. The approach just demands more micromanagement on the side of the developer. Needing to make sure they know what power consumption their code will likely draw to ensure clock ranges are kept to a spec that won't end up resulting in excess use that can produce excess heat and thus requiring a reduction in power resulting in a reduction of clock frequency...yeah that's going to complicate things no matter how you look at it.
It's being suggested that Sony have some logic on the APU to "automatically" handle power load shifting within 2 Ms, but how exactly does that part of the design even work? How is it determining when to adjust power? Do devs need to write triggers in their code to signal "hey, this is probably going to draw up a lot of power so start reducing the power load on Event X okay?", because that require developer's direct input. And it wouldn't be automatic in the sense of requiring no dev input the way it's been suggested.
If that silicon is doing the detection automatically, I guess it could be using a type of sensor with a microprocessor or microcontroller unit built into it. Has to be able to detect the currents, so I guess it'd need to be integrated into the PSU, but that will complicate the PSU design in both engineering and costs. And I can't imagine that type of sensor capability (if it requires no developer input) to be on the cheap, not if they want quality.
I mean.. if the clocks are variable, the clocks are variable.
Sony's solution is just different; while he has a point he's being a bit silly with the language here.
I mean.. if the clocks are variable, the clocks are variable.
Sony's solution is just different; while he has a point he's being a bit silly with the language here.
“Variable clocks” is a term used when top clock speeds get throttled due to heat. That can be unpredictable depending on the user’s local weather, etc. That can be difficult to optimize games for. 100% Predictably balancing clocks for max power draw is a different animal.He shouldn't have said this because, looking at the responses, even the most verbose idiot on this forum needs to work on his comprehension.
can't retort
He's just saying the PS5 aims to balance clocks for max power draw and that it's very different from the traditional method of clocks throttling based on temps.
I understand the confusion.
Please explain to us who have no idea.Y'all know what it truly means but I won't say it because I got banned last time I did.
They hated Jesus because he told them the truth.
It's a completely different paradigm," says Cerny. "Rather than running at constant frequency and letting the power vary based on the workload, we run at essentially constant power and let the frequency vary based on the workload."
An internal monitor analyses workloads on both CPU and GPU and adjusts frequencies to match. While it's true that every piece of silicon has slightly different temperature and power characteristics, the monitor bases its determinations on the behaviour of what Cerny calls a 'model SoC' (system on chip) - a standard reference point for every PlayStation 5 that will be produced.
Source"Rather than look at the actual temperature of the silicon die, we look at the activities that the GPU and CPU are performing and set the frequencies on that basis - which makes everything deterministic and repeatable," Cerny explains in his presentation. "While we're at it, we also use AMD's SmartShift technology and send any unused power from the CPU to the GPU so it can squeeze out a few more pixels."
(don't say that, because it's paradigm shift and most revolutionary thing ever(outside of SSD of course))Still amused at the people who where trying to convince themselves that variable frequencies between CPU and GPU were better than having fixed ones.
It’s a compromise, not and advantage.
I always understood that the PS5 could run at full speed if it needed too 24/7. But the variable part was when it did need the power it would clock dow to save electricity? And that would be 95% of the time.
Basically makes no difference.I wonder if someone would have an advantage playing a game in a Alaska because of cooler ambient temperature, vs someone playing in Brazil with a more humid and hot atmosphere. Imagine if variable clocks separate the gaming community.
Yes you would, unless you love wasting energy. Why would you want your GPU and CPU to run at their maximum clock speed if whatever you're doing doesn't require it? I could force the CPU and GPU in my PC to boost at all times, but why would I want to?As we all know, PS5 can't handle maxing out its cpu and gpu clocks 100% of the time. If it could, it would and you wouldn't need variable rates.