B_Boss
Member
Then where is the 60FPS performance?
For starters, Destiny 2 will run 4k/60 so far as we know straight from Luke Smith himself.
Then where is the 60FPS performance?
but that's not boosting? that's the oppositeIt is 10.28Tf most of the time, but in rare cases where either the CPU or GPU would throttle due to workload, one chip can be downclocked so the struggling chip can maintain max clockspeed.
That's what I understand from Cerny's talk, but feel free to correct me if I'm mistaken.
Boosting is just a name they gave to the increased clockspeed.but that's not boosting? that's the opposite
so its a 10TF machine but when the going gets tough it won't be? I find it curious they went with a design like this tbh. it seems like an odd choiceBoosting is just a name they gave to the increased clockspeed.
But people decided to completely misinterpret Cerny's explanation and call it a 9Tf machine that occassionally boosts to 10Tf instead.
Cerny literally explained all this.
Apparently both CPU and GPU will never be maxing at the same time, so when one struggles it leaves room for the other to be downclocked without a negative impact.so its a 10TF machine but when the going gets tough it won't be? I find it curious they went with a design like this tbh. it seems like an odd choice
For starters, Destiny 2 will run 4k/60 so far as we know straight from Luke Smith himself.
that's at the moment and this is only the start of the gen. same as when PS4 and XO launched they weren't taxing the gpu's or cpu's early in the gen. I worry about later in the next gen how this will effect my PS5 with how they choose it will work that way. I will also say that later in the gen SSD streaming will be used better as well so that may help move things about quicker but maybe bottlenecked by variable CPU and GPUApparently both CPU and GPU will never be maxing at the same time, so when one struggles it leaves room for the other to be downclocked without a negative impact.
Seems like a more efficient approach.
That's what I understand though, so again, if I'm wrong, someone can correct me on this.
No idea, that's something time will tell. The way I see it, is that ultimately they would have the same or similar results as they would have achieved with fixed clockspeeds at 10Tf.that's at the moment and this is only the start of the gen. same as when PS4 and XO launched they weren't taxing the gpu's or cpu's early in the gen. I worry about later in the next gen how this will effect my PS5 with how they choose it will work that way. I will also say that later in the gen SSD streaming will be used better as well so that may help move things about quicker but maybe bottlenecked by variable CPU and GPU
Boosting is just a name they gave to the increased clockspeed.
But people decided to completely misinterpret Cerny's explanation and call it a 9Tf machine that occassionally boosts to 10Tf instead.
Cerny literally explained all this.
well with fixed you know what you are going to get I guess. in a CPU heavy game graphics will take a hit and vice versa in a GPU heaven game cpu will take a hit.No idea, that's something time will tell. The way I see it, is that ultimately they would have the same or similar results as they would have achieved with fixed clockspeeds at 10Tf.
They just designed it in an opposite way with power being fixed, instead of clockspeeds.
That's how Cerny explained it, anyway.
True, but he also said that with the current design, clockspeeds could be even much higher, but they capped them to achieve predictable performance.Cerny also said that before they implemented SmartShift it was hard for the CPU and GPU to maintain 3 and 2GHz clocks respectively, so it does being some valid concerns.
Yeah, I suppose. The downside would probably be that your chips are locked, so you can't give the other a little boost, so there's less efficiency.well with fixed you know what you are going to get I guess. in a CPU heavy game graphics will take a hit and vice versa in a GPU heaven game cpu will take a hit.
but if the chips are fast enough as is why would they need boosting. it does feel as if the chips have been overclocked slightly and that's what they are doing with this "boost" mode. just seems a strange setup, I mean in pc we have maintained clock speeds and I dont see and boosting or throttling of either cpu or gpu to help performance. its just all new till we see on the ground what it doesTrue, but he also said that with the current design, clockspeeds could be even much higher, but they capped them to achieve predictable performance.
Yeah, I suppose. The downside would probably be that your chips are locked, so you can't give the other a little boost, so there's less efficiency.
so its a 10TF machine but when the going gets tough it won't be? I find it curious they went with a design like this tbh. it seems like an odd choice
They are overclocked afaik, that's why they use the term "boost".but if the chips are fast enough as is why would they need boosting. it does feel as if the chips have been overclocked slightly and that's what they are doing with this "boost" mode. just seems a strange setup, I mean in pc we have maintained clock speeds and I dont see and boosting or throttling of either cpu or gpu to help performance. its just all new till we see on the ground what it does
nobody is interpreting it wrong the message they came out with is fuzzy which leads to speculation of how this will work. its as simple as that, we are not talking about xbox we are talking about PS5. we know what the xbox is and we are discussing how the PS5 will workNo. Why do people have so many problems with this. Watch the video of the talk. The PS5 has a PSU that runs at constant power. This constant power is built around the max (capped) frequencies. If they weren't capped then they would both over budget and the cooling solution being scaled would make no difference. Games will be built and developed for those specs. However, at times, when the game isn't pushing one of the components to its threshold (due to need), the technology will hold the unused power in reserve so it can be reallocated to the other component (if needed). If anythign when the going gets tough for a few seconds, it makes it likelier they can 'steal' some unused power to level out that part of the game.
Take GTA V as an example, driving along a highway. You've got draw distance and all the other stuff going on. But hang on, the AI simulation isn't maxing out the CPU, so there's an extra 3 percent in the power budget. OK, well let's use that 3% reserve to boost the GPU even higher and maintain that steady 60fps. Suddenly you go through a tunnel. The traffic density drops, the GPU doesn't need to work as hard so neither component needs to be running at max, so it will drop them for efficiency. It's like looking at a wall on a PC game and getting 224fps then getting back into the action and it's running at a more realistic 120. Isn't it better to use v-sync at 120 and not have your GPU burning extra frames when they're not needed?
People are interpreting this (because they want to) wrongly. The Xbox Series X is more powerful, it's that simple I don't know why we get bogged down so much on these little details. I mean, when the DF video's come out, everyone is just setting themselves up for fails when they'll be then asked to explain why a '9TF GPU' is on par with the extra horsepower of the XsX.
We also need to see how this translates because I also remember a very similar point being made about the PS$ with offloaded asynch compute and how it was going to push the capabilities beyond the limitations of the jaguar CPU as a result. Didn't really materialise in real life though did it. There was nothing first party that clearly outstripped the Jaguar CPU, even on the Pro which had 50% more GPU power for the offloading.
yeah that makes sense in terms of cooling tbh, hope its better than my PS4 which sounds like its ready to take off sometimes lol. maybe this solution will allow it to run near silentThey are overclocked afaik, that's why they use the term "boost".
But because they went with fixed power consumption and variable clockspeeds, they could come up with that cooling solution people were talking about to counter the heating issues.
nobody is interpreting it wrong the message they came out with is fuzzy which leads to speculation of how this will work. its as simple as that, we are not talking about xbox we are talking about PS5. we know what the xbox is and we are discussing how the PS5 will work
to the point where Digital Foundry were not sure of it?It's not fuzzy at all though. The videos and articles all explain it simply, then they get ran through various intepretations on forums like this or reddit subs with positive and negative spin. It's all there.
They have a Xbox bias though.to the point where Digital Foundry were not sure of it?
do they really though? is this like Epic's bias towards PS5?They have a Xbox bias though.
No, both cpu and gpu can run max clock at the same time.Apparently both CPU and GPU will never be maxing at the same time, so when one struggles it leaves room for the other to be downclocked without a negative impact.
Seems like a more efficient approach.
That's what I understand though, so again, if I'm wrong, someone can correct me on this.
You're wrong. Neither the CPU or GPU are enhanced beyond their top power level. They are enhanced to reach their top power/ max clocks. This interview made it clear, as is expected, that Boosting Clocks is a scene specific flexibility given to developers and even the person that is interviewed said that this should be done incidentally and not sustained. So as others, I would like to know what the PS5's Sustained Performance number is.
Thanks for clarifying.No, both cpu and gpu can run max clock at the same time.
Cery said,
"The CPU and GPU each have a power budget, of course the GPU power budget is the larger of the two," adds Cerny. "If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."
On the contrary, if you check post #57 you’ll see that I was right.
PlayStation 5’s Boost Clock Design Opens Up a Lot of Opportunities, Says Developer
You can throttle to the max but we prefer if you don’t? What in the fuck?www.neogaf.com
Why are you so obsessed with those numbers?? As long as games run smooth and have good visual great graphics, that's all that matters. Who the hell cares about TFLOPS, clocks speed, multipliers bull craps. Just enjoy whatever platform u like with whatever games that's available. Unless you're developer.Boost clock is just more performance, and more is always better. Developers can always optimize their code for the clock in the worst possible scenario and everything above that clock will be a bonus (a little bit better framere or higher resolution).
The problem is however we dont know how far PS5 GPU will downclock and most likely we will never know (Digital Foundry has no tools to determine PS5 GPU clock). I hope PS5 GPU will stay at 10TF even at worst possible scenario, but anything 2170MHz would already give sub 10TF numbers.
Sometimes even when gpu running at full max clock which is 2.23GHz, a set of instruction codes can flips many transistors at the same time. That's where the game have power spike when running 100% utilization. When that happens , if there is no Amd smart shift gpu will have to downclock. But because of AMD SmartShift cpu can give it's power budget for gpu's power spike to compensate gpu downclocking.Nothing that post or that article points out contradicts what I said.
You have max clocks and sustained clocks. Like mentioned before that means the frequencies are variable but capped at 3.5 GHz for the CPU and 2.23 GHz for the GPU. That is the top level.
Beyond the top level, what you said, implies that variable frequencies allows the CPU and GPU to go beyond 3.5 Ghz and 2.23 Ghz. They won't and they can't.
In other words, The CPU's 100% is 3.5Ghz and the GPU's 100% is 2.23 Ghz.
Any variation below that will have the CPU run at [100%-X%] and the GPU run at [100%-Y%], where X and Y can differ per scene.
That's why it's so important to know what the PS5's sustained clocks are/ base clocks if developers don't choose to variate.
Who cares about TFLOPS? I always thought every PS fan on this site care about TFLOPS at least that's how it looked like before people have learned MS has the upper hand here.Why are you so obsessed with those numbers?? As long as games run smooth and have good visual great graphics, that's all that matters. Who the hell cares about TFLOPS, clocks speed, multipliers bull craps. Just enjoy whatever platform u like with whatever games that's available. Unless you're developer.
That makes the clockspeeds even more sustainable than the way I understood it then, correct?Sometimes even when gpu running at full max clock which is 2.23GHz, a set of instruction codes can flips many transistors at the same time. That's where the game have power spike when running 100% utilization. When that happens , if there is no Amd smart shift gpu will have to downclock. But because of AMD SmartShift cpu can give it's power budget for gpu's power spike to compensate gpu downclocking.
I would love to see NeoGAF with only Microsoft using variable frequencies. Would be so much fun to read thousands and thousands of posts why it's utter crap.No matter how many times people explain this it still sounds like the most retarded fanboy nonsense
Yes, higher resolution and framerate and lower detail. Or a larger RAM pool.Who cares about TFLOPS? I always thought every PS fan on this site care about it at least that's how it looked like before people have learned MS has the upper hand in TFLOPS.
TFLOPS dont paint the whole picture, but make no mistake, no matter what Cerny say TFLOPS are still important and especially if someone want to estimate performance on the same architecture. With more TFLOPS Epic would increase resolution and framerate in their tech demo.
Yes, power consumption spike is not because of utilization of cpu gpu. It is because of a code that makes many or most transitors flip at the same time. If developers can optimize and minimize power spikes both cpu and gpu can run at max speed for sustained period. Also power spikes happens only for short period of time.That makes the clockspeeds even more sustainable than the way I understood it then, correct?
Epic demo can't run not because of limitations but because they are not optimized. Epic confirms there is enough budget to run at 4k 60 after optimization. Research plz..Who cares about TFLOPS? I always thought every PS fan on this site care about it at least that's how it looked like before people have learned MS has the upper hand in TFLOPS.
TFLOPS dont paint the whole picture, but make no mistake, no matter what Cerny say TFLOPS are still important and especially if someone want to estimate performance on the same architecture. With more TFLOPS Epic would increase resolution and framerate in their tech demo.
That shuts down the 9Tf-narrative then.Yes, power consumption spike is not because of utilization of cpu gpu. It is because of a code that makes many or most transitors flip at the same time. If developers can optimize and minimize power spikes both cpu and gpu can run at max speed for sustained period. Also power spikes happens only for short period of time.
I will attach the link from digital foundry interview with mark cerny, in case you haven't read it.That shuts down the 9Tf-narrative then.
And some wonder why "9Tf" results in bans.This developer really worded this badly when saying "We prefer if you didn't, but if there's like a fringe case where you're just off that tiny bit of performance you need, we will let you squeeze a little bit extra.", which actually means that the console normally doesn't run at those max clocks and only in special cases you would actually use that. Pretty sure we are going to see a difference in games when DF does their analysis.
I would love to see NeoGAF with only Microsoft using variable frequencies. Would be so much fun to read thousands and thousands of posts why it's utter crap.
Where Epic has confirmed there's enough GPU budget to run UE5 at 4K 60fps?Epic demo can't run not because of limitations but because they are not optimized. Epic confirms there is enough budget to run at 4k 60 after optimization. Research plz..
I must be missing where people are trash-talking the Series X? I admit I don't frequent every thread but I've gotten used to people trolling the PS5 to the point that the mods need to ban people for spreading false info constantly. The most down-talk I've seen about the Series X is that theoretically if a dev took advantage of it they could do some stuff with double the through-put on PS5 meaning something has to be cut down on with Series X. For the most part I'd expect first party devs to be the most likely to take full advantage of that, though, meaning you can't even compare the game on both platforms. Is anyone denying more TLFOPs means the X can push out more effects? Is anyone lying about the numbers the Series X does? I'm not saying it doesn't exist but thousands of posts? Really? I just don't see it.
Thousands may be a tad low, actually. There’s a reason this place is known as Sony Gaf to many. It’s more balanced than...... other forums.... but the fan bias still exists. For reference go in the next gen discussion thread. But be warned... once you go in, you will never leave...
Read the comments above, developers do not have to "carefully evaluate" anything.Take this bit of rope, put both hands on both ends and your foot in the middle. When you pull one end, the other end moves down. Pull that end, and the first goes down. Carefully evaluate whether you need more GPU or cpu power in certain situations, but remember you can never go over a certain point.
And just for that second sentence you will probably be banned.Thousands may be a tad low, actually. There’s a reason this place is known as Sony Gaf to many. It’s more balanced than...... other forums.... but the fan bias still exists. For reference go in the next gen discussion thread. But be warned... once you go in, you will never leave...
I'm not saying anything about 9TF or anything. I don't really care. Purely pointing out that this developer worded in such a way that in most cases the PS5 doesn't run at max clocks.And some wonder why "9Tf" results in bans.