• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

buenoblue

Member
Incorrect.

You’ve misunderstood both smartshift and Sony’s power monitoring system.

Smart shift adjusting clocks when not needed and sending more power to cpu and gpu is all good and great but my question still stands - What happens when full clocks on cpu and gpu are needed? Surly demanding games will need full clocks on on both CPU and GPU. Surly you must see that games will need this, yet the ps5 and smart shift will not allow this for sustained lengths of time. Surly this is a detriment to the system.
 
Could someone with the approved beyond3d account take this photo and display it on this forum?

thank you

Good discussion over there.
 
OK. You can feel an arrow rushing by your head. Through your fingertips?

Hyperbole much? 😁

Rumbly triggers are good, but they're not that good!

The triggers aren't the xbox rumble triggers. They have variable tension and are much more advanced.

I'd assume he is referring to the audio or the haptic feedback (not the triggers). Maybe the haptic does a little fly by vibration. Hard to judge it without feeling it, but I could see something like that working and increasing immersion when mixed with some good 3D audio.
 

Njocky

Banned
Smart shift adjusting clocks when not needed and sending more power to cpu and gpu is all good and great but my question still stands - What happens when full clocks on cpu and gpu are needed? Surly demanding games will need full clocks on on both CPU and GPU. Surly you must see that games will need this, yet the ps5 and smart shift will not allow this for sustained lengths of time. Surly this is a detriment to the system.
Your question is 100% pertinent and legitimate and none of the people who have ridiculed it with memes bothered to provide an answer (far from uncommon fun this thread). AMD smartshift as described on AMD’s website does not allow for this yet Cerny has implied that the PS5 can. At least one developer has also publicly expressed that Sony discourages the usage of both processors at their max rate.
so your question was far from being laughable for anyone not on a console defense mission.
 

BigLee74

Member
He clearly talks about 3D audio with headphones, how the heck you read it as "hear arrows trough fingertips?" that would be some kind of voodoo level sorcery :messenger_fearful:

Relax. Just read the tweet. It's tagged dual sense. If he's not talking about the controller, then it's on him, because theres nothing wrong with my reading comprehension skills! 😂 (maybe)
 
Last edited:

Mr Moose

Member
"near or at" is very different to both running at full speed all the time.

a game running at a variable 57-60fps and only being 60fps 5% of the time is "near or at" 60fps. Not what you'd call running at 60fps all the time though.
OK this is the last time I will talk about it with you...
You said "The gpu cannot reach that clock speed unless it is taking power away from the cpu, at which time the cpu isn’t at its max clock speed. "
the boost clock system should still see both components running near to or at peak frequency most of the time.
Or at
peak frequency most of the time means you are wrong, there's also the other quotes too but its pointless going around in circles.
What does at mean? Seems a few days ago people knew what it meant when Ubisoft said it, now it just means "or" like "and" did?
You say it can't, Cerny says it can.
End of discussion.
 

HAL-01

Member
Smart shift adjusting clocks when not needed and sending more power to cpu and gpu is all good and great but my question still stands - What happens when full clocks on cpu and gpu are needed? Surly demanding games will need full clocks on on both CPU and GPU. Surly you must see that games will need this, yet the ps5 and smart shift will not allow this for sustained lengths of time. Surly this is a detriment to the system.
Are you trying to spell "surely"?

And the console provides ample power to run both cpu and gpu at max clocks. In the rare case of a particularly heavy load, the console is designed to minimally downclock itself in order to not go over power budget.
In the same case of a heavy load that would otherise cause a spike in power consumption, a "fixed clocks" system would handle it in a much less graceful manner. Heavy loads like this put a stress on both systems, and its on dev's hands to avoid situations like this as much as they can.
 

Tchu-Espresso

likes mayo on everthing and can't dance
Last edited:

DinoD

Member

RE8 to feature 4K and Ray tracing on PS5 along with all the haptic trigger stuff with weapons etc

This is impossible. Just can't be. PS5 is struggling to run RE8 in 1080p.

I know this from my trusted source. A cousin who works for Nintendo who has friend over at Capcom who works as a cleaner and who was able to interpret notes REengine programmer left at her desk.
 
Last edited:

PaintTinJr

Member
Smart shift adjusting clocks when not needed and sending more power to cpu and gpu is all good and great but my question still stands - What happens when full clocks on cpu and gpu are needed? Surly demanding games will need full clocks on on both CPU and GPU. Surly you must see that games will need this, yet the ps5 and smart shift will not allow this for sustained lengths of time. Surly this is a detriment to the system.
Much like my last response in this thread to MrFunSocks MrFunSocks your problem is you don't acknowledge that your question is bogus to begin with.

That isn't how workloads on a fixed clock system use watts supplied to the system - because no synthetic workload could demand that of any system, never mind a real workload, as Cerny pointed out in his answers to DF in the post-Road to Ps5 interview - so using as some measure to suggest variable clocks in the PS5 aren't vastly superior to the status quo is all sorts of wrongness.
 
Last edited:

HAL-01

Member
Your question is 100% pertinent and legitimate and none of the people who have ridiculed it with memes bothered to provide an answer (far from uncommon fun this thread). AMD smartshift as described on AMD’s website does not allow for this yet Cerny has implied that the PS5 can. At least one developer has also publicly expressed that Sony discourages the usage of both processors at their max rate.
so your question was far from being laughable for anyone not on a console defense mission.
oh but what we were laughing about is his confident assertion that "downclocking when needed is bad" and that "the PS5 downclocks itself to not sound like a jet engine", both laughable statements considering that by making the least amount of research you'd know that the variable clocks are not tied to thermals or cooling whatsoever
 

DinoD

Member
All I'll say is... 3D audio is so immersive that if you pared up VR with the PS5, and playing say Resident Evil, and you're not wearing incontinent pants, you're in for a very bad day. 😛

I'm not even going to bother to apply to get that camera adapter if RE8 is not PSVR comp. on PS5.
 

FranXico

Member
OK this is the last time I will talk about it with you...
You said "The gpu cannot reach that clock speed unless it is taking power away from the cpu, at which time the cpu isn’t at its max clock speed. "

Or at peak frequency most of the time means you are wrong, there's also the other quotes too but its pointless going around in circles.
What does at mean? Seems a few days ago people knew what it meant when Ubisoft said it, now it just means "or" like "and" did?
You say it can't, Cerny says it can.
End of discussion.
I would even add that people are really failing to understand how feedback control systems work.

The frequencies might even be adjusted several times within a frame render. It does not work like throttling at all.
 
Last edited:

Elog

Member
Your question is 100% pertinent and legitimate and none of the people who have ridiculed it with memes bothered to provide an answer (far from uncommon fun this thread). AMD smartshift as described on AMD’s website does not allow for this yet Cerny has implied that the PS5 can. At least one developer has also publicly expressed that Sony discourages the usage of both processors at their max rate.so your question was far from being laughable for anyone not on a console defense mission.

I will try to be nice ;)

Summary: Smarshift adds capacity for a piece of silicon - not the opposite - and both the GPU and CPU can run at max frequency most of the time since Smartshift unifies the power budget.

First basics:

1) Every piece of silicon has hard limits for voltage, power and thermals.

2) Silicon is crazy under-utilised when you think it is running at 100% (your GPU monitoring system says 100% GPU or CPU usage). In reality roughly 50% of the silicon is idle (or thereabout) under heavy load.

3) Frequency determines how fast the silicon can conduct tasks

4) Tasks are instruction sets that a piece of software is throwing at the silicon

5) Frequency does not cost power - tasks do. Each task cost a certain amount of power - and the higher the frequency the silicon is running at the higher the power cost for that specific task (roughly an exponential curve).

6). There are large power consumption differences between various tasks a piece if silicon can do - some tasks/instructions cost much more.

7) Power consumption is what drives temperature/ thermals

When you design a console you are cost conscious so you make assumptions about what tasks that will be thrown at the silicon. From there you choose your power supply and your cooling solution. You do not have a crazy headroom here due to cost.

The most common throttling of a GPU is that you see FPS drops. Simply - the GPU cannot keep up with the tasks thrown at it and they queue up. More severe throttling occur if you hit a thermal, voltage or power limit. This can happen in a console since you do not have crazy head-room if someone finds a way to code really close to the metal for a section of code (e.g. the silicon run with 70% of the transistors running for a short time - the console will most likely struggle to handle that due to voltage, power and/or thermal limitations).

The first conclusion with regard to the PS5 is that both the GPU and the CPU should be able to run at max frequency most of the time if need be. The budget is not defined by frequency but by power. The power consumption is defined by the instruction sets thrown at the silicon at any given frequency. You can ofc bomb the silicon out with certain instruction sets but that is no different than a fixed frequency environment.

In a fixed power environment such as the XSX you have a power amount/budget allocated to the GPU. If you throw instruction sets at the GPU so it runs out of power/ hits the power limit budget you will see FPS stutter. Smartshift allows the APU to have one more potential solution - if the CPU is under-utilised at this point, power can be redistributed to the GPU by down clocking the CPU so it consumes less power., i.e. Smartshift adds a broader range of power to potentially be given to the GPU under load. It is a benefit.

In other words: XSX would benefit from having Smartshift as well since it could handle power spikes better instead of throttling.
 

Dodkrake

Banned
I have a question, AMD SmartShift currently only allows CPU resources to assist GPU, correct? Is there any chance PS5 allows a more advanced version to go both ways?

That's what it looks like it will do, but I'm not sure.

Nope. It's CPU > GPU, as demonstrated before.
 
Last edited:
In other words: XSX would benefit from having Smartshift as well since it could handle power spikes better instead of throttling.

Excellent job explaining this. So basically everyone is focusing on frequency when the real factor at play is power delivery. Never really thought of the role of power but makes sense as all the best graphics cards for overclocking have extra power capabilities as well. Interesting stuff.
 

Md Ray

Member
No worries at all. So if it has zero disadvantages, why didn’t Microsoft implement it too? They have access to it too. They worked closely with AMD too. Wouldn’t it stand to reason that the Series S/X would use it too if there was literally no drawbacks, only benefits? Honest to god serious question. Why would MS not use it if it is an advantage in literally every scenario?

In a situation where both the CPU and GPU are being stressed to the max, full utilisation trying to run a big online open world multiplayer game at 4K60FPS for example, would the guaranteed constant clock speeds not give more power than the variable ones where cerny even said himself that they will have drops?

So why did MS choose to not implement it, since it’s an AMD feature that is available to them? They worked closely with amd on these new chipsets, so if it only offers advantages and would mean an even more powerful console then why do you think they didn’t choose to use it?
A topic on Variable Frequency & Power:

Bear with me here long post incoming. It will be worth a read. And hopefully, it'll be able to clear some misunderstanding regarding the variable frequency of PS5. I'll do my best to explain with the limited info we have available to the best of my ability. And pardon my English as it's not my native language.

Instead of asking "Why didn't MS implement variable frequency?"

A better question is why did Sony choose to NOT implement fixed frequency?

It's simply because the fixed frequency strategy didn't allow Sony to reach where they wanted to be in terms of frequency. What does that mean? Their goal from the get-go for PS5 was to set the GPU frequency as high as possible.

As Cerny mentions:
Mark Cerny said:
In general, I like running the GPU at higher frequency.
But with the fixed frequency strategy, he couldn't go as high as he wanted to, more on this below.
Mark Cerny said:
Running a GPU at 2 GHz was looking like an unreachable target with the old fixed frequency strategy.

So what's this "old fixed frequency strategy"?
It is to run the GPU at a constant frequency at all times and let power vary based on the GPU workload. Btw, GPU power draw varies A LOT from game to game and even scene to scene in a game. Anyway, this is the exact same strategy that PS4, PS4 Pro, and Xbox consoles from XB1 to Series X/S use.

And why was 2 GHz unreachable?
Cerny's above "2 GHz was looking unreachable" comment doesn't mean the chip was having thermal issues nor was it incapable of reaching 2 GHz. In fact, the chip was capable and had the potential to go even further than 2 GHz comfortably. It was simply due to this old strategy of power being variable, they were unable to achieve their goal of crossing 2 GHz, in other words the power being supplied to the chip was likely insufficient. Now with the variable frequency strategy, "a completely different paradigm" as Cerny calls it, they're able to fully tap into the full potential of the GPU in terms of its frequency. More on this below.

Mark Cerny said:
It's a completely different paradigm rather than running at constant frequency and letting power vary based on the workload we run at essentially constant power and let the frequency vary based on the workload.

So what is this different paradigm and how are they achieving higher frequency with this variable frequency strategy?

To this Cerny answers:
Mark Cerny said:
We supply a generous amount of electrical power and then increase the frequency of GPU and CPU until they reach the capabilities of the system's cooling solution.

So what was previously an unreachable 2 GHz is now way over 2+GHz.
Mark Cerny said:
With this new paradigm, we're able to run way over that. In fact, we have to cap the GPU frequency at 2.23 GHz so that we can guarantee that the on-chip logic operates properly.
So the goal of 2.2+ GHz is now achieved and this was done so, in Cerny's words, by "supplying a generous amount of electrical power" i.e. constant power. And they had to cap it at 2.23 GHz to ensure the logic operated properly. And SmartShift was used as a supplement to this strategy for the SoC to further boost performance/pixels. SmartShift isn't the primary reason for variable frequency, btw.
Mark Cerny said:
While we're at it we also use AMD's "SmartShift" Technology and send any unused power from the CPU to the GPU so it can squeeze out a few more pixels.


Why does Cerny like running the GPU at a higher frequency?
Suppose Sony had gone with the same 56 CUs as MS with 4 disabled (52 CUs active) for PS5 and with the goal of 10.3 TF. They would have to set the frequency at 1544 MHz to achieve that. They would surely achieve their goal of 10.3 TF but the performance would be noticeably different between 36 CU at 2230 MHz vs 52 CU at 1544 MHz. Cerny even gave this e.g. but with 36 vs 48 CU.
20200329151024.jpg

Mark Cerny said:
If you just calculate teraflops you get the same number, but actually, the performance is noticeably different because teraflops is defined as the computational capability of the vector ALU.

That's just one part of the GPU, there are a lot of other units and those other units all run faster when the GPU frequency is higher. At 33% higher frequency rasterization goes 33% faster, processing the command buffer goes that much faster, the L2 and other caches have that much higher bandwidth, and so on.

About the only downside is that system memory is 33% further away in terms of cycles. But the large number of benefits more than counterbalance that.

At Hotchips MS revealed their "GPU Evolution" slide for Series X comparing its GPU all the way from the original Xbox One GPU.

sUET9U8.jpg


For the GPU evolution slide, they focus on 4 metrics here to show the evolution.
  1. Computational power
  2. Memory bandwidth
  3. Rasterization rate and
  4. Pixel fillrate.
Here's how a notional PS5's GPU perf with 52 CU @1544 MHz config would look like:
10.3 TFLOPS, 448 GB/sec, 6.18 Gtri/sec, 98.8 Gpix/sec

Here's what the actual PS5's GPU perf looks like with the current 36 CU @2230 config:
10.3 TFLOPS, 448 GB/sec, 8.92 Gtri/sec, 142.7 Gpix/sec

You reach your TF goal but look at rasterization and pixel fillrate. They take a massive hit. With 36 CUs those other units are now 44% higher than they were with 52 CUs. Not only does rasterization go up, but pixel fillrate also goes up along with the processing of the command buffer. And L2 and other caches now get 44% more bandwidth. Not to mention it would cost more money to go with a larger GPU. Please note I'm not downplaying MS here. Just pointing out the strategy that Sony has taken this time around. Whatever strategy MS followed for their machine works the best for them and I'm not downplaying that one bit.

Hope this post was able to better explain Cerny's variable frequency topic just a bit more and why they went with it. There is obviously more to talk about it and there's still not much detail and info like what's the power consumption like, etc.
 
Last edited:
You guys are killing it with this PS5 GPU info. Seriously those two posts gave me much more than Cerny's presentation. I can actually pretend like I understand the design choices now.

Do we have any way of knowing how much power they are giving to the GPU on each system respectively? Or is that something that we might get once someone rips them apart and takes measurements? If the PS5 is giving more power to the GPU, could that close the GPU gap even more, especially with smart shift?

Edit: Also just looked it up and the PS5 is 350w vs the XSX at 315, so seems like the PS5 already has a power advantage without smart shift.
 
Last edited:

AeneaGames

Member
Ok? That doesn't address what I said. People want to know how often the GPU or CPU won't be having their full power, meaning that 10.2TF isn't all available.

As a Sony fan I couldn't care less if it drops at all in a game or how often. I care about how the game looks, performs and plays.

The only people interested in such things are Xbox fans so they can console war on twitter.
 

geordiemp

Member
Smart shift adjusting clocks when not needed and sending more power to cpu and gpu is all good and great but my question still stands - What happens when full clocks on cpu and gpu are needed? Surly demanding games will need full clocks on on both CPU and GPU. Surly you must see that games will need this, yet the ps5 and smart shift will not allow this for sustained lengths of time. Surly this is a detriment to the system.

Thats not how frame work on either console. Sustained rubbish again I see, as people need to think in nanoseconds within a frame. Neither console is desigend to have every transistor on in any nanosecond, that is worse than a furmark type virus and would downclock ps5 but also thermally throttle XSX and shut it down..

When CPU is busy, GPU is not, and even when GPU is busy and CPU is chilling out, not every core is going full hammer.

If you noticed the AMD reveal, RDNA2 GPU are deisgned with purvasive fine gated clocks, that means frequency control at a small level as per DCU most likely. That is the RDNA2 advancement for fast clocks amongst other things listed.


J0Nd3jD.png


Ps5 is doing the advanced RDNA2 clocks to include the CPU, call it whatever you want, and factor in race to idle and you will get close to understanding what is being explained by Cerny and AMD for 6800..

However, I suspect the word sustained means you dont want to understand or cant comprehend, I am unsure but your posts sound like funsocks 2.0.
 
Last edited:

LucasBR

Member
I was wondering about the PS Plus Collection, this "service" will always be "free" or there's a chance to become another paid service from Sony?
 

LokusAbriss

Member
A topic on Variable Frequency & Power:

Bear with me here long post incoming. It will be worth a read. And hopefully, it'll be able to clear some misunderstanding regarding the variable frequency of PS5. I'll do my best to explain with the limited info we have available to the best of my ability. And pardon my English as it's not my native language.

Instead of asking "Why didn't MS implement variable frequency?"

A better question is why did Sony choose to NOT implement fixed frequency?

It's simply because the fixed frequency strategy didn't allow Sony to reach where they wanted to be in terms of frequency. What does that mean? Their goal from the get-go for PS5 was to set the GPU frequency as high as possible.

As Cerny mentions:

But with the fixed frequency strategy, he couldn't go as high as he wanted to, more on this below.


So what's this "old fixed frequency strategy"?
It is to run the GPU at a constant frequency at all times and let power vary based on the GPU workload. Btw, GPU power draw varies A LOT from game to game and even scene to scene in a game. Anyway, this is the exact same strategy that PS4, PS4 Pro, and Xbox consoles from XB1 to Series X/S use.

And why was 2 GHz unreachable?
Cerny's above "2 GHz was looking unreachable" comment doesn't mean the chip was having thermal issues nor was it incapable of reaching 2 GHz. In fact, the chip was capable and had the potential to go even further than 2 GHz comfortably. It was simply due to this old strategy of power being variable, they were unable to achieve their goal of crossing 2 GHz. Now with the variable frequency strategy, "a completely different paradigm" as Cerny calls it, they're able to fully tap into the full potential of the GPU in terms of its frequency. More on this below.



So what is this different paradigm and how are they achieving higher frequency with this variable frequency strategy?

To this Cerny answers:


So what was previously an unreachable 2 GHz is now way over 2+GHz.

So the goal of 2.2+ GHz is now achieved and this was done so, in Cerny's words, by "supplying a generous amount of electrical power" i.e. constant power. And they had to cap it at 2.23 GHz to ensure the logic operated properly. And SmartShift was used as a supplement to this strategy for the SoC to further boost performance/pixels. SmartShift isn't the solution to variable frequency, btw.



Why does Cerny like running the GPU at a higher frequency?
Suppose Sony had gone with the same 56 CUs as MS with 4 disabled (52 CUs active) for PS5 and with the goal of 10.3 TF. They would have to set the frequency at 1544 MHz to achieve that. They would surely achieve their goal of 10.3 TF but the performance would be noticeably different between 36 CU at 2230 MHz vs 52 CU at 1544 MHz. Cerny even gave this e.g. but with 36 vs 48 CU.
20200329151024.jpg



At Hotchips MS revealed their "GPU Evolution" slide for Series X comparing its GPU all the way from the original Xbox One GPU.

sUET9U8.jpg


For the GPU evolution slide, they focus on 4 metrics here to show the evolution.
  1. Computational power
  2. Memory bandwidth
  3. Rasterization rate and
  4. Pixel fillrate.
Here's how a notional PS5's GPU perf with 52 CU @1544 MHz config would look like:
10.3 TFLOPS, 448 GB/sec, 6.18 Gtri/sec, 98.8 Gpix/sec

Here's what the actual PS5's GPU perf looks like with the current 36 CU @2230 config:
10.3 TFLOPS, 448 GB/sec, 8.92 Gtri/sec, 142.7 Gpix/sec

You reach your TF goal but look at rasterization and pixel fillrate. They take a massive hit. With 36 CUs those other units are now 44% higher than they were with 52 CUs. Not only does rasterization go up, but pixel fillrate also goes up along with the processing of the command buffer. And L2 and other caches now get 44% more bandwidth. Not to mention it would cost more money to go with a larger GPU. Please note I'm not downplaying MS here. Just pointing out the strategy that Sony has taken based on the information that's out there. Whatever strategy MS followed for their machine works the best for them and I'm not downplaying that one bit.

Hope this post was able to better explain Cerny's variable frequency topic just a bit more and why they went with it. There is obviously more to talk about it and there's still not much detail and info like what's the power consumption like, etc.
All that seems pretty meaningless, when we see the same performance in multiplats for XSX and PS5. For now. Just two different approaches by Sony and Xbox.

The PS5 seems to have an advantage in throughput and loading, but obviously not more. Even though, first party titles will look better on PS5 again, because of the really strong and talented studios under Sony.

But too argue that either console has any huge advantage or disadvantage in comparison, seems very unreasonable.

All in all, the base performance of both console could have been better. The inequality between nextgen and pc will be immense in the coming years.
 
Last edited:

geordiemp

Member
I was wondering about the PS Plus Collection, this "service" will always be "free" or there's a chance to become another paid service from Sony?

Sony said its free for plus members...

Talking about possible changes is same discussion as microsoft doubling gamepass to pay for all the new studios.

We can only discuss sensibly what is the situation today, not concern for tomorrow.
 
Last edited:

sircaw

Banned
A topic on Variable Frequency & Power:

Bear with me here long post incoming. It will be worth a read. And hopefully, it'll be able to clear some misunderstanding regarding the variable frequency of PS5. I'll do my best to explain with the limited info we have available to the best of my ability. And pardon my English as it's not my native language.

Instead of asking "Why didn't MS implement variable frequency?"

A better question is why did Sony choose to NOT implement fixed frequency?

It's simply because the fixed frequency strategy didn't allow Sony to reach where they wanted to be in terms of frequency. What does that mean? Their goal from the get-go for PS5 was to set the GPU frequency as high as possible.

As Cerny mentions:

But with the fixed frequency strategy, he couldn't go as high as he wanted to, more on this below.


So what's this "old fixed frequency strategy"?
It is to run the GPU at a constant frequency at all times and let power vary based on the GPU workload. Btw, GPU power draw varies A LOT from game to game and even scene to scene in a game. Anyway, this is the exact same strategy that PS4, PS4 Pro, and Xbox consoles from XB1 to Series X/S use.

And why was 2 GHz unreachable?
Cerny's above "2 GHz was looking unreachable" comment doesn't mean the chip was having thermal issues nor was it incapable of reaching 2 GHz. In fact, the chip was capable and had the potential to go even further than 2 GHz comfortably. It was simply due to this old strategy of power being variable, they were unable to achieve their goal of crossing 2 GHz, in other words the power being supplied to the chip was likely insufficient. Now with the variable frequency strategy, "a completely different paradigm" as Cerny calls it, they're able to fully tap into the full potential of the GPU in terms of its frequency. More on this below.



So what is this different paradigm and how are they achieving higher frequency with this variable frequency strategy?

To this Cerny answers:


So what was previously an unreachable 2 GHz is now way over 2+GHz.

So the goal of 2.2+ GHz is now achieved and this was done so, in Cerny's words, by "supplying a generous amount of electrical power" i.e. constant power. And they had to cap it at 2.23 GHz to ensure the logic operated properly. And SmartShift was used as a supplement to this strategy for the SoC to further boost performance/pixels. SmartShift isn't the solution to variable frequency, btw.



Why does Cerny like running the GPU at a higher frequency?
Suppose Sony had gone with the same 56 CUs as MS with 4 disabled (52 CUs active) for PS5 and with the goal of 10.3 TF. They would have to set the frequency at 1544 MHz to achieve that. They would surely achieve their goal of 10.3 TF but the performance would be noticeably different between 36 CU at 2230 MHz vs 52 CU at 1544 MHz. Cerny even gave this e.g. but with 36 vs 48 CU.
20200329151024.jpg



At Hotchips MS revealed their "GPU Evolution" slide for Series X comparing its GPU all the way from the original Xbox One GPU.

sUET9U8.jpg


For the GPU evolution slide, they focus on 4 metrics here to show the evolution.
  1. Computational power
  2. Memory bandwidth
  3. Rasterization rate and
  4. Pixel fillrate.
Here's how a notional PS5's GPU perf with 52 CU @1544 MHz config would look like:
10.3 TFLOPS, 448 GB/sec, 6.18 Gtri/sec, 98.8 Gpix/sec

Here's what the actual PS5's GPU perf looks like with the current 36 CU @2230 config:
10.3 TFLOPS, 448 GB/sec, 8.92 Gtri/sec, 142.7 Gpix/sec

You reach your TF goal but look at rasterization and pixel fillrate. They take a massive hit. With 36 CUs those other units are now 44% higher than they were with 52 CUs. Not only does rasterization go up, but pixel fillrate also goes up along with the processing of the command buffer. And L2 and other caches now get 44% more bandwidth. Not to mention it would cost more money to go with a larger GPU. Please note I'm not downplaying MS here. Just pointing out the strategy that Sony has taken based on the information that's out there. Whatever strategy MS followed for their machine works the best for them and I'm not downplaying that one bit.

Hope this post was able to better explain Cerny's variable frequency topic just a bit more and why they went with it. There is obviously more to talk about it and there's still not much detail and info like what's the power consumption like, etc.

Move over geordiemp geordiemp , you have been replaced. NOVICE. "lollipop_disappointed:

Md Ray Md Ray My man, lets talk technical stuff, later, much later :messenger_smiling_hearts:
 

Njocky

Banned
I will try to be nice ;)

Summary: Smarshift adds capacity for a piece of silicon - not the opposite - and both the GPU and CPU can run at max frequency most of the time since Smartshift unifies the power budget.

First basics:

1) Every piece of silicon has hard limits for voltage, power and thermals.

2) Silicon is crazy under-utilised when you think it is running at 100% (your GPU monitoring system says 100% GPU or CPU usage). In reality roughly 50% of the silicon is idle (or thereabout) under heavy load.

3) Frequency determines how fast the silicon can conduct tasks

  • 4) Tasks are instruction sets that a piece of software is throwing at the silicon

5) Frequency does not cost power - tasks do. Each task cost a certain amount of power - and the higher the frequency the silicon is running at the higher the power cost for that specific task (roughly an exponential curve).

6). There are large power consumption differences between various tasks a piece if silicon can do - some tasks/instructions cost much more.

7) Power consumption is what drives temperature/ thermals

When you design a console you are cost conscious so you make assumptions about what tasks that will be thrown at the silicon. From there you choose your power supply and your cooling solution. You do not have a crazy headroom here due to cost.

The most common throttling of a GPU is that you see FPS drops. Simply - the GPU cannot keep up with the tasks thrown at it and they queue up. More severe throttling occur if you hit a thermal, voltage or power limit. This can happen in a console since you do not have crazy head-room if someone finds a way to code really close to the metal for a section of code (e.g. the silicon run with 70% of the transistors running for a short time - the console will most likely struggle to handle that due to voltage, power and/or thermal limitations).

The first conclusion with regard to the PS5 is that both the GPU and the CPU should be able to run at max frequency most of the time if need be. The budget is not defined by frequency but by power. The power consumption is defined by the instruction sets thrown at the silicon at any given frequency. You can ofc bomb the silicon out with certain instruction sets but that is no different than a fixed frequency environment.

In a fixed power environment such as the XSX you have a power amount/budget allocated to the GPU. If you throw instruction sets at the GPU so it runs out of power/ hits the power limit budget you will see FPS stutter. Smartshift allows the APU to have one more potential solution - if the CPU is under-utilised at this point, power can be redistributed to the GPU by down clocking the CPU so it consumes less power., i.e. Smartshift adds a broader range of power to potentially be given to the GPU under load. It is a benefit.

In other words: XSX would benefit from having Smartshift as well since it could handle power spikes better instead of throttling.

Seems like being "nice" isn't that hard if people would try.
Thanks for the breakdown although not much of what you said if any is new to me. I have pre-orderd a PS%, so I obviously paid a lot of attention to the official communication on what's inside the machine.
  • The PS5 has variable CLOCKS and not just a transferable power budget
  • What drives the clock fluctuation (whether it's workload or thermals) is less important than the fact that the available processing power at any given time is unpredictable to developers. Your assessment of fixed vs variable occults that fact, but Cerny acknowledges and adresses this in his post GDC interview to Eurogamer. No need to revisit this. Fixed clock profiles in PS5 dev kits are the result of this realisation ("it's useful for optimisation"): https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
  • A legitimate question that was asked by someone else and ridiculed is what happens if a component is not running at its maximum clock capacity (because it doesn't have the power to) at a time when it needs to in order to cope with the task at hand
  • We don't necessarily need your answer to that because we have Cerny's. To paraphrase him, that would be an extreme a very rare edge case scenario that the PS5 would handle more gracefully than any other console (by dropping rates or graphic quality instead of going into emergency shutdown).
  • It is easier to avoid "bombing out" (as you put it) a fixed target as opposed to a moving one. That doesn't even take into account that it would take more load to bomb out either the CPU or the GPU, as they are both more powerful than in the PS5.
  • If the XSX desperately needed AMD smartshift, so would AMD's most powerful graphic card the RX 6800 XT and ironically they both have fixed clocks and no smartshift. Just because the PS5 has something doesn't make it a requisite or even good. It allowed Sony to reach a goal. If that goal was to manage noise and thermals, early indications is that XSX achieves them just fine with fixed clocks. If the goal was to lower gamer's electricity bills, I'll go ahead and give the PS5 a win on that just based on the intent (still have to see in practice)
 
I was thinking about the rumors from months ago that PS5 was using a more customized ray tracing solution that wasn't as taxing as AMD's. Could this be one of those surprises DF was teasing?
We all know just by looking at the specs of both machines that each one has their own advantages in that front, but it will be interesting to see how DX12 and whatever Sony is using compare in RT performance.
 

reksveks

Member
I was thinking about the rumors from months ago that PS5 was using a more customized ray tracing solution that wasn't as taxing as AMD's. Could this be one of those surprises DF was teasing?
We all know just by looking at the specs of both machines that each one has their own advantages in that front, but it will be interesting to see how DX12 and whatever Sony is using compare in RT performance.

What was the company that had been linked to Sony before ? it think it was doing a Lumen like solution to GI.
 
  • If the XSX desperately needed AMD smartshift, so would AMD's most powerful graphic card the RX 6800 XT and ironically they both have fixed clocks and no smartshift. Just because the PS5 has something doesn't make it a requisite or even good. It allowed Sony to reach a goal. If that goal was to manage noise and thermals, early indications is that XSX achieves them just fine with fixed clocks. If the goal was to lower gamer's electricity bills, I'll go ahead and give the PS5 a win on that just based on the intent (still have to see in practice)
GPU != APU
APU has CPU and GPU on same silicon and therefore the power budget is shared.
 
Last edited:

thelastword

Banned
I have a question, AMD SmartShift currently only allows CPU resources to assist GPU, correct? Is there any chance PS5 allows a more advanced version to go both ways?
Well, SmartShift is about shifting power back and fourth to the CPU and GPU as needed. It's more or less power delivery to the component that needs that extra power. In essence, it's set up to minimize allocated power that's not being utilized on either side...where there is demand of course......
Listen to this...




RE8 to feature 4K and Ray tracing on PS5 along with all the haptic trigger stuff with weapons etc
I guess the RE team are optimization gods.....They went from 1080p struggling to maintain 30fps with no RT to 4k 60fps with RT........It seems Sony placed an extra GPU on all PS5 dev kits these last few weeks...Now the PS5 can finally deliver RE8 in 4k.....
 
Status
Not open for further replies.
Top Bottom