• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry]PS5 uncovered

Panajev2001a

GAF's Pleasant Genius
That comparison actually made the PS5 stupidly high clocks look really bad. - after Cerny's claim that higher clocks give actually better results when compared to the same computing power but achieved with lower clocks and more CUs, it turns out that even less computing power gives virtually the same results, given the same CU count. They should've really stick to those 2GHz or even 1.9Ghz and call it a day. And to think Cerny was talking about effective utilization of CUs...

Higher clocks vs higher CU’s assume each design is comparable (same TFLOPS target), but also that each does not have bottlenecks that mute its differentiating point... hence why the comparison DF chose is a bit iffy, but then again maybe they could not choose a better / proper one (reminds me a bit of how they benchmarked PS Vita memory cards using the PC USB backup tool).

There are diminishing returns in the efficiency gains you get by raising clocks, especially in an open system (Cerny noted correctly that you risk increasing the memory latency as seen by the GPU by just raising the CPU clocks). I do expect a closed box fixed specs system built around it to make better use of the extra potential.
 
Last edited:

sinnergy

Member
That comparison actually made the PS5 stupidly high clocks look really bad. - after Cerny's claim that higher clocks give actually better results when compared to the same computing power but achieved with lower clocks and more CUs, it turns out that even less computing power gives virtually the same results, given the same CU count. They should've really stick to those 2GHz or even 1.9Ghz and call it a day. And to think Cerny was talking about effective utilization of CUs...
For their box it’s true, no lies there. They maximized their system , they have 36 CUs used to the max efficiently. But MS played it all close to the chest and outclassed and out designed them with a out of the box form factor. It is what it is. Not saying that we won’t see amazing games from Sony because we will, we have seen what their studios can do with PS4.

But hardware wise , MS outclassed them.
 

Kenpachii

Member
:LOL: , sure we have the “pack” shitting on PS5 and calling it a crappy design out of genuine concern :rolleyes:.
It is not enough to have a well made console, the other must be shit... people must not see any positive in it. Then the coordinated effort to ensure no positive PS5 thread is left without “intervention“ else people may start believing in such things and it must not be allowed.

Shit design is shit design mate. That box is throttling on the clocks they put it. Why? shit design. As all of this could easily be solved.

@NXGamer can you explain why you said the GPU wouldn't "throttle" but only 50 or 60 MHz? Or is it just not worth your time?




The bolded is wrong dude. I'm not sure why you keep thinking this. @SonGoku just summarized the DF article (it's better than the video so PLEASE go read it). These points below (written by DF by the way) is what you need to keep in mind.



  • The CPU and GPU each have a power budget. If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU.
  • There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower.
  • GPU will spend most of its time at or near its top frequency in situations where the whole frame is being used productively in PS5 games. The same is true for the CPU, based on examination of situations where it has high utilization throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency.
  • With race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time.

NXGamer video was laughable bad as i stated in the topic that covered his video. I wouldn't take anything the guy says for anything worth.

About this part
  • There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower

Pretty sure the DF guy said that it will not be able to push both, because that would mean its fixed hardware clocks. Which cerny also was specific about that it wasn't.

If that's not the case, then cerny did a mighty bad job at explaining what his box did.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Shit design is shit design mate. That box is throttling on the clocks they put it. Why? shit design. As all of this could easily be solved.

:LOL:, I think your discord group needs to change trolling tack a bit.
Still, you are right that Sony needs to take this into account too and not get frustrated. They keep explaining their design choices and it is good for people that are interested in such things, but it is a waste of time against the “is it really HW raytracing?” / HW accelerated RT concern troll brigade.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
For their box it’s true, no lies there. They maximized their system , they have 36 CUs used to the max efficiently. But MS played it all close to the chest and outclassed and out designed them with a out of the box form factor. It is what it is. Not saying that we won’t see amazing games from Sony because we will, we have seen what their studios can do with PS4.

But hardware wise , MS outclassed them.

I do not think they outclassed and out designed them. They chose different targets and likely price points: one included higher TFLOPS the other did not and invested those transistors elsewhere and used higher clocks to compensate. This is why they did not go with more than 36 active CU’s, their transistors budget went elsewhere and I do not think it was a bad choice in this case, but we shall see.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
For their box it’s true, no lies there. They maximized their system , they have 36 CUs used to the max efficiently. But MS played it all close to the chest and outclassed and out designed them with a out of the box form factor. It is what it is. Not saying that we won’t see amazing games from Sony because we will, we have seen what their studios can do with PS4.

But hardware wise , MS outclassed them.

You sure about that? The consoles aren't even out yet mate lol. Is the only thing you care about is teraflops?

NXGamer video was laughable bad as i stated in the topic that covered his video. I wouldn't take anything the guy says for anything worth.

About this part
  • There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower

Pretty sure the DF guy said that it will not be able to push both, because that would mean its fixed hardware clocks. Which cerny also was specific about that it wasn't.

If that's not the case, then cerny did a mighty bad job at explaining what his box did.

You sure DF said that? Where did they say that? Are you sure you heard or read it correctly?
 
Game was not designed for 8 core cpu's mate. barely any game is. Next gen will peg those CPU's to oblivion specially with increases density of everything AI physics etc. That stuff already was going to happen this generation until devs realized the cpu's simple couldn't handle it for shit. Also big chance that GPU will always sit at 100% usage when dynamic resolution at 4k will push it always to its max.

People can try to sugarcoat the bad design all day long, but what they should do is give sony lots of shit so they can make changes still by redesigning there box to get stable clocks or even go so far to redesign the entire box and slam in the same GPU microsoft has.

I’m really quite certain you don’t have a clue what you are even talking about. This post is the text equivalent of this:

giphy-downsized-large.gif
 
Last edited:

Vroadstar

Member
Shit design is shit design mate. That box is throttling on the clocks they put it. Why? shit design. As all of this could easily be solved.

NXGamer video was laughable bad as i stated in the topic that covered his video. I wouldn't take anything the guy says for anything worth.

This sounds exactly like that "intervention" by X fans on videos about PS5, I guess that plan is already in motion.

and you're a system architect too ugh. The lengths some X fans are going thru is soo embarssing.
 
Last edited:

Kenpachii

Member
You sure about that? The consoles aren't even out yet mate lol. Is the only thing you care about is teraflops?



You sure DF said that? Where did they say that? Are you sure you heard or read it correctly?

Was somewhere at the doom part where he was playing it if i can remember i could have heard him wrong tho as i did some administration work while listening towards it, but that quote u just pushed makes it sound like those clocks are fixed and will be able to run always at max frequency's whenever they feel like it. U either can run at full clocks or u can't there is no in between basically.

That cpu and gpu clocks jump up and down when need is there isn't much interesting, that's just nice for your energy bill at the end of the day that's about it.

:LOL:, I think your discord group needs to change trolling tack a bit.
Still, you are right that Sony needs to take this into account too and not get frustrated. They keep explaining their design choices and it is good for people that are interested in such things, but it is a waste of time against the “is it really HW raytracing?” / HW accelerated RT concern troll brigade.

Debunk what i say, or stop quoting me. This is the 3rd time now in a topic u start to shit on my posts with no arguments other then your flower power feelings logic i couldn't give 2 shits about. If you got information i don't know share it and shit with facts on my post. I will appreciate it even. As of now u are just wasting my time.

I’m really quite certain you don’t have a clue what you are even talking about. This post is the text equivalent of this:

giphy-downsized-large.gif

Nice argument. Got actual some factual information or tech detail instead of posts that say nothing then "ur wrong".

This sounds exactly like that "intervention" by X fans on videos about PS5, I guess that plan is already in motion.

and you're a system architect too ugh. The lengths some X fans are going thru is soo embarssing.

Hey look another one pops up. no argument just straight up shitposting. Nice job mate.
 
Last edited:
Was somewhere at the doom part where he was playing it if i can remember i could have heard him wrong tho as i did some administration work while listening towards it, but that quote u just pushed makes it sound like those clocks are fixed and will be able to run always at max frequency's whenever they feel like it. U either can run at full clocks or u can't there is no in between basically.

That cpu and gpu clocks jump up and down when need is there isn't much interesting, that's just nice for your energy bill at the end of the day that's about it.



Debunk what i say, or stop quoting me. This is the 3rd time now in a topic u start to shit on my posts with no arguments other then your flower power feelings logic i couldn't give 2 shits about. If you got information i don't know share it and shit with facts on my post. I will appreciate it even. As of now u are just wasting my time.



Nice argument. Got actual some factual information or tech detail instead of posts that say nothing then "ur wrong".

lol Actual factual information. Not one thing you said in that entire post made sense or proved a point. Then you say the PS5 is a shit design, yet you literally can not prove why. Why is the PS5 a ‘shit design”? The burden of proof is on you Lead System Architect. Give a good answer or I’m reporting you for console warring and trolling.
 

Eliciel

Member
Girls and Guys, hold up here for a moment!
You know the big problem that we are all going to have is the following:

At the end of the day we all do troll each other for the love that we have for each other, the love we have for game and this industry.
I like being sarcastic and so on yeah, but let's be very precise, honest and serious for a moment and pelase hear me out you will find yourself in there as well, I am certain:


If any of these companies fails to deliver something spectacular, and with spectacular I mean great gaming experiences, this Generation it can get very f'ugly for the Single Player Games Experiences in a short-medium time frame.

My single player experience is almost exclusively derives from Console Gaming and I am not going to exclude my Switch here. Switch games are freaking awesome.
By the end of the day I started my console career with a SNES and moved over to a PS and N64 and I actually was a games kid - no console fan. Now I am a grown up and have to pay my bills and pay for whatever I want to do. I play Xbox occasionally, but since it has been available on PC now I am just playing on my new rig - it is THAT simple. I am just being convenient as much as possible....
The current state-of-play is that the best single player experiences I have witnessed came 40% from PS, 40% from Switch and 20%! from Xbox. If I am to lose my best Single Player experiences to a fuck up from any of these companies, being it Sony in regards to their console or being it Xbox in regards to their Game Studios or even Switch maybe they kill Mario, who knows... it will hurt bad and to be honest my money is too important to me to spent it on a console or a game that is really going to be sub-par to anything we expect.

By the end of the day the games will decide and if those games don't deliver a good experience I am going to be very sad. The graphics I am fine with if they are a good evolution from what we have seen on PS4pro and Xbone X. I do not really need the next generation of incredible 4k/120hz graphics - personally what I need is a good plot, a good gameplay, a good and innovative idea and I need heart and love. I need you to make me feel I spent my time on a product that tried to do something real here.

I love games as long as they are good.
That's all I wanted to say on a more serious note..
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Was somewhere at the doom part where he was playing it if i can remember i could have heard him wrong tho as i did some administration work while listening towards it, but that quote u just pushed makes it sound like those clocks are fixed and will be able to run always at max frequency's whenever they feel like it. U either can run at full clocks or u can't there is no in between basically.

That cpu and gpu clocks jump up and down when need is there isn't much interesting, that's just nice for your energy bill at the end of the day that's about it.



Debunk what i say, or stop quoting me. This is the 3rd time now in a topic u start to shit on my posts with no arguments other then your flower power feelings logic i couldn't give 2 shits about. If you got information i don't know share it and shit with facts on my post. I will appreciate it even. As of now u are just wasting my time.



Nice argument. Got actual some factual information or tech detail instead of posts that say nothing then "ur wrong".



Hey look another one pops up. no argument just straight up shitposting. Nice job mate.

:LOL: you are concern trolling and ignoring arguments made left and right while providing no data on your own beyond a hyper cynical look at the data Cerny provides and expect others to do the homework for you while you just shit on things? You keep doing you... do not throw a fit if people do not buy into what you are saying and poke holes in it.
 
A very small group of people that seem to think a faster SSD, less CU's at higher clock rate and Tempest Audio would make up the difference in CPU, GPU bandwidth speed.
The CPUs are virtually identical on both machines (3% diff). Considering on PS5 the CPU won't have the audio and the I/O to process at all (which won't be the case on XSX contrary to what they want you to believe), PS5 CPU may well be more potent in actual games.

And XSX bandwidth has constraints. But yeah XSX has a bit more Tflops power, there is no denying that.
 
Girls and Guys, hold up here for a moment!
You know the big problem that we are all going to have is the following:

At the end of the day we all do troll each other for the love that we have for each other, the love we have for game and this industry.
I like being sarcastic and so on yeah, but let's be very precise, honest and serious for a moment and pelase hear me out you will find yourself in there as well, I am certain:


If any of these companies fails to deliver something spectacular, and with spectacular I mean great gaming experiences, this Generation it can get very f'ugly for the Single Player Games Experiences in a short-medium time frame.

My single player experience is almost exclusively derives from Console Gaming and I am not going to exclude my Switch here. Switch games are freaking awesome.
By the end of the day I started my console career with a SNES and moved over to a PS and N64 and I actually was a games kid - no console fan. Now I am a grown up and have to pay my bills and pay for whatever I want to do. I play Xbox occasionally, but since it has been available on PC now I am just playing on my new rig - it is THAT simple. I am just being convenient as much as possible....
The current state-of-play is that the best single player experiences I have witnessed came 40% from PS, 40% from Switch and 20%! from Xbox. If I am to lose my best Single Player experiences to a fuck up from any of these companies, being it Sony in regards to their console or being it Xbox in regards to their Game Studios or even Switch maybe they kill Mario, who knows... it will hurt bad and to be honest my money is too important to me to spent it on a console or a game that is really going to be sub-par to anything we expect.

By the end of the day the games will decide and if those games don't deliver a good experience I am going to be very sad. The graphics I am fine with if they are a good evolution from what we have seen on PS4pro and Xbone X. I do not really need the next generation of incredible 4k/120hz graphics - personally what I need is a good plot, a good gameplay, a good and innovative idea and I need heart and love. I need you to make me feel I spent my time on a product that tried to do something real here.

I love games as long as they are good.
That's all I wanted to say on a more serious note..

Well we already know Sony is going to bring the fire when it comes to first party games, especially single player experiences. This smorgasbord of AAA GotY titles Sony keeps dropping is the cultivation of decades of work from all of their major studios. They put in the time and effort and are being rewarded.

MS didn’t learn that lesson until just recently. They took for granted the loyalty of the third parties during the 360 era and were stunned when some of their biggest relationships turned blue: Call of Duty going PS for marketing and exclusives, Destiny going PS for marketing and exclusives, 2K Basketball going PS for marketing, etc.

Now MS is trying to play catch-up with the spending spree they just had buying all of these studios, but just like DC did when they tried to jump the gun with Batman vs. Superman into Justice League, they are going to learn that it takes time to get to that same level of quality as Sony’s first party. Possibly years. Can they do it? Certainly, but MS better realize now that it’s going to take a few broken eggs before you make that omelette.
 
You sure about that? The consoles aren't even out yet mate lol. Is the only thing you care about is teraflops?



You sure DF said that? Where did they say that? Are you sure you heard or read it correctly?
https://www.neogaf.com/threads/digital-foundry-ps5-uncovered.1534691/post-257644442

I guess what we can take from that is as soon a game that is actually made with the new CPUs in mind, the GPU can't run full clocks.

Hell even current gen games like Assasin's Creed Odyssey. On PC at 60 FPS, that game can make a 9900K sweat. Vastly different CPU usage to 30 FPS.
 

makaveli60

Member
@NXGamer can you explain why you said the GPU wouldn't "throttle" but only 50 or 60 MHz? Or is it just not worth your time?




The bolded is wrong dude. I'm not sure why you keep thinking this. @SonGoku just summarized the DF article (it's better than the video so PLEASE go read it). These points below (written by DF by the way) is what you need to keep in mind.



  • The CPU and GPU each have a power budget. If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU.
  • There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower.
  • GPU will spend most of its time at or near its top frequency in situations where the whole frame is being used productively in PS5 games. The same is true for the CPU, based on examination of situations where it has high utilization throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency.
  • With race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time.
  • There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower.
If this point is true, then yes, you are right and I'm wrong. I definitely hope this is the case. However, based on what I gathered in this thread, this point is simply not true, the CPU and the GPU can't run at their max frequencies at the same time. Several people said this. What's the truth then? We need the exact words from Cerny's mouth to decide but I suppose he was vague, that's why the confusion. And vaguePR talk rarely means good news. Again, I hope I'm wrong.
 
Last edited:

slade

Member
From what I understand, variable clocks are the way they are on PS5 to address cooling concerns and keep fan noise down. Will it work? Is it a good solution? Like everything else we will have to wait for the games and analysis afterwards to really say one way or another. What I find hilarious is that Xbox fans who maybe worked on building their own PC once or even better, the 'Chemical Engineer,'..... as if, probably knows the window cleaner at Chem-U-Tech, or something.... know better then the hardware engineers designing the PS5. I've rarely seen so much hubris even within this industry.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
  • There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower.
If this point is true, then yes, you are right and I'm wrong. I definitely hope this is the case. However, based on what I gathered in this thread, this point is simply not true, the CPU and the GPU can't run at their max frequencies at the same time. Several people said this. What's the truth then? We need the exact words from Cerny's mouth to decide but I suppose he was vague, that's why the confusion. And vaguePR talk rarely means good news. Again, I hope I'm wrong.

It's really not able either me or you being right. The man that made the PS5 is saying it's true. I think some of the Xbox fanboys are confusing you. I suggest you don't listen to them, because they have a clear agenda.
 

Bojanglez

The Amiga Brotherhood
This article just confirms what everybody already knew. The PS5 is a great system, the XSX is just better
... in terms of graphics.

When I ask my son to say which order he prefers each of our consoles at the moment his answer is inversely proportionate to the TFs.
 
  • There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower.
If this point is true, then yes, you are right and I'm wrong. I definitely hope this is the case. However, based on what I gathered in this thread, this point is simply not true, the CPU and the GPU can't run at their max frequencies at the same time. Several people said this. What's the truth then? We need the exact words from Cerny's mouth to decide but I suppose he was vague, that's why the confusion. And vaguePR talk rarely means good news. Again, I hope I'm wrong.

The best way to describe it is like this: the PS5 has enough power budget to be able to run both the CPU and GPU at full power, but like Richard explained in the video from Cerny, there are going to be times where the workload isn’t going to require the CPU or GPU to be running full blast. SmartShift allows the power to run back and forth between them to give one or the other, extra juice when needed. The issue is that Xbox fanboys are trying to confuse people with FUD that it’s a this or that situation; that either the CPU is up and the GPU is down and vice versa. Cerny literally said that is not the case.
 

StreetsofBeige

Gold Member
The best way to describe it is like this: the PS5 has enough power budget to be able to run both the CPU and GPU at full power, but like Richard explained in the video from Cerny, there are going to be times where the workload isn’t going to require the CPU or GPU to be running full blast. SmartShift allows the power to run back and forth between them to give one or the other, extra juice when needed. The issue is that Xbox fanboys are trying to confuse people with FUD that it’s a this or that situation; that either the CPU is up and the GPU is down and vice versa. Cerny literally said that is not the case.
So if PS5 has enough power to run both cpu/gpu "at full power", what's the point of see-sawing back and forth if both cpu/gpu are already running at max?
 

sinnergy

Member
You sure about that? The consoles aren't even out yet mate lol. Is the only thing you care about is teraflops?



You sure DF said that? Where did they say that? Are you sure you heard or read it correctly?
I looked at the overal simplicity of both designs, MS won hands down. It is what it is.

Even the way they laid out the inner workings and pcb’s so efficiently is a work of art “less but better”

On top of this they also have a 2 TF difference. I call that a outstanding achievement.
 
Last edited:
Ok, heres my basic question. If the PS5 can run both GPU and CPU at full clocks not a problem, and only reduce clocks when the load is light to conserve power, why did he say that when there is a demanding scene any unused CPU power can be sent to the GPU to get more out of it. I mean, it is capped at 2.23ghz, so one can only assume that in a demanding scene if the CPU is running flat out the GPU is obviously not running at 2.23ghz.
 
So if PS5 has enough power to run both cpu/gpu "at full power", what's the point of see-sawing back and forth if both cpu/gpu are already running at max?

I have you on ignore, but I’ll answer this question. More than likely the fluctuation accounts for the increase in heat and potential cooling of the system. When Richard asked that exact question, Cerny said we would see soon in a system teardown how the cooling comes into play and that people would be satisfied with what they’ve come up with.
 

makaveli60

Member
The best way to describe it is like this: the PS5 has enough power budget to be able to run both the CPU and GPU at full power, but like Richard explained in the video from Cerny, there are going to be times where the workload isn’t going to require the CPU or GPU to be running full blast. SmartShift allows the power to run back and forth between them to give one or the other, extra juice when needed. The issue is that Xbox fanboys are trying to confuse people with FUD that it’s a this or that situation; that either the CPU is up and the GPU is down and vice versa. Cerny literally said that is not the case.
Then my question is the same as StreetsofBeige StreetsofBeige 's.
If they can run at the max frequencies at the same time, then why vary the frequencies at all? What's the point? Somehow this is not coming together.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
I REALLY don't understand why Richard Ledbetter made the DF video so freaking terrible, compared to their article. The article is 1000x better and he WROTE the article! Like check this part out about the SSD.

Richard quoting Mark Cerny said:
With latency of just a few milliseconds, data can be requested and delivered within the processing time of a single frame, or at worst for the next frame. This is in stark contrast to a hard drive, where the same process can typically take up to 250ms. What this means is that data can be handled by the console in a very different way - a more efficient way. "I'm still working on games. I was a producer on Marvel's Spider-Man, Death Stranding and The Last Guardian," says Mark Cerny. "My work was on a mixture of creative and technical issues - so I pick up a lot of insight as to how systems are working in practice."

One of the biggest issues is how long it takes to retrieve data from the hard drive and what this means for developers. "Let's say an enemy is going to yell something as it dies, which can be issued as an urgent cut-in-front-of-everybody else request, but it's still very possible that it takes 250 milliseconds to get the data back due to all of the other game and operating requests in the pipeline," Cerny explains. "That 250 milliseconds is a problem because if the enemy is going to yell something as it dies, it needs to happen pretty much instantaneously; this kind of issue is what forces a lot of data into RAM on PlayStation 4 and its generation."

Do people here at GAF truly understand what this quote even means? Think about it for a second......data off the SSD can be BOTH requested AND delivered within one or two frames! NO.....not 1 or 2 seconds. But 1 or 2 frames! With GPU scrubbers in the mix (I'm still not sure if the scrubbers are hardware based or not), this completely changes the way data can and will be used relative to this generation.

Let's assume a dev has the data compressed using Kraken from an SSD at 18 GBs per second and they are making a 30 fps game. That'll be 600 MBs of data that can be both requested and delivered ever frame! Marvel's Spiderman for the PS4 had 1.5 MBs per frame that was being streamed from the HDD. The difference between 600 MBs per frame and 1.5 MBs per frame is orders and orders of magnitude different! In theory, the quality of the textures in PS5 games should be darn near movie-like quality given those numbers for a 30 fps game. Next-gen games will 100% guarantee look like this in real time....

 
The CPUs are virtually identical on both machines (3% diff). Considering on PS5 the CPU won't have the audio and the I/O to process at all (which won't be the case on XSX contrary to what they want you to believe), PS5 CPU may well be more potent in actual games.

And XSX bandwidth has constraints. But yeah XSX has a bit more Tflops power, there is no denying that.

We don’t know for sure if the 3.5GHz speed on the cpu is with SMT disabled. If it is then we are talking about more than just a 3% difference in cpu power. Given that 3.5GHz is a best case for Sony with the variable clock it could be a lot more. Tell us more about the ‘contrary to what they want you to believe’ conspiracy comment will you? Microsoft has stated that a custom decompression block reduces the texture decompression IO cpu usage down to 1/10 of a zen2 core. It sounds like Sony has done a bit better offloading IO from the cpu but I doubt there will be much difference between them as both have a custom approach.

The biggest differentiator from what I can see if going to be in ray tracing. The ps5 appears a little inadequate due to the lower CU count and as a result this may impact the visual fidelity of ray tracing on the ps5.
 

StreetsofBeige

Gold Member
I have you on ignore, but I’ll answer this question. More than likely the fluctuation accounts for the increase in heat and potential cooling of the system. When Richard asked that exact question, Cerny said we would see soon in a system teardown how the cooling comes into play and that people would be satisfied with what they’ve come up with.
You have me on ignore, yet you replied to me 5 minutes later. Ya, ok.

As for your reply that heating issues can make the system decrease clocks if things get hectic, that's exactly what many of us have been saying the whole time.

The system cannot maintain max speeds of 3.5 and 2.23 when the system runs into tough operations leading to heat issues for prolonged time. So that's why it downclocks.

And that's where we all get the vague "most of time" statements because nobody knows how often it will happen, when it will happen and how much it will happen.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Then my question is the same as StreetsofBeige StreetsofBeige 's.
If they can run at the max frequencies at the same time, then why vary the frequencies at all? What's the point? Somehow this is not coming together.

Stop listening to StreetsoBeige first of all. 2nd.....Cerny has already answered this question. The answer is too better cool the PS5 console with more predictability. Instead of having a fan run faster to cool the system down, the clocks vary to a slight degree to remain in the power budget to keep the console cooled down. Again lowering the GPU from 2.23 GHz to 2.17 GHz saves 10% of power or 10% less heat (this will help the PS5 NOT run hotter).
 
Then my question is the same as StreetsofBeige StreetsofBeige 's.
If they can run at the max frequencies at the same time, then why vary the frequencies at all? What's the point? Somehow this is not coming together.

Mark Cerny:
In short, the idea is that developers may learn to optimise in a different way, by achieving identical results from the GPU but doing it faster via increased clocks delivered by optimising for power consumption. "The CPU and GPU each have a power budget, of course the GPU power budget is the larger of the two," adds Cerny. "If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

"There's another phenomenon here, which is called 'race to idle'. Let's imagine we are running at 30Hz, and we're using 28 milliseconds out of our 33 millisecond budget, so the GPU is idle for five milliseconds. The power control logic will detect that low power is being consumed - after all, the GPU is not doing much for that five milliseconds - and conclude that the frequency should be increased. But that's a pointless bump in frequency," explains Mark Cerny.

PS5 caps its CPU and GPU clocks at 3.5GHz and 2.23GHz respectively, but how stable are the frequencies? At this point, the clocks may be faster, but the GPU has no work to do. Any frequency bump is totally pointless. "The net result is that the GPU doesn't do any more work, instead it processes its assigned work more quickly and then is idle for longer, just waiting for v-sync or the like. We use 'race to idle' to describe this pointless increase in a GPU's frequency," explains Cerny. "If you construct a variable frequency system, what you're going to see based on this phenomenon (and there's an equivalent on the CPU side) is that the frequencies are usually just pegged at the maximum! That's not meaningful, though; in order to make a meaningful statement about the GPU frequency, we need to find a location in the game where the GPU is fully utilised for 33.3 milliseconds out of a 33.3 millisecond frame.

"So, when I made the statement that the GPU will spend most of its time at or near its top frequency, that is with 'race to idle' taken out of the equation - we were looking at PlayStation 5 games in situations where the whole frame was being used productively. The same is true for the CPU, based on examination of situations where it has high utilisation throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency."

Put simply, with race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time. Cerny also stresses that power consumption and clock speeds don't have a linear relationship. Dropping frequency by 10 per cent reduces power consumption by around 27 per cent. "In general, a 10 per cent power reduction is just a few per cent reduction in frequency," Cerny emphasises.
 
Last edited:

StreetsofBeige

Gold Member
Stop listening to StreetsoBeige first of all. 2nd.....Cerny has already answered this question. The answer is too better cool the PS5 console with more predictability. Instead of having a fan run faster to cool the system down, the clocks vary to a slight degree to remain in the power budget to keep the console cooled down. Again lowering the GPU from 2.23 GHz to 2.17 GHz saves 10% of power or 10% less heat (this will help the PS5 NOT run hotter).
In other words, PS5 has trouble running at full 3.5 and 2.23 at all times. If there weren't issues, just leave it maxed out at all times like every other console.

So it has to downclock to cut back on power and heat. That's exactly what all of us noticed two weeks ago.
 

GymWolf

Member
Exactly. I really hope they know what they are doing with this PS5 but day after day it seems to me more and more that it can easily be a disaster in the end. And I really hope I'm not right, because I want the best possible PS5. If that means delaying and redesigning the thing then so be it, but unfortunately after announcing all these, there is almost no chance for this.
That article where they talk about the console being too hot is 99% bullshit, don't worry about that.
 

makaveli60

Member
You have me on ignore, yet you replied to me 5 minutes later. Ya, ok.

As for your reply that heating issues can make the system decrease clocks if things get hectic, that's exactly what many of us have been saying the whole time.

The system cannot maintain max speeds of 3.5 and 2.23 when the system runs into tough operations leading to heat issues for prolonged time. So that's why it downclocks.

And that's where we all get the vague "most of time" statements because nobody knows how often it will happen, when it will happen and how much it will happen.
Yeah, I think this is it.
So here is how I understand all this:
Cerny wanted to boost the system somehow. He achieved this by introducing constant power draw. Constant power draw means constant fan speed and noise. This constant power draw allows the clocks to be boosted to 3.5 Ghz and 2.23 Ghz respectively. The system can do this at the same time theoretically but if both ran at max clocks all the time the system would overheat because of the constant fan speed. To solve this problem he introduced variable clockrates. This doesn't sound too bad in theory, but this will make game designers' job more difficult. I already mentioned an example: dynamic open world games would suffer from this because what happens in such a game cannot be controlled all the time by the developers. If let's say there are too many characters in a certain moment when eveything is blowing up it can result in tanking the framerate. Well, this also happens with existing consoles too. So maybe it's not that bad as I imagined :) I just hope this doesn't mean developers will "downgrade" their games because they want to avoid such cases.
 
Yeah, I think this is it.
So here is how I understand all this:
Cerny wanted to boost the system somehow. He achieved this by introducing constant power draw. Constant power draw means constant fan speed and noise. This constant power draw allows the clocks to be boosted to 3.5 Ghz and 2.23 Ghz respectively. The system can do this at the same time theoretically but if both ran at max clocks all the time the system would overheat because of the constant fan speed. To solve this problem he introduced variable clockrates. This doesn't sound too bad in theory, but this will make game designers' job more difficult. I already mentioned an example: dynamic open world games would suffer from this because what happens in such a game cannot be controlled all the time by the developers. If let's say there are too many characters in a certain moment when eveything is blowing up it can result in tanking the framerate. Well, this also happens with existing consoles too. So maybe it's not that bad as I imagined :) I just hope this doesn't mean developers will "downgrade" their games because they want to avoid such cases.

If you haven’t, I really recommend rewatching the PS5 deep dive. They have introduced an algorithm into the chip that allows the PS5 to detect workloads and on the fly shift power back and forth between the GPU and CPU to compensate for whatever it going on in the game. When you combine that with the speed of the assets being streamed through the I/O from the SSD, situations that you are worried about are almost literally not going to happen. Cerny discussed this at length during the talk.
 

StreetsofBeige

Gold Member
In short, the idea is that developers may learn to optimise in a different way, by achieving identical results from the GPU but doing it faster via increased clocks delivered by optimising for power consumption. "The CPU and GPU each have a power budget, of course the GPU power budget is the larger of the two," adds Cerny. "If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."
That bolded paragraph doesn't even make sense.

On one hand Cerny is talking about shifting clocks from one to another if the cpu doesn't use its full potential, but then says devs don't have to worry about cpu or gpu running slower - which sounds like they "potentially" are running at max full 3.5 and 2.23.

So if the system is already maxing out at 3.5 and 2.23, what's the point of shifting?

You know what the reason is for shifting? Because one of the two is underclocking. And if the other one isn't using full power, it can SmartShift it over. That might be fine for games never maxing out cpu/gpu. But for those games that do require it, both cpu/gpu will "try" to max out best it can but it won't be able to and go into cool down mode because sustained 3.5 and 2.23 is not doable.
 
Last edited:

Kenpachii

Member
lol Actual factual information. Not one thing you said in that entire post made sense or proved a point. Then you say the PS5 is a shit design, yet you literally can not prove why. Why is the PS5 a ‘shit design”? The burden of proof is on you Lead System Architect. Give a good answer or I’m reporting you for console warring and trolling.

This is what i said in a short version.

Game was not designed for 8 core cpu's mate. barely any game is. Next gen will peg those CPU's to oblivion specially with increases density of everything AI physics etc. That stuff already was going to happen this generation until devs realized the cpu's simple couldn't handle it for shit. Also big chance that GPU will always sit at 100% usage when dynamic resolution at 4k will push it always to its max.

People can try to sugarcoat the bad design all day long, but what they should do is give sony lots of shit so they can make changes still by redesigning there box to get stable clocks or even go so far to redesign the entire box and slam in the same GPU microsoft has.

This is what i mean with it in the longer version.

FF15 game is not designed for 8 cores.

Barely any game is, i can count them on a single hand that uses 8 cores at close to 100% usage at times.

Next gen will hit those Cores hard the same as every gen does why? complexity goes up and with stuff like physics that most likely will be sitting on its CPU cores and improved AI u can bet those cores will get some work. As somebody with a 9900k, ac odyssey is doing work on it to the point it barely sustains already at 5ghz 60 fps in towns at ultra settings. Now imagine next generation open world games like the actual next witcher 4, u think standing in town will not hit those cores hard? or any open world game for that matter? u think that ryzen cpu has endless performance? Yea u can forget about it. The games i play on my 9900k need already far more performance then a 9900k can deliver. The cores will be slammed with extra work next gen that's just the reality.

Then also we can talk about how code will hit those 8 cores and 16 threads, will code still cripple the first core to shit? how's that going to work out? why do you think microsoft came forwards with SMT off and on will sony use that, are clocks based on smt off?. So even if only 2 cores are used, will it still need to cripple the gpu when clocks go up? Then even a simplistic game could cripple the gpu as result as what happens with my 9900k that sits at 20% usage but still bottlenecks.. Do we know? nope will we at some point? probably not unless u work with it. But like the DF guy says in his video it will be fun to test stuff out to see how it interacts in games like spiderman on the PS5 or uncapped framerate games to see how the PS5 coops with it or xbox for that matter.

With 4k and dynamic resolution. We already saw this generation that GPU's had a awful time hitting stable performance out of the gate with a massive jump forwards. Games like watch dogs / assassin creed unity / witcher 3 all struggled to get stable performance forwards even while the box was a metric ton faster than the gen before. This means taxation will go up on those gpu's straight from day one until they are exhausted, anybody that thinks it will take time for devs to get used towards it probably don't know how easy it is for devs to dig into the performance on technology that is well understood much like what happened with the PS4, now SSD part is debatable but GPU / CPU will be hit hard straight out of the gate on first and 3rd party titles pretty much directly.

Everybody with a PC knows that 4k is murdering GPU's. I got a 1080ti, and frankly even 1440p in current top end titles on ultra is wishful thinking for 60 fps, a 2080ti has issue's hitting 60 fps at 4k and even at 60 at 1440p in some modern PC titles and that problem will stay the moment complexity goes up which it will. So hows is that GPU going to coop with that? they going to lock every new title to a fixed resolution with fixed fps? nope. That's like asking 60 fps at 1080p at the start of PS5 gen. U where lucky to get 1080p at 30fps. They will put heavy dynamic resolutions in play much like what u see with pro and x and specially when the X will struggle in newer games the PS5 will have a even worse time.

This is why i mentioned, both cpu and gpu will be pegged hard always. And there is absolute no indication it isn't. I use now already for a year 9900k and 1080ti, and frankly both cripple and get nuked in games i play hard because single core performance is simple not there and with the 9900k and 1080ti even while its a fast card and probably comparable somewhat to the PS5 GPU ( i personally think both boxes will outperform it handedly next gen however at the end ) i need 3x more performance just for 1080p to fill my needs to play games like red dead redemption 2 in its full glory.

The bad design is simple based on the fact that there clocks aren't fixed i explained that here in one of my posts until people started to go full derp on me, there is absolute no reason for this to happen at all. If the GPU and CPU can hit those clocks that just means they are constrained on power or heat and that can be dealt with accordingly on all kinds of different ways. I call it bad design because i find it bad design if it can't sustain it.

Now what the other guy quoted that it can run at full clocks most of the time at both solutions, then that changes things because then the clocks are fixed, even while obviously clocks can change in different environments. Which is completely understanding, but cerny did not explain this well even remotely if that's the case.

Now about your troll and whatever posting like the other 3 guys, i know my material, i know what hardware does, i program myself for emulators while not tripple AAA games and its mostly a hobby of mine i know how things interact as i am already busy with hardware benching for more then 2 decades now, i own a lot of hardware private but also have a lot of hardware to my disposal to bench with through the work i do which is nothing related towards games or anything but as a game nerd and most of all tech nerd specially on IT department i love tech and like to discuss about it and probably spend a good 50% of my free time doing so on multiple forums. I know enough of my stuff to make statements. And if i say something that's factual incorrect then say it because i am curious towards new information always. Coming with ban bait and troll fanboy reactions with no substance is fastly ignored by me until u directly start to attack me with it.

However when i see reactions like this:

I’m really quite certain you don’t have a clue what you are even talking about. This post is the text equivalent of this:

you are concern trolling and ignoring arguments made left and right while providing no data on your own beyond a hyper cynical look at the data Cerny provides and expect others to do the homework for you while you just shit on things? You keep doing you... do not throw a fit if people do not buy into what you are saying and poke holes in it.

This sounds exactly like that "intervention" by X fans on videos about PS5, I guess that plan is already in motion.

and you're a system architect too ugh. The lengths some X fans are going thru is soo embarssing.

you are concern trolling and ignoring arguments made left and right while providing no data on your own beyond a hyper cynical look at the data Cerny provides and expect others to do the homework for you while you just shit on things? You keep doing you... do not throw a fit if people do not buy into what you are saying and poke holes in it.

Then i am not sure on what planet you guys reside on.
 
Last edited:

StreetsofBeige

Gold Member
Yeah, I think this is it.
So here is how I understand all this:
Cerny wanted to boost the system somehow. He achieved this by introducing constant power draw. Constant power draw means constant fan speed and noise. This constant power draw allows the clocks to be boosted to 3.5 Ghz and 2.23 Ghz respectively. The system can do this at the same time theoretically but if both ran at max clocks all the time the system would overheat because of the constant fan speed. To solve this problem he introduced variable clockrates. This doesn't sound too bad in theory, but this will make game designers' job more difficult. I already mentioned an example: dynamic open world games would suffer from this because what happens in such a game cannot be controlled all the time by the developers. If let's say there are too many characters in a certain moment when eveything is blowing up it can result in tanking the framerate. Well, this also happens with existing consoles too. So maybe it's not that bad as I imagined :) I just hope this doesn't mean developers will "downgrade" their games because they want to avoid such cases.
People have been bringing up "the gpu only needs to downclock 2% max" kind of thing. And if it does that everything works out. Who knows how true or false the 2% claim is.

But let's say it is true, it sure seems like an odd way to make a system for 2%.

Just downclock whatever cpu or gpu the 2% and let them both max out in full all the time, and Cerny wouldn't even need to care as much about fan speed or overheating if this 2% is a make or breaker.
 
Last edited:
A very small group of people that seem to think a faster SSD, less CU's at higher clock rate and Tempest Audio would make up the difference in CPU, GPU bandwidth speed.

Nobody is saying the ultra fast SSD will give PS5 an extra teraflop. Why would anyone say that? Teraflop doesn't matter anyway. That 18% advantage in teraflop will only result in a higher screen output resolution that you can't notice even in youtube comparison videos anyway.

People are saying (including developers) that the PS5 SSD will allow an unprecedented and unmatched insane texture details and more varied art. A developer explained this in detail in one of his tweets.
 
Last edited:

Psykodad

Banned
In other words, PS5 has trouble running at full 3.5 and 2.23 at all times. If there weren't issues, just leave it maxed out at all times like every other console.

So it has to downclock to cut back on power and heat. That's exactly what all of us noticed two weeks ago.
Maybe I'm misunderstanding something, but it seems to me that Cerny says that they could both run at full speed all the time if necessary, but since that apparently is never the case, the speeds can be downclocked.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Yeah, I think this is it.
So here is how I understand all this:
Cerny wanted to boost the system somehow. He achieved this by introducing constant power draw. Constant power draw means constant fan speed and noise. This constant power draw allows the clocks to be boosted to 3.5 Ghz and 2.23 Ghz respectively. The system can do this at the same time theoretically but if both ran at max clocks all the time the system would overheat because of the constant fan speed. To solve this problem he introduced variable clockrates. This doesn't sound too bad in theory, but this will make game designers' job more difficult. I already mentioned an example: dynamic open world games would suffer from this because what happens in such a game cannot be controlled all the time by the developers. If let's say there are too many characters in a certain moment when eveything is blowing up it can result in tanking the framerate. Well, this also happens with existing consoles too. So maybe it's not that bad as I imagined :) I just hope this doesn't mean developers will "downgrade" their games because they want to avoid such cases.

Keep in mind Mark Cerny was the Producer of both open-world games Spiderman and Death Stranding. He helped with the tech side of both games. I'd trust he knows how open-world games are made and how they run. Especially on a console he designed (considering he designed the PS4, PS4 Pro, and PS5).
 

GymWolf

Member
In theory it can, but in practice it never would. Devs will decide how much it should take and plan accordingly. Devs not wanting to budget too much to it and the Tempest Hardware not being fully utilized could be a reality. Speaking to the 20 GB/s, even that is a ton and I doubt it will hit that number ofen.
i hope not, majority of people doesn't give a fuck about 3d audio with crappy tv speakers.
i'm not gonna be happy if audio steal even a miserable 5% of power.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Nobody is saying the ultra fast SSD will give PS5 an extra teraflop. Why would anyone say that? Teraflop doesn't matter anyway. That 18% advantage in teraflop will only result in a higher screen output resolution that you can't notice even in youtube comparison videos anyway.

People are saying (including developers) that the PS5 SSD will allow an unprecedented and unmatched insane texture details and more varied art. A developer explained this in detail in one of his tweets.

Cerny also said this to DF during their interview.....

In short, to get instant access to urgent data, more of it needs to be stored in RAM on the current generation consoles - opening the door to a huge efficiency saving for next-gen. The SSD alleviates a lot of the burden simply because data can be requested as it's needed as opposed to caching a bunch of it that the console may need... but may not. There are further efficiency savings because duplication is no longer needed. Much of a hard drive's latency is a factor of the fact that a mechanical head is moving around the surface of the drive platter. Finding data can take as long - or longer - as reading it. Therefore, the same data is often duplicated hundreds of times simply to ensure that the drive is occupied with reading data as opposed to wasting time looking for it (or "seeking" it).

"Marvel's Spider-Man is a good example of the city block strategy. There are higher LOD and lower LOD representations for about a thousand blocks. If something is used a lot it's in those bundles of data a lot," says Cerny.

Without duplication, drive performance drops through the floor - a target 50MB/s to 100MB/s of data throughput collapsed to just 8MB/s in one game example Cerny looked at. Duplication massively increases throughput, but of course, it also means a lot of wasted space on the drive. For Marvel's Spider-Man, Insomniac came up with an elegant solution, but once again, it leaned heavily on using RAM.

"Telemetry is vital in spotting issues with such a system, for example, telemetry showed that the city database jumped in size by a gigabyte overnight. It turned out the cause was 1.6MB of trash bags - that's not a particularly large asset - but the trash bags happened to be included in 600 city blocks," explains Mark Cerny. "The Insomniac rule is that any asset used more than four hundred times is resident in RAM, so the trash bags were moved there, though clearly there's a limit to how many assets can reside in RAM."

It's another example of how the SSD could prove transformative to next-gen titles. The install size of a game will be more optimal because duplication isn't needed; those trash bags only need to exist once on the SSD - not hundreds or thousands of times - and would never need to be resident in RAM. They will load with latency and transfer speeds that are a couple of orders of magnitude faster, meaning a 'just in time' approach to data delivery with less caching.


Behind the scenes, the SSD's dedicated Kraken compression block, DMA controller, coherency engines and I/O co-processors ensure that developers can easily tap into the speed of the SSD without requiring bespoke code to get the best out of the solid-state solution. A significant silicon investment in the flash controller ensures top performance: the developer simply needs to use the new API. It's a great example of a piece of technology that should deliver instant benefits, and won't require extensive developer buy-in to utilise it.
 
A whole lot of “stuff”

Yikes. Off the bat, you are are making a whole lot of assumptions based upon your experience with PC building and overclocking including your example of a 1080Ti, a 3 year old card. The main point you are missing however is that the GPUs in both XSX and PS5 are RDNA2 which cores are going to run higher and more efficient than anything you have running in your PC from 3 years ago. Furthermore, having the variability factor of being able to push power back and forth between both the GPU and CPU not only helps keep the system regulated for cooling purposes, but also allows the chip to run at stable clocks instead of at a locked speed which will invariably cause the system to either overheat or waste power due to either component being used at full throttle when there is nothing for it to be doing. Like this is all literally covered by Cerny in the interview and the PS5 deep dive. Your bad design comment got called out because this is anything, but bad. It’s actually quite elegant.
 
Top Bottom