• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry]PS5 uncovered

Tripolygon

Banned
I thought it didn't exist. If you fought half as hard to better your life as you do to defend Sony's product, I guarantee you would be too happy to be trolling game forums to systematically deny the truth on behalf of a corporation.
Quote the full context mate.
There's likely more to discover about how boost will influence game design. Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. It makes perfect sense as most game engines right now are architected with the low performance Jaguar in mind - even a doubling of throughput (ie 60fps vs 30fps) would hardly tax PS5's Zen 2 cores.
"Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising.
My life is pretty good thanks for your concern. Got "furloughed" because of the whole Corona situation but i'm financially stable for now. You should spend more time educating yourself because it seems reading comprehension is not your strong suit, and spend less time being a fanboy of a plastic device. It'll do well for your sanity.
 

Windows-PC

Banned
Has anyone seen this shit floating around on Twitter? It is posted by a WindowsCentral guy so...salt mine worth of salt. Also, I seriously hope it is not true. No one wants to see the console delayed. Also, does anyone know who the fuck the guy saying all that is?



Thanks for sharing this, this would explain Sony's horrible Marketing and it proves what I've always believed.

Microsoft caught Sony from behind, and now Sony is trying to barely match the competition amongst other things by horrendously overclocking the GPU.

It really looks like to me that Sony doesn't have a product right now which they are proud of, and their actual Quietness and Marketing reflects that!
 
Quote the full context mate.


My life is pretty good thanks for your concern. Got "furloughed" because of the whole Corona situation but i'm financially stable for now. You should spend more time educating yourself because it seems reading comprehension is not your strong suit, and spend less time being a fanboy of a plastic device. It'll do well for your sanity.

I was wondering who you were talking to, and then I realized it was another Xbox fan being a console warrior as usual. I really admire your ability to be patient with these children because I couldn’t do it lol.
 

dxdt

Member
I am puzzled by the need to go to 2.23 GHz with 36 CUs especially when dropping the frequency by 10% can reduce power by 27%. Which is the better option? I am assuming the extra 4 CUs won't have as much heating as the extra 230 MHz. It seems like a lot of effort went into the heatsink and cooling design to squeeze 2.23 GHz to get 10.28 TF when +4 CU can get the same thing. I would have loved to be part of this trade-off study.

36 CU @ 2.23 GHz = 10.28 TF
40 CU @ 2.00 GHz = 10.24 TF
 

Dory16

Banned
Quote the full context mate.


My life is pretty good thanks for your concern. Got "furloughed" because of the whole Corona situation but i'm financially stable for now. You should spend more time educating yourself because it seems reading comprehension is not your strong suit, and spend less time being a fanboy of a plastic device. It'll do well for your sanity.
If developers are throttling the PS5 CPU today while working on current gen games in order to ensure stable GPU performance (which you finally stopped denying), it does not bode well for what will happen when they start working on CPU bound next-gen games that require high GPU performance.
Sorry to hear you got furloughed, I don't wish it on anyone and it's terrible what's happening at the moment. Jobs come and go, I hope we all manage to stay safe.
 
I am puzzled by the need to go to 2.23 GHz with 36 CUs especially when dropping the frequency by 10% can reduce power by 27%. Which is the better option? I am assuming the extra 4 CUs won't have as much heating as the extra 230 MHz. It seems like a lot of effort went into the heatsink and cooling design to squeeze 2.23 GHz to get 10.28 TF when +4 CU can get the same thing. I would have loved to be part of this trade-off study.

36 CU @ 2.23 GHz = 10.28 TF
40 CU @ 2.00 GHz = 10.24 TF

This is a concern trolling post. Cerny LITERALLY explained why they went this way in the PS5 deep dive.
 
Last edited:
Has anyone seen this shit floating around on Twitter? It is posted by a WindowsCentral guy so...salt mine worth of salt. Also, I seriously hope it is not true. No one wants to see the console delayed. Also, does anyone know who the fuck the guy saying all that is?

Regardless of the validity of these statements I honestly don't doubt the accuracy. This console is an enigma with convolution and issues written all over it.
 

Tripolygon

Banned
I am puzzled by the need to go to 2.23 GHz with 36 CUs especially when dropping the frequency by 10% can reduce power by 27%. Which is the better option? I am assuming the extra 4 CUs won't have as much heating as the extra 230 MHz. It seems like a lot of effort went into the heatsink and cooling design to squeeze 2.23 GHz to get 10.28 TF when +4 CU can get the same thing. I would have loved to be part of this trade-off study.

36 CU @ 2.23 GHz = 10.28 TF
40 CU @ 2.00 GHz = 10.24 TF
4 CU is for redundancy. It is head room to make sure all chips have the same performance. Say you have 10 chips but 2 are 100% defect free then you have 4 that are 90% defect free the remaining 4 are 80% defect free. You bin them to the performance of the 4 chips that are 80% defect free so you use all 10 of them instead of throwing some out and losing money.
 
Last edited:

phil_t98

#SonyToo
So then give options, or send a crew over and explain how you want games to run and or look on your hardware. The one x is the most powerful console and yet there are compromises
they choose how to make it, also the playstation version in this video is running a later version of the game, xbox is an older patch. games not out yet so have to wait till it launches to know
 

TheStruggler

Report me for trolling ND/TLoU2 threads
they choose how to make it, also the playstation version in this video is running a later version of the game, xbox is an older patch. games not out yet so have to wait till it launches to know
Sure and if performance is still trash?
 

Tripolygon

Banned
If developers are throttling the PS5 CPU today while working on current gen games in order to ensure stable GPU performance (which you finally stopped denying), it does not bode well for what will happen when they start working on CPU bound next-gen games that require high GPU performance.
Sorry to hear you got furloughed, I don't wish it on anyone and it's terrible what's happening at the moment. Jobs come and go, I hope we all manage to stay safe.
But games are developed with locked frequency now and we still get frame drops, torn frames and bugs. Locked frequency or not we will have games that do that but i recon it will not be because of variable frequency. We will find out sooner or later when next gen games start to come out.

No worries mate, hopefully this Corona situation is at least under control before or by then so our lives can go back to normal.
 
Last edited:

darkinstinct

...lacks reading comprehension.
So then give options, or send a crew over and explain how you want games to run and or look on your hardware. The one x is the most powerful console and yet there are compromises
And yet, RE3 will run at locked 4K/60 on XSX because of those compromises and at the same 1620p/60 as on Pro on PS5.
 

Dabaus

Banned
Saw this on NPC Era, thought it was worth a quote:


Soprano said:
Just a little notice. The concern and FUD you are seeing around PS5 news lately is being done on purpose. Some of it is planned in a certain discord.

This thread heading to the dumps is apart of it i presume. There's this sad attempt to make PS5 look like it's going to fail and experience another PS3 generation.

That is all. Bye.
 

DESTROYA

Member
Below the waist it's minimal, so that allows the brain to workload at max.

Just like when he came with the magical teraflop figure needed for 4K gaming that was, by coincidence, higher than the new Xbox One X.

He is just a designer, bro.
Sort of like your fantasy of ”Is she playing me” thread right? It‘s all just some made up story :goog_unsure:
 
I thought it didn't exist. If you fought half as hard to better your life as you do to defend Sony's product, I guarantee you would be too happy to be trolling game forums to systematically deny the truth on behalf of a corporation.
Was this necessary?
Who the fuck are you to judge people? Stop being an ass
 
Last edited:

joe_zazen

Member
it_wasnt_me was going to bring some matt quotes over, but he got banned. I’ll bring a couple over.

So you are looking at it backwards.

Any developer (including Sony) would ideally prefer the maximum possible GPU and CPU chip performance set at all times.

But that’s simply not possible. You just are not able to reach these high clock speeds all the time under every possible workload.

The PS5 is designed to take advantage of the fact that variable performance allows the system to reach a higher performance level than it would if the same hardware had to conform to stable clocks. Stable clocks inevitably leave some performance on the table that the hardware would otherwise be able to achieve with more flexibility.

So, for example, if the SX could magically adopt this variable system with all other specs remaining the same, that would produce a stronger system in the end than the SX is now, not a weaker one. But I say “magically” because that’s not possible, there would be trade offs for MS to do this that they did not want to make.

MS has been more reserved privately than Sony, but more open publicly.

They are just different strategies.

Well MS built a system like most other systems and desktops have always been built, which is tried and true. I certainly don’t know for sure what would need to be altered if they adopted a pretty basic design change, but just logically speaking such a system would draw more power and generate a meaningful amount of additional heat. That would require a different cooling and case design, because there is little chance MS built their system as is to allow for that level of tolerance (why would they? It would be a waste). And maybe having that number of CUs running faster would create so much more heat and require so much more power that it’s just not practical in a consumer device like this. And maybe that change in heat and power, even if possible, would limit other areas of the system or additional chips or logic that could be part of the SOC.

Or maybe that level of heat or power or complexity would make deploying SX based hardware for xCloud significantly more problematic.

Then we get into the OS side. MS has done amazing work with BC largely because of their software layer. Maybe having a variable design would add an additional level of complication that MS is unwilling to introduce to their ecosystem.

But I think it was just never really a consideration. MS built basically a computer the way you build computers. Sony is just trying something a little different.

The SSD. It’s not the commonly derided (usually rightfully) “secret sauce”. It’s fucking fast, much faster than the one in the SX (which, to be clear, is no slouch either), and it will, for games that are designed to take advantage of it, open the door to better game design and new gameplay possibilities. The SSD in the SX will absolutely also accomplish this, but just not to the same extent.

I’m really excited to see this in action in Sony’s first party games.


For those who don't know, matt is 100% trustworthy.
 
it_wasnt_me was going to bring some matt quotes over, but he got banned. I’ll bring a couple over.










For those who don't know, matt is 100% trustworthy.
That doesn't make any sense, adding variability doesn't make a system more capable, it doesn't allow you to tap into more compute resources.

If the PlayStation 5 were fixed at 2.23Ghz and 3.5Ghz it would be bilaterally more capable than if any one side had to make concessions for the other.

This crazed thinking peddled by Matt or whoever defies all reason and logic. The Series X adding downward variability to its affixed frequencies wouldn't make it more capable.

How do people even come up with this shit?
 
Last edited:

marquimvfs

Member
I see lots of people here that believe that Blu-rays can kill you! At least, they appear to have the same text interpretation level than the guy that believes it... Please, read/watch the entire damn video before coming here and say "it's not gonna sustain that clock most of time", or "it will throttle af". Please, there's no evidence of that and Cerny is saying the ABSOLUTE OPPOSITE OF IT! If your opinion became true, then Cerny lied, don't act like if you are the only that really understood the presentation.
 
Last edited:

Jtibh

Banned
What a useless video.
Zero information that would explain more about the ps5.

The whole I/O conveniently left out like by soo many which is the biggest difference in all of next gen.

DF just stick to your xbox coverage and dont make any more videos about ps5. You make it hard to believe you even care.

Lazy work. Very lazy.
 
Well, as already stated by DF, devs will most likely throttle one to maximize the other. The thing is, doing it this way will allow devs to extract more performance than having set clocks. Otherwise we would likely have a PS5 running at say 3.2 GHz CPU and 1.9 GHz GPU flat, for example. But this way, you can boost your CPU or GPU at the times where the other is idle.

no it was stated that the reason it throttle back is because the games dont use as much CPU, why have a high clock just to be idle?

“Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. It makes perfect sense as most game engines right now are architected with the low performance Jaguar in mind - even a doubling of throughput (ie 60fps vs 30fps) would hardly tax PS5's Zen 2 cores. However, this doesn't sound like a boost solution, but rather performance profiles similar to what we've seen on Nintendo Switch. "Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.”

current games use GPU to calculate things that used to be calculated on CPU, for example around the time uncharted 4 was released the devkits for PS4 were updated to calculate physics on GPU by default that means more free time for the jaguar CPU so naturally it doesn't require as much on a more powerful CPU and you can throttle back the clocks it doesnt imply you cannot use both at max clock in fact it was stated

"There's enough power that both CPU and GPU can run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

and yes if they run at that speed is because they will use a workload that requires it otherwise they can simply down clock, what is the point of having a high clock if you will be idle? if you can downclock do it


Say for example you have to render 50 people on the street with high polygons. That is high on the CPU, but a street itself is not that taxing on the GPU. You lower the GPU power usage and give the CPU its max performance.
If you have a scene with ray tracing but with barely any polygons in it, the CPU will not need to use that much power but the GPU is under a heavy workload. You then shift the power to the GPU.
If you had set clocks, you would have lower performance in either of the above scenarios.

NO, games used to be like that like 20 years ago, back then you gave the vertex data to the GPU calculated by CPU to render like the direct mode of opengl, that is why carmack spent lot of time batling to send the absolute minimum required geometry in quake for rendering, it changed, we store vertex data in vertex buffer objects and vertex array objects and those are on GPU memory, the CPU now only tell GPU which one to use to render a specific character or object scene, to tax the CPU using lot of characters require you to run certain calculations on that are taxing to CPU for example some physic boxes(those are calculated on GPU now) and AI calculations and making lot of drawing calls(on PC) as those have to be "negotiated" with the OS and driver so they consume CPU time and you prefer to draw many objects with one single drawcall or use something like Vulkan/DX12
 
Last edited:

GymWolf

Member
But in this case this is something that developers have to pay attention for, isn't it? That's a huge task if I understand this correctly. Just an example: let's say they have to test a certain scene where everything is blowing up, there are lots of characters etc. They achieve their target fps with a certain number of characters, but if they added more the framerate would be worse. And they have to test everything like this. They always have to keep the "budget" in mind which is quite difficult in a dynamic open world game. They can't let the system just take care of this problem because that can mean that a certain number of characters is too much for the system, it donwclocks itself and the framerate tanks. I hope it's clear what I mean and I'm not a developer so it is easily possible that I'm talking nonsense. I just want to understand this whole thing.
this is exactly what i was thinking, in open world games you can transform a calm scene in a giant mess just by playing the damn game and creating disasters, just look how rdr2 slow down during scene with fire, explosions or a shitload of npc on screen fighting.
 

joe_zazen

Member
Moar

"There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."


this has been quoted multiple times already and was mentioned in the article.


Workloads and clock speeds are not the same thing. You can have a workload that means having to run the CPU at a lower speed in order to maintain the top GPU clock, just like you can have a workload that allows you to keep both the CPU and GPU at max.



Do you know if Sony already experimented with the clocking boost from the start (which would make sense if they were indeed limited to 36cus due BC) or that came over time adjusting to feedback from developers?

As far as I know, it’s always been part of the plan.


 
you should ask amd, apple, and intel since all their top of the line shit uses this type of tech.
No, just absolutely 100% no. That's not how any of this works.

If your base is 3.2Ghz and your base is 2.02Ghz and you boost to 3.5Ghz and 2.23Ghz, yes that would be a computational uplift. That would make your system more capable when boosting. Taking something which is at 3.5Ghz and 2.23Ghz and then giving it variable draw to scale back frequency removes compute and capability from the system when it falls below those figures.

The fact that this even needs to be explained shows the level of insane delusion which has captivated the minds of people with this nonsense. Unbelievable.
 

GymWolf

Member
Youre missing the point.
Devkits only has profiles the devs must choose, hence the quote.
Retail units are not bound to those profiles since they have smartshift, and devkits don't.
so...devs develop their games with a non-definitive hardware that differ from what people have at their home? how this even works?
how can you squeeze a closed hardware when you don't even work with an exact replica of the final hardware?

is this just for pre-launch period or definitive?

this point is not really clear tbh.
 

joe_zazen

Member
No, just absolutely 100% no. That's not how any of this works.

If your base is 3.2Ghz and your base is 2.02Ghz and you boost to 3.5Ghz and 2.23Ghz, yes that would be a computational uplift. That would make your system more capable when boosting. Taking something which is at 3.5Ghz and 2.23Ghz and then giving it variable draw to scale back frequency removes compute and capability from the system when it falls below those figures.

The fact that this even needs to be explained shows the level of insane delusion which has captivated the minds of people with this nonsense. Unbelievable.

what it means is that given budgets (power, thermals, and money) a system with variable clocks performs better everything else being equal. So if xsx remained exactly the same except they could implement the tech, their console would perform better. Likewise, if ps5 had to have 100% stable clocks, it would perform worse.

This is why apple, intell, and amd include variable clocks in their products, they perform better with them than without, everything else being equal.
 

recursive

Member
For sure. I know a guy at work who had a shitty looking VW Rabbit or Golf and he messed with it to be fast.

The thing is even if it's fast, it's still a shit car.

The PS4 supercharged PC... lol... pretty sure a decent PC in 2013 could run games well at 1080p or higher at better settings.
But could a shitty laptop?
 
what it means is that given budgets (power, thermals, and money) a system with variable clocks performs better everything else being equal. So if xsx remained exactly the same except they could implement the tech, their console would perform better. Likewise, if ps5 had to have 100% stable clocks, it would perform worse.

This is why apple, intell, and amd include variable clocks in their products, they perform better with them than without, everything else being equal.
It would be more power efficient, but it wouldn't benefit the system computationally in any regard. Also that's not how those systems work, they boost up from a base frequency, Sony's system appears to draw down.

It's apples to oranges and you're extremely confused.
 

II_JumPeR_I

Member
What a useless video.
Zero information that would explain more about the ps5.

The whole I/O conveniently left out like by soo many which is the biggest difference in all of next gen.

DF just stick to your xbox coverage and dont make any more videos about ps5. You make it hard to believe you even care.

Lazy work. Very lazy.
If there isnt anything more revealed for PS5? Its not DFs fault that Sony cant get their shit together...
 
Matt is not trustworthy at all.
Of course he's not, he's part of the reason people were on this 13 teraflop "PS5 is more powerful" hype train to begin with. He's part of the reason there's so much disappointment, he helped setup the unreal expectations which didn't land.

Matt is fake af, he's a bullshit "insider", a nobody with a history of totally inaccurate information. This variability trash takes the cake though.
 

ethomaz

Banned
I was simply referring to the below but go ahead, say it's photoshopped. Group denial in all its splendor:

“Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. "


ctputqT.jpg
Go ahead and read the full paragraph.
Don’t cut the content to make up FUD.

It makes perfect sense as most game engines right now are architected with the low performance Jaguar in mind - even a doubling of throughput (ie 60fps vs 30fps) would hardly tax PS5's Zen 2 cores. However, this doesn't sound like a boost solution, but rather performance profiles similar to what we've seen on Nintendo Switch. "Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.”

Devs are choosing a fixed profile that decrease the CPU clock because their engine are build to Jaguar CPU and because they are debugging/profiling three game.
 
Last edited:

dxdt

Member
4 CU is for redundancy. It is head room to make sure all chips have the same performance. Say you have 10 chips but 2 are 100% defect free then you have 4 that are 90% defect free the remaining 4 are 80% defect free. You bin them to the performance of the 4 chips that are 80% defect free so you use all 10 of them instead of throwing some out and losing money.
I was thinking of 44 CU with 4 redundant. I am not sure which sales better later on once these SoC get the 5 nm treatment. I am assuming that it'll cheaper with 36 CU at 2.23 GHz than 40 CU at 2.00 GHz.
 

Ascend

Member
no it was stated that the reason it throttle back is because the games dont use as much CPU, why have a high clock just to be idle?

“Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. It makes perfect sense as most game engines right now are architected with the low performance Jaguar in mind - even a doubling of throughput (ie 60fps vs 30fps) would hardly tax PS5's Zen 2 cores. However, this doesn't sound like a boost solution, but rather performance profiles similar to what we've seen on Nintendo Switch. "Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.”

current games use GPU to calculate things that used to be calculated on CPU, for example around the time uncharted 4 was released the devkits for PS4 were updated to calculate physics on GPU by default that means more free time for the jaguar CPU so naturally it doesn't require as much on a more powerful CPU and you can throttle back the clocks it doesnt imply you cannot use both at max clock in fact it was stated

"There's enough power that both CPU and GPU can run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

and yes if they run at that speed is because they will use a workload that requires it otherwise they can simply down clock, what is the point of having a high clock if you will be idle? if you can downclock do it
Assuming all you just said is true... I have the following questions for you;

  • Where is what Cerny talked about, regarding the PS5 doing things differently? Because what you just described is how basically every phone, laptop and PC out there works, and he was specific that the PS5 works differently.
  • Where does the power limit come in? Cerny was quite clear about workloads and the power limit. You didn't mention that anywhere.
  • If the console could handle both the CPU and GPU at max load, why would developers have to choose a profile to throttle the CPU to ensure the GPU runs at 2.23 GHz?
  • If developers prefer non-variable clocks for optimization, why have variable clocks if the console can reach the max clocks at max workloads at all times anyway?

Good luck.
 
Last edited:
Assuming all you just said is true... I have the following questions for you;

  • Where is what Cerny talked about, regarding the PS5 doing things differently? Because what you just described is how basically every phone, laptop and PC out there works, and he was specific that the PS5 works differently.
  • Where does the power limit come in? Cerny was quite clear about workloads and the power limit. You didn't mention that anywhere.
  • If the console could handle both the CPU and GPU at max load, why would developers have to choose a profile to throttle the CPU to ensure the GPU runs at 2.23 GHz?
  • If developers prefer non-variable clocks for optimization, why have variable clocks if the console can reach the max clocks at all time anyway?

Good luck.
RIght? They make it so easy to deconstruct the BS.
 

ethomaz

Banned
Saw this on NPC Era, thought it was worth a quote:


Soprano said:
Just a little notice. The concern and FUD you are seeing around PS5 news lately is being done on purpose. Some of it is planned in a certain discord.

This thread heading to the dumps is apart of it i presume. There's this sad attempt to make PS5 look like it's going to fail and experience another PS3 generation.

That is all. Bye.
That is know for while already.
ERA Xbox fans made a discord group to spread fake news about PS5 and that inclusive the famous GitHub leaks :messenger_tears_of_joy:
I can list some members but it is not worthy.
 
Last edited:
Top Bottom