• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

SSfox

Member
Made a thread for it haha let's see if there are Little Boy or Fat Man size explosions.

Edit: Forgot to link the thread, so there it goes;

So we've Crytec another 3rd party on long list that got paid by Sony to say PS5 won't suck

wOwORSav_400x400.jpg
 

PaintTinJr

Member
Overall in moat scenarios it is indeed the case, but with a shared bus unless you can afford to properly alternate accesses to both fast and slow “pools” you will be losing quite a bit of bandwidth when the GPU needs data in the slow pool if the pools cannot be accessed in parallel which is possibly not something happening often at launch, but it does add complications if you already store more in RAM as you need a bit larger temporary buffer for streaming data from the SSD.
It feels like a conundrum when looking at the balance of compute in each system in regards of usage of memory bandwidth and how the medium unified speed will play-out, against slow, plus fast split bandwidth

My presumption is that Zen2 isn’t what both consoles need exactly – in terms of CPU compute – for gaming, but that it was chosen primarily for it being a modern-ish brawny x64 processor at +3Ghz with 2-way (SMT), and could be cost-effectively incorporated into a HSA console APU by the only viable supplier.

(AFAIK) Beyond the primary and secondary cores, PS5 and XSX game compute demands would be better served in using the remaining Zen2 cores’ chip area for something closer to tensor cores to provide versatility of a CPU core but at efficiency nearer to CUs - like the SPU/Tempest engine. Presumably the overriding need for a familiar/reliable CPU for developers meant in both consoles it was better to just incorporate a tweaked Zen2 rather than do an AMD x64 version of the Cell.

I think Mark Cerny made little comment about the Zen2 in the PS5 because he expects the 6 extra cores to be utilised as optimisation overspill for the main CPU core, and overspill for the GPU GPGPU functions and taking trival stuff off the Tempest engine. Xbox/DF said that developers were favouring the higher clock without SMT on XsX at launch. As the remaining six Zen2 cores, in each system, do (memory intensive) work they aren’t optimally designed for, the more memory bandwidth they take from the GPU. With SMT workloads being more memory bound and less compute bound (IIRC) than when SMT is disabled, it seems that early XsX developers are conscious of the CPU cores overburdening memory bandwidth at the expense of the GPU. That, or the message was to imply better games by higher CPU clocks @ 3.8GHz on XsX.
 

CJY

Banned
I think I've figured out how the variable clocks are a genuinely positive thing for PS5. It will lead to overall better (more stable) performance of games, meaning what's important... Framepacing and framerates can and probably will be rock solid 99.9% of the time on PS5.

Until today, I've remained quite confused about the whole thing, but it's actually genius and quite logical.

Based on the last DF article, the PS5 devkit doesn't boost, but instead has profiles which the dev programs and optimises against. So the devkit has locked clocks at whatever, say a fixed 10.0TF/3.3GHZ. The dev will feel comfortable in optimising right up against this limit with the knowledge that on a production console, the clocks will be variable and they'll boost up to their max frequencies to give the game that extra performance ceiling for those split seconds when the game engine requires it. This is what is meant by "continuous boost", there is clearly no throttling going on here.

This is the "new paradigm" that Cerny mentioned. It's not just a new paradigm in console design, but also a revolution in how devs normally optimise their games.


Let's take the XSX, you have a max fixed clock. You're a dev and you program against the peak theoretical limit of 12.15TF. However, suddenly there is a ton of action on screen when you're playing the game, hitting that GPU limit and there is no more power left so you'll experience stutters and dropped frames. Of course, extensive testing could mitigate all this and for sure, top studios may have the time and budget to do so, but this system is more difficult and more prone to dropped frames.

This isn't a slight on XSX... just that a game will never actually optimise up to the peak theoretical limit of the GPU, because there will always need to be some left over.


Bottom line, I think this optimisation phase of development is where the term "lazy developers" comes from... It's the spit and polish, it's what DF is testing for mainly... how steady is the framerate and how optimised is the game engine for the platform to be able to deliver a consistent frame time. With the PS5, it would appear the boost/variable clocks are there in the console to compensate for those instances that a dev did not foresee when developing their game and have that extra performance overhead to maintain that steady performance for us, the end user. It simply saves time and headaches. It really is a new paradigm and it's really ingenius.

Where this new way of optimising games will come in most importance is for VR where a steady, high framerate is absolutely key to reduce the chance of motion sickness.

PS5, to me, screams "steady, consistent, performance". Absolutely contrary to what the "variable clocks" makes it sound like. That is the true genius of it.
 

UrgeLoLUS

Neo Member
I get that most of the forum is fueled by the fanboy wars, but I kinda find it interesting that we are ending up with to powerful consoles, based on the same arcitechture, and at the same time enough difference that digital foundries comparison videoes will be interesting.
The consoles are in the rtx 2070-2080 range, and with the efficiency we see in consoles I can only dream about what results we will get in 2-3-4 years. The consoles is so close in power that they will be 99% the same in multiplattform titles.
 

BGs

Industry Professional
So.... do you agree with him?

If you don't want to answer I have a question regarding PSVR2. Not about details.

I am not an engineer. Therefore I cannot comment directly on everything.

I do not know if everything he says is true, but on the one hand it coincides with what I have mostly known, either from my direct side or from my company colleagues or colleagues in the sector with whom I have been able to talk. Therefore, I would venture to say that, if it coincides in the parts that I know, I see no reason to distrust the parts that I do not know.

One thing is clear. Under NDA one cannot speak for good or for bad. If this is the case I do not know. But it could very well be the case.

One thing should be clear for the future, being a professional in the sector does not mean knowing everything about the sector. Each one has his work and his specialties. There are things that are known for what you touch, and things that are known for what your colleagues touch. Then there is the part that is known from what other professionals tell you, who may have different DKs than yours and their own personal experiences. But the general opinion of impartial developers is usually the same and is what I share.

I doubt that Ali Salehi did an interview to lie. I think he has come out to tell the truth and has sinned innocently as I did. I don't think he deliberately compromised his reputation and that of his company. That would be stupid for a direct developer. Likewise I do not know him.

Now, I'm sure he has access to more advanced DKs than we did. And from here to the consoles come out, everything can change to one side or the other. But if they do not touch anything and leave everything as it is, I agree with him, both for knowing some things first hand and for having heard others comment on a colleague. And some things he says I did not know. But I imagine they will be true. I don't have a colleague on hand right now to corroborate it. Maybe some other neutral GAF or ERA Insider can shed more light than I do right now.

Right now we have put aside the development in the new consoles to focus on the fight against Covid-19 from the place that allows us our position collaborating with the pharmaceutical companies and trying to collaborate with the WHO as we are allowed.
 
Last edited:

ZywyPL

Banned
I think I've figured out how the variable clocks are a genuinely positive thing for PS5. It will lead to overall better (more stable) performance of games, meaning what's important... Framepacing and framerates can and probably will be rock solid 99.9% of the time on PS5.

Until today, I've remained quite confused about the whole thing, but it's actually genius and quite logical.

Based on the last DF article, the PS5 devkit doesn't boost, but instead has profiles which the dev programs and optimises against. So the devkit has locked clocks at whatever, say a fixed 10.0TF/3.3GHZ. The dev will feel comfortable in optimising right up against this limit with the knowledge that on a production console, the clocks will be variable and they'll boost up to their max frequencies to give the game that extra performance ceiling for those split seconds when the game engine requires it. This is what is meant by "continuous boost", there is clearly no throttling going on here.

This is the "new paradigm" that Cerny mentioned. It's not just a new paradigm in console design, but also a revolution in how devs normally optimise their games.


Let's take the XSX, you have a max fixed clock. You're a dev and you program against the peak theoretical limit of 12.15TF. However, suddenly there is a ton of action on screen when you're playing the game, hitting that GPU limit and there is no more power left so you'll experience stutters and dropped frames. Of course, extensive testing could mitigate all this and for sure, top studios may have the time and budget to do so, but this system is more difficult and more prone to dropped frames.

This isn't a slight on XSX... just that a game will never actually optimise up to the peak theoretical limit of the GPU, because there will always need to be some left over.


Bottom line, I think this optimisation phase of development is where the term "lazy developers" comes from... It's the spit and polish, it's what DF is testing for mainly... how steady is the framerate and how optimised is the game engine for the platform to be able to deliver a consistent frame time. With the PS5, it would appear the boost/variable clocks are there in the console to compensate for those instances that a dev did not foresee when developing their game and have that extra performance overhead to maintain that steady performance for us, the end user. It simply saves time and headaches. It really is a new paradigm and it's really ingenius.

Where this new way of optimising games will come in most importance is for VR where a steady, high framerate is absolutely key to reduce the chance of motion sickness.

PS5, to me, screams "steady, consistent, performance". Absolutely contrary to what the "variable clocks" makes it sound like. That is the true genius of it.


That's what I though about as well some time ago:

 
Thank you BGs BGs , for your response.

The question I have regarding PSVR2 is this :
To what extent PS5's SSD will impact PSVR2 games. It will benefit more flat games than the VR games? the opposite or the same? Eye-tracking, for example, will benefit much from SSD?
 
Last edited:
  • Like
Reactions: CJY

CJY

Banned
That's what I though about as well some time ago:

Cool man, I missed that cos I was banned for a few days. It has to be exactly what is going on. It's the only justification for the variable clocks and the information is all out there anyway, it was just difficult to get the head around before we got all the info. Now knowing it all... It seems like a really smart and brilliant idea, don't you think?

The PS5 seems to be placing a huge focus on allowing games to go from concept to reality as quickly as possible, with as few bottlenecks and restrictions as possible.

Letting devs reach the peak potential of the console as easily as possible. It sounds like an absolute dream to develop on to be honest. I can see why there are reports of devs loving it.
 

Gediminas

Banned
An SSD always benefits in all aspects of its usefulness.
i have other question if you will to respond :)

what do you think, is Sony's new SSD is overkill for this generation? or do you think there was enough xbox level of SSD for this generations games(this generation as PS5 and xboxsx)? if we take into account all bottlenecks and new technologies.
 
Last edited:

Disclaimer: Don't quote me to tell me any kind of BS, if you dislike the info, call Sony :messenger_smirking:
Breaking News: Electronics generate heat, company responds by using cooler.

That reeks of someone realizing they put out BS and covering it with "oh but they'll have it solved by launch". I mean, how you gonna prove them wrong? lol
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
I think I've figured out how the variable clocks are a genuinely positive thing for PS5. It will lead to overall better (more stable) performance of games, meaning what's important... Framepacing and framerates can and probably will be rock solid 99.9% of the time on PS5.

Until today, I've remained quite confused about the whole thing, but it's actually genius and quite logical.

Based on the last DF article, the PS5 devkit doesn't boost, but instead has profiles which the dev programs and optimises against. So the devkit has locked clocks at whatever, say a fixed 10.0TF/3.3GHZ. The dev will feel comfortable in optimising right up against this limit with the knowledge that on a production console, the clocks will be variable and they'll boost up to their max frequencies to give the game that extra performance ceiling for those split seconds when the game engine requires it. This is what is meant by "continuous boost", there is clearly no throttling going on here.

This is the "new paradigm" that Cerny mentioned. It's not just a new paradigm in console design, but also a revolution in how devs normally optimise their games.


Let's take the XSX, you have a max fixed clock. You're a dev and you program against the peak theoretical limit of 12.15TF. However, suddenly there is a ton of action on screen when you're playing the game, hitting that GPU limit and there is no more power left so you'll experience stutters and dropped frames. Of course, extensive testing could mitigate all this and for sure, top studios may have the time and budget to do so, but this system is more difficult and more prone to dropped frames.

This isn't a slight on XSX... just that a game will never actually optimise up to the peak theoretical limit of the GPU, because there will always need to be some left over.


Bottom line, I think this optimisation phase of development is where the term "lazy developers" comes from... It's the spit and polish, it's what DF is testing for mainly... how steady is the framerate and how optimised is the game engine for the platform to be able to deliver a consistent frame time. With the PS5, it would appear the boost/variable clocks are there in the console to compensate for those instances that a dev did not foresee when developing their game and have that extra performance overhead to maintain that steady performance for us, the end user. It simply saves time and headaches. It really is a new paradigm and it's really ingenius.

Where this new way of optimising games will come in most importance is for VR where a steady, high framerate is absolutely key to reduce the chance of motion sickness.

PS5, to me, screams "steady, consistent, performance". Absolutely contrary to what the "variable clocks" makes it sound like. That is the true genius of it.

That makes a lot of sense.
 

SatansReverence

Hipster Princess
Really? You're using a link to an article about a debunked rumor?

That article is every bit as valid as the now canned article from mister "has no experience developing for XSX", "states several demonstrably false claims", "has twitter and insta plastered with pro sony posts", "miraculously hits every one of team sonys wet dreams", "Has textbook definition of FUD with every 'maybe' 'perhaps' 'might not' 'possibly won't' claim that could exist regarding XSX".

So yes, if that dumpster of an article gets posted, the in no way debunked rumor article is every bit as valid.
 

Neo Blaster

Member
That article is every bit as valid as the now canned article from mister "has no experience developing for XSX", "states several demonstrably false claims", "has twitter and insta plastered with pro sony posts", "miraculously hits every one of team sonys wet dreams", "Has textbook definition of FUD with every 'maybe' 'perhaps' 'might not' 'possibly won't' claim that could exist regarding XSX".

So yes, if that dumpster of an article gets posted, the in no way debunked rumor article is every bit as valid.
Whoa, calm down, the way you talk makes it seems the guy cursed your mother. Are you ok?
 

Neo Blaster

Member
Funny, your entire post history makes it seem like the XSX being better cursed your whole family :messenger_tears_of_joy:
If you REALLY had looked at my post history, you would have known I always wanted both consoles to be the same, I was even ok if MS would have a slight edge(what turned out to be true). It's Xbox fanboys who try to push this narrative that there's this huge gap between SX and PS5, when they just have different approaches to reach their goals.
 

BGs

Industry Professional
i have other question if you will to respond :)

what do you think, is Sony's new SSD is overkill for this generation? or do you think there was enough xbox level of SSD for this generations games(this generation as PS5 and xboxsx)? if we take into account all bottlenecks and new technologies.
I think the XsX SSD is sufficient for a closed system, and the SSD is not exactly bad, it is very good for its purpose. But the PS5 SSD goes one step further, allowing you to use it in different ways. If it will be used and how it will be used, only time will tell. Only time...
 

IntentionalPun

Ask me about my wife's perfect butthole
*snip*
The dev will feel comfortable in optimising right up against this limit with the knowledge that on a production console, the clocks will be variable and they'll boost up to their max frequencies to give the game that extra performance ceiling for those split seconds when the game engine requires it.

*snip*
Let's take the XSX, you have a max fixed clock. You're a dev and you program against the peak theoretical limit of 12.15TF. However, suddenly there is a ton of action on screen when you're playing the game, hitting that GPU limit and there is no more power left so you'll experience stutters and dropped frames.

Huh? I don't think this is anywhere close to accurate; the PS5 doesn't save frequency until the "game needs it." It scales down the frequency during certain heavy workloads, it otherwise runs at max.

The XSX just always runs at max.. you are describing these scenarios quite strangely. Why would a dev with a constant 12.1 code their game for dropped frames? During testing they'd scale it back; same with the PS5.. there is no boost waiting for "when it's needed"; if anything the PS5 will scale down when it's needed most and devs will have to account for that (but I don't think it's some huge problem or anything, it's just a lower perf overall GPU with some advantages during workloads that don't scale to CUs.)

For both systems devs will run into dropped frames at testing and have to scale things back if they want consistency; neither GPU has some magic tech that stops frames from being dropped.
 
Last edited:
That article is every bit as valid as the now canned article from mister "has no experience developing for XSX", "states several demonstrably false claims", "has twitter and insta plastered with pro sony posts", "miraculously hits every one of team sonys wet dreams", "Has textbook definition of FUD with every 'maybe' 'perhaps' 'might not' 'possibly won't' claim that could exist regarding XSX".

So yes, if that dumpster of an article gets posted, the in no way debunked rumor article is every bit as valid.

Why all that hate, my friend? :)

This Crytek dev is just saying the same thing that Jason Scherier has already reported from 3 diferents 3rd party devs/sources.

Lets listen him again:

 

IntentionalPun

Ask me about my wife's perfect butthole
A quote from Liabe Brave of era regarding the memory of XSX:


Don't kill me I am just quoting
Yeah it's unified memory.. that's how it works. CPU and GPU contend with access to the unified pool (as do things like 3d audio processors and any processor on board that needs RAM.)
 
Last edited:

CJY

Banned
Huh? I don't think this is anywhere close to accurate; the PS5 doesn't save frequency until the "game needs it." It scales down the frequency during certain heavy workloads, it otherwise runs at max.

The XSX just always runs at max.. you are describing these scenarios quite strangely. Why would a dev with a constant 12.1 code their game for dropped frames? During testing they'd scale it back; same with the PS5.. there is no boost waiting for "when it's needed"; if anything the PS5 will scale down when it's needed most and devs will have to account for that (but I don't think it's some huge problem or anything, it's just a lower perf overall GPU with some advantages during workloads that don't scale to CUs.)

For both systems devs will run into dropped frames at testing and have to scale things back if they want consistency; neither GPU has some magic tech that stops frames from being dropped.
Everything you need to know is in my post already. Your conclusion seems to suggest that devs on both consoles will need to do the same thing and optimize in the same way, and that's right to a degree, which I said already, but it's going to be far easier on PS5 because of the new paradigm and the extra ceiling of performance that will be present because of the continuous boost.

And anyway, if it's so easy, they why do we have DF testing for this with every game and finding a game that runs solid 60fps without dropped frames such a rarity? It's because it's hard to account for every situation in a game from moment-to-moment.

Look, I'm not saying we'll never see a dropped frame on PS5, but the way it's set up and the devkit is set up, it should be a rarity for there to be any dropped frames... the complete opposite to what we experience in games today.


An example where we'll likely still see dropped frames would be a sandbox open-world game where you can do lots and lots of crazy things that pushes the limits of the engine and what the developer intended... Like Just Cause.

For linear game where things are more controlled, I think we're never going to see a dropped frame. Dynamic Resolution Scaling will also help but the continuous boost is what will eradicate the issue in 99.9% of scenarios like I said.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Everything you need to know is in my post already.

Your post describes the PS5 wrong. Everything you need to know about why you were wrong is in my post, I suggest maybe you read it.,

And anyway, if it's so easy, they why do we have DF testing for this with every game and finding a game that runs solid 60fps without dropped frames such a rarity? It's because it's hard to account every situation in a game from a moment-to-moment.

I didn't say it's easy; I said it's the same on a locked setup vs the PS5 GPUs setup. Developers have to discover frame drops and choose whether to scale back those scenes, there is no magic tech on the PS5 that boosts frequencies to fix framerate.

Look, I'm not saying we'll never see a dropped frame on PS5, but they way it's set up and the devkit is set up, it should be a rarity for there to be any dropped frames... the complete opposite to what we experience in games today.

And I'm saying you are flat out wrong, describing the tech incorrectly. What in the world in the PS5 is going to stop frames from being dropped? You appear to have a 100% backwards understanding of it's clocks. They aren't sitting around idling when not needed lol, if anything they are at max when they are needed the least, and have to scale back when needed the most (heavy workloads.)
 
Yeah it's unified memory.. that's how it works. CPU and GPU contend with access to the unified pool (as do things like 3d audio processors and any processor on board that needs RAM.)
That was not exactly my point I mean the console only can access to one type of memory in the same time, so if you want to check something related to OS of the game you need to
stop reading from the faster memory and then start with the slow, so yes part of the memory is faster but in the practice both memory needs to be read each second not only the fastest so
you real average speed per second will be below.

Another quote from Liabe (I love the way so simple this guy describe a complicated thing):
And a 2080Ti has 11GB. Say the CPU only needs 30% of total RAM transfer time, and GPU 70%. But then GPU memory usage tips over to 10.5GB. While that's the case, the average theoretical bandwidth for XSX would be ~480 GB/s. Versus 448GB/s for PS5 in the same scenario.
 
Last edited:

CJY

Banned
Your post describes the PS5 wrong. Everything you need to know about why you were wrong is in my post, I suggest maybe you read it.,

I'm not describing PS5 wrong. Care to elaborate on where I'm wrong? And what makes you think the GPU runs at max? Until recently everybody was doubting if PS5 could even sustain those clocks. You mentioned workload... If the game's workload for a particular scene only needs 90% of the GPU, then why would it run at full tilt?

I didn't say it's easy; I said it's the same on a locked setup vs the PS5 GPUs setup. Developers have to discover frame drops and choose whether to scale back those scenes, there is no magic tech on the PS5 that boosts frequencies to fix framerate.

Yep, it's not easy, and it has never been easy, but with the PS5's power profiles, it will be substantially easier and there's breathing room left over to boost the clocks past what they've optimised for to account for unforeseen spikes in GPU requirement.

EDIT: the job for the dev in trying to eliminate dropped frames will be the same but the result will be different. Meaning, PS5 continuous boost variable clocks are designed to help maintain game performance and whereas on traditional consoles, there's more chance for there to be nothing "left in the tank" so to speak. All this is also without mentioning SmartShift which will give even more power to the GPU if the CPU is underutilised.

And I'm saying you are flat out wrong, describing the tech incorrectly. What in the world in the PS5 is going to stop frames from being dropped? You appear to have a 100% backwards understanding of it's clocks. They aren't sitting around idling when not needed lol, if anything they are at max when they are needed the least, and have to scale back when needed the most (heavy workloads.)

The boost is what will stop the dropped frames and devs optimizing within the profile they've chosen on the dev kit. It's very simple really. It may not be a panacea and magically make dropped frames disappear. It's not what I'm talking about. It just makes optimization far easier than the old paradigm, an the continuous boost clocks allows for more performance when needed.
 
Last edited:

FeiRR

Banned
So looks like the biggest request from the devs to Xbox was 'For God's sake don't use Windows again on your console' and they just heard "please give me more flops and splitted memory"

I know a joke and it's not console related. But just wait:
A business helicopter is lost amidst NYC skyscrapers. They hover around for a while then the pilot says "okay, I have an idea", So he hovers next to a huge office building with glass windows. You can see people working inside. When the heli hovers next to them, there's some movement. The pilot takes a pen and paper and writes a message, sticking it to the front of the cabin: "Where are we?". You can see movement in the skyscraper, people running around, they also find a pen and paper and they write a message. Then they hold it against the window. It says: "You are in a helicopter". The pilot shakes his head in dismay, yet his passenger shouts: "Okay, I know the way now! This is Microsoft building!"
 

IntentionalPun

Ask me about my wife's perfect butthole
I'm not describing PS5 wrong. Care to elaborate on where I'm wrong? And what makes you think the GPU runs at max? Until recently everybody was doubting if PS5 could even sustain those clocks. You mentioned workload... If the game's workload for a particular scene only needs 90% of the GPU, then why would it run at full tilt?

You are 100% describing it wrong and this paragraph proves it. It's actually the PS5 GPU that manages to use 100% of it's GPU when only 90% is needed not the other way around lol (as does the Xbox, since it's locked.... other variable frequency designs WOULD scale down for a low needs scene, that's not how the PS5 works though)

The PS5 uses a specific amount of power, as in watts. If the workload is low, it's actually going to likely be at 100% frequency for both CPU and GPU. It's when the workload is HIGH that it scales down.

Yep, it's not easy, and it has never been easy, but with the PS5's power profiles, it will be substantially easier and there's breathing room left over to boost the clocks past what they've optimised for to account for unforeseen spikes in GPU requirement.

Look if a dev optimizes for some super low profile and then "hey performance is actually better in real life" could happen; but that isn't some magical thing the PS5 APU is doing and really has little to do with the variable frequencies (other than it making it so devs would rather dev against a locked clock,). Someone could optimize for a GPU weaker than the one in any hardware and then see better real world performance, regardless of whether that APU was doing variable frequencies.

The PS5 GPU does not act like you say it acts either way.

The boost is what will stop the dropped frames and devs optimizing within the profile they've chosen on the dev kit. It's very simple really. It may not be a panacea and magically make dropped frames disappear. It's not what I'm talking about. It just makes optimization far easier than the old paradigm, an the continuous boost clocks allows for more performance when needed.

If this is such an amazing idea why doesnt' MS just let devs underclock their Xbox Series X's? You'd get the same effect without any need to have variable frequencies. You are basically just saying "devs will optimize for lower clocks therefor we will see better framerates in real life" here. Which has little to do with the variable frequency APU that you are describing incorrectly.
 
Last edited:

CJY

Banned
You are 100% describing it wrong and this paragraph proves it. It's actually the PS5 GPU that manages to use 100% of it's GPU when only 90% is needed not the other way around lol (as does the Xbox, since it's locked.... other variable frequency designs WOULD scale down for a low needs scene, that's not how the PS5 works though)
You keep trying to say I'm wrong without actually describing what it is actually doing then. I really don't understand what you're saying either because it flies in the face of what we know about PS5. It sounds like you're saying that the PS5 GPU runs at 100% clocks at all times? Then where do the variable clocks come in then?


The PS5 uses a specific amount of power, as in watts. If the workload is low, it's actually going to likely be at 100% frequency for both CPU and GPU. It's when the workload is HIGH that it scales down.
You keep mentioning or alluding to the fixed power budget or whatever, but this has nothing to with the workload whatsoever. The fixed power budget is put in place to have a predictable thermal envelope and so PS5 can stay quiet no matter what kind of workload its under. Mentioning anything about power budget and watts makes zero sense here.



Look if a dev optimizes for some super low profile and then "hey performance is actually better in real life" could happen; but that isn't some magical thing the PS5 APU is doing and really has little to do with the variable frequencies (other than it making it so devs would rather dev against a locked clock,). Someone could optimize for a GPU weaker than the one in any hardware and then see better real world performance, regardless of whether that APU was doing variable frequencies.

The PS5 GPU does not act like you say it acts either way.

It doesn't have to be "super low". Those tolerances will be up to developer to decide. If there is very little variance in a particular game from frame to frame, they profile higher, if there is a lot of variance, they'll profile lower. The whole point of the continuous boost is to improve performance at runtime. Why else would there be continuous boost at all then? Let me guess: you think it's to try to get closer to XSX's 12TF figure?

If this is such an amazing idea why doesnt' MS just let devs underclock their Xbox Series X's? You'd get the same effect without any need to have variable frequencies. You are basically just saying "devs will optimize for lower clocks therefor we will see better framerates in real life" here. Which has little to do with the variable frequency APU that you are describing incorrectly.

PS5 is not underclocking or overclocked, it's variable depending on workload. You will find that XSX and every GPU has variable clocks. It's not running at full tilt at all times, that's ridiculous. It's why I don't quite understand why Xbox decided to say their clocks are fixed. I think it's because they thought PS5 has boost clocks like normal PC boost and that it can't sustain those peaks. Well, we now know that to not be true and that it's boosting for a different reason.

And it's that very reason that I'm talking about... to make the game performance at runtime better and more consistent for the end user.

I'm really not sure if XSX can adopt this new paradigm. I haven't thought about that. It's a difficult thing to get your head around when PCs have been one way for decades. Maybe they can do it with XSX, but I have no doubt all consoles will be like it in the future. It's just makes too much sense. The old way was just too open-ended and full of unknowns in terms of power draw, heat, and required cooling. The old way won't allow you to get the absolute maximum out of your silicon because of those unknowns.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
You will find that XSX and every GPU has variable clocks. It's not running at full tilt at all times, that's ridiculous.

OK have fun, you are just flat out wrong and this is beyond frustrating. XSX will always run at the same clock frequency because it has locked clocks. WTF?

quite understand why Xbox decided to say their clocks are fixed.

Because they are, in fact, fixed.

Not going to keep repeating myself; you flat out do not understand this topic and you apparently have no interest in listening to someone who does.
 
Last edited:

CJY

Banned
OK have fun, you are just flat out wrong and this is beyond frustrating. XSX will always run at the same clock frequency because it has locked clocks. WTF?



Because they are, in fact, fixed.

Not going to keep repeating myself; you flat out do not understand this topic and you apparently have no interest in listening to someone who does.

Nope, you're absolutely wrong and being taken for a ride by MS PR department. No GPU runs at maximum fixed frequencies at all times. It's wasteful and stupid. I think you need to go and educate yourself.

Edit: what MS means is that's it's capable of fixed, sustained maximum clocks. Which PS5 can also do. PS5 just has a different use for the boost which MS misinterpreted.

Edit 2: Why else would they both revealing XSX has "fixed" clocks at all when every console since the dawn of time has had fixed clocks? They (MS) clearly got a hold of Sony info prior to the reveal.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Nope, you're absolutely wrong and being taken for a ride by MS PR department. No GPU runs at maximum fixed frequencies at all times. It's wasteful and stupid. I think you need to go and educate yourself.

Edit: what MS means is that's it's capable of fixed, sustained maximum clocks. Which PS5 can also do. PS5 just has a different use for the boost which MS misinterpreted.

Except consoles generally use fixed clocks...

I'm well aware that the vast majority of processors on multi-purpose machines (PCs, phones, tablets) use variable frequencies. Consoles historically do, that's why Sony making the PS5 variable (using a unique method not the same as used on PC) is making the news so much.

You seriously need to educate YOURSELF dude, and maybe fuck off a little bit.
 

CJY

Banned
Except consoles generally use fixed clocks...

I'm well aware that the vast majority of processors on multi-purpose machines (PCs, phones, tablets) use variable frequencies. Consoles historically do, that's why Sony making the PS5 variable (using a unique method not the same as used on PC) is making the news so much.

You seriously need to educate YOURSELF dude, and maybe fuck off a little bit.
Nah... you're the one who came in here telling me I'm wrong when I don't see you demonstrating any knowledge whatsoever. I only noticed you earlier shilling for Xbox, so I don't expect you to understand anything anyway.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Nah... you're the one who came in here telling me I'm wrong when I don't see you demonstrating any knowledge whatsoever. I only noticed you earlier shilling for Xbox, so I don't expect you to understand anything anyway.
OK now definitely fuck off, a lot.
 

Kusarigama

Member
Nope, you're absolutely wrong and being taken for a ride by MS PR department. No GPU runs at maximum fixed frequencies at all times. It's wasteful and stupid. I think you need to go and educate yourself.

Edit: what MS means is that's it's capable of fixed, sustained maximum clocks. Which PS5 can also do. PS5 just has a different use for the boost which MS misinterpreted.

Edit 2: Why else would they both revealing XSX has "fixed" clocks at all when every console since the dawn of time has had fixed clocks? They (MS) clearly got a hold of Sony info prior to the reveal.
No, it is fixed at those clock speeds. The power consumption of XSX varies with the workload.

From the examples Mark Cerny gave what I understand is that in certain games even simpler scenes like the map screen of Horizon would cause an increase in the workload resulting in more heat, power demand and fan noise. On PS5, to mitigate such occurences the algorithm they have in place catches such moments where the workload is increasing even when the displayed geometry is very simple and would reduce the frequency of the GPU resulting in more efficient power draw, heat production and noise.


Edit: The smart shift technology is to increase performance not to help with heat management.
 
Last edited:
Status
Not open for further replies.
Top Bottom