• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

-Arcadia-

Banned
How many of you are content with having "Next Generation" games be native 4k, 30FPS with RTX on?
I think many will end up like that and we've already seen proof of it happening. (Assassin's Creed Valhalla).

That, frankly, is well beyond what I was expecting with ray-tracing in the picture. I would be ecstatic.

I really liked the GT7 demo for that. All that ray-tracing in use, and what appeared to be 4K and 60 FPS. I was additionaly fascinated to learn that the ray-tracing was lower resolution, because I couldn't tell, and developers making use of smart tricks like that, and others, bodes really well for wide-spread use without harsh compromises.
 
Oh absolutely. One of the things I've loved about this generation is that I rarely, if ever noticed issues with a game where it really chugged or hitched. I'm sure it's out there but it at least was WAY less frequent than with the previous generation. So games being locked at least at 30fps is the thing. I want them to be smooth continuously without performance issues. Then the complexity and mechanics expanded and THEN I care about putting more RT or max frame rates, etc.

The problem we have is that the graphics gap is already going to be small when compared to ps4 pro/ xbox one x, and if you double the framerate I doubt that you will get much improvement in fidelity. A lot of games will have a 60fps mode, but the majority won’t. I think having too many options isn’t good and just means none of them are perfect. I would prefer one version that the developers have spent time on making great.
 

Sinthor

Gold Member
That, frankly, is well beyond what I was expecting with ray-tracing in the picture. I would be ecstatic.

I really liked the GT7 demo for that. All that ray-tracing in use, and what appeared to be 4K and 60 FPS. I was additionaly fascinated to learn that the ray-tracing was lower resolution, because I couldn't tell, and developers making use of smart tricks like that, and others, bodes really well for wide-spread use without harsh compromises.

I have been surprised with how much RT has been being shown as well so far. To be honest, when I saw the Minecraft demo for XSX I figured 'that's it, we won't be seeing RT hardly at all this generation,' because of the massive performance hit it showed and that hit was hardly a surprise as RT is DEMANDING. But...from what I've read so far, it seems like devs can get some good bang for the buck from RT while taking only about a 10% performance hit. Which appears to be what we're seeing right now. I'm happy with that as it still seems to be letting them hit 4k at at least 30fps. So, ok....and for games where 60fps is needed, take the RT off- fine with me!

I'm just pleasantly surprised that it seems we'll see a lot more of RT in games than I had expected. Again, as long as they don't make games start chugging thru sections of levels, I'm all for it. Can't wait to see more demos and games from both XSX and PS5 so we can see more of what we can expect!
 
Can someone just react to this post god dammit


KYWmH4k.jpg
 
I hope it doesn't happen. I don't want to pay more and get a pile of old crap I'm not interested in. Let me pay for my MP and cloud saves, leave the rest out. I prefer to pick games I play.

Yes but uf they want ps now to have more sctive scene on the interface they need to added to ps plus rank, out of my whole friend list im the only that has it i think out of 300 psn friends or so, they say “ youvactually pay for that shit”
 

ZywyPL

Banned
That, frankly, is well beyond what I was expecting with ray-tracing in the picture. I would be ecstatic.

I really liked the GT7 demo for that. All that ray-tracing in use, and what appeared to be 4K and 60 FPS. I was additionaly fascinated to learn that the ray-tracing was lower resolution, because I couldn't tell, and developers making use of smart tricks like that, and others, bodes really well for wide-spread use without harsh compromises.

Yeah, investing into CBR development surely pays off now more than anyone could've imagine, I just hope it becomes a standard technique, rather than for example cutting out objects from reflections like R&C does. GT7 not only looks better because of that but, also runs at 60FPS, it's a win-win solution.
 
I hope it doesn't happen. I don't want to pay more and get a pile of old crap I'm not interested in. Let me pay for my MP and cloud saves, leave the rest out. I prefer to pick games I play.
It would be good to have options. I have both services, so if they could be bundled in for a small discount I would say yes please :)
 

sircaw

Banned
I don't think anybody has recently brought up the way PS5 variable frequency mitigates for situations like 'race to idle' and how that is big win compared to fixed frequency, there's a link somewhere in the past posts with a graph showing an example how it's implemented. I was gonna elaborate on this... but gonna hold off till pest control had done their job (come on mods!)... I have no desire to waste my time in twisted fuckedup debates with the 'he/she'!!

Haha, i love it, wonder if they take cats!
 

Bo_Hazem

Banned
I want a stable framerate at 30fps, but with greater resolution and fidelity. I don’t want ps4 pro looking games at 60fps. People who want 60fps should just go to pc, where it can be achieved with higher fidelity.

I'm more for native 4K@30fps over 1800-1440p@60fps checkerboard 4K. But it depends on the game, like last Tom Raider game with open framerates looked identical to the 30fps in visuals. Played GOW in 30fps because it looked better than the open framerates. But for FPS like Borderlands 3 I went for framerates because at resolution mode it was way below 30fps to begin with.

To me, it depends, but for 3rd PV I would rather go for visuals over framerates. For multiplayer/car games/FPS 60fps is a must, just like Uncharted 4 MP mode has 60fps, but at lower resolution.

With OLED though, you gonna suffer with 30fps gaming because of stutter.
 
Last edited:

CrysisFreak

Banned
I'm more for native 4K@30fps over 1800-1440p@60fps checkerboard 4K. But it depends on the game, like last Tom Raider game with open framerates looked identical to the 30fps in visuals. Played GOW in 30fps because it looked better than the open framerates. But for FPS like Borderlands 3 I went for framerates and at resolution mode it actually was way below 30fps to begin with.

To me, it depends, but for 3rd PV I would rather go for visuals over framerates. For multiplayer/car games/FPS 60fps is a must, just like Uncharted 4 MP mode has 60fps, but at lower resolution.

With OLED though, you gonna suffer with 30fps gaming because of stutter/judder.
Tell that Ubisoft lmao.
That's the point. Devs rolling with 30 when in reality 60 is a fckn necessity like For Honor/The Crew.
 

Vae_Victis

Banned
no pdf though

source:
Found the original patent filing: https://www.j-platpat.inpit.go.jp/p0200

There is no English though, so I'm not really sure what this is all exactly about and how it is being presented.
 

Bo_Hazem

Banned
Tell that Ubisoft lmao.
That's the point. Devs rolling with 30 when in reality 60 is a fckn necessity like For Honor/The Crew.

Yeah, I love The Crew 1-2, but that 30fps is kinda nasty. But it already looks like shit in 30fps, imagine with 60fps! I think it's too ambitious for current gen, I hope next gen TC3 will be great!
 

Lunatic_Gamer

Gold Member
Games With “Massive, Detailed Worlds” Will Benefit the Most From PS5 and Xbox Series X SSDs – Dysmantle Dev

“Everything will feel a bit snappier, but that’s already something you’ve sort of gotten used to on PCs with SSDs,” Töyssy said. “But now for the first time, you can rely on that speed being available on a console. There have been studies on how much of loading the player can tolerate before it starts to be annoying. Making it load faster, you can simply add more stuff in shorter time and still provide a good user experience.


“Streaming world content is one of the big things that will get better. You can have a more detailed world as you can stream data faster from mass storage into the GPU. You can also move faster in the game world as the hardware can keep up better. Dysmantle has also a streaming system for the world, but we are probably not exceeding even current-gen capabilities in that regard. Games with massive, extremely detailed worlds will benefit the most.”


 

Xplainin

Banned
So basically, under a fixed clock system the PS5 may have been fixed at maybe 2ghz for GPU and probably a lower CPU clock too and games optimised for that, effectively leaving performance on the table most the time.

With the new variable system games are optimised at locked profiles probably approaching or maybe even at the max limits, let’s go with it and say 2.23ghz and 3.5ghz.

Now under a fixed clock system this would be well beyond the systems cooling capability on worst case scenario code areas and therefore can’t happen, as without being able to lower clocks even temporarily on the fly, the demand at those moments may be too much and so clocks are set lower to account for this.

But on the PS5 the reason it can happen is because game code is not near to fully utilising the GPU/CPU at all times, so games optimised at these much higher clock speeds are able to run at these higher levels and perform much better than a fixed clock version because they have the ability to lower, if needed, at the time of those worst case scenarios.

Now to smartshift and am just trying to clarify here…. if the GPU is being highly utilised and running at max frequency and the CPU wasn’t at that moment (or visa versa), you could transfer more of the available power budget to enable the GPU to run at max, even if it already was, because the workload has increased and is now using more power to stay at max. Where without smartshift I presume it wouldn’t be able to increase that performance at that higher workload and may have had to downclock for the duration of that higher workload without that additional power boost to maintain it? Because prolonged spikes of high utilisation are apparently so rare this is why both should be able to run at max most the time as power can be shifted to each? So in cases where one has spare power, which with game code not being at max utilisation seems like it could be a lot, there’s scope to push these components at max frequency and higher workloads than possible before?

If both GPU and CPU are running at max frequency, this is possible but depends on workload, if the workload increased on both to push the power budget to or beyond the maximum, the component under least load would downclock or maybe even both would downclock. Downclocks could be for a few ms in a frame upwards and very minor (a 10% drop in power results in a few % drop in frequency but theres no reason to jump to 10% power dropped, it could be much less and depend on whats needed to be dropped to meet the power and cooling limit). But this is likely very rare as the loads on both the GPU and CPU are rarely anything approaching max especially at the same time?

So to me the variable clocks and smartshift seem like a very clever solution and together should ensure very high levels of performance not able to be attained from the same part under a fixed clock system….

Maybe someone can correct, add anything, fill in any gaps to this as it’s just me trying to fully understand all aspects of this.
The GPU cant go over 2.23ghz. So if the GPU is already running full tilt, and the CPU is only at say 90% utilization, no the GPU isn't going to get any additional power from smartshift.

This whole thing stems from Sony going with a smaller GPU and having to clock it higher to get the performance they needed.
The power draw on the GPU would be very high, more likely higher than the XSX GPU.
Because of this thermal load it is not feasible from a cooling point of view, to cool it with everything running at full clocks for high periods of time.
So the clocks will be backed off if the game is such that it is demanding for periods of time.
Its not all about the speed of the GPU either.
The XSX GPU would be running at 1825mhz when you have stopped in game and are looking at the sky, and also 1825mhz when you are in a 100 man battle royal with shit flying everywhere. The power draw will be less when you are looking at the sky compared to when you are in the heat of the battle.
So with the PS 5 if you are playing a game like TLOU2, where its quite slow moving, the PS5 will no doubt be able to run full clocks. If however you are playing COD where there is intense fighting and effects on screen, then you are more likely to need to reduce clock speeds. Easy way to think about it is when you are playing your Pro and the games and areas that start the jet engines up, they are the times when the variable clocks might come in to play. What Smartshift will do, is if in these intense areas the GPU is at full tilt, but the CPU is only at 90% utilization, then Smartshift will allow the GPU to stay at full frequency. If the CPU is at full utilization, then the GPU will get it's clocked reduced. No one really knows how much it can go down to.
It also needs to be pointed out that if there is only a brief section of intense gameplay, it might not need to reduce clocks.
As no game is going to running at 100% intense utilization of the APU at all times, then as Cerny said, they will spend most of their time at full frequency.
 

CrysisFreak

Banned
Yeah, I love The Crew 1-2, but that 30fps is kinda nasty. But it already looks like shit in 30fps, imagine with 60fps! I think it's too ambitious for current gen, I hope next gen TC3 will be great!
Yeah the devs of Crew are fckn stupid.
"Hey look our LOD system collapses like a noodle when an object is a bit further away let's include planes so everyone can see how ugly it is lmao"
 

Neo Blaster

Member
I agree. I'd like to see more complexity in the games...and different types of gameplay, more than I'm stuck on "4k 60fps 4EVA!" Really, a ton of games today are basically the same game with different graphics and some variation on gameplay. Not saying they all have to be different, but seeing the power of these boxes result in at least some games with very different mechanics and features would be pretty nice. I'd take that over tons of ray tracing or 60fps with every game any day.
I think you'll get what you want not from power, but from I/O speed, and not from the first next gen games, game design doesn't change that fast. I'm excited about what we'll see later on.
 
Last edited:

Bo_Hazem

Banned
Yeah the devs of Crew are fckn stupid.
"Hey look our LOD system collapses like a noodle when an object is a bit further away let's include planes so everyone can see how ugly it is lmao"

And I had a slightly faster HDD than OEM, and heard lots of complains about cars popping in front of you while at high speeds!

I still love the game but it has lots of flaws, and that RPG shit, especially the one that gives you some perk points to make you superior to new comers, should be gone.
 

Xplainin

Banned
Being sick of users like @SleepDoctor (or @Jon Neu , etc) I am currently in open conversation with the Staff. Until this conversation is over, I will not participate in the forum in the way that I did. Once the discussion with them is over, I will decide if I continue to contribute things to the forum or if I take my suitcase (or my BS as said by my users) and leave. I am sure that these types of users are normal in the forums, but it is something that I am not used to, if it has to be that way then I will always leave. If, on the contrary, measures are taken in this regard, then I will make another decision.
I wouldn't be so sensitive to people questioning you.
It is always going to happen.
Some of the things yo have said are questionable. That could be because English is your second language, or that you are.repeating what you have been told that isn't correct, I don't know.
You said on here that the PS5 SSD can do up to 22gbs, which is incorrect. If you watch the PS5 presentation you will see that he was saying that the hardware decompression block on the PS5 is capable of 22gbs if it's compressed particularly well, not that the SSD can do 22gbs.
Now maybe you got the 22gbs from the PS5 presentation and with English being your second language you misheard what he was saying.
Having someone question you about stuff like that is fair.
 

DrDamn

Member
This whole thing stems from Sony going with a smaller GPU and having to clock it higher to get the performance they needed.

This I kind of agree with, whether it was down to needing to put other stuff on the APU for IO/Sound or related to keeping the CU count in line with the PS4 Pro - probably a combination of both.

Because of this thermal load it is not feasible from a cooling point of view, to cool it with everything running at full clocks for high periods of time.
So the clocks will be backed off if the game is such that it is demanding for periods of time.

This isn't how I understand it. How long the game is demanding for is not a factor in clocks backing off - because for that time period the power draw is the same. So if that fits within the power profile then it can maintain for as long as needed. If you just think about heat then yes, but power no, and the clock speed is dependent on power draw (which is consistent) not heat (which can build up and be related to external factors).

So with the PS 5 if you are playing a game like TLOU2, where its quite slow moving, the PS5 will no doubt be able to run full clocks. If however you are playing COD where there is intense fighting and effects on screen, then you are more likely to need to reduce clock speeds.

This is not a good example. TLOU2 makes my PS4 fan spin up like a jet engine. The GPU usage is high. Just because CoD is a faster game it doesn't mean the GPU usage is higher.
 
The GPU cant go over 2.23ghz. So if the GPU is already running full tilt, and the CPU is only at say 90% utilization, no the GPU isn't going to get any additional power from smartshift.

This whole thing stems from Sony going with a smaller GPU and having to clock it higher to get the performance they needed.
The power draw on the GPU would be very high, more likely higher than the XSX GPU.
Because of this thermal load it is not feasible from a cooling point of view, to cool it with everything running at full clocks for high periods of time.
So the clocks will be backed off if the game is such that it is demanding for periods of time.
Its not all about the speed of the GPU either.
The XSX GPU would be running at 1825mhz when you have stopped in game and are looking at the sky, and also 1825mhz when you are in a 100 man battle royal with shit flying everywhere. The power draw will be less when you are looking at the sky compared to when you are in the heat of the battle.
So with the PS 5 if you are playing a game like TLOU2, where its quite slow moving, the PS5 will no doubt be able to run full clocks. If however you are playing COD where there is intense fighting and effects on screen, then you are more likely to need to reduce clock speeds. Easy way to think about it is when you are playing your Pro and the games and areas that start the jet engines up, they are the times when the variable clocks might come in to play. What Smartshift will do, is if in these intense areas the GPU is at full tilt, but the CPU is only at 90% utilization, then Smartshift will allow the GPU to stay at full frequency. If the CPU is at full utilization, then the GPU will get it's clocked reduced. No one really knows how much it can go down to.
It also needs to be pointed out that if there is only a brief section of intense gameplay, it might not need to reduce clocks.
As no game is going to running at 100% intense utilization of the APU at all times, then as Cerny said, they will spend most of their time at full frequency.

You do not understand how smart shift is supposed to work, so you make incorrect assumptions. You're assuming it's a similar situation as thermal throttling, that in particularly heavy workloads the cpu/gpu downclocks to not raise temperatures above a certain level.
Smartshift moves power around to where it's needed and does so at low single digit milisecond intervals. For reference in 16ms (the time it takes to generate a single frame at 60FPS), smartshift can perform 8 distinct power allocations. The few miliseconds the cpu needs to calculate "stuff" for the next frame, can be done at max clock speed, then when the gpu kicks in to render the image, the shift of power can be done again, if necessary, to run the gpu at max clocks.

And so on, and so on.

I find it interesting that XSX while having rdna 2 and zen 2 capabilities, does not use smartshift. They went with a more traditional design, which has me wonder how they will cope with heat and if thermal throttling will happen at points.
 

azertydu91

Hard to Kill
This I kind of agree with, whether it was down to needing to put other stuff on the APU for IO/Sound or related to keeping the CU count in line with the PS4 Pro - probably a combination of both.



This isn't how I understand it. How long the game is demanding for is not a factor in clocks backing off - because for that time period the power draw is the same. So if that fits within the power profile then it can maintain for as long as needed. If you just think about heat then yes, but power no, and the clock speed is dependent on power draw (which is consistent) not heat (which can build up and be related to external factors).



This is not a good example. TLOU2 makes my PS4 fan spin up like a jet engine. The GPU usage is high. Just because CoD is a faster game it doesn't mean the GPU usage is higher.
Quick question about the jet engine sound... Does it make this sound because you do not take care of your ps4 or did I just got a lucky batch ?
Because mine is absolutely silent regarding of what I'm playing can it be because it is a launch model that I took good care of.
I once a year use a Q-tip to dust of the horizontal ventilations and it is on an open air furniture.
I know I'm apparently part of the minority here with a silent ps4 so that's why I'm wondering if it could just be a lucky batch.
 

DrDamn

Member
Quick question about the jet engine sound... Does it make this sound because you do not take care of your ps4 or did I just got a lucky batch ?
Because mine is absolutely silent regarding of what I'm playing can it be because it is a launch model that I took good care of.
I once a year use a Q-tip to dust of the horizontal ventilations and it is on an open air furniture.
I know I'm apparently part of the minority here with a silent ps4 so that's why I'm wondering if it could just be a lucky batch.

Little bit of column A, little bit of column B :messenger_grinning: . Mine is a launch model regular PS4. It's in an open TV stand, but I don't clean it religiously, just a few times. Last time I did a good clean there was something of an improvement, but it reverted back to jet engine fairly soon after - so probably build up. On the flip side it does let me know when it's working hard so I know which games are intensive :D
 
Yeah, only reason PS4 Pro and One X were released was because there was a high demand for 4K gaming, I don't see the purpose of a mid-gen refresh for the PS5 or Series X, both consoles will be more than capable of 4K gaming and I don't see 8K gaming on the horizon anytime soon.

Sony doesn't need to release a mid-gen refresh either, just to meet a small deficit in compute performance, doesn't make sense to me. We have true generational leaps with next-gen consoles, including the CPU/GPU/SSD.

But this is just my 2 cents, we'll have to wait and see what happens.
No, the mid gen refresh was to prevent people from moving to PC and they were successful. I didn't upgrade my PC because PS4 pro provided me with higher res and more stable performance. For a mid range PC guy like me PS4 pro was good.

If PS5 pro provides 4k60 all the time then I will buy it. I believe Sony and Microsoft sees this. They are definitely already working on them and probably waiting for RDNA3 and New Zen CPUs for efficient power consumption. 3D stack seems to be an option for Sony to experiment on PS5 pro before moving to PS6.
 
You do not understand how smart shift is supposed to work, so you make incorrect assumptions. You're assuming it's a similar situation as thermal throttling, that in particularly heavy workloads the cpu/gpu downclocks to not raise temperatures above a certain level.
Smartshift moves power around to where it's needed and does so at low single digit milisecond intervals. For reference in 16ms (the time it takes to generate a single frame at 60FPS), smartshift can perform 8 distinct power allocations. The few miliseconds the cpu needs to calculate "stuff" for the next frame, can be done at max clock speed, then when the gpu kicks in to render the image, the shift of power can be done again, if necessary, to run the gpu at max clocks.

And so on, and so on.

I find it interesting that XSX while having rdna 2 and zen 2 capabilities, does not use smartshift. They went with a more traditional design, which has me wonder how they will cope with heat and if thermal throttling will happen at points.
The XSX has significantly more CU's at a lower clock rate than PS5. Seems like a better design for getting more performance at sustained performance,while also producing less heat. It explains how XSX is all most 2 tf more powerful, yet smaller, and very quite.
 
Last edited:

azertydu91

Hard to Kill
Little bit of column A, little bit of column B :messenger_grinning: . Mine is a launch model regular PS4. It's in an open TV stand, but I don't clean it religiously, just a few times. Last time I did a good clean there was something of an improvement, but it reverted back to jet engine fairly soon after - so probably build up. On the flip side it does let me know when it's working hard so I know which games are intensive :D
Nice I remember mine being loud only one time when there was a bug in the Rocket League menu once fixed no problem anymore neither on TLOU2 nor GOW nor any game.

And I will sell it just before getting a ps5 it would've been a flawless console for me (even the controller I banged hard against a wall playing VR still works great.

It's like the complete opposite of the Switch experience I had which lead me to sell it and maybe wait for a refresh if Nintendo fix their joycons one day.
 

Xplainin

Banned
You do not understand how smart shift is supposed to work, so you make incorrect assumptions. You're assuming it's a similar situation as thermal throttling, that in particularly heavy workloads the cpu/gpu downclocks to not raise temperatures above a certain level.
Smartshift moves power around to where it's needed and does so at low single digit milisecond intervals. For reference in 16ms (the time it takes to generate a single frame at 60FPS), smartshift can perform 8 distinct power allocations. The few miliseconds the cpu needs to calculate "stuff" for the next frame, can be done at max clock speed, then when the gpu kicks in to render the image, the shift of power can be done again, if necessary, to run the gpu at max clocks.

And so on, and so on.

I find it interesting that XSX while having rdna 2 and zen 2 capabilities, does not use smartshift. They went with a more traditional design, which has me wonder how they will cope with heat and if thermal throttling will happen at points.
It directs power between the CPU and GPU. How it works is on the PS5 reveal, it is what it is.
The XSX doesn't need Smartshift. Sony have only used it because they have thermal issues with running their APU at peaks loads.
As MS have said, the XSX will run at full speeds all day every day. It won't throttle.
 
It directs power between the CPU and GPU. How it works is on the PS5 reveal, it is what it is.
The XSX doesn't need Smartshift. Sony have only used it because they have thermal issues with running their APU at peaks loads.
As MS have said, the XSX will run at full speeds all day every day. It won't throttle.
My understanding is that it can't run at full speeds for both the gpu and cpu at the same time.
 

sircaw

Banned
It directs power between the CPU and GPU. How it works is on the PS5 reveal, it is what it is.
The XSX doesn't need Smartshift. Sony have only used it because they have thermal issues with running their APU at peaks loads.
As MS have said, the XSX will run at full speeds all day every day. It won't throttle.

This does not sound right..

Taken from Amd website in regards to their laptops

"AMD SmartShift technology dynamically shifts power in your laptop to help boost performance for gaming, video editing, 3D rendering, content creation and productivity.

Get up to 14% EXTRA Performance with SmartShift Enabled¹

A new interface within AMD Radeon Software Adrenalin 2020 Edition makes it easy to see how power is being shifted to the CPU and GPU.

Unlike other implementations, AMD SmartShift can boost both components during the same workload."


It sounds to me it would improve Microsofts design if they have it.

This might be their laptops at amd but whats good for the goose is good for the gander surly.

but i am a simple, non technical person, perhaps people with more know how can chip in.
 
Last edited:

ToadMan

Member
It directs power between the CPU and GPU. How it works is on the PS5 reveal, it is what it is.
The XSX doesn't need Smartshift. Sony have only used it because they have thermal issues with running their APU at peaks loads.
As MS have said, the XSX will run at full speeds all day every day. It won't throttle.

MS have identical thermal issues - they've implemented a different solution than Sony to address those issues.
 

Xplainin

Banned
My understanding is that it can't run at full speeds for both the gpu and cpu at the same time.
It can if the its not a demanding scene.
The heat is generated from the load on APU. if there isn't much load, say its a 2D game like Ori, the PS5 would be able to run both clocks at full speed not a problem.
If the scene is taxing, lots of polys, lots of AI, alpha effects etc etc, then the heat generated is more.
So in a COD game where all he'll is breaking loose, then that is where the CPU and GPU wont be at full speed, which is ironic as thats the exact scenario that you nees the extra compute power.
That's how I have had it explained to me.
If I'm wrong, I guess we will find out down the track.
 

Xplainin

Banned
MS have identical thermal issues - they've implemented a different solution than Sony to address those issues.
Its only an issue if you cant cool it. Sure, if MS reduced the size if the fan and heat sink and reduced the amount of air vents, then yeah it would be a problem.
Sonys APU will most likely generate more heat than the XSX, and so Sony has a bigger thermal issue to address. They needed to add variable clocks to help fix theirs.
So not identical issues.
 

DrDamn

Member
It directs power between the CPU and GPU. How it works is on the PS5 reveal, it is what it is.
The XSX doesn't need Smartshift. Sony have only used it because they have thermal issues with running their APU at peaks loads.
As MS have said, the XSX will run at full speeds all day every day. It won't throttle.

If MS used SmartShift it could run at faster speeds than it currently does. That's how it works. Why would you not want it?
 
It can if the its not a demanding scene.
The heat is generated from the load on APU. if there isn't much load, say its a 2D game like Ori, the PS5 would be able to run both clocks at full speed not a problem.
If the scene is taxing, lots of polys, lots of AI, alpha effects etc etc, then the heat generated is more.
So in a COD game where all he'll is breaking loose, then that is where the CPU and GPU wont be at full speed, which is ironic as thats the exact scenario that you nees the extra compute power.
That's how I have had it explained to me.
If I'm wrong, I guess we will find out down the track.

Ouch. So indie games won't be taxing enough, but AAA games will be too demanding most of the time for the PS5 to run both GPU and CPU at full speed?

I remember reading Brad Sams talking about this exact thing and he said most devs will chose to push the gpu at full speed and that the CPU side would suffer
 

Thirty7ven

Banned
It can if the its not a demanding scene.
The heat is generated from the load on APU. if there isn't much load, say its a 2D game like Ori, the PS5 would be able to run both clocks at full speed not a problem.
If the scene is taxing, lots of polys, lots of AI, alpha effects etc etc, then the heat generated is more.
So in a COD game where all he'll is breaking loose, then that is where the CPU and GPU wont be at full speed, which is ironic as thats the exact scenario that you nees the extra compute power.
That's how I have had it explained to me.
If I'm wrong, I guess we will find out down the track.


Ouch. So indie games won't be taxing enough, but AAA games will be too demanding most of the time for the PS5 to run both GPU and CPU at full speed?

I remember reading Brad Sams talking about this exact thing and he said most devs will chose to push the gpu at full speed and that the CPU side would suffer


You guys are hilarious. It's page 2393 and you're still snowballing this shit.
 
If MS used SmartShift it could run at faster speeds than it currently does. That's how it works. Why would you not want it?
I assume its better to run both the gpu and cpu at full capacity with predictability. MS states it would create unbalance within the console and is more difficult for devs. MS does have wiggle room to up the clocks though according to DF and others.
 
Last edited:

DrDamn

Member
I assume its better to run both the gpu and cpu at full capacity with predictability. MS states it would create unbalance within the console and is more difficult for devs.

It's not full capacity if it can go faster though is it? Shifting power from CPU to GPU or vice versa if the other doesn't need it is the very definition of balance.

Consider these situations where you are hitting the limits and the overload is slight:

A) CPU overloaded, GPU not overloaded
B) CPU not overloaded, GPU overloaded
C) CPU and GPU overloaded

Each would cause frame-rate dips.

For a Smartshift APU
For A) and B) it would shift power to boost the overloaded CPU/GPU and automatically manage the problem.
For C) you can optimise either the CPU or GPU load to manage the problem.

For a fixed clock APU
For A) you need to optimise for CPU
For B) you need to optimise for GPU
For C) you need to optimise for CPU and GPU

Don't think of PS5 vs XSX. It's not about PS5 vs XSX, it's whether the tech improves a machine which doesn't use it.
 

ToadMan

Member
Apparently it depends on the load. So it depends!

You said you understood that the PS5 could not run at max cpu/gpu clock simultaneously.

So that was misunderstanding on your part - glad you cleared that up. Let's not hear it again eh?

"Load" is a different matter and is entirely deterministic from a developer's perspective - so it doesn't "depend" it's entirely measurable and within software control.
 
Ouch. So indie games won't be taxing enough, but AAA games will be too demanding most of the time for the PS5 to run both GPU and CPU at full speed?

I remember reading Brad Sams talking about this exact thing and he said most devs will chose to push the gpu at full speed and that the CPU side would suffer
Few questions and one answer..
Who is this Sams the wise?
Does he have the game dev or tech chops to really tell us what devs will choose or how Smartshift really works?
I think not.
 
All i know is XSX has all most a 2tf power advantage at sustained performance,which is superior to variable clocks in terms of performance and ease of development.

Other than that, i have a some armchair devs spinning things in favor of the weaker PS5. I'll leave it at that.
 
Last edited:

sinnergy

Member
This does not sound right..

Taken from Amd website in regards to their laptops

"AMD SmartShift technology dynamically shifts power in your laptop to help boost performance for gaming, video editing, 3D rendering, content creation and productivity.

Get up to 14% EXTRA Performance with SmartShift Enabled¹

A new interface within AMD Radeon Software Adrenalin 2020 Edition makes it easy to see how power is being shifted to the CPU and GPU.

Unlike other implementations, AMD SmartShift can boost both components during the same workload."


It sounds to me it would improve Microsofts design if they have it.

This might be their laptops at amd but whats good for the goose is good for the gander surly.

but i am a simple, non technical person, perhaps people with more know how can chip in.
If you add , those 14% on top of 9 TF , you get 10.26 TF ... coincidence??
 

ToadMan

Member
Its only an issue if you cant cool it. Sure, if MS reduced the size if the fan and heat sink and reduced the amount of air vents, then yeah it would be a problem.
Sonys APU will most likely generate more heat than the XSX, and so Sony has a bigger thermal issue to address. They needed to add variable clocks to help fix theirs.
So not identical issues.

No... Sony and MS both have to determine power usage and cooling vs performance for their systems and they have taken different approaches to achieving that.

Sony aren't using variable clocks to solve a thermal "issue" - they are using power allocation (new tech) and variable clocks (old tech) to achieve greater performance from their hardware with attendant cooling hardware.

By suggesting Sony have a thermal issue and have used variable clocks to solve it, implies the Sony engineers are incompetent - 4 generations of gaming consoles and multitudes of successful consumer electronic devices stand against your claim.

Variable clocks have been around in the PC domain for 15 years or more now - it's not some dark unknown magic.

Power allocation is the new technology only just being rolled out by AMD and Nvidia - and that's what Sony have chosen to exploit to gain more performance. MS didn't use this new tech and so are leaving (<10% according to Nvida) performance "on the table".
 
Last edited:
Status
Not open for further replies.
Top Bottom