• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

onQ123

Member
No. Just no.



Cool, but 12.1 tflops of RDNA 2 > "10.2" tflops of RDNA 2.

I find it funny that some users are trying to act as if the XSX is GCN and PS5 is RDNA2.


10.2TF from a 36CU GPU clocked at 2.23GHz wouldn't get you the same result as 10.2TF from a 44CU GPU clocked at 1.825GHz


Is that hard for you to understand that they would end up having different results in games?
 

Panajev2001a

GAF's Pleasant Genius
Stop!
Both the N64 and GCN launched at $199
They both more powerful and cheaper then both the PS1 and PS2.
The problem was Sony had already sold more then 10-20 million before those consoles even released.
And Sony only match their price
I thought PS1 was already $199 by the time N64 launched, extra tricky for Nintendo...
 

Panajev2001a

GAF's Pleasant Genius
By your complete failure of logic, the Xbox One was better than the PS4 because it had less CU with higher clocks.

There were some things it could do that closed the gap a bit yes... one of them was being able to run fully virtualised and we are seeing the benefits there in terms of BC (see Xbox One X too).

Faster frequency, when not bottlenecked (how you achieve high frequency can hurt actual efficiency, see Pentium 4 for example) makes your job a bit easier when compared to equivalently wider yet slower HW. Not sure when this is being debated as if it were a surprising development.

Other Xbox One problemthat they had less resources even for the shared components such as ROPS and async compute engines (less ACE’s and less queues per ACE which further lowered efficiency). Non ESRAM bandwidth was also a lot less percentage wise even comparing fast XSX memory and regular PS5 memory.
 
Last edited:

BluRayHiDef

Banned
By your complete failure of logic, the Xbox One was better than the PS4 because it had less CU with higher clocks.

It's illogical to compare the difference in power between the base models of the current generation's consoles to the difference in power between the next generation's consoles for one specific reason: only the base model of one of the current generation's consoles was powerful enough to perform at its target resolution, 1080p, and that console was the PlayStation 4. However, next generation, both consoles will be powerful enough to perform at their target resolution, 4K.

Hence, the difference in power will be irrelevant relative to other factors, such as the quality and quantity of first-party titles and the ability to provide revolutionary gaming experiences due to being designed in a way that encourages new game-design philosophies. And it goes without saying which console will do this.
 
I'm sticking to my belief that PS5 will mostly be a 9.2tf console with the ablility to occasionally go higher if devs want to momentarily push above to 10.2 tf for the Michael Bay scenes in games to avoid overheating with that whopping 2.23 GHz. clocks. The variable clocks are there for a reason. PS5 games are simply not going to be running at such a high clock rate of 2.23 90% of the time without serious potential heating/noise issues.

The sad thing is, nobody will ever know probably. It's a smart PR decision when you're console is more like a 9.2tf console that can occasionally be pushed to boost mode hitting the ceiling of 10.tf in the most highest stress areas in a game in rare instances. It allows Sony to claim it's a 10.2tf to lesson the blow vs 12.1tf of sustained performance through out.

The variable clock things is very wishy washy imo.
 

ZywyPL

Banned
10.2TF from a 36CU GPU clocked at 2.23GHz wouldn't get you the same result as 10.2TF from a 44CU GPU clocked at 1.825GHz


Is that hard for you to understand that they would end up having different results in games?

Given the same end computing power (TFlops), sure, that might be the case - a 10,3TF PS5 with 36Cu @2,23Ghz would most likely perform slightly better than a 10,3TF PS5 with 52Cu @ 1500MHz. But as XB1 (vs PS4) with its lesser CU count and higher clock already strongly proved over and over again, at the end of the day the higher the TFlops simply the better. And no one know what at actual average/sustainable clocks for the PS5 are, even Cerny said they "expect" them to be close to the maximum ones most of the time, but expectations vs reality often don't go along. Time will tell.
 

SatansReverence

Hipster Princess
P Panajev2001a


DForce DForce

Apparently you guys need to be clued in on how the flop performance difference between the XB1 and the PS4 almost directly translated into final game performance (900p<1080p) regardless of all the secret sauce dGpu bullshit of the XB1. This same difference will manifest in XSX vs PS4.

It's either that, or the XB1 was all along the better console because it was narrower but faster.

It's illogical to compare the difference in power between the base models of the current generation's consoles to the difference in power between the next generation's consoles for one specific reason: only the base model of one of the current generation's consoles was powerful enough to perform at its target resolution, 1080p, and that console was the PlayStation 4. However, next generation, both consoles will be powerful enough to perform at their target resolution, 4K.

Hence, the difference in power will be irrelevant relative to other factors, such as the quality and quantity of first-party titles and the ability to provide revolutionary gaming experiences due to being designed in a way that encourages new game-design philosophies. And it goes without saying which console will do this.

You heard it here first Gaf, the next gen consoles will just be todays graphics at 4k native and that's it.
 
Last edited:

B_Boss

Member
I'm wondering why the hell Sony hasn't shown off the console still... wtf are they waiting on?

What if they're still having thermal issues and just advertised smartshift boost mode *in hopes* that they could get cooling sorted out but end up having to step it back and run the machine at slower clocks?

I'd hope they are working on a form factor similar to XsX to allow for higher clocks as advertised.

Unless it was (what some would consider) a “PR stunt” or “corporate “CYA”, Cerny sounded fairly confident that the cooling was efficient in the PS5.
 

nosseman

Member
I'm sticking to my belief that PS5 will mostly be a 9.2tf console with the ablility to occasionally go higher if devs want to momentarily push above to 10.2 tf for the Michael Bay scenes in games to avoid overheating with that whopping 2.23 GHz. clocks. The variable clocks are there for a reason. PS5 games are simply not going to be running at such a high clock rate of 2.23 90% of the time without serious potential heating/noise issues.

The sad thing is, nobody will ever know probably. It's a smart PR decision when you're console is more like a 9.2tf console that can occasionally be pushed to boost mode hitting the ceiling of 10.tf in the most highest stress areas in a game in rare instances. It allows Sony to claim it's a 10.2tf to lesson the blow vs 12.1tf of sustained performance through out.

The variable clock things is very wishy washy imo.

The only thing that would show this is the games and I dont think there is going to be such a big difference. DF will find it multiplatform games.

The thing is - what is better? Visuals or framerate?

We had a couple of situations when Xbox One and PS4 was released. A game ran at 900p on Xbox One at almost 60 fps. On PS4 the game ran at 1080p but with lower FPS.

I wonder if MS could "unlock" the GPU and allow it to boost and also claim a peak tflop higher than 12.155 tflop that is guaranteed.
 

Imtjnotu

Member
I'm sticking to my belief that PS5 will mostly be a 9.2tf console with the ablility to occasionally go higher if devs want to momentarily push above to 10.2 tf for the Michael Bay scenes in games to avoid overheating with that whopping 2.23 GHz. clocks. The variable clocks are there for a reason. PS5 games are simply not going to be running at such a high clock rate of 2.23 90% of the time without serious potential heating/noise issues.

The sad thing is, nobody will ever know probably. It's a smart PR decision when you're console is more like a 9.2tf console that can occasionally be pushed to boost mode hitting the ceiling of 10.tf in the most highest stress areas in a game in rare instances. It allows Sony to claim it's a 10.2tf to lesson the blow vs 12.1tf of sustained performance through out.
The variable clock things is very wishy washy imo.
2MoN.gif
 

KWAB

Banned
I'm not a big fan of how the PS5 was designed. A 10,3 teraflops GPU would be fine but there are doubts on the sustainability of the clocks. The SSD looks great but was it really necessary or they went overkill for no reason?
Sony has some explaining to do. Show us why we really need a 5.5GB/s SSD and how the Xbox 2.4GB/s SSD is not enough. Same thing for the clocks. Is the downclocking really minor like Cerny said? Well you better prove it. Maybe call Digital Foundry put them in front of a devkit and let them benchmark the CPU and GPU clockspeed using games from a variety of genres. I think ten would be enough.
I feel like the way Sony revealed their specs did more harm than good. It's okay, not too late to fix it but they better have some convincing answers.
 

Way to refute my post with an intelligent, informed reply. Do you understand the difference between "sustained and variable clocks"? Do you understand that very high clocks( 2.23 GHz.) running at 90% of the time can lead to heating/cooling issues? Its why MS went with far more CU's at a much lower clock. Its the better design as it gives you more power at a sustained performance with less heating/noise issues.
 
Last edited:

Shio

Member
Ummm, if it has to be designed around the PS4 then yes that means PS5 hardware can't be taken full advantage of. Just the PS4's CPU alone hinders what the developers can do in that game. Just because PS5 has much better hardware doesn't mean they can make it a generational leap compared to the PS4. Only way they can do that is if the game was specifically made for the PS5 and if it was the game would not likely be able to run on the PS4, so basically PS5 is just gonna brute force that game for higher resolutions and frame rates.
Sooooo does that mean xbox series x will suffer from the same limitation?
 

Imtjnotu

Member
Way to refute my post with an intelligent, informed reply. Do you understand the difference between "sustained and variable clocks"? Do you understand that very high clocks( 2.23 GHz.) running at 90% of the time can lead to heating/cooling issues? Its why MS went with far more CU's at a much lower clock. Its the better design as it gives you more power at a sustained performance with less heating/noise issues.
Pretty sure Cerny said the engineers came up with a good cooling solution for it. Just wait and see how about that
 

DForce

NaughtyDog Defense Force
Apparently you guys need to be clued in on how the flop performance difference between the XB1 and the PS4 almost directly translated into final game performance (900p<1080p) regardless of all the secret sauce dGpu bullshit of the XB1. This same difference will manifest in XSX vs PS4.

I don't know why you try so hard to deny clear evidence.



Microsoft claims it weighed the benefits of running 12 CUs (768 cores) at 853MHz vs. 14 CUs (896 cores) at 800MHz and decided on the former. Given that the Xbox One APU only features 16 ROPs and ROP performance scales with clock speed, Microsoft likely made the right decision. Thermal and yield limits likely kept Microsoft from doing both - enabling all CUs and running them at a higher frequency. Chances are that over time Microsoft will phase out the extra CUs, although it may take a while to get there. I'm not sure if we'll see either company move to 20nm, they may wait until 14/16nm in order to realize real area/cost savings which would mean at least another year of shipping 14/20 CU parts at 28nm.

You can find these facts anywhere.

Difference between PS4 and Xbox One's TF is about 28%
Difference between PS5 and XsX TF is about 15%

High clocks will bring better results, but PS4 just had a much better GPU. There's also far less CU's in comparison to last gen consoles.
 
It does confirm it. They're speaking in general terms of how GPU's work. You can find this anywhere.

You guys are trying to hard to deny clear facts.
I'm referring to sustained clocks vs variable clocks and the frequency stuff.
DF has already confirmed XSX has a far better GPU that is very customized and is filled with a big feature set.
 
Pretty sure Cerny said the engineers came up with a good cooling solution for it. Just wait and see how about that
I would hope so. That still doesn't explain the wishy washy variable clocks situation where PS5 will supposedly run at 2.23 GHz. clock speed 90% of the time for both the GPU and CPU side.
 
Last edited:

BluRayHiDef

Banned
P Panajev2001a


DForce DForce

Apparently you guys need to be clued in on how the flop performance difference between the XB1 and the PS4 almost directly translated into final game performance (900p<1080p) regardless of all the secret sauce dGpu bullshit of the XB1. This same difference will manifest in XSX vs PS4.

It's either that, or the XB1 was all along the better console because it was narrower but faster.



You heard it here first Gaf, the next gen consoles will just be todays graphics at 4k native and that's it.
LOL, I never said that the next generation of consoles will simply process today's quality of graphics at native 4K and that's that. I was implying that they'll be able to perform adequately at native 4K with very good frame rates with next-gen quality graphics.
 

DForce

NaughtyDog Defense Force
I'm referring to sustained clocks vs variable clocks and the frequency stuff.
DF has already confirmed XSX has a far better GPU that is very customized and is filled with a big feature set.

You didn't even read the post because your comment had nothing to do with the conversation.
 
By changing the voltage. You can run the the Gpu at that speed with out the excess heat and power draw
If it were that simple, why doesn't MS do it? Why haven't big tech been doing that for years? Why worry about heat at all? Just speed ahead to 5.23 GHz. clocks and just "CHANGE THE VOLTAGE".
 
Last edited:

-kb-

Member
I would hope so. That still doesn't explain the wishy washy variable clocks situation where PS5 will supposedly run at 2.23 GHz. clock speed 90% of the time for both the GPU and CPU side.

The entire solution was designed to push as much power out of it as possible for a given power envelope and not have to deal with the possible future game that generates so much more excess heat that it causes the console to go nuts. Its okay if you don't completely understand it, its not the easiest thing to understand is a shift in the paradigm for the console industry. May I suggest you watch the Road to PS5 video to more understand it.
 
Last edited:

SatansReverence

Hipster Princess
I don't know why you try so hard to deny clear evidence.
Your "evidence" says the XB1 was superior.

Difference between PS4 and Xbox One's TF is about 28%

Nice math, it's actually closer to a 40% difference.

Difference between PS5 and XsX TF is about 15%

This 18% difference gets smaller by the day.

High clocks will bring better results, but PS4 just had a much better GPU. There's also far less CU's in comparison to last gen consoles.

And the XSX just has a much better GPU than the PS5.

Again, the flop delta between the XB1 and PS4 almost directly translated to final game perfomance.

I was implying that they'll be able to perform adequately at native 4K with very good frame rates with next-gen quality graphics.

I'll simplify this for you.

If both consoles are pushing the exact same graphics, the XSX will be doing so at a near 20% higher resolution. If both consoles are pushing the same resolution, the XSX will be capable of doing so with more graphical bells and whistles.
 
PS5 is not going to sustain 10 TF, that's just a marketing number. Which is theoretical anyway and has little to do with the real world. The power difference is still small.

All I care about is the noise. Sony should be called out if they can't make a quiet console. This is how many generations of noiseboxes now? Especially since MS clearly has the cooling tech sorted out.

It won't sustain 10.3 all the time, but Cerny said it would spend most of the time at or near that number. I highly doubt any downclocks would be higher than say 3% for the GPU, so overall we are still dealing with a 10 Tflop machine.
 
Last edited:

SatansReverence

Hipster Princess
its not the easiest thing to understand is a shift in the paradigm for the console industry.

No, it's a stop gap measure so Sony didn't have to market a single digit console vs a double digit console.

There is nothing even remotely extraordinary or revolutionary about variable clocks.

The PS5 seems to be the first console designed with inadequate cooling/power delivery incapable of running it's components at their nominal clock speeds. That'll happen when you get blindsided by a significantly more powerful competitor too late in your development cycle.
 
Way to refute my post with an intelligent, informed reply. Do you understand the difference between "sustained and variable clocks"? Do you understand that very high clocks( 2.23 GHz.) running at 90% of the time can lead to heating/cooling issues? Its why MS went with far more CU's at a much lower clock. Its the better design as it gives you more power at a sustained performance with less heating/noise issues.
Answering you with one of my posts.
The truth is that we don't know. The variability is based on the power budget (watt consumed) so we don't know how much power any type of game and every scene in it will require, if the power consumption goes to high the developers will set in advance a lower clock for the GPU, but again: how this will be handled depends game by game, scene by scene, for all we know could stay fixed all the time because one game was designed to not overload the power budget, regardless of the temperatures (that are taken care by the cooling).
Also, we don't take in account smartshift: if the CPU has power to spare, the power will be transfered to the GPU if needed, meaning the GPU could remain at max clocks even if should surpass the power consumption limit, because the CPU will lower the clocks thus the power consumed will remain the same but the GPU will get a boost.
We can't predict how smartshift will work because we don't know how often Zen 2 will be at full 100% so unable to lower its clocks to rise those of the GPU while still keeping the power balanced.
Also, mind you that this means that the temperature directly generated by the components is fixed each time and tested by the devs because they are gonna set the power consumption= setting internal temperatures. The only thing cooling need to take care is external temperatures, so the console will not get crazy loud simply because of the workload this time.
 
No, it's a stop gap measure so Sony didn't have to market a single digit console vs a double digit console.

There is nothing even remotely extraordinary or revolutionary about variable clocks.

The PS5 seems to be the first console designed with inadequate cooling/power delivery incapable of running it's components at their nominal clock speeds. That'll happen when you get blindsided by a significantly more powerful competitor too late in your development cycle.

You honestly think PS5's high GPU clocks was a reaction to the Series X specs? Yeah I doubt that. Sony has had a target for awhile, I'm pretty sure double digit Tflops is what they were always aiming for, but they had to wait until they got RDNA 2 silicone to push the frequencies.
 

-kb-

Member
No, it's a stop gap measure so Sony didn't have to market a single digit console vs a double digit console.

There is nothing even remotely extraordinary or revolutionary about variable clocks.

The PS5 seems to be the first console designed with inadequate cooling/power delivery incapable of running it's components at their nominal clock speeds. That'll happen when you get blindsided by a significantly more powerful competitor too late in your development cycle.

A stop gap measure that was literally baked in from the start, been designed for years, I personally don't think you quite understand it either. It'll be spending the majority of its time at the clocks mentioned and when the worst case game comes instead of turning into a turbine or failing it'll simply clock down.
 
Answering you with one of my posts.

Also, mind you that this means that the temperature directly generated by the components is fixed each time and tested by the devs because they are gonna set the power consumption= setting internal temperatures. The only thing cooling need to take care is external temperatures, so the console will not get crazy loud simply because of the workload this time.

Yes, people keep forgetting that PS5 won't downclock based on the actual thermals of the silicone, but by the workloads, that way every PS5 gives a consistent result regardless of your environment.
 
No, it's a stop gap measure so Sony didn't have to market a single digit console vs a double digit console.

There is nothing even remotely extraordinary or revolutionary about variable clocks.

The PS5 seems to be the first console designed with inadequate cooling/power delivery incapable of running it's components at their nominal clock speeds. That'll happen when you get blindsided by a significantly more powerful competitor too late in your development cycle.
It basically allows consoles to run at generally stable and crazy high clocks, SeX could get 25% higher clocks and use even more power with this, in theory. Of course we are not certain of anything, but it's a system designed to squeeze out the max power possible, any hardware would benefit.
 

FranXico

Member
Apparently you guys need to be clued in on how the flop performance difference between the XB1 and the PS4 almost directly translated into final game performance (900p<1080p) regardless of all the secret sauce dGpu bullshit of the XB1.
That wasn't due to TF, but rather due to the massive memory bandwidth difference. It was 8GB DDR3 (+ESRAM) vs 8GB GDDR5. The biggest advantage the PS4 had by far was the bandwidth, not the GPU. And even then, multiplats eventually started performing pretty close mid-way this generation.
 

SatansReverence

Hipster Princess
You honestly think PS5's high GPU clocks was a reaction to the Series X specs? Yeah I doubt that. Sony has had a target for awhile, I'm pretty sure double digit Tflops is what they were always aiming for, but they had to wait until they got RDNA 2 silicone to push the frequencies.

1. Leaks

2. Variable clocks

It really isn't hard to understand. If the console was always intended to run at 2.2ghz, they would have had power and cooling to do so without ever needing to down clock.

Sony has clearly stated that there console is NOT capable of doing as such. Whether the problem lies in the power delivery or the cooling capacity, the console simply was not originally intended to be run at such frequencies. When Sony realised that the XSX was going to have a much more powerful GPU (obviously would have been months before reveal, so don't even bother shitting it up with "reacting in less than a week" lines) variable clocks was what they decided was their best option to reduce the power delta between the consoles.

This is exactly the same situation Microsoft went through when they overclocked the XB1 however Microsoft didn't bother with variable clocks.
 
1. Leaks

2. Variable clocks

It really isn't hard to understand. If the console was always intended to run at 2.2ghz, they would have had power and cooling to do so without ever needing to down clock.

Sony has clearly stated that there console is NOT capable of doing as such. Whether the problem lies in the power delivery or the cooling capacity, the console simply was not originally intended to be run at such frequencies. When Sony realised that the XSX was going to have a much more powerful GPU (obviously would have been months before reveal, so don't even bother shitting it up with "reacting in less than a week" lines) variable clocks was what they decided was their best option to reduce the power delta between the consoles.

This is exactly the same situation Microsoft went through when they overclocked the XB1 however Microsoft didn't bother with variable clocks.

Yeah, and those leaks were probably of a pre RDNA 2 silicone test bench setup.


Well, we know they are using a more robust cooling system than PS4/PS4 Pro.
 
Last edited:

mitchman

Gold Member
Way to refute my post with an intelligent, informed reply. Do you understand the difference between "sustained and variable clocks"? Do you understand that very high clocks( 2.23 GHz.) running at 90% of the time can lead to heating/cooling issues? Its why MS went with far more CU's at a much lower clock. Its the better design as it gives you more power at a sustained performance with less heating/noise issues.
With RDNA1, that would have been the case. RDNA2 is uses 50% less power for the same performance, meaning you can increase clocks significantly without using more power and generating more heat than RDNA1. The expectation is that RDNA2-based gfx cards from AMD will run up to 2.5GHz at a sustained boost clock.
 
Last edited:

SatansReverence

Hipster Princess
That wasn't due to TF, but rather due to the massive memory bandwidth difference. It was 8GB DDR3 (+ESRAM) vs 8GB GDDR5. The biggest advantage the PS4 had by far was the bandwidth, not the GPU. And even then, multiplats eventually started performing pretty close mid-way this generation.

Oh, it's all about memory bandwidth now, oof, PS5 in for a VERY rough time then. XB1 must have had a dGpu hidden in the powerbrick.

Yeah, and those leaks were probably of a pre RDNA 2 silicone test bench setup.
Leaks that were practically spot on about all but clocks :pie_thinking:

Well, we know they are using a more robust cooling system than PS4/PS4 Pro.

We also know said cooling system is incapable of cooling the PS5 enough to run at it's nominal clock speeds. (or the PSU hasn't got enough headroom to do so)

With RDNA1, that would have been the case. RDNA2 is uses 50% less power for the same performance, meaning you can increase clocks significantly without using more power and generating more heat than RDNA1.

Does it need to be explained why this is entirely irrelevant?

A stop gap measure that was literally baked in from the start, been designed for years, I personally don't think you quite understand it either. It'll be spending the majority of its time at the clocks mentioned and when the worst case game comes instead of turning into a turbine or failing it'll simply clock down.

A stop gap that was never (afaik) needed on any previous console and is simply a basic part of AMDs APUs already seen in laptops?
 
Last edited:

SaucyJack

Member
Your "evidence" says the XB1 was superior.



Nice math, it's actually closer to a 40% difference.



This 18% difference gets smaller by the day.



And the XSX just has a much better GPU than the PS5.

Again, the flop delta between the XB1 and PS4 almost directly translated to final game perfomance.



I'll simplify this for you.

If both consoles are pushing the exact same graphics, the XSX will be doing so at a near 20% higher resolution. If both consoles are pushing the same resolution, the XSX will be capable of doing so with more graphical bells and whistles.

This is correct. But if all the XSX is doing is pushing 18% more pixels then outside the console warriors most people just aren't going to care.

40% last gen was, sometimes, the difference between HD and sub-HD in multi-platform games and tbh it wasn’t really that big a deal for most people. Assuming that 10.3 Tflops is enough to to 4k at a stable frame rate then the extra 18% is going to be even less of a deal.

Price and games will be far more important factors as we go into the next gen.
 

DForce

NaughtyDog Defense Force
Your "evidence" says the XB1 was superior.

Just because you get better performance, doesn't mean it will match or exceed the other GPU.

1.3TF w/ less CU's and higher clock vs 1.3TF more CU's and lower clock.

So was Digital Foundry wrong or not?

Nice math, it's actually closer to a 40% difference.
Really? :messenger_grinning_sweat:

1.84 -28.80% brings you to 1.31 TF
1.31 from 40.46% brings you to 1.84 TF



And the XSX just has a much better GPU than the PS5.

Again, the flop delta between the XB1 and PS4 almost directly translated to final game perfomance.

17.96%
-15.23%

You'll end up with the same result.

But anyways, you're not making much sense. Less CUs with higher clocks brings better results. This is stated just about anywhere. I can bring up article after article and you're going to continue to deny it.
 
Oh, it's all about memory bandwidth now, oof, PS5 in for a VERY rough time then. XB1 must have had a dGpu hidden in the powerbrick.


Leaks that were practically spot on about all but clocks :pie_thinking:



We also know said cooling system is incapable of cooling the PS5 enough to run at it's nominal clock speeds. (or the PSU hasn't got enough headroom to do so)



Does it need to be explained why this is entirely irrelevant?



A stop gap that was never (afaik) needed on any previous console and is simply a basic part of AMDs APUs already seen in laptops?

You act like changes in clock frequency during the development of a system is new. This is pretty normal.
 

SaucyJack

Member
Also where did this fast and narrow vs wide and slow comparison for PS4 and XBO come from?

If 18 CUs at 800 MHz is wide and slow then 12 CUs at 853 MHz is narrow and slow, or at least the 6% higher clock isn’t close to enough to start to compensate.
 

SatansReverence

Hipster Princess
This is correct. But if all the XSX is doing is pushing 18% more pixels then outside the console warriors most people just aren't going to care.

40% last gen was, sometimes, the difference between HD and sub-HD in multi-platform games and tbh it wasn’t really that big a deal for most people. Assuming that 10.3 Tflops is enough to to 4k at a stable frame rate then the extra 18% is going to be even less of a deal.

Price and games will be far more important factors as we go into the next gen.

If the difference is native 4k vs upscaled, it is going to make people want native.

A lot of people apparently just brush over the XB1s pathetic marketing, the fact that it was significantly weaker AND more expensive.

Just because you get better performance, doesn't mean it will match or exceed the other GPU.

1.3TF w/ less CU's and higher clock vs 1.3TF more CU's and lower clock.

Oh look, it's irrelevant 1.3 vs 1.3 comparisons.

Maybe try it with 1.3 vs 1.8 next time.

So was Digital Foundry wrong or not?
They were just parroting Cernys talking points.

Really? :messenger_grinning_sweat:

1.84 -28.80% brings you to 1.31 TF
1.31 from 40.46% brings you to 1.84 TF





17.96%
-15.23%

You'll end up with the same result.

I guess we're at the "selectively choosing the direction of percentage calculations to downplay the difference" stage.

But anyways, you're not making much sense. Less CUs with higher clocks brings better results. This is stated just about anywhere. I can bring up article after article and you're going to continue to deny it.

So again, the XBone was the superior, more powerful console. It had less CU, higher clocks. Apparently developers just decided to render their multiplats at a significantly lower resolution because... Reasons... I guess?
 

SatansReverence

Hipster Princess
Running at constant power is not a part of AMD APUs, its similar to SmartShift but not the same.

Smartshift is almost exactly what it is in all but name.

You have a finite amount of power you can pump through the APU before either power delivery limitations become a problem or thermal capacity is exceeded.

The only difference is no console player would want their frame rates/resolution to tank randomly so Sony has set profiles for developers to try to work around whether they need more CPU performance or GPU performance.

It is objectively worse than having been designed from the get go to be capable of cooling/powering itself consistently.

A higher peak of power could have been reached on any console using this method.

Yes, Microsoft easily could have done the same, but they didn't need to because they already have the significant power advantage with a console already designed for a specific thermal output/power requirement.
 
Status
Not open for further replies.
Top Bottom