• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Tripolygon

Banned
All true.... Except in those cases, the GPU boost and CPU boost work independently. Both clock up and down as needed and do not interfere with each other, unlike the PS5, which has been confirmed to need to balance between the CPU and GPU workload.
Confirmed by who? All these stuff will be revealed by next year GDC when devs start to talk about their next gen games. Game design is a balancing act mate, even more so on consoles have always been about adjusting your scope within power budget. Want higher frame rate? you lower resolution and other stuff.
 

pasterpl

Member
Do you think they'll put every second console into a cloud rack just to offer a feature not many people will appreciate? Also remember that a lot of decisions are made in the USA, a country which still is medieval in terms of Internet standards (data caps? I forgot about them 20 years ago).

might be something offered only for the first party games, and remember (in my theory) this would be only streamed when the actual game installs, so potentially after your first session is finished, the 2nd time you start that game it would run from the local machine.

Re. Data-caps, streaming services usage, digital game sales etc. Suggest that this would not be a massive issue, in addition, if these US companies are really worried about this (data caps) why would they invest in stadia, xCloud, GeForce now etc?
 

henau212

Neo Member
That will never happen. And why is the worst case design target being extrapolated to be the norm?

They expect the norm for the clocks to be at or around those max clocks without the need to drop frequency.
Question was not about the norm but if both can be at absolute max frequency. Answer seems to be no, otherwise he would not be talking about the worst case and throttling when that happens. And yes, that is not the norm. But that was not the question. The talk about frequency is kind of strange. Why is it so hard to run the CPU at 3 GHZ fixed if the Xbox is doing it way above that?

But who cares in the end. I am more interested in the cooling of the PS5 before I buy one.. That will have to be very powerful and hopefully silent. They seem to be way about the efficiency curve on the GPU if 10 percent of power only yields in a couple percent performance..
 
This is the first time im hearing about the L2/L3 cache bandwidth advantage. where was this mentioned? also, is the faster rasterization going to make up a 17% gap in tflops? ps5 only has 22% higher clockspeeds, and lacks 44% of CUs, i dont see how faster clocks on 16 fewer CUs will make a dent.
Not sure if someone already told you, but if you go back to 32:30 on the Road to PS5 video, Cerny talks about how increasing the clockspeed on the GPU also boosts rasterisation and the cache's bandwidth. The reason why Cerny went with a lower CU count because it's easier to full load the GPU that way. And with the higher cache bandwidth, the memory can feed more data to the GPU in less time and the cache scrubbers in tandem with the coherency engines allow new data to flow in without the GPU stalling.
 

quest

Not Banned from OT
Not sure if someone already told you, but if you go back to 32:30 on the Road to PS5 video, Cerny talks about how increasing the clockspeed on the GPU also boosts rasterisation and the cache's bandwidth. The reason why Cerny went with a lower CU count because it's easier to full load the GPU that way. And with the higher cache bandwidth, the memory can feed more data to the GPU in less time and the cache scrubbers in tandem with the coherency engines allow new data to flow in without the GPU stalling.
Those idiots who brought core 2 duos should went with high clocked p4. Those idiots who bought the the low clocked 2800ti should got higher clocked 2060 for more performance. Just because mark netburst cerny has to sell his design to the public does not make it correct. Wanna bet the next high end nvidia cards are not low cores super high clocked? It's all PR plain and simple he way over spent on the SSD and there limited budget left for the apu so clock it just short of exploding. It's fine to admit he is wrong here unless your saying nvidia is also wrong lol.
 
Last edited:

Ascend

Member
Confirmed by who? All these stuff will be revealed by next year GDC when devs start to talk about their next gen games. Game design is a balancing act mate, even more so on consoles have always been about adjusting your scope within power budget. Want higher frame rate? you lower resolution and other stuff.
By Cerny AKA Sony? Who else?
 

Evilms

Banned
aq6YhQm.jpg
 
Those idiots who brought core 2 duos should went with high clocked p4. Those idiots who bought the the low clocked 2800ti should got higher clocked 2060 for more performance. Just because mark netburst cerny has to sell his design to the public does not make it correct. Wanna bet the next high end nvidia cards are not low cores super high clocked? It's all PR plain and simple he way over spent on the SSD and there limited budget left for the apu so clock it just short of exploding. It's fine to admit he is wrong here unless your saying nvidia is also wrong lol.
Rip open a 2060 and tell me if it has any cache scrubbers. Find any I/O from any desktop part that contains a DMAC, Kraken decompression chip, and coherency engines like the ones the PS5's I/O has. I'll wait....
 

yewles1

Member
Those idiots who brought core 2 duos should went with high clocked p4. Those idiots who bought the the low clocked 2800ti should got higher clocked 2060 for more performance. Just because mark netburst cerny has to sell his design to the public does not make it correct. Wanna bet the next high end nvidia cards are not low cores super high clocked? It's all PR plain and simple he way over spent on the SSD and there limited budget left for the apu so clock it just short of exploding. It's fine to admit he is wrong here unless your saying nvidia is also wrong lol.
How does that first sentence apply with two consoles that have the same architecture? And why ignore the CPU on XS being higher speed than PS5? This argument doesn't work, it seems almost as if certain people are trying to say the PS5 is barely more powerful than PS4 Pro or something.
 
Last edited:
I don't doubt that - I think questions are mainly how mixing slow/fast access will impact performance.

And that's a really valid question.

But there are some crazy "tech analysis" fuckwits (like me) who promote themselves as authorities (I'm definitely not one) that are churning out misninformation fodder for sites like NeoGaf.

Case in point - this thread:


Yes - while CPU providing fast-path to 'its' memory was always part of PS3 design(so GPU got fast access either way) - RSX wasn't built the other way around.

My feeling looking back at the PS3 is that Sony were on the cusp of a revolution, but weren't able to fully predict the future or detach themselves from the past.

Like PS2, I didn't appreciate it at the time. It might not have ended up setting the agenda for the future, even if it foreshadowed it. But it was a really honest and bold shot at moving graphics forwards.

It's a mix - GC had a really slow-pool in the mix - but Wii upgraded all of its memory to have roughly the same bandwidth, so it was really just latency differences between SRam and DRam pools. PSP was interesting as bandwidth was largely the same, the main use for embedded memory was lower latency/direct access to the 'owning' chip (no bus contention).

Interesting point about the Wii - I'd forgotten how much faster its "a-ram" was!

On a different note, I wonder sometimes if Sony aiming so high in terms of performance in the handheld space limited their options for competing in a very family / kid oriented market.
 
I guess that could be seen as a bad thing, given how we are on a forum to debate... oh wait.
Give it a rest mate.

The developer made certain claims regarding PS5 and how the system would be bottlenecked by the CPU and he stated he would release a video to prove it. People have been waiting and asking for said video and nothing has materialised so far. He has also gone radio silent on these bottleneck claims.

He just signed up to twitter last month. Is working on an Xbox Series X exclusive and has his company plastered all over his twitter page.

It doesn't take a slide rule to figure out this dev is taking part in console wars and the noise surrounding it to further promote his game by making unsubstantiated claims on social media.
 

ethomaz

Banned
No, that is exactly how it works.
By design they can not both be at 100% Frequency and 100% power budget. Otherwise you could make the clocks fixed and it would not make a difference.
And Cerny never said they're both at max speed at the same time. The whole design philosophy only makes sense that way. If the SOC could run the GPU and CPU at 100% frequency under load it would be the same classic console design like the Ps4 or Series X.

Again that is literally impossible under load. And without lod it also does not matter how high your frequency is, because it's not needed anyways.
Another denying Cerny words lol

Cerny said both can be 100% at same time... in fact he said most of time that is what happens.
 
Last edited:

ethomaz

Banned
I know that. And that is exactly what I said.
100% GPU and 100% CPU at 100% power target is impossible.
Yes they continually run in boost mode. Guess where the boost start? Not at 100% max frequencies.

It's at 80%, 85% or 90%
Nope.

100% CPU and 100% GPU at same time.
CPU 3.5Ghz and GPU at 2.23Ghz at same time.


He did not.
If he did, It will be easy for you to find a video or quote of it.
And if he did, he would've said something wrong.
Of course he did lol

Road of PS5... watch it... you were linked the last page but ignored.
 
Last edited:

Nickolaidas

Member
I'm at 90%.

My family not.

Mom, dad, wife, son,... Everyone's sick. My dad very sick.

Yes, nightmare.
My dad is 73 years old, diabetic, high blood pressure. I sincerely hope he won't get it - at least not until a solid cure or a vaccine is available.

I really pray your dad will make it.

This fucking thing is so contagious it's not even funny.
 
Last edited:

Ascend

Member
Stop lying. Present proof with time stamp. You have no idea how smart shift works. You have no idea how PS5 works.
I'm not going to go through the entire presentation to provide you with a timestamp. Go do that yourself. You claim I have no idea how SmartShift works? The fact that it is a function for laptops already says quite a lot, but let me quote what it is;

AMD SmartShift is a new technology that is being introduced with the new AMD Ryzen 4000 Mobile processor family.
It can be simply described as a smart power distribution technique, to dynamically improve CPU or GPU performance with a limited power budget.

How Does AMD SmartShift Work?
In a typical laptop, the CPU and GPU each have their own pre-defined power budget.
The CPU and GPU will individually adjust their power consumption according to the load, using less power for a longer battery life, but n
ever exceeding their power budget even when there is a need for more performance.

And here's a video as a bonus



In other words, if the GPU is drawing a lot of power, it will hamper the CPU from performing at its max capacity and vice versa. SmartShift is an advantage only when you are comparing it with a system using the exact same power without SmartShift. If you have a system that is allowed to use more power, the one with SmartShift will generally simply lose, because it is by default limited by power constraints.
And remember that the specs that were given for the PS5 were the MAX performance numbers. So SmartShift is already accounted for. It can even be argued that they are painting it as better than it really is, since it will not be able to run at its max GPU speed and CPU speed at the same time.

Tell me again I don't know how smart shift works or how the PS5 works.
 
Last edited:
from that dumb tweet


Agreed on that it's dumb. It's a very stupid and simple take.

In other words, if the GPU is drawing a lot of power, it will hamper the CPU from performing at its max capacity and vice versa. SmartShift is an advantage only when you are comparing it with a system using the exact same power without SmartShift. If you have a system that is allowed to use more power, the one with SmartShift will generally simply lose, because it is by default limited by power constraints.
It's more so if the workload is more GPU intensive, power will be diverted from the CPU to GPU in order for the system to actually execute the workload. Without Smartshift, the GPU will not have enough power to take on the workload while the CPU has more power than it really needs. It also should be noted that laptops usually run at a very low TDP compared to consoles. The Asus Zephyrus G14 has as TDP of 35W iirc. That is what SmartShift is for.

For the PS5, it's a different story because it will have a much higher power draw. Also, power draw and frequency do not tell the whole story. The workload that the CPU and GPU have to work with is what determines the amount of heat production. As long as the workload is within the CPU and GPU's (along with the other custom hardware) capabilities, then both will run at max frequency. However, if the workload is way too high, then that is where the PS5 needs to reduce frequency.

For instance, I can make my 2700x run at 3.7Ghz right now with cpu-freq. Is the CPU fan going crazy because I'm constantly running the CPU at a high frequency? No, because the workload is not intensive. However, if I were to run that 2700x at 3.7Ghz with a CPU-intensive game, then that is where the CPU fan will ramp up.
 
Last edited:

xacto

Member
That's it boys pack it in, PS5 just lost the war... /s

Fastest AND most powerful... they really are going into this head strong and it makes me wonder if indeed they are actually paying for creation and spread of FUD on forums if they are marketing against PS5 with 'fastest' messaging. Some people claimed GitHub fiasco was bought and paid for, to create FUD and make PS fans root for impossibly high TF figures only to be left feeling bad when official announcement came. The insider debacle could also make sense if MS planned that strategy and now going all in on console war themselves.

Well, PS brand is giant so I don't see them winning this generation either.

If they think that's enough for them to win this generation, they haven't learned a thing from the past. Trying the same thing over and over and expecting different results... we all know what that is, right?
 

LordOfChaos

Member
This is for the technically illiterate. The GPU of the current gen is too large, but the image describes the situation perfectly.



What this is showing is how SmartShift works, which the point of part of the talk was this is decidedly not.

In a worst case scenario with the highest power operations on both the CPU and GPU, a "couple" of percent drop on the clock speed drops power by 10%. They expect it to run at or near the peak most of the time.

It's flipping the idea on its head somewhat. On a PC laptop with Smartshift, you'd have a TDP that the cooler can remove and shared power between the CPU and GPU, if the GPU is being maxed out and the CPU isn't, it would give more of that TDP to the GPU, the cooling being the hard limit. Smartshift isn't running near its peak clocks most of the time, it's an opportunistic TDP user.

The PS5 flips this around by saying the power output is constant, the cooling isn't the bottleneck, and the CPU and GPU are actually capped from going higher because they built the system around a constant power output vs a constant frequency.
 

Ascend

Member
I did and he never said it.
Just stop spreading misinformation.
Just give me the quote or timestamp.

I'll give you $1000 when you found it.
I already stated that I'm not going to look for it. I don't need nor want your money. The bottom line is that he did say it, but in a sugar-coated kind of way, because, well, he has to sell his inferior console as being equally if not more capable. It's the difference between saying "I ate half your ice cream" and "I left some of your ice cream for you".
 
Last edited:

ethomaz

Banned
I'm not going to go through the entire presentation to provide you with a timestamp. Go do that yourself. You claim I have no idea how SmartShift works? The fact that it is a function for laptops already says quite a lot, but let me quote what it is;

AMD SmartShift is a new technology that is being introduced with the new AMD Ryzen 4000 Mobile processor family.
It can be simply described as a smart power distribution technique, to dynamically improve CPU or GPU performance with a limited power budget.

How Does AMD SmartShift Work?
In a typical laptop, the CPU and GPU each have their own pre-defined power budget.
The CPU and GPU will individually adjust their power consumption according to the load, using less power for a longer battery life, but n
ever exceeding their power budget even when there is a need for more performance.

And here's a video as a bonus



In other words, if the GPU is drawing a lot of power, it will hamper the CPU from performing at its max capacity and vice versa. SmartShift is an advantage only when you are comparing it with a system using the exact same power without SmartShift. If you have a system that is allowed to use more power, the one with SmartShift will generally simply lose, because it is by default limited by power constraints.
And remember that the specs that were given for the PS5 were the MAX performance numbers. So SmartShift is already accounted for. It can even be argued that they are painting it as better than it really is, since it will not be able to run at its max GPU speed and CPU speed at the same time.

Tell me again I don't know how smart shift works or how the PS5 works.

Just to add.

PS5 uses SmartShift to send any unused power from CPU to GPU but it is not used to define which frequencies the CPU/GPU are running.

That is logic is different from SmarShift and any other PC logic and done by a custom Sony logic... it will only use SmartShift if there is non-used power on CPU to send to GPU.

Cerny choose that variable frequency because he wanted to allow devs to take most of the others parts of the GPU if they need... só rasterization, command buffer, L1 & L2 caches, etc are 33% faster than a GPU running at 1850Mhz.

It is easy to devs use 36 parallel CUs than 48 parallel CUs (that was their other option of design)... efficiency is the key here.

He says too the worst case scenario the GPU will run at lower clocks but not that much lower than 2.23Ghz because a coupe of % drop in frequency already saves 10% in power draw.

Any downclock will be pretty minor.
Most of time both CPU and GPU will run at max frequencies.

So the GPU will probably drop between 50-100Mhz to sustain the capped power draw.... maybe in the worst case a bit more like 150-200Mhz but from Cerny words it seems unlikely.

I’m very interested in some games stats to be showed in the future... max, min and avg. frequencies.

Some here thinks the GPU and CPU will heavy dos clock the frequencies that goes against what Cerny said.
 
Last edited:
from that dumb tweet



Two fanboys of the same console chatting if they only see number then:

XSX GPU> PS5 GPU (a 20% is not huge sorry)
XSX CPU> PS5 CPU (almost the same)
PS5 SDD>>XSX SDD (most of the double)

Ram well will depends in how many left to dev PS5 and in some scenarios should be better one than the other.

Sorry as I believe in Cerny when talks about the PS5 specs I also believe in XSX when say the theoretical limit of its SSD and still the half of the best scenario for PS5.

Also if lockhart exist will exist an impact in the graphics of XSX but even if not the first parties need to make work its games for an old jaguar and very weak gpu of xbox one
for two years.

If you think is not important so why Microsoft put so many money in make and ssd which can be reach around 6 GBs in the best scenario and why is basically the only
feature the dev talk, I don't see dev talking with the new gpus we can do things which looks impossible before (talking for TF raw number) but instead they talk about the
SSD, CPU, Audios chips (in both), technologies in the gpu like VRS, Mesh shaders.
 

ethomaz

Banned
Hint: random read/write speed.
5.5GB/s is sequential read only that is the fastest (that a SSD can do) and more used in games... you basically don’t change the game assets (write) and the files are already stored in a sequential way to be read.

Write is slower in both consoles than 2.4/5.5GB/s.
 
Last edited:

Kusarigama

Member
I'm not going to go through the entire presentation to provide you with a timestamp. Go do that yourself. You claim I have no idea how SmartShift works? The fact that it is a function for laptops already says quite a lot, but let me quote what it is;

AMD SmartShift is a new technology that is being introduced with the new AMD Ryzen 4000 Mobile processor family.
It can be simply described as a smart power distribution technique, to dynamically improve CPU or GPU performance with a limited power budget.

How Does AMD SmartShift Work?
In a typical laptop, the CPU and GPU each have their own pre-defined power budget.
The CPU and GPU will individually adjust their power consumption according to the load, using less power for a longer battery life, but n
ever exceeding their power budget even when there is a need for more performance.

And here's a video as a bonus



In other words, if the GPU is drawing a lot of power, it will hamper the CPU from performing at its max capacity and vice versa. SmartShift is an advantage only when you are comparing it with a system using the exact same power without SmartShift. If you have a system that is allowed to use more power, the one with SmartShift will generally simply lose, because it is by default limited by power constraints.
And remember that the specs that were given for the PS5 were the MAX performance numbers. So SmartShift is already accounted for. It can even be argued that they are painting it as better than it really is, since it will not be able to run at its max GPU speed and CPU speed at the same time.

Tell me again I don't know how smart shift works or how the PS5 works.

Burden of providing proof lies on you since you are claiming that Cerny said that in that video and Since is did not say that I can't find it in that video.

There is more to smart shift than that, go do your homework better.
 

icerock

Member
from that dumb tweet

That tweet has nothing on the follow-up :messenger_tears_of_joy:



Brad Sams: "Cerny is lying about 5.5 GBps raw speed. My Microsoft programmed brain is not complex enough to understand how we allowed Sony to get a 2.25x advantage in I/O. Hence, it must be a lie."

I'm sure you are following the PS5 thread on the other place, it is absolutely hilarious, I'm convinced that bloke 'SPDIF' sent out a beacon on that Discord cliq given how all of them assembled in space of few minutes to champion his post and offer support for what Brad Sams wrote.
 

Ascend

Member
Burden of providing proof lies on you since you are claiming that Cerny said that in that video and Since is did not say that I can't find it in that video.

There is more to smart shift than that, go do your homework better.
Oh. So that's how it works. When it is something that you claim, i.e. that there's more to smart shift than that, then suddenly the burden of proof is not on the one making the claim anymore, but I have to go look for it myself? GTFO.
 
That tweet has nothing on the follow-up :messenger_tears_of_joy:





I'm sure you are following the PS5 thread on the other place, it is absolutely hilarious, I'm convinced that bloke 'SPDIF' sent out a beacon on that Discord cliq given how all of them assembled in space of few minutes to champion his post and offer support for what Brad Sams wrote.

Actually the Chiron should be a high end PC because you know any game is exclusive for xbox ....
 
Last edited:

Gediminas

Banned
That tweet has nothing on the follow-up :messenger_tears_of_joy:





I'm sure you are following the PS5 thread on the other place, it is absolutely hilarious, I'm convinced that bloke 'SPDIF' sent out a beacon on that Discord cliq given how all of them assembled in space of few minutes to champion his post and offer support for what Brad Sams wrote.

i went to tweeter and trolled that dumb Brad Sams :)
 

Ascend

Member
What this is showing is how SmartShift works, which the point of part of the talk was this is decidedly not.

In a worst case scenario with the highest power operations on both the CPU and GPU, a "couple" of percent drop on the clock speed drops power by 10%. They expect it to run at or near the peak most of the time.

It's flipping the idea on its head somewhat. On a PC laptop with Smartshift, you'd have a TDP that the cooler can remove and shared power between the CPU and GPU, if the GPU is being maxed out and the CPU isn't, it would give more of that TDP to the GPU, the cooling being the hard limit. Smartshift isn't running near its peak clocks most of the time, it's an opportunistic TDP user.

The PS5 flips this around by saying the power output is constant, the cooling isn't the bottleneck, and the CPU and GPU are actually capped from going higher because they built the system around a constant power output vs a constant frequency.
It really isn't as different as they are advertising it to be. In both cases (laptops and PS5), a certain max power consumption target is given (like 45W for example), and the frequency is allowed to go wild up to those power consumption targets. Those limits are determined by real factors like max cooler size and heat output. But minimum voltages also play a role here, and the higher you go in frequency, the more voltage you will need. So you'll need to cap the frequency to not start burning things. That's the way they know exactly how to design the cooler and what power envelop is allowed. The difference is that in laptops temperature is set as the limiting factor, while in the PS5 it's the workload.

Has anyone here used WattMan? Or Afterburner? You can do the same thing with your graphics card. You set your fan at X speed, you say you want to reach max 70 °C for example, and then let the graphics card run wild. The clocks will be 'floating' to give you the max performance at the limits you have set. Or instead of the max temperature, you simply lower the power limit. Then you can tweak the voltages to try and squeeze more frequency out of it. But instead of manually, this is done automatically on the PS5. This doesn't mean that you wouldn't be better off simply removing the power limit, if we're talking pure performance.

Lastly, the workload being the limiting factor is technically a translation of heat and temperature. And what do I mean by that? A certain workload is expected to have a certain power draw, which correlates with a certain amount of heat production and thus a certain temperature output. You then set a certain limit considering the cooler you have. This data has likely been put in some sort of working tables for the chip, and the GPU/CPU works in the same way without needing temperature sensors and control system. It's a cost saving measure, because the temperature method will generally be more efficient, provided the sensor is working correctly.

Really... I do not see how all of this is a big deal.
 
Status
Not open for further replies.
Top Bottom