I'm not going to go through the entire presentation to provide you with a timestamp. Go do that yourself. You claim I have no idea how SmartShift works? The fact that it is a function for laptops already says quite a lot, but let me quote what it is;
AMD SmartShift is a new technology that is being introduced with the new AMD Ryzen 4000 Mobile processor family.
It can be simply described as a smart power distribution technique, to dynamically improve CPU or GPU performance with a limited power budget.
How Does AMD SmartShift Work?
In a typical laptop, the CPU and GPU each have their own pre-defined power budget.
The CPU and GPU will individually adjust their power consumption according to the load, using less power for a longer battery life, but never exceeding their power budget even when there is a need for more performance.
And here's a video as a bonus
In other words, if the GPU is drawing a lot of power, it will hamper the CPU from performing at its max capacity and vice versa. SmartShift is an advantage only when you are comparing it with a system using the exact same power without SmartShift. If you have a system that is allowed to use more power, the one with SmartShift will generally simply lose, because it is by default limited by power constraints.
And remember that the specs that were given for the PS5 were the MAX performance numbers. So SmartShift is already accounted for. It can even be argued that they are painting it as better than it really is, since it will not be able to run at its max GPU speed and CPU speed at the same time.
Tell me again I don't know how smart shift works or how the PS5 works.
Just to add.
PS5 uses SmartShift to send any unused power from CPU to GPU but it is not used to define which frequencies the CPU/GPU are running.
That is logic is different from SmarShift and any other PC logic and done by a custom Sony logic... it will only use SmartShift if there is non-used power on CPU to send to GPU.
Cerny choose that variable frequency because he wanted to allow devs to take most of the others parts of the GPU if they need... só rasterization, command buffer, L1 & L2 caches, etc are 33% faster than a GPU running at 1850Mhz.
It is easy to devs use 36 parallel CUs than 48 parallel CUs (that was their other option of design)... efficiency is the key here.
He says too the worst case scenario the GPU will run at lower clocks but not that much lower than 2.23Ghz because a coupe of % drop in frequency already saves 10% in power draw.
Any downclock will be pretty minor.
Most of time both CPU and GPU will run at max frequencies.
So the GPU will probably drop between 50-100Mhz to sustain the capped power draw.... maybe in the worst case a bit more like 150-200Mhz but from Cerny words it seems unlikely.
I’m very interested in some games stats to be showed in the future... max, min and avg. frequencies.
Some here thinks the GPU and CPU will heavy dos clock the frequencies that goes against what Cerny said.