Yea most of the time meaning if the gpu needs a boost clock it will do so but if it doesn't , it won't and that's up to the developer to balance things up between the cpu and gpu because both of them can't go at boost clocks at the same time on the PS5. PS5 running at a fixed power system which I assume you mean 200-220W? Correct me if im wrong, so you are basically saying PS5 is running at that power range on a indie 2d game also? Or what Cerny means that fixed upper limit 220W for the PS5 in order to prevent heating issues?
Sorry, but some of this is straight up FUD. The 3.5 GHz CPU and 2.23 GHz GPU clocks are not "boost clocks", they do not function the way boost clocks do with PC CPUs and GPUs. The system has a fixed power draw so that at most, when a game runs into a scenario that is GPU-limited, power from the CPU power budget can be shifted to the GPU's power budget as most of the time the CPU will have the space budget to allow it. If the scenario's an inverse of that, then the inverse occurs.
Both the CPU and GPU can run at their peak clocks when required and only in very small instances will there need to be a power reduction of the power budget on the GPU by at most 10% resulting in just a 2% frequency reduction (which would put the GPU @ 2.185 GHz or slightly above 10 TF). Yes, if a game doesn't call for full utilization of hardware resources, the system won't need to provide them, and when it does the system will.
But it can do this for BOTH CPU & GPU simultaneously while staying within the total system power budget, and only drop by a 2% frequency in a worst-case scenario. Also it's not "up" to the developer as if they need to micromanage their code between CPU and GPU specifically for PS5; the only instructions cited as causing a worst-case are AVX-256 which are barely used in any game whatsoever. Otherwise most dev code won't incur a "worst-case" scenario to occur.