Strictly speaking if your only metric is power consumption and TF, then it makes no sense like you say, the intention comes from not consumption or TF design, but to increase the other metrics such as your L1/L2 cache. Modern CPU/GPU design is based on thermal heat to reduce workload or increase workload, this is why you see a framerate drop if the game becomes too demanding, but the PS5 approaches this very differently, rather the consumption is fixed, but the variable freq is dependent of the workload, therefore if the cpu only demands X amount of computation, the rest will go to the GPU to push for additional res, etc. Despite what people think GPU isn't fully utilized as much as people think, it's why GPGPU exists last Gen, because graphics output doesn't require the full power of the GPU, so general processing can be used in lieu of traditional CPU, in PS4 case a very weak jaguar cores. It's also why TF isn't as important as people think, it has its uses, but other factors are at play. In a way the most wasteful tech out there is the SLI, you double the cost, but it doesn't double the performance, yet your TF is up the roof, so why not double the power? The short answer is it doesn't need or should I say it can't use it because of other elements at play.
What Sony is counting on is the max utilization of all the other computational needs to a game, not just TF. At this point it's hard to say what the gap is really going to be like without seeing the games in motion. I think the Xbox's brute power will probably beat out on the resolution front, just from the power difference alone, but performance wise like FPS, staggering in gameplay, open world game design requiring instant data satisfaction is alot more blur, and it may surprise some that the PS5 could possibly win on this front due to of a lack of a bottleneck. I think RDNA 2.0 probably removed the heavy power consumption needs for Sony to move into this realm, but the thermals on how they keep it constant is going to be interesting.