Folks get way too hung up on percentage differences, like whether it's 20, 30, 40, or 50%, and end up having this notion that at some arbitrary percentage point, the power difference somehow becomes "significant." Really, folks are blowing the numbers out of proportion and getting too hung up on specific numbers.
To illustrate this point, let's compare some PC GPUs that actually have a significant gap in performance, where there
would be a huge leap in resolution, IQ, and fidelity between them. For this example, take the fairly pitiful
HD7670, with a meager 768 GFLOPS, and compare it to the comparatively monstrous
HD7990, still today a beast of a card with a whopping 3891x2 GFLOPS under its belt.
This represents a truly massive power differential. This is where you'd expect to drop resolution, IQ, fidelity, everything just to get a decent framerate on the weaker hardware. And it makes sense - in this case, there is a
1013% difference in GFLOP capability between the cards. That's a level of difference where you can expect to see something that arguably looks like a generational leap, something that looks like Xbox vs. PS2, maybe more.
Now look back at the Xbone vs. PS4. 50%, what does that amount to again? What does 20, 30, 40% actually result in? Probably not as much as folks have in their minds. Yeah, it's a difference; yeah, first party titles will show the difference when they fully exploit each system's power. But multiplats? If there's a difference, don't get yourself worked up into believing it's going to be this huge gulf. Some parts could get trimmed down going from PS4 to Xbone, sure, but getting all worked up over how much will actually come of a ~50% or less difference starts to get silly.
Just a general observation I'm throwing out there, take it for what y'all will.