I'd say a 6-8 year lifecycle would've been okay, had PS4 and Xbox One been exceptionally powerful, as the PS3 was in 2006, and actually even more so, the Xbox 360 in 2005.
If you had an Xbox 360 in late 2005 or even early 2006, and you wanted a PC to match its gaming capability, you'd have to spend a few grand on on high-end desktop class hardware, and even then, it would only be more or less about equal. Certainly wouldn't have blown Xbox 360 out of the water. Lastgen consoles introduced three new high-end concepts for console developers: multi-core processing, HD rendering/asset creation, and complex fully programmable shaders on both MS and Sony hardware (yes, og Xbox had programmable shaders in 2001, but PS2 did not).
Those consoles were a huge leap and a massive hurdle for devs to come to grips with, expense for making quality games, and cost for MS and Sony to manufacture. On top of that, online multiplayer and a constant online ecosystem had to be standard, unlike the generation before, where Xbox Live launched a year after the og Xbox came out, PS2 barely had anything at all, and Dreamcast was dial-up.
Whereas, when PS4 and Xbox One launched in 2013, they were equivalent to a midrange gaming laptop GPU, and lowend laptop CPU. This was and is the HD remaster generation combined with full social media integration. Pretty incremental in itself. Nothing ground breaking.
As others have said, the traditional console cycle model is archaic and dates back to the 1970s & 80s.
This is not an argument in favor of PS4K, just where the industry is heading.