Consoles are vastly more price sensitive. PC enthusiasts don't balk at $400 GPUs. Consoles at $400 are considered expensive.
When you have to work within those tight margins, there's no way that the hardware is going to be up to snuff. That, and even if the console manufacturers eat a loss at launch, as they usually do, to get a bigger install base, they can't really improve on the hardware after launch except marginally.
Console lifespans are being stretched too, now increasing to 8+ years. For PC hardware, that's at least 4 generations of improvements that aren't being reflected in console hardware. As a comparison, the Xbox 360 launched in November 2005. GeForce was still on its 7000 series of cards. Since then, there has been the 8000s, 9000s, GTX 100s, 200s, 300s, 400s, 500s, now 600s, and likely 700s (GK-110), before the next-gen consoles arrive. PCs are now running stuff with 8 generations of GPU improvements.
There's no doubt that console gaming is more economical, for both consumers and developers. That's why development has switched over to that side for the lion's share of games. But to ask why console tech can't do this or that is basically asking why consoles don't cost $900 and get new revisions every four years.