The CPU is 2 generations behind a ps4/xbox one and just a little worse then a 360. The GPU set up though, is basically a much weaker version of what's in a ps4/xb1. Max it out and it can probably produce visuals on par with the best PS3/360 games but it's not easy to do so. It has access to better shaders and more memory though which kind of compensate in some ways. There's really nothing to fix the crappy CPU though. I should say, it's crappy in terms of computational output for games but it's very energy efficient and it still works well enough.
I've used this scale before -
1-10, with 1 being an xbox 360 or a ps3 and 10 being a ps4.
1-2 are actually shared by 360/ps3 depending on the game
2-3 are the wii U's hot spots
9 is an xbox 1
10 is a ps4.
Basically, in the right hands I'd expect the average game will always look and run just a hair better then the best the other 2 systems can muster but it takes almost as much work as a PS3 game. The userbase just isn't there so you don't see that kind of effort being used. Also, if a game makes heavy use of the gamepad at TV at the same time with out simply cloning the screen it uses up a decent chunk of that extra power and ends up with games that look a bit worse then on 360/ps3.
I've used this scale before -
1-10, with 1 being an xbox 360 or a ps3 and 10 being a ps4.
1-2 are actually shared by 360/ps3 depending on the game
2-3 are the wii U's hot spots
9 is an xbox 1
10 is a ps4.
Basically, in the right hands I'd expect the average game will always look and run just a hair better then the best the other 2 systems can muster but it takes almost as much work as a PS3 game. The userbase just isn't there so you don't see that kind of effort being used. Also, if a game makes heavy use of the gamepad at TV at the same time with out simply cloning the screen it uses up a decent chunk of that extra power and ends up with games that look a bit worse then on 360/ps3.