But it's bad that we accept this just like that. I mean, nowadays we just have to assume that if something looks unbelievable then it is because it is unachievable on current hardware. Yes, mostly 3rd party games, but even exclusives go through some minor downgrades. I think previous gens were different. Games tended to look better closer to release, and not worse. Don't take me wrong, I'm okay with how games look today, but I just hate it that they still look worse than what they show us initially.
I agree it's bad, Witcher 3, Watch Dogs and the The Division were the main culprits this gen especially the two W's......The games looked much worse than what was initially shown, and to Witcher 3's detriment, console performance and graphical settings were terrible at launch, and yet, it's still not a locked 30fps even on today's mid gen consoles.
So my thought was, if you're going to lower graphical detail, it has to be to correct framerate and other miscellaneous issues right? but even then, that don't always pan out, which is a shame and not acceptable. Witcher 3 still has slowdown, had settings below low on consoles when it launched, bad pop-in issues, texture loading issues, bad load times etc....all at a much lower fidelity than when it was first shown. At the very least, if you can't live up to the graphics touted in initial trailers, nail perf to justify where the render budget went. I'm sure if you ask EA they'd say Anthem was running at 30fps 4k etc at E3 on consoles...