I'm having trouble understanding why it's on software developers to ensure that a GPU won't overheat from being understressed. I understand that it's "simple enough" to implement an FPS cap, but what's the difference between a GPU running at full blast rendering as many frames as possible and a GPU running at full blast rendering, say, a 20fps chop-fest?
Am I wrong to think that neither of these scenarios should pose a threat to hardware? If so, why?