borborygmus
Member
I don't have a good idea of what people think about hardware accelerated real time ray tracing. Sony, Microsoft and nVidia are marketing the hell out of it, but I don't think I've read many positive impressions of it on forums.
To me it seems like an obvious waste of resources. You'd still be able to have very good looking reflections and lighting/shadows without it and imho it's "good enough" quality for now in the big picture. When I look at "RTX on" screenshot galleries, it very quickly resorts to showing scenes with large puddles and it reminds me how situational and how marginal the benefit is.
Is this an artifact of how cynical this field has become? Or maybe this is a necessary stepping stone toward something that'll become worthwhile in the future. But then that raises new questions: should we dedicate silicon, and how much, for specific features? This seems to contradict the previous trend of generalizing GPU programmability.
Am I an idiot? I'm open to that possibility. Let me know.
edit: I a word.
To me it seems like an obvious waste of resources. You'd still be able to have very good looking reflections and lighting/shadows without it and imho it's "good enough" quality for now in the big picture. When I look at "RTX on" screenshot galleries, it very quickly resorts to showing scenes with large puddles and it reminds me how situational and how marginal the benefit is.
Is this an artifact of how cynical this field has become? Or maybe this is a necessary stepping stone toward something that'll become worthwhile in the future. But then that raises new questions: should we dedicate silicon, and how much, for specific features? This seems to contradict the previous trend of generalizing GPU programmability.
Am I an idiot? I'm open to that possibility. Let me know.
edit: I a word.
Last edited: