Back to voxels?
JS: So you guys are just going to use CUDA or whatever?
TS: It could be any general-purpose programming language. But I assume in that case we'll write an algorithm that takes as its input a scene in our own little representation defined by our own data structures, and spits out a framebuffer full of colors, and generate that using any sort of technique.
Remember the Skaarj?
You could write a software rasterizer that uses the traditional SGI rendering approach; you could write a software renderer that generates a scene using a tiled rendering approach. Take for instance Unreal Engine 1, which was one of the industry's last really great software renderers. Back in 1996, it was doing real-time colored lighting, volumetric fog, and filtered texture mapping, all in real-time on a 90MHz Pentium. The prospect now is that we can do that quality of rendering on a multi-teraflop computing device, and whether that device calls a CPU or GPU its ancestor is really quite irrelevant.
Once you look at rendering that way, you're just writing code to generate pixels. So you could use any possible rendering scheme; you could render triangles, you could use the REYES micropolygon tesselation scheme and render sub-pixel triangles with flat shading but really high-quality anti-aliasing a lot of off-line movie renderers do thator you could represent your scene as voxels and raycast through it to generate data. Remember all the old voxel-based games in the software rendering era?
JS: Yeah.
TS: You could do that with enormous fidelity for complete 3D voxel environments now in real-time. You might even be able to do that on an NVIDIA GPU in CUDA right now. But whether or not you can actually do that today, I have little doubt that you'll be able to do that on processors from multiple vendors in a few years.
And what else could you do? You could do a ray tracing-based renderer. You could do a volumetric primitive-based renderer; one idea is to divide your scene into a bunch of little tiny spherical primitives and then just render the spheres with anti-aliasing. So you can get really efficiently rendered scenes like forests and vegetation.
There are really countless possibilities. I think you'll see an explosion of new game types and looks and feels once rendering is freed from the old SGI rendering model.
Remember, the model we're using now with DirectX was defined 25 years ago by SGI with the first OpenGL API. It's a very, very restrictive model that assumes you're going to generate all the pixels in your scene by submitting a bunch of triangles in a fixed order to be blended into some frame buffer using some blending equation, and the fact that we have these ultra-complex programmable pixel shaders running on each pixelthat part of the pipeline has been made extensible, but it's still the same back-end rendering approach underneath.
JS: So to follow up with that, I hear that Larrabee will be more general-purpose than whatever NVIDIA has out at the time, because NVIDIA is still gonna have some hardware blocks that support whatever parts of the standard rasterization pipeline.
TS: That's kind of irrelevant, right? If you have a completely programmable GPU core, the fact that you have some fixed-function stuff off to the side doesn't hurt you. Even if you're not utilizing it at all in a 100 percent software-based renderer, there are economic arguments that say it might be worthwhile to have that hardware even if it goes unused during a lot of the game, for instance, if it consumes far less power when you're running old DirectX applications, or if it can perform better for legacy usage cases.
Because, one important thing in moving to future hardware models is that they can't afford to suddenly lose all the current benchmarks. So DirectX remains relevant even after the majority of games shipping are using 100 percent software-based rendering techniques, just because those benchmarks can't be ignored.
So I think you'll see some degree of fixed-function hardware in everybody's architectures for the foreseeable future, and it doesn't matter. And as long as the hardware is sufficiently programmable, we're fine with that.