• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Microsoft announces RayTracing for DirectX.

Leonidas

Member


For the last thirty years, almost all games have used the same general technique—rasterization—to render images on screen. While the internal representation of the game world is maintained as three dimensions, rasterization ultimately operates in two dimensions (the plane of the screen), with 3D primitives mapped onto it through transformation matrices. Through approaches like z-buffering and occlusion culling, games have historically strived to minimize the number of spurious pixels rendered, as normally they do not contribute to the final frame. And in a perfect world, the pixels rendered would be exactly those that are directly visible from the camera:

[...]

Today, we are introducing a feature to DirectX 12 that will bridge the gap between the rasterization techniques employed by games today, and the full 3D effects of tomorrow. This feature is DirectX Raytracing. By allowing traversal of a full 3D representation of the game world, DirectX Raytracing allows current rendering techniques such as SSR to naturally and efficiently fill the gaps left by rasterization, and opens the door to an entirely new class of techniques that have never been achieved in a real-time game.

What Does This Mean for Games?
DXR will initially be used to supplement current rendering techniques such as screen space reflections, for example, to fill in data from geometry that’s either occluded or off-screen. This will lead to a material increase in visual quality for these effects in the near future. Over the next several years, however, we expect an increase in utilization of DXR for techniques that are simply impractical for rasterization, such as true global illumination. Eventually, raytracing may completely replace rasterization as the standard algorithm for rendering 3D scenes. That said, until everyone has a light-field display on their desk, rasterization will continue to be an excellent match for the common case of rendering content to a flat grid of square pixels, supplemented by raytracing for true 3D effects.​

https://blogs.msdn.microsoft.com/directx/2018/03/19/announcing-microsoft-directx-raytracing/

Remedy and EA have posted a video each showcasing this technology.



 
Last edited:

Shifty

Member
Neat. Some of the materials in the Northlight scene have some serious noise on them though- pretty typical of modern raytracing as it casts multiple rays to 'resolve' a frame over time, but it's surprising to me that they'd put out a demo with such obvious artifacting all over the place. I wouldn't call this production-ready for a while yet.
 
Last edited:

sertopico

Member
Basically in the short term they're gonna fix that annoying problem related to SSR which disappear and reappear according to the camera angle, also preventing surfaces like mirrors to reflect the environment correctly. Well, good to hear.
 
Last edited:

DryvBy

Member
Haven't there already been games with ray tracing? Also, isn't that more reliant on the graphics engine than the API? Sorry, tired and can't think straight.
 

Shifty

Member
Haven't there already been games with ray tracing? Also, isn't that more reliant on the graphics engine than the API? Sorry, tired and can't think straight.
Technically Wolfenstein 3D was raytraced. The renderer would cast a ray for each column of pixels, then figure out the texture based on the type of block hit and the on-screen size based on distance traveled.
You'll find it here and there in older games because, to a point, raytracing is simpler / cheaper than rasterization. Once you pass that point however, performance cost explodes, hence why we haven't seen the technique much in modern rendering.

And as far as graphics engine vs API goes, it's been more engine-side up to this point because GPUs have always been rasterization-focused. Any raytracing solution running on a hardware-accelerated backend like DirectX or OpenGL would still have to rasterize at least one quad in order to display the traced rays onscreen.
Now we have these new to-the-metal APIs like DX12 and Vulkan, developers and platform holders are able to explore more novel techniques. You could probably still implement your own solution engine-side, but creating a unified API makes sense since raytracing is easy to implement but hard to optimize.
 
Last edited:

Leonidas

Member
guess it would be only for those(like myself) who has Nvidia gpus as they teamed up with microsoft

DirectX Raytracing is supported by AMD as well.

Nvidias RTX is different, it requires a Volta GPU(the only currently available Volta GPU is the $3000 Titan V). Though I suspect Nvidia will announce mainstream Volta GPUs sometime soon...
Kp7RA5C.png


https://www.anandtech.com/show/12547/expanding-directx-12-microsoft-announces-directx-raytracing
 

oldergamer

Member
I'm almost certain we will see some of this in Xbox one X games. Fairly easy to use it in limited places to increase fidelity.
 
Hate to burst your bubble but this is not happening. Look at DX12 and Vulkan, look at 25 y.o (yes 25 years) real-tile raytracing...This is just marketing and announcement BS.
 

onQ123

Member
Hate to burst your bubble but this is not happening. Look at DX12 and Vulkan, look at 25 y.o (yes 25 years) real-tile raytracing...This is just marketing and announcement BS.


Wrong devs can already use ray-tracing for shadows & so on what this will do is make it part of the API. Both PS4 & Xbox One have features that make them ripe for ray-traced games but the problem was the devs would most likely have to do all the groundwork & it wouldn't be easily ported to other platforms now there will be a middle ground for bigger games to use it.
 

wellwhy

Neo Member
I'm not sure what form it will be in, but realtime raytracing is the future. this isn't turfFX or hairWorks, it's an insanely important rendering method that makes "pre-rendered" footage like trailers and movies look so good. it's more representative of real light's behavior in a single package, encompassing depth of field, ambient occlusion, global illumination, spotlight shadows, reflections etc. Nowadays they're all separate techniques and faked tricks, but that will soon change. Right now AI gives raytracing a crutch by denoising low res results, but it still looks great. The jump in graphical quality between generations has been getting harder to see, but photogrammetry, PBR, and now realtime raytracing are the next big jumps that we've been waiting for.
 
Last edited:
Top Bottom