• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DX11 Real-Time Raytracing Tech Demo Running on a i7 + Radeon 5870

Zaptruder

Banned
That's a gob stopping bit of tech-demo.

But you gotta realize that the scene is also exemplary for the strengths of ray-tracing... namely reflective surfaces.

In other real time ray-tracing demos, it's not significantly improved upon what we've already been shown in other real time demos - because the technology has just come that far that few obvious benefits are left for such a computing intensive way of rendering a scene.

The main difference you'll see between existing advanced engines (UE4, luminous, cryengine 3, etc) and a real time ray-tracing engine, outside of much nicer reflections - is just a better sense of cohesion in the visuals... elements like bloom and shadows don't result from seperate techniques and algorithms overlaid and blended ontop of each other - they come from the same thing.

But even real time reflections in UE4, luminous, etc have gotten pretty good already.


For the most part, I'd like to see this tech been applied to 3D modelling and rendering packages. A much more WYSIWYG experience would just make the task of working in those packages that much more joyful.
 

vio

Member
Well its raytracing, realtime, and it got big environments. And lighting is very complex. Itjust shows realistic how far are we from quality realtime raytracing in games.
 
Well its raytracing, realtime, and it got big environments. And lighting is very complex. Itjust shows realistic how far are we from quality realtime raytracing in games.

So this is what the OP demo would look like if instead of looking at rings on a tabletop, we were looking at a building on a street? Depressing.
 

Man

Member
Carmack's take on Ray-tracing: http://raytracey.blogspot.com.au/2011/08/john-carmack-eventually-ray-tracing.html

Vertex fragment polygon based rasterizers are so far and away the most successful parallel computing architecture ever it’s not even funny. I mean all the research projects and everything else just haven’t added up to one fraction of the value that we get out of that. And there’s a lot of work, lots of smart people, lots of effort and lots of great results coming out of it. Eventually ray tracing will win, but it’s not clear exactly when it’s gonna be."

...I do think that some form of raytracing, of forward tracing or reversed tracing rather than forward rendering will eventually win because there’s so many things that just get magically better there. There’s so much crap that we deal with in rasterisation with, okay let’s depth fade or fake our atmospheric stuff using environment maps, use shadows. And when you just say “well just trace a ray” a lot of these problems vanish. But one interesting thing that people say “look real-time raytracing on current hardware”, that’s what I did in OpenCL recently and I did some interesting work with that.
 

TheD

The Detective
The main difference you'll see between existing advanced engines (UE4, luminous, cryengine 3, etc) and a real time ray-tracing engine, outside of much nicer reflections - is just a better sense of cohesion in the visuals... elements like bloom and shadows don't result from seperate techniques and algorithms overlaid and blended ontop of each other - they come from the same thing.

No, what you described is Path Tracing, not Ray Tracing.

http://en.wikipedia.org/wiki/Path_tracing
 
But you gotta realize that the scene is also exemplary for the strengths of ray-tracing... namely reflective surfaces.


what about realistic lights and shadowing and natural materials? thats still the biggest difference that serparates rasterization from cg nowadays...
 

onesvenus

Member
the nice thing about raytracing is that it scales really well to larger scenes.

It does?? Although the ammount of rays you fire is the same for a fixed resolution, the ray traversal gets slower and slower with bigger scenes and ray depth even when using space partition tecniques to decide which triangles to check with the ray-triangle intersection test.
 

Vic

Please help me with my bad english
This fucking blew my mind. It's a huge jump graphically compared to that we're currently seeing in games.
 

HyperionX

Member
It does?? Although the ammount of rays you fire is the same for a fixed resolution, the ray traversal gets slower and slower with bigger scenes and ray depth even when using space partition tecniques to decide which triangles to check with the ray-triangle intersection test.

True, but the cost of each ray traversal only increases by the log of the total number of polygons in the scene. With rasterization, rendering costs increases linearly with the number of polygons.
 
10 fps on two 580s? damn.

But imagine this thing running on 3x 580's :O

Feels like we're so so close, doesn't it? And if we're this close, next-gen machines need to have this tech because as nice as Epics global illumination tech looks, it doesn't get close to this.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
There's a tiny real time ray tracing demo downloadable here, for those interested. Third one from the top.

raynoupu.jpg
 

.nimrod

Member
It would be a lot more impressive if it used some form of BRDF to render the materials instead of making everything perfectly specular. Mirror surfaces take the least amount of samples to look good.

That's a gob stopping bit of tech-demo.
[...]
For the most part, I'd like to see this tech been applied to 3D modelling and rendering packages. A much more WYSIWYG experience would just make the task of working in those packages that much more joyful.

At least that's already happening with renderers like octance render or blender cycles
 
True, but the cost of each ray traversal only increases by the log of the total number of polygons in the scene. With rasterization, rendering costs increases linearly with the number of polygons.
The total rendering cost/time doesn't increase linearly with the number of polygons since the rasterization is only a small part of the total cost and the "only increases by the log of the total number of polygons in the scene" only applies to static polygons.

For animated characters you need to process every polygon either way because you don't know before hand where will it end.
 

Dire

Member
Looks great aside from the DoF effect. I can only assume that was added for performance purposes since I refuse to believe any human alive could see that as anything but distracting and just plain ugly.
 

Perkel

Banned
Was this ray-tracing (and real-time HDR) in the GT5:prologue (garage screens)?

nope. PD are amazing because their texture and shader work.

And GT5 has FP32 HDR (128bit) which is amazing if you know what it mean (And a lot of wow factor comes from that)
 

goomba

Banned
Lol prior to every gen since the n64 there has been hype about how "real time ray tracing " will soon be realized.
 

HyperionX

Member
The total rendering cost/time doesn't increase linearly with the number of polygons since the rasterization is only a small part of the total cost and the "only increases by the log of the total number of polygons in the scene" only applies to static polygons.

For animated characters you need to process every polygon either way because you don't know before hand where will it end.

You rarely move the whole scene. Only a few characters/items at a time are moving.

Right now, animated scenes are still the land of rasterization, but I hear that they are making progress on updating accelerating structures for ray traversals.
 
Was this ray-tracing (and real-time HDR) in the GT5:prologue (garage screens)?

As far as I can tell it's just IGN bullshit which still somehow persists to this day. Is there anything there which looks different to compared to on track shots?


That's a gob stopping bit of tech-demo.

But you gotta realize that the scene is also exemplary for the strengths of ray-tracing... namely reflective surfaces.

Well refractive in this case, if there's any ray tracing going on here that isn't pointless, it's for the refractions in translucent surfaces, because the only reflections seem to be of an HDRI map, and the old mirrored floor trick, both of which lots of games do already.
 

pottuvoi

Banned
Not sure how much of the demo is ray-traced.
From get go it is clear that it uses traditional image based DoF like games do and not sending rays from area of a lens. (like it's properly done in ray-tracers.)
My guess is that it's a either simple ray-tracer with refractions/reflections and shadows or hybrid renderer where reflections/refractions and perhaps shadows are traced.

And yes, ray-tracing has been possible for a long time and simple versions have been used in games for a years. (All those Parallax Mapping methods are simple tracers..)
Samatarian demo used raytracing of billboards for specular reflections. (up to 4? guads per pixel)
Unreal Engine 4 has cone tracing of voxels for creating that good looking global illumination and specular.

As far as I can tell it's just IGN bullshit which still somehow persists to this day. Is there anything there which looks different to compared to on track shots?
There certainly was not a single evidence in garage view for it, which is fail considering how low hanging fruit proper sharp reflection is..
 

orioto

Good Art™
I'm a noob but i have to ask, will it be this interesting to use if it's that resource heavy, when we're achieving some good result with actual tools...
 
As far as I can tell it's just IGN bullshit which still somehow persists to this day. Is there anything there which looks different to compared to on track shots?.

It's not IGN bullshit, I remember someone from PD saying in an interview that not only was it using tracing it was also full 1080p (unlike during gameplay where it's upscaled) and how proud they were of that scene.
 
Top Bottom