• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

It's About Time - Photorealistic game demo using Brigade 2 path tracing engine

-SD-

Banned
EDIT: Looks like the demo has been pulled.

There's a path traced playable game that uses the Brigade 2 engine.

Direct download: http://igad.nhtv.nl/~bikker/files/AboutTime R1.zip
Project site: http://igad.nhtv.nl/~bikker/

This game is playable on NVidia hardware only, and will use all your GPUs if you have more than one installed. A single high-end GPU should run the game quite well; more GPUs will reduce the noise.

To be honest, it's not that "photorealistic" due to the heavy noise which is caused by me having only one GPU (GTX 570), 6 year old CPU etc..., but the feeling I get out of this is that we're now closer than ever to playing ray traced games. It's About Time, indeed.

This is how it's supposed to look if you've got enough GPU power:

Siggraph 2012 video: http://www.youtube.com/watch?v=n0vHdMmp2_c (using cloud computing)
Screens: http://raytracey.blogspot.co.nz/2012/08/real-time-path-traced-brigade-demo-at.html



There's also a downloadable version of the Brigade 2 engine, if you want to build your own stuff with it.

http://igad.nhtv.nl/~bikker/downloads.htm
 

Dennis

Banned
I have two GTX580 GPUs with 3GB VRAM each.

Just hook this straight into my veins.

I only need like 15 fps to play anyway.
 

injurai

Banned
Wow, so its actually collecting virtual rays of light to render the scene... If they could tone down the grain this could revolutionize lighting engines.
 

Jackpot

Banned
What determines the noise level? You can't have it delay rendering the frame until it has all the info even if it means its playing in slow motion?
 

Stallion Free

Cock Encumbered
Wow, so its actually collecting virtual rays of light to render the scene... If they could tone down the grain this could revolutionize lighting engines.

The excitement isn't so much over the method (this tech has been studied for a longggg time) as it is over it running real-time on consumer hardware.
 

dab0ne

Member
Wow. Soon the models will be imported straight from Zbrush instead of normal mapping low poly characters too. Pretty amazing stuff.
 

mrklaw

MrArseFace
couldn't it use 1/10 or 1/100 the number of rays, and just apply the lighting to a wider area? I don't see why you need full ray tracing
 

Wortany

Member
It's amazing to get real-time ray-tracing, but so much information gets lost right now (lack of GPU power) that it's hardly visually appealing.
This thing eats my GTX680 for breakfast, and I don't think a second 680 will satisfy the game enough as well. It still demands a boatload of processing.

Still kudos to those guys, I do like what I'm seeing for now. It's amazing to have this running in at least some capacity.

Rays Max, Bounce Minimum:

Rays Minimum, Bounce max before slider goes back (around 4/6):


Would love to see what an SLI gets out of this.
 

n0n44m

Member
doesn't really seem to work on my 670 SLI with the 306.97 drivers

menu runs OK, turning the sliders up really maxes out the GPU usage, but when I start a new game it is 0.5 fps with 30% usage no matter what the sliders are set to ...
 

injurai

Banned
The excitement isn't so much over the method (this tech has been studied for a longggg time) as it is over it running real-time on consumer hardware.

So lets say in another 5-10 years. Would hardware be capable of rendering in such a method to eliminate the grain? Or is it just not possible? It seems if more of these rays were captured they would properly develop the image, kind of like a video camera.
 

JNT

Member
couldn't it use 1/10 or 1/100 the number of rays, and just apply the lighting to a wider area? I don't see why you need full ray tracing

In ray tracing there is no distinction between the contents of the screen and the lighting. This would mean that applying light to a bigger area is the same as rendering at a lower resolution. This would increase performance, but you would get a lower-res look.
 
Dat grain. Running through a shadowy area on my GTX470 is like looking at the sky and trying to see something in the stars.
 

Lord Error

Insane For Sony
I've seen this video a while ago. I think it does a terrible job of showcasing the kind of thing this would be good for, which is high quality dynamic lighting and shadowing. As it is, it looks like noisier version of envoronments we've already seen in traditional games, where complex lighting is partially precomputed and static.
 

mrklaw

MrArseFace
In ray tracing there is no distinction between the contents of the screen and the lighting. This would mean that applying light to a bigger area is the same as rendering at a lower resolution. This would increase performance, but you would get a lower-res look.

ah, I was figuring they were using it just for calculating lighting and applying it to normally modeled and textured scenes.
 

JNT

Member
ah, I was figuring they were using it just for calculating lighting and applying it to normally modeled and textured scenes.

Can't say I'm well read in this particular project, so I guess they could be using a hybrid approach.
 

Stallion Free

Cock Encumbered
So lets say in another 5-10 years. Would hardware be capable of rendering in such a method to eliminate the grain? Or is it just not possible? It seems if more of these rays were captured they would properly develop the image, kind of like a video camera.

Oh yeah, as hardware gets better, the performance with this will get better as well. It's hard to say if 5 years will be enough time to optimize it and getting it running well on consumer grade hardware, but I would bet it happens within 10.
 

Spazznid

Member
Could have sworn I've seen both this thread and this demo about 10 times each throught the last year or so....
 
Wow. Soon the models will be imported straight from Zbrush instead of normal mapping low poly characters too. Pretty amazing stuff.

Nah, we're years away from that. No sane person would even do that, shading milions of polygons is inefficient. That and deforming meshes that dense in realtime is next to impossible. Normal maps and displacements maps are here to stay.
 

Krilekk

Banned
I don't see the beauty in it. There was a realtime raytracing demo of Quake 3 Arena some years ago that was much more impressive TBH. This just seems like a proof of concept of scaling raytracing to the hardware (basically just show what could be rendered in time and add noise to everything else). It's simply ugly, even on high end hardware.
 

Brera

Banned
Am I missing something?

Looks like grainy shit. Bad lighting. Boxy neat looking buildings. Crap water...not photorealistic at all!
 

E-Cat

Member
New demo

Amazing how fast they're making progress, path tracing with very little noise seems a real possibility next next-gen!

Some details:

- City scene on Brigade engine made by Hayssam Keilany of iCEnhancer fame, has 750
instanced animated characters (30k triangles each, in total 22.5 million animated
triangles
), all of them physics driven with Bullet physics in a 600k triangle city

- The Piazza has 16,384 instances of a 846k triangle city, 13.8 billion triangles
in total


- Interior scene on Octane Render, created by Enrico Cerica, 1 million
triangles


- Rendered on a couple of Titans, 40 fps @ 720p, 25fps @ 1080p
 

schuey7

Member
New demo

Amazing how fast they're making progress, path tracing with very little noise seems a real possibility next next-gen!

Some details:

- City scene on Brigade engine made by Hayssam Keilany of iCEnhancer fame, has 750
instanced animated characters (30k triangles each, in total 22.5 million animated
triangles
), all of them physics driven with Bullet physics in a 600k triangle city

- The Piazza has 16,384 instances of a 846k triangle city, 13.8 billion triangles
in total


- Interior scene on Octane Render, created by Enrico Cerica, 1 million
triangles


- Rendered on a couple of Titans, 40 fps @ 720p, 25fps @ 1080p

The reflections in the interior scene is the part that made me say wow.Hope the team make further progress .
 

E-Cat

Member
The reflections in the interior scene is the part that made me say wow.Hope the team make further progress .
Another cool thing about a path tracer is that it doesn't care about how much geometry you throw at it, as long as the scene fits in the VRAM. Can't wait for fields full of instanced grass.
 

ZealousD

Makes world leading predictions like "The sun will rise tomorrow"
Wow, so its actually collecting virtual rays of light to render the scene... If they could tone down the grain this could revolutionize lighting engines.

Ray tracing is old as dirt. It's just computationally expensive as all fuck. It's been impractical for real-time applications for a long time. It's just much less taxing if you cheat.

In an alternate universe, videogames had ray-tracing since 1999, but wont have bump mapping nor complicated meshes until 2020.

So in other words, video games look like shit in this alternate universe?
 

baphomet

Member
Beautiful. Hopefully the gen after next will have some sort of path tracing abilities. The interior portion was amazing.
 

sTeLioSco

Banned
cubecity9.png


397318_472898982744587_1257142561_n.jpg


https://www.facebook.com/photo.php?...7799.111426.198351220199366&type=3&permPage=1
 

Blizzard

Banned
The lighting in the first outdoor scene, and the indoor scene with the chess pieces, looks amazingly realistic to me. I liked changing the material in realtime as well.

Moving the light outside with the repeated building models was neat, but I wonder how computational cost would scale if there were lots of small lights to dynamically render in a scene as well. I don't know a lot about ray tracing.
 
Top Bottom