Aintitcool
Banned
After watching the new Nvidia Siggraph conference I finally see how VR could really be the future. Ofcourse we are still very far from that at 120Hz. So maybe in 20 years.
Last edited:
After watching the new Nvidia Siggraph conference I finally see how VR could really be the future. Ofcourse we are still very far from that at 120Hz. So maybe in 20 years.
Machine learning has already resulted in significant improvements to reducing ray samples. Foveated Rendering is even more promising.This thread is kinda of weird; what examples did you see that demonstrated that ray tracing inside the VR space is the future of the VR? A link? Summary? Anything?
In any case: ray tracing can produce some incredible visuals, no question, but the performance cost is obscene. It's why we largely don't use it. Utilising nVidia's new non-consumer grade hardware, we're now able to produce visuals on par with early original Xbox games in real-time:
The shadow, lighting, and reflection quality is significantly better than that era, but the visuals take an enormous hit. We're decades away from hardware at a consumer level that could meet the gaming industry standards. Outside of real-time rendering, we see some pretty terrific results in terms of modelling and development:
This is enterprise level stuff, but it's interesting none-the-less.
This type of technology isn't outside of our reach, it's just that quality of it so far beneath what we'd be prepared to accept:
Ray tracing in tandem with typical rendering techniques could yield some fairly incredible results. We see some of this in global illumination solutions and reflection passes in modern engines. But, shifting all rendering to ray tracing? Not likely; it's a waste of the hardware for the results, and will be for decades. The horse power needed to full off proper ray tracing could be spent faking the same output in a game engine with so much overhead to spare developers would be spoiled for choice.
That first video linked demonstrates all the problems reducing ray-samples brings (basically it's taking steps backwards in temporal stability again). And it's not like we're currently doing great in that regard - lots of standard rasterization stuff that is accepted as good/great quality on 2d-panels looks dreadfully poor in VR.Machine learning has already resulted in significant improvements to reducing ray samples.
Most of the benefits of this will be eaten by catching-up to "retina" resolution in VR HMDs - basically just going into increasing raw-pixel counts. The major short-term benefit of RT acceleration is actually making non-linear projection performant - which will benefit standard rasterization too.Foveated Rendering is even more promising.
Can you post some examples of the machine learning output and how it’s improved? I haven’t seen much in this regard.Machine learning has already resulted in significant improvements to reducing ray samples. Foveated Rendering is even more promising.
Full raytracing in VR is easier to achieve than you think.
Can you post some examples of the machine learning output and how it’s improved? I haven’t seen much in this regard.
Then again it's doing what it's doing (namely throwing 300G polys at the frustum) with minimal effort at temporal AA. It's not like a rasterizer would achieve temporal AA effortlessly -- it's just the fact a rasterizer would be plainly unable to handle the scene, so a LOD system would be mandatory if the scene were to have any chance of being rasterized at this fps. This raytraced scene uses no LOD, but nothing precludes it from.That first video linked demonstrates all the problems reducing ray-samples brings (basically it's taking steps backwards in temporal stability again).
Well that's been the promise (and the main argument for eventual crossing of streams) for ray-tracing over other approaches since day one - scaling to much larger data sets/geometric complexity without other "tricks". In fact it's been over a decade(that I know of) since first demos of RT outperforming rasterization under large-data set(contrived) scenarios. I'd argue not-having to rely on LOD as much should also lead to improvements in temporal stability that I've been complaining about for the past 15 years - but I digress.Then again it's doing what it's doing (namely throwing 300G polys at the frustum) with minimal effort at temporal AA.
Poor people who think real-time raytracing is a near thing...
You won't see a SINGLE experience, let alone game, using RTRT until at least 10 years in the future, and this is an optimistic guess.
The only thing you might see are pre-rendered "holograms" like ORBX or Google Seurat, but as far as I'm concerned they're varpoware.
However if you want the only single way to get a "snapshot" of what the future will look like, get a VR headset and download a stereocube map. This is one of my favorite prospective demo to get clients hyped on VR and game's future.
lol what are you on about?
That GPU costs $10,000 (Correction: Apparently it will also run on the $6300 version), and its a Quadro, which is a workstation GPU, not a consumer GPU. It is going to be a LONG time before that level of performance is in an affordable consumer GPU (and this is just the first gen of this technology).
And those demos aren't rendering a full game environment, either.
That GPU costs $10,000 (Correction: Apparently it will also run on the $6300 version), and its a Quadro, which is a workstation GPU, not a consumer GPU. It is going to be a LONG time before that level of performance is in an affordable consumer GPU (and this is just the first gen of this technology).
And those demos aren't rendering a full game environment, either.
20 years???After watching the new Nvidia Siggraph conference I finally see how VR could really be the future. Ofcourse we are still very far from that at 120Hz. So maybe in 20 years.
It runs on a 2080TI @ 999 dollars You guys are smoking some weird shit
We are about 1 year from 7nm , which will probably atleast double the performance. Then AMD will probably be ready with their 7nm raytracing answer and prices could come down even more. From 7nm the transition to smaller nodes will go faster because of EUV, atleast down to 3nm. Future of graphics looking bright.
My comment was pre-conference-- at that point, it had only been demoed on the Quadros. Now it seems that the consumer cards can also run the demos as well
EDIT: It is important to note that the intention with these cards isn't to run fully ray traced games-- the idea is to ray trace certain components, and to continue using existing graphical techniques for other things. Like the Tomb Raider example; they're ray tracing shadows, but lighting and AO are still not ray traced.
Are you sure about that? You do know that A.I is used with ray tracing to do full scenes now.The demo they had with the 2 robots, was fully ray traced and ran on 2080TI. For games its hybrid rendering for now, but with the upcoming 7nm and then 5nm (2023) games will probably be close to fully ray traced.
lol what are you on about?
The title of the thread is about ray tracing.20 years???
If my Vive had foveated rendering sensors and functionality, I bet I could do it without a RTX card right now just fine on my system.
lol what are you on about?
Oh I just happen to have worked on engines for years, and trust me, you won't see a single RTRT game until at least 10 years in the future (and that's optimistic).
Are you sure about that? You do know that A.I is used with ray tracing to do full scenes now.
There is actually lots of delay in that demo, many rays and reflections don't go at the speed of light.(which is impossible anyway but they get close digitally) Also the headlights have the wrong materials and to my eyes still looks fake. But porche made that demo, and I give them a break because they are car makers not real time graphics experts.
The title of the thread is about ray tracing.
We have had 2d screens for how long? And you think they are eh? nah bro they work just fine. Although 3d screen oleds were the fucking best.Funny i was just about to think that the only next gen jump to be made was vr
Super high res at 120fps
2d is getting kinda eh in jumps
nah what i meant to say silly.We have had 2d screens for how long? And you think they are eh? nah bro they work just fine. Although 3d screen oleds were the fucking best.
A few posts back you said it would take 10 years for a RTRT experience, and today they have shown several? Are you a car mechanic?