• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Dedicated ray tracing cores for Xbox Scarlett?

Being the non-tech savy person that I am, how much would this actually benefit games? That's assuming it's true and that it is as interesting as it sounds. I'm assuming PS5 would have something similar even though the article tries to slant it as something only on the Xbox. If this is old news just disregard thread. I'm a lazy person. Cheers.

 

Journey

Banned
Being the non-tech savy person that I am, how much would this actually benefit games? That's assuming it's true and that it is as interesting as it sounds. I'm assuming PS5 would have something similar even though the article tries to slant it as something only on the Xbox. If this is old news just disregard thread. I'm a lazy person. Cheers.



There was a rumor that this was the case, based on comments made by both Sony and MS, where it seemed like Scarlett included this, but Sony made no mention. The rumor goes far back to the announcement, but it's not until now that I hear of any developer speaking about it.
 
Last edited:

Jigsaah

Gold Member
I'm not sure...I mean NVIDIA had these ray-tracing cores in their RTX graphics cards at first...then all of a sudden, now GTX cards can do it too? It makes it sound like Raytracing doesn't require specific technology to run it, but maybe having cores dedicated to it makes for a better implementation?

Hell, I dunno.
 

Journey

Banned
I'm not sure...I mean NVIDIA had these ray-tracing cores in their RTX graphics cards at first...then all of a sudden, now GTX cards can do it too? It makes it sound like Raytracing doesn't require specific technology to run it, but maybe having cores dedicated to it makes for a better implementation?

Hell, I dunno.


When you have dedicated cores for any specific task, it usually means better performance when a game heavily uses said task. What is the performance difference between GTX cards and RTX cards with dedicated cores?
 

Jigsaah

Gold Member
When you have dedicated cores for any specific task, it usually means better performance when a game heavily uses said task. What is the performance difference between GTX cards and RTX cards with dedicated cores?
I'm not sure. I've never tried to use it mainly because I don't have the rig to run it properly. I'm on 1070ti with an i5 8400. If I wasn't at work right now, I'd take the time to look it up.
 
Last edited:

pawel86ck

Banned
I'm not sure...I mean NVIDIA had these ray-tracing cores in their RTX graphics cards at first...then all of a sudden, now GTX cards can do it too? It makes it sound like Raytracing doesn't require specific technology to run it, but maybe having cores dedicated to it makes for a better implementation?

Hell, I dunno.
Thanks to RT cores RTX 2080ti runs Quake 2 RTX with similar framerate at 1440p as 1080ti in 480p without RT cores, so there's a huge performance benefit on Turing thanks to RT cores.
 

Komatsu

Member
Contrary to what most people believe, even high-end GPUs (graphics cards) are actually not that great in handling unstructured, incoherent work and ray tracing is not a very coherent rendering technique. Remember, light can go in a million different directions, bounce off many objects, shade different textures, etc. GPU speed is highly dependent on memory management, and what they're great at is applying the same process to lots of data points.

Let's take a look at the RTX (Turing) architecture. Yep, I know Scarlett will be using an AMD GPU, but still, the principles remain the same.

scJPYEX.png


We have the CUDA (Compute Unified Device Architecture) cores, which are part of the Turing Streaming Multi-proceesor (SM) architecture. Think of those things as parallel processors, like the cores in your CPU (the comparison is inaccurate but helpful). The Tensor Core is a specialized execution unit developed to perform the core compute functions used in Deep Learning and other types of AI.

The RT Core included in each SM is there to accelerate Bounding Volume Hierarchy (BVH) traversal and ray casting functions. BVH is basically a tree-based acceleration structure that contains multiple bounding boxes that are leveraged when running calculations for illumination.

So whether a dedicated core would be a good idea or not, let's first talk about the function of an RT core. In machines without a dedicated hardware ray tracing engine, the process of bounding volume hierarchy calculations for illumination would be handled by your standard graphics pipeline and shaders, with thousands and thousands of instructions bouncing to the CPU for every ray of light cast against the geometry rendered - that is, the bounding boxes with the object. It keeps going on and on, onto ever smaller geometries up until the moment it hits a polygon. It's very computationally intensive, and in situations where render time is not an issue, like in movies, most people/studios would let the CPU handle most of the brunt. But that's not the case with gaming.

The ray tracing cores are there to handle all ray-triangle intersection calculations, leaving the rest of the card to handle the remainder of the pipeline. Basically it only has to run the ray probe and the Ray Tracing core takes it from there.

Given that it's very likely that the Zen 2 might prove underwhelming, having its own dedicated core (which AMD will call "Ray Processor" or whatever) to handle some sort of ray tracing does make sense.

Very good resource: NVIDIA White Paper on the Turing architecture
 
Last edited:

Leonidas

Member
When you have dedicated cores for any specific task, it usually means better performance when a game heavily uses said task. What is the performance difference between GTX cards and RTX cards with dedicated cores?

The difference is massive. Without RT cores, GTX cards suffer greatly when attempting Ray Tracing. Take the most advanced Ray Tracing benchmark for example.

Q2-RTX-1080.png


Dedicated ray tracing cores seems like an unnecessary addition and reason to drive up cost in "value" pitching.

It's necessary to get much improved performance when using Ray Tracing.
 

Jigsaah

Gold Member
Contrary to what most people believe, even high-end GPUs (graphics cards) are actually not that great in handling unstructured, incoherent work and ray tracing is not a very coherent rendering technique. Remember, light can go in a million different directions, bounce off many objects, shade different textures, etc. GPU speed is highly depending on memory management, and what they're great at is applying the same process to lots of data points.

Let's take a look at the RTX (Turing) architecture. Yep, I know Scarlett will be using an AMD GPU, but still, the principles remain the same.

scJPYEX.png


We have the CUDA (Compute Unified Device Architecture) cores, which are part of the Turing Streaming Multi-proceesor (SM) architecture. Think of those things as parallel processors, like the cores in your CPU (the comparison is inaccurate but helpful). The Tensor Core is a specialized execution unit developed to perform the core compute functions used in Deep Learning and other types of AI.

The RT Core included in each SM is there to accelerate Bounding Volume Hierarchy (BVH) traversal and ray casting functions. BVH is basically a tree-based acceleration structure that contains multiple bounding boxes that are leverage when running calculations for illumination.

So whether a dedicated core would be a good idea or not, let's first talk about the function of an RT core. In machines without a dedicated hardware ray tracing engine, the process of bounding volume hierarchy calculations for illumination would be handled by your standard graphics pipeline and shaders, with thousands and thousands of instructions bouncing to the CPU for every ray of light cast against the geometry rendered - that is, the bounding boxes with the object. It keeps going on and on, onto ever smaller geometries up until the moment it hits a polygon. It's very computationally intensive, and in situations where render time is not an issue, like in movies, most people/studios would let the CPU handle most of the brunt. But that's not the case with gaming.

The ray tracing cores are there to handle all ray-triangle intersection calculations, leaving the rest of the card to handle the remainder of the pipeline. Basically it only has to run the ray probe and the Ray Tracing core takes it from there.

Given that it's very likely that the Zen 2 might prove underwhelming, having its own dedicated core (which AMD will call "Ray Processor" or whatever) to handle some sort of ray tracing does make sense.

Very good resource: NVIDIA White Paper on the Turing architecture

Ok, I get the gist of what you're saying. Basically having dedicated RT Cores in Scarlett will save it's GPU from having to deal with Raytracing itself. Just because a GPU could do it on it's own...it's not efficient to do so because of how complex and heavy of a computational process raytracing is.

Am I in the ball park?
 

Jigsaah

Gold Member
The difference is massive. Without RT cores, GTX cards suffer greatly when attempting Ray Tracing. Take the most advanced Ray Tracing benchmark for example.

Q2-RTX-1080.png




It's necessary to get much improved performance when using Ray Tracing.
Why even have it available on GTX cards then?
 

Komatsu

Member
Ok, I get the gist of what you're saying. Basically having dedicated RT Cores in Scarlett will save it's GPU from having to deal with Raytracing itself. Just because a GPU could do it on it's own...it's not efficient to do so because of how complex and heavy of a computational process raytracing is.

Am I in the ball park?

Yep, not far off Bear in mind those consoles will have APUs - CPU and GPU integrated in one die. The RT cores will be there, so one can't say that the GPU won't be handling ray tracing. But yes, the cores will handle that specific set of instructions leaving the remainder of the architecture to handle everything else.
 

psorcerer

Banned
What Nvidia calls "ray tracing" is just a marketing shtick that does not bring anything new to ingame graphics.
It fixes minor problems at a very high performance cost.
 

Nikana

Go Go Neo Rangers!
Someone explain to me what a rtx core does that a regular cuda core doesn't to make it more Ray tracing accessible.
 

muteZX

Banned
Someone explain to me what a rtx core does that a regular cuda core doesn't to make it more Ray tracing accessible.

Developers use DXR to cast rays, RT cores accelerate their traversal through the scene ..


see Metro Exodus frame times ..

 
Last edited:

The Alien

Banned
Not sure of all the technical jargon, but pretty sure Microsoft confirmed "hardware accelerated ray tracing" in their E3 Scarlett vid.

Conversely Sony (vaguely) stated the PS5 would have ray tracing...which many assume to be "software" assisted.

Again, not up on my tech jargon here, but the consensus has been hardware is preferred/better as it would not be as taxing or create bottlenecks. Not sure if that translates to dedicated cores, etc.
 

bitbydeath

Member
Not sure of all the technical jargon, but pretty sure Microsoft confirmed "hardware accelerated ray tracing" in their E3 Scarlett vid.

Conversely Sony (vaguely) stated the PS5 would have ray tracing...which many assume to be "software" assisted.

Again, not up on my tech jargon here, but the consensus has been hardware is preferred/better as it would not be as taxing or create bottlenecks. Not sure if that translates to dedicated cores, etc.

A patent stated they’ll be using a mix of both hardware and software in order to obtain the best results possible.

It would seem to have been doing so for quite some time, too. A patent application came to light in July 2019, which pointed to AMD utilizing a “hybrid” hardware and software approach to ray tracing. This patent suggests that AMD’s plan for ray tracing involves leveraging bespoke hardware to accelerate it, whilst performing the bulk of the work on more general hardware via software. This, it claims, makes it so that there isn’t too much of a performance hit, without requiring game developers to work with its very specific definition of ray tracing rendering.

 

pawel86ck

Banned
Only if you consider ray tracing to be necessary.
IMO they can fake reflections with good results, and shadows also, but RT GI makes a drastic difference. When it comes to performance impact, with RT cores it's not big as long developers will limit RT calculations because RT cores has limits.


QHXTI7a.jpg



In 1080p as you can see performance difference between RTX off and on it's not big. RT cores in current Turing GPU's are barely enough for 1080p 60fps, but that's all. On consoles however developers can make 30fps games, and optimize performance much better, so I think RT cores in Xbox Scarlett and PS5 will make a huge difference. Xbox OG was first console with shaders, and although shaders performance was extremely bad on Geforce 3/4 type GPU's on PC, we have seen many great looking games on Xbox OG even on 1'st gen shaders.
 
Last edited:

Gargus

Banned
That's all well and good but this is why I haven't played my xbx1 in a year because it's in a storage bin and why I have no interest in their next system and won't buy it. Because Microsoft seems to think hardware is what makes a good game. It's all they have talked about for quite some time now, occasionally they talk about games but not often.

I don't care if it has ray tracing or be "the most powerful console ever created" or how many tera flops it puts out. All I want is good games and xbox has so few ones that are exclusive I want to play I wrote them off a while ago.

Now it's all about playstation, PC, and occasionally switch.
 

magnumpy

Member
I assume all next-gen consoles will be supporting ray tracing (except for nintendo, who is famous for using outdated tech in their systems. not hating, they've experienced some success with that strategy. even if nvidia (GPU supplier for nintendo) is all in on ray tracing, nintendo won't be using the latest and greatest nvidia tech). both Nvidia and AMD see the future as ray tracing.
 
I don't want rtx. It adds nothing to a game other than it looking shiney.

How about we use the power to inprove physics? How am I playing console games in 2019 without destructible environments?

Pure bullshit.
 

psorcerer

Banned
Here's a paper on how Pixar added RT to their proprietary render for the movie Cars.

That's one effect for specific scenes.
Generally ray tracing doesn't even solve lighting equation good enough.
Unless you are really not approximating it by casting rays from camera but brute force cast it from each object.
And even in this prohibitive case it still cannot solve quite a lot of materials: try to ray trace a human eye.
 

Komatsu

Member
That's one effect for specific scenes.
Generally ray tracing doesn't even solve lighting equation good enough.
Unless you are really not approximating it by casting rays from camera but brute force cast it from each object.
And even in this prohibitive case it still cannot solve quite a lot of materials: try to ray trace a human eye.

I'm not going to argue the limitations of the technique - you're absolutely right that it's often too expensive, computationally speaking.

I just wanted to clarify that your statement that Pixar "doesn't ray trace" wasn't accurate. That paper about a 2006 movie of theirs using RT selectively was just one example. RenderMan has extensive features to facilitate tracing of all kinds and there's plenty of documentation for new users on how to leverage it. So, yes, Pixar actually employs RT quite liberally in their productions, in their tools and in their demos. They use it to push their product.
 

psorcerer

Banned
Pixar "doesn't ray trace" wasn't accurate.

Okay, agree. I was trying to illustrate that RT is not the holy grail of lighting and even for offline renderers its cost is prohibitively high.
It's just another technique that can be applied in specific cases to create a better looking image.
All the "RTX" battlecrys we hear from every garbage bin nowdays is just a marketing shtick. Nobody cried "cone tracing" or "shadow maps" AFAIR.
 
Pointless spending, money down the drain thus reminds me of dedicated physx cards it's utter nonesense, Microsoft don't have a clue what they are doing 80 percent of the time by this time I think only windows is keeping them alive the rest departments are just wasting Microsoft's money!
 
Pointless spending, money down the drain thus reminds me of dedicated physx cards it's utter nonesense, Microsoft don't have a clue what they are doing 80 percent of the time by this time I think only windows is keeping them alive the rest departments are just wasting Microsoft's money!
I always wondered if people like you are real and here you are. You really believe ray tracing which has been the holy grail and an industry standard in animation movies and finally we got the hardware that can do it in real time is a gimmick? What will you say when Sony and Nintendo also use hardware with RT cores? What I'm asking afterall is are you just a fanboy or plain dumb?
 
Last edited:
I always wondered if people like you are real but there you are. You really believe ray tracing which has been the holy grail and an industry standard in animation movies and finally we got hardware that can do it in real time is a gimmick?
I'm not against raytracing I'm against wasting console resources and money on gimmick dedicated cores, they've done it on older cards using cry engine they've done it on a degree xbone and PS4 aswell so why spend costs making dedicated cores
 
I'm not against raytracing I'm against wasting console resources and money on gimmick dedicated cores, they've done it on older cards using cry engine they've done it on a degree xbone and PS4 aswell so why spend costs making dedicated cores
You realise the GPU manufacturers and not Microsoft or Sony make this kind of decisions. Their r&d departments decided RT cores is the most efficient way to do it. Same thing happened when shaders came around and every modern gpu has shading units.
 

CeeJay

Member
Pointless spending, money down the drain thus reminds me of dedicated physx cards it's utter nonesense, Microsoft don't have a clue what they are doing 80 percent of the time by this time I think only windows is keeping them alive the rest departments are just wasting Microsoft's money!
Really bad post, you make it sound like Microsoft are floundering in the market and behind the curve :messenger_tears_of_joy:
 

meirl

Banned
This was already known. PS5 only has Software based RT. It’s the same as this gen:

Xbox oneX native 4K
PS4 pro checkerboard crap fake 4K
 
Top Bottom