• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Dedicated ray tracing cores for Xbox Scarlett?

thelastword

Banned
A lot of anus sources today and no receipts.
They're all professionals including DF, no receipts necessary, it's 4k native you hear, PRO is that fake 4k. Their opinion is gold, forget people who prove things correctly.....Those guys who prove things correctly, they are a poor mans DF, a poor imitation of Gavon & Meirl etc.....

Remember, all these guys are professionals...….



This is what happens to these types of professionals.....

 

Mass Shift

Member
So is this a case of Physical Core vs Logical Cores? Would that apply in this scenario?

I'm going to break the vein and say MS' RT scenario is very likely to be built specifically around the parameters of their own DXR api. Engineering it in that way would compliment the api's methods for submitting workloads to the GPU and whatever "dedicated hardware" provided for RT tasks.

I would imagine that throughout DXR's development they've had a lot time to think about it, and just how to apply it's acceleration structure at both high and bottom levels. And it will likely have it's own custom "hooks" the way Nvidia utilizes Volta.
 
I'm going to break the vein and say MS' RT scenario is very likely to be built specifically around the parameters of their own DXR api. Engineering it in that way would compliment the api's methods for submitting workloads to the GPU and whatever "dedicated hardware" provided for RT tasks.

I would imagine that throughout DXR's development they've had a lot time to think about it, and just how to apply it's acceleration structure at both high and bottom levels. And it will likely have it's own custom "hooks" the way Nvidia utilizes Volta.
That seems to make sense, I believe they did the same thing with the Xbox One X to some degree with Direct X if I’m not mistaken. Either way, With the tidbits of info, it appears that next gen will be an interesting generation.
 

trikster40

Member
Lol at reading this thread, there’s some pretty funny comments out there. I love being able to see dialogue out there being able to discuss differing opinions and not getting aggressive like other places.

On topic, Barely anything has been confirmed hardware wise for either of these systems because neither wants to show their cards yet. We will all find out soon enough, so let’s quit acting like there’s definitive proof out there for either console. Just comments being misinterpreted to fit our own visions of what we want.

giphy.gif
 

Mass Shift

Member
That seems to make sense, I believe they did the same thing with the Xbox One X to some degree with Direct X if I’m not mistaken. Either way, With the tidbits of info, it appears that next gen will be an interesting generation.

You're not mistaken, that's exactly what they did with the One X, and I wouldn't at all be surprised if that will be their approach with Scarlett.

Just one thing though. Colin Penty is hardly stupid, and I just don't see a Freudian slip here. As a developer who probably knows better than any marketing voice, he knows EXACTLY what's going on with Scarlett 's hardware. Does anyone honestly believe the man doesn't know what MS' RT solution is actually called?

This gave me a chuckle 😁
 

Mass Shift

Member
I'm out of the loop now but what some here believe MS they did with X?

Instead of the usual software application, MS engineered Direct X as a hardware integration into the Xbox One X GPU's command processor. This hardware implementation significantly improved game to GPU communication and reduced CPU overhead.

For example, instruction set calls to the CPU can number in the thousands, by baking Direct X into the hardware those instruction sets could be reduced to less than a dozen. Cutting overall CPU workloads in half.

Which was a good thing when you consider what an utterly shitty processor the Jaguar is. This of course was contingent upon the game being a DIRECT X build.
 

pottuvoi

Banned
Having a path to trace rays without burdening shader cores will be nice. (It's basically same as having ROPs instead of developer having to write rasterizer for a shader cores.)

Actually any new features which developers will have access to will be nice, as they will find ways to use them.
Hopefully all consoles have those features so they will see more use and so that we would see them opened in PC as well. (Like Barycentric coordinates this gen..)
 
You realise the GPU manufacturers and not Microsoft or Sony make this kind of decisions. Their r&d departments decided RT cores is the most efficient way to do it. Same thing happened when shaders came around and every modern gpu has shading units.
Same way they came up with dedicated physx cards which turned up to be shit same way they came up with sli which is shit and this dedicated raytracing cores is another shitty idea soon enough as cards will get faster they'll scrap it
 
RT is not a gimmick, just the opposite. Standard lighting methods are gimmicks, because developers try to replicate with various techniques how RT would look like. RT is expensive but no one should call it a simple gimmick. Minecraft shows how drastic difference RT GI can make, and Metro exodus also looks great although they have limited GI RT to the extreme (sun was the only source of RT GI, and on top of that light bounces were limited to "2" and that's not enough to lit evenly certain high contrast scenes).
Theres no game out there that has shown any drastic difference of traditional rendering to ray tracing, yes there is a difference but the difference isn't enough to waste computing resources to I mean who cares about a fucking reflection and a shadow, it's pointless it's a gimmick until they can properly utilise it.
 
RT is not a gimmick, just the opposite. Standard lighting methods are gimmicks, because developers try to replicate with various techniques how RT would look like. RT is expensive but no one should call it a simple gimmick. Minecraft shows how drastic difference RT GI can make, and Metro exodus also looks great although they have limited GI RT to the extreme (sun was the only source of RT GI, and on top of that light bounces were limited to "2" and that's not enough to lit evenly certain high contrast scenes).
Theres no game out there that has shown any drastic difference of traditional rendering to ray tracing, yes there is a difference but the difference isn't enough to waste computing resources to I mean who cares about a fucking reflection and a shadow, it's pointless it's a gimmick until they can properly utilise it.
 

pawel86ck

Banned
Theres no game out there that has shown any drastic difference of traditional rendering to ray tracing, yes there is a difference but the difference isn't enough to waste computing resources to I mean who cares about a fucking reflection and a shadow, it's pointless it's a gimmick until they can properly utilise it.

The difference is big in metro exodus as you can see here, and that's just very limited RT implementation (sun is the only source of RT GI, so only outdoor scenes are affected, and there's only 2 light bounces, too small to lit evenly high contrast scene)


At 1080p performance is not that much worse compared to RTX off, and 2070S can run RTX effects at around 63 fps (71 fps without RTX), while game looks clearly much better in all outdoor scenes.

Of course current games use limited RT (limited amount of rays and effects), but limited or not, it's still RT, so you should not call it a gimmick, because it's not a gimmick and that's exactly why it's so demanding. You can call RT uneccesery feature, but gimmick is the last word you should use.
 

ethomaz

Banned
Instead of the usual software application, MS engineered Direct X as a hardware integration into the Xbox One X GPU's command processor. This hardware implementation significantly improved game to GPU communication and reduced CPU overhead.

For example, instruction set calls to the CPU can number in the thousands, by baking Direct X into the hardware those instruction sets could be reduced to less than a dozen. Cutting overall CPU workloads in half.

Which was a good thing when you consider what an utterly shitty processor the Jaguar is. This of course was contingent upon the game being a DIRECT X build.
That I believe is actually false.

MS used the AMD programmable command processor of GCN to implement DX instructions to run in the hardware.
That is software yet... not hardware implementation.

I think they are talking about this maybe?


I dunno
This part says all...

"they’ll all natively implemented in the logic of the command processor"

They are using the GCN programmable command processor to implement the logic they needs... it is software not hardware implementation... there is no silicon specific to run DX in Xbox One X APU.

BTW that DX implemented logic exists in Xbox One and the PS4/PRo API has instructions in the programmable command processor too (of course in this case not DX lol).

Edit - Just to people have a better ideia... you can create functions/instructions and update it in the command processor via microcode in the GCN... your logic is saved in the command processor to be called at any time you want but that doesn't mean there is a hardware dedicated unit created to execute these instructions.

That logic continue being software but in a more low level that implement it on the game or API code for exemple.
It is more like you are using Assembly to code your logic instead of C/C++... the overhead is lower.

That is a GCN feature... not something exclusive to MS Xbox One X lol
 
Last edited:

psorcerer

Banned
The difference is big in metro exodus as you can see here

Looks like bullshit. They disable any AO or GI to make it look better.
And overall graphics in Metro is nothing to write home about it's sometimes ugly and usually has uneven quality. Some scenes are nice, but others are straight off last gen,
There is nothing in RTX but gimmicks.
 

Mass Shift

Member
That I believe is actually false.

MS used the AMD programmable command processor of GCN to implement DX instructions to run in the hardware.
That is software yet... not hardware implementation.


This part says all...

"they’ll all natively implemented in the logic of the command processor"

They are using the GCN programmable command processor to implement the logic they needs... it is software not hardware implementation... there is no silicon specific to run DX in Xbox One X APU.

BTW that DX implemented logic exists in Xbox One and the PS4/PRo API has instructions in the programmable command processor too (of course in this case not DX lol).

Edit - Just to people have a better ideia... you can create functions/instructions and update it in the command processor via microcode in the GCN... your logic is saved in the command processor to be called at any time you want but that doesn't mean there is a hardware dedicated unit created to execute these instructions.

That logic continue being software but in a more low level that implement it on the game or API code for exemple.
It is more like you are using Assembly to code your logic instead of C/C++... the overhead is lower.

That is a GCN feature... not something exclusive to MS Xbox One X lol

Lol. Thanks for the clarification.
 

lukilladog

Member
The difference is big in metro exodus as you can see here, and that's just very limited RT implementation (sun is the only source of RT GI, so only outdoor scenes are affected, and there's only 2 light bounces, too small to lit evenly high contrast scene)
...

You are comparing an effect specifically injected to run on nvidia´s latest and greatest pc hardware, to a global illumination solution implemented to run on a console with the same theoretical output of a gtx750ti (1.4 tflops and 1.38 tflops respectively). Of course it´s gonna look better, even if it was made without RT :pie_expressionless:
 
Last edited:

lukilladog

Member
remember bilinear texture filtering, perspective correct texture mapping, per pixel lighting etc. etc.

all of them looks like a big waste of resources .. not anymore.

I´d agree if those were considered a big waste of resources, the comparison seems like anecdotal oversimplification or a false equivalence.

Ps.- What about a lot of surface detail or variation among several elements?, I think that´s also a big no with RT.
 
Last edited:

muteZX

Banned
Mwahahaha.
Try to render a simple kitchen scene with a lot of pots with RTX. Before you salivate about infinite reflections.

A500 - 8 HW sprites, Neo Geo - hundreds, modern PC - milions and milions.
Infinite reflections are the same thing.
 

pawel86ck

Banned
Looks like bullshit. They disable any AO or GI to make it look better.
And overall graphics in Metro is nothing to write home about it's sometimes ugly and usually has uneven quality. Some scenes are nice, but others are straight off last gen,
There is nothing in RTX but gimmicks.
Well at least you cant deny RTX looks better in metro exodus😃.
 

muteZX

Banned
I´d agree if those were considered a big waste of resources, the comparison seems like anecdotal oversimplification or a false equivalence.

Psone - texture filtering none, same with perspective correction, phong, AA, AF, z buffer etc.
fight against the RT is futile.
 

Vesper73

Member
Theres no game out there that has shown any drastic difference of traditional rendering to ray tracing, yes there is a difference but the difference isn't enough to waste computing resources to I mean who cares about a fucking reflection and a shadow, it's pointless it's a gimmick until they can properly utilise it.

Minecraft would like a word with you.
 
The difference is big in metro exodus as you can see here, and that's just very limited RT implementation (sun is the only source of RT GI, so only outdoor scenes are affected, and there's only 2 light bounces, too small to lit evenly high contrast scene)


At 1080p performance is not that much worse compared to RTX off, and 2070S can run RTX effects at around 63 fps (71 fps without RTX), while game looks clearly much better in all outdoor scenes.

Of course current games use limited RT (limited amount of rays and effects), but limited or not, it's still RT, so you should not call it a gimmick, because it's not a gimmick and that's exactly why it's so demanding. You can call RT uneccesery feature, but gimmick is the last word you should use.

Nope it isn't big and sometimes the rasterized looks better
 

Kenpachii

Member
I'm not sure...I mean NVIDIA had these ray-tracing cores in their RTX graphics cards at first...then all of a sudden, now GTX cards can do it too? It makes it sound like Raytracing doesn't require specific technology to run it, but maybe having cores dedicated to it makes for a better implementation?

Hell, I dunno.

U can run raytracing even without RTX cards, performance is just wonky.
 

CeeJay

Member
PS4 was released 6 years ago.
We are talking about times 4 years ahead, when PS6 will be looming.
Yeah sorry, was just wishing that time still went as slow as when I was a kid :p

Hopefully in 2023 we will be in the sweet spot mid-gen were we are getting an abundance of second-wave games making full use of the hardware with no talk of the PS6 yet, looking i'm forward to it :D
 

llien

Member
A few relevant numbers here.
There have been different estimates of how big portion of a silicon "RT hardware" is in RTX cards.
Estimates went from about 20%, down to 8% or so, I've heard.

AMD is notably taking a different approach:


In the application, AMD said that software-based solutions "are very power intensive and difficult to scale to higher performance levels without expending significant die area." It also said that enabling ray tracing via software "can reduce performance substantially over what is theoretically possible" because they "suffer drastically from the execution divergence of bounded volume hierarchy traversal."

AMD didn't think hardware-based ray tracing was the answer either. The company said those solutions "suffer from a lack of programmer flexibility as the ray tracing pipeline is fixed to a given hardware configuration," are "generally fairly area inefficient since they must keep large buffers of ray data to reorder memory transactions to achieve peak performance," and are more complex than other GPUs.
So the company developed its hybrid solution. The setup described in this patent application uses a mix of dedicated hardware and existing shader units working in conjunction with software to enable real-time ray tracing without the drawbacks of the methods described above. Here's the company's explanation for how this system might work, as spotted by "noiserr" on the AMD subreddit:


nYNyiOI.jpg



Doesn't sound like something compatible with RTX approach. And if so, with this going into consoles and AMD's GPUs from 2020 onwards, "G-Sync vs FreeSync story" might repeat itself again.
 
Last edited:

Dontero

Banned
Depends only if developers will start imlementing those features via DX12/Vulcan and not with Nvidia extensions.
The big issue everyone doesn't seem to notice is performance impact.

Console gpus will be at best mid range gpus. RTX currently with 2060 is abysmall you need at least 2070 which cost more than new next gen console and then some.

I think you could reasonably expect some small time ray tracing fixed function hardware but it won't be huge. It will be more like raytracing for shooting few rays to establish what is happening in scene and some software function to smear rays effects on objects.

Killzone devs used ray tracing for its voxel global illumination to tell voxels what to do. With such setup you don't need many rays but you can't do stuff like precise shadows.
 
Top Bottom