• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 hardware based Ray-Tracing vs Xbox Series X hardware accelerated Ray Tracing: Is there a difference in the tech or just the wording?

Status
Not open for further replies.

JaffeLion

Banned
Negative almost all ppl tht have mentioned RT solutions and are suppose to be Insiders have said ps5 RT solution is the one with the edge. We've all been here reading up on the leaks thy gave tht nod to ps5 except for like one person tht said thy were similar.

Some ppl here have selective memory and reading.

where did you read that?

what I've seen:

The PS5 GPU is a heavily customised variant of AMD's Radeon's Navi (RDNA 1.0), which can simulate 3D audio and support ray-tracing.

and

Navi GPU based on 'next-generation' Radeon RDNA architecture (RDNA 2.0), which we're told will allow Xbox Series X to support hardware-accelerated real-time ray tracing.

Source: https://www.gamesradar.com/ps5-vs-xbox-series-x/

so, why would RNDA 1.0 and only a ray tracing simulation be better than nextgen RDNA 2.0 with hardware-accelerated real time RT?

doesnt make any sense lol
 

Gamernyc78

Banned
where did you read that?

what I've seen:



and



Source: https://www.gamesradar.com/ps5-vs-xbox-series-x/

so, why would RNDA 1.0 and only a ray tracing simulation be better than nextgen RDNA 2.0 with hardware-accelerated real time RT?

doesnt make any sense lol

Some posts from supposed insiders and some actual industry ppl tht might know on this forum I will not sift through them however. If you been here reading all the leaked posts It's been the narrative. Only person or ppl tht have said otherwise or hinted have been fans of the other side.
 

sendit

Member
You really think Xbox is RDNA 2.0 and ps5 RDNA 1.0 😂😂🤦‍♂️🤦‍♂️ oh now I see what's going on.

Agreed. It is pretty laughable some people think this to be true. Both the PS5 and XBXS are coming out around the same time frame. There isn’t going to be much of a tech gap between the two. Especially not in major version (RDNA1.0 vs RDNA2.0).
 
Last edited:

Journey

Banned
I fully expect someone with the name Playstation Latest to be non biased.

Definitely a unrefutable source. I wonder what Xbox Latest has to say.

(I actually don’t because none of these people know anything and are just trying to stoke the fire.)

Stop listening to random people on the internet and wait to see for yourself. This shit is beyond belief right now and some of you are giving attention to people that don’t deserve attention.


Not that I believe the source or care, I was replying to TLW to point out how he can take good PS5 rumors and ignore things when it's not in PS5's favor. Back when the rumor was that the PS5 was more powerful, everyone was happy to accept the news even if it also claimed Scarlett had the Raytracing advantage, but now after Xbox Series X is announced and new rumors pointing to it having more TF performance, now his gut is telling him that PS5 has better Ray Tracing because Sony will want to please its developers, ignoring the rumors about Scarlett having a Ray Tracing advantage lol.
 

Journey

Banned
Agreed. It is pretty laughable some people think this to be true. Both the PS5 and XBXS are coming out around the same time frame. There isn’t going to be much of a tech gap between the two. Especially not in major version (RDNA1.0 vs RDNA2.0).


It would be pretty laughable to assume that Xbox 360 (2005) would have a GPU with unified shaders, a geometry tesselation unit and programmable ALU's, whereas the PS3 (2006) would not have these newer features, it's just unfathomable!
 

Armorian

Banned
It would be pretty laughable to assume that Xbox 360 (2005) would have a GPU with unified shaders, a geometry tesselation unit and programmable ALU's, whereas the PS3 (2006) would not have these newer features, it's just unfathomable!

You're comparing gpus from different vendors. PS4 and X1 had AMD gpus in 2013 and only difference between them was amount of cores (and clock of course) and better implementation of async on PS4.
 

Journey

Banned
You're comparing gpus from different vendors. PS4 and X1 had AMD gpus in 2013 and only difference between them was amount of cores (and clock of course) and better implementation of async on PS4.


You're missing the point, which is that it's incorrect to assume that Sony will automatically have the most cutting edge design because of the launch timeline, when history shows us that is not necessarily the case. RDNA 2.0 is hardware design based, and if Sony locked in to create a compliant RDNA 1.0, they're not going to switch designs this late in the game just because MS announced it's using RDNA 2.0.

Let's be clear though, I'm not saying Sony is not going with RDNA 2.0, just saying that his assumption based on release timing is flawed.
 
Last edited:

Armorian

Banned
You're missing the point, which is that it's incorrect to assume that Sony will automatically have the most cutting edge design because of the launch timeline, when history shows us that is not necessarily the case. RDNA 2.0 is hardware design based, and if Sony locked in to create a compliant RDNA 1.0, they're not going to switch designs this late in the game just because MS announced it's using RDNA 2.0.

Let's be clear though, I'm not saying Sony is not going with RDNA 2.0, just saying that his assumption based on release timing is flawed.

That's true, Sony could have locked specs earlier than MS. So far insiders say that PS5 solution for RT is better so not having RDNA2 version of it could be for the best, but if they don't have VRS support it might hurt them in the long run. It's funny though, MS advertises XSX as a 8K machine only to smear those high resolution detail by VRS :messenger_grinning_smiling:

Sony really fucked up with PS3 GPU, Nvidia gave them defective G70 when they had G80 far in development, 8800 prototype in PS3 would mop the floor with 360.
 
That's true, Sony could have locked specs earlier than MS. So far insiders say that PS5 solution for RT is better so not having RDNA2 version of it could be for the best, but if they don't have VRS support it might hurt them in the long run. It's funny though, MS advertises XSX as a 8K machine only to smear those high resolution detail by VRS :messenger_grinning_smiling:

Sony really fucked up with PS3 GPU, Nvidia gave them defective G70 when they had G80 far in development, 8800 prototype in PS3 would mop the floor with 360.
Who is saying the PS5's RT solution is better? All that's been said to my knowledge is that it appears Series X has RT accelerated hardware incorporated into their APU via AMD's solution whereas Sony does not. This leads me to believe that Sony started development first so the basis of their GPU architecture is RDNA 1.0 which as we're all aware has no RT acceleration, so they had to find an external way to manage this outside of the APU and sourced hardware for it elsewhere.

This is also likely why there's reports that the PlayStation 5 is 9.2 Teraflops which again is looming right around that RDNA 1.0 5700 XT performance threshold. To add to this Sony also had devkits out much sooner than Microsoft, apparently Microsoft temporarily sent out PC's to fill the void until they were ready to ship their own devkits.

All in all it appears that Microsoft held out for a more advanced architecture (RDNA 2.0) which explains the built in RT, which explains the 12 Teraflops, which explains the lack of shipped out devkits. The more and more you unravel things it all starts to fit together.
 
Last edited:

Armorian

Banned
Who is saying the PS5's RT solution is better? All that's been said to my knowledge is that it appears Series X has RT accelerated hardware incorporated into their APU via AMD's solution whereas Sony does not. This leads me to believe that Sony started development first so the basis of their GPU architecture is RDNA 1.0 which as we're all aware has no RT acceleration, so they had to find an external way to manage this outside of the APU and sourced hardware for it elsewhere.

This is also likely why there's reports that the PlayStation 5 is 9.2 Teraflops which again is looming right around that RDNA 1.0 5700 XT performance threshold. To add to this Sony also had devkits out much sooner than Microsoft, apparently Microsoft temporarily sent out PC's to fill the void until they were ready to ship their own devkits.

All in all it appears that Microsoft held out for a more advanced architecture (RDNA 2.0) which explains the built in RT, which explains the 12 Teraflops, which explains the lack of shipped out devkits. The more and more you unravel things it all starts to fit together.

I think Orisisi said this here. Nvidia has RT cores at the same die as GPU, Sony could be doing something similar, anything outside APU would be stupid as fuck. Perhaps AMD solution is using modifies shader cores or something.
 
Last edited:
I think Orisisi said this here. Nvidia has RT cores at the same die as GPU, Sony could be doing something similar, anything outside APU would be stupid as fuck. Perhaps AMD solution is using modifies shader cores or something.
So Sony has third party RT cores in the same die as the GPU which is in an APU built by AMD?

That doesn't seem realistic whatsoever, and it may be retarded to have RT external of the APU but if that's what had to be done it's what had to be done.
 

Armorian

Banned
So Sony has third party RT cores in the same die as the GPU which is in an APU built by AMD?

That doesn't seem realistic whatsoever, and it may be retarded to have RT external of the APU but if that's what had to be done it's what had to be done.

I wasn't saying anything about TP, this still could be developed by AMD. Nvidia solution to RT using RT cores is probably not the only way to this, in games with no RT these cores remain idle and they still occupy space on the die. Perhpas AMD want something different for they GPU line up and they will with RDNA2 gpus (MS could approach them later than Sony and decided to use that) while something similar to RT cores was designed by AMD to be used alongside RDNA1 in PS5 (and these cores won't be idle in most games like on PC currently). Or perhaps Sony contracted AMD to develop RT solution that can't be used in their PC gpus and MS. This is just pure speulation, we don't know much at this point.
 

psorcerer

Banned
I think people in this thread should first clarify for themselves what does it mean "hardware RT"? Is NV Turing a HWRT? Why? What exactly makes it so? Quantify.
Then we can continue the discussion.
 
I think people in this thread should first clarify for themselves what does it mean "hardware RT"? Is NV Turing a HWRT? Why? What exactly makes it so? Quantify.
Then we can continue the discussion.
Well this is the way I look at it at least from my logic and point of view.

When someone says "hardware accelerated ray tracing" (Microsoft) I take that as you have the regular raster capability of the GPU plus RT cores or whatever in addition to that which take on the brunt of the additional load experienced by ray tracing being in effect.

When someone says "hardware based ray tracing" (Sony) I take that as you're cannibalizing the raster capability of the GPU to render the RT load being introduced, it is through hardware as stated but there's no form of discrete acceleration.
 
Last edited:

onQ123

Member
So Sony has third party RT cores in the same die as the GPU which is in an APU built by AMD?

That doesn't seem realistic whatsoever, and it may be retarded to have RT external of the APU but if that's what had to be done it's what had to be done.

You do know that AMD GPU/APU/SoC have 3rd party co-processors on them right?
 
You do know that AMD GPU/APU/SoC have 3rd party co-processors on them right?
That's not what I'm getting at, it seems like that would be a pretty ridiculous undertaking to retrofit something like that instead of handling it externally and tying it in.
 

Tomeru

Member
Well this is the way I look at it at least from my logic and point of view.

When someone says "hardware accelerated ray tracing" (Microsoft) I take that as you have the regular raster capability of the GPU plus RT cores or whatever in addition to that which take on the brunt of the additional load experienced by ray tracing being in effect.

When someone says "hardware based ray tracing" (Sony) I take that as you're cannibalizing the raster capability of the GPU to render the RT load being introduced, it is through hardware as stated but there's no form of discrete acceleration.

To me that sounds like its the other way around.
 

psorcerer

Banned
Well this is the way I look at it at least from my logic and point of view.

When someone says "hardware accelerated ray tracing" (Microsoft) I take that as you have the regular raster capability of the GPU plus RT cores or whatever in addition to that which take on the brunt of the additional load experienced by ray tracing being in effect.

When someone says "hardware based ray tracing" (Sony) I take that as you're cannibalizing the raster capability of the GPU to render the RT load being introduced, it is through hardware as stated but there's no form of discrete acceleration.

I don't see any number here. Are we in a humanities class?
Obviously everything uses the raster pipeline and the shader pipeline, and other GPU parts.
What exactly Turing is accelerating? Do you have an idea?
 
Last edited:

Journey

Banned
And how do them two "phrases" mean anything other than hardware that takes care of ray tracing?


I would think that there's a difference between hardware that completely handles Ray Tracing vs hardware that helps with Ray Tracing.
 
Well this is the way I look at it at least from my logic and point of view.

When someone says "hardware accelerated ray tracing" (Microsoft) I take that as you have the regular raster capability of the GPU plus RT cores or whatever in addition to that which take on the brunt of the additional load experienced by ray tracing being in effect.

When someone says "hardware based ray tracing" (Sony) I take that as you're cannibalizing the raster capability of the GPU to render the RT load being introduced, it is through hardware as stated but there's no form of discrete acceleration.


it sounds the other way

if its "accelerated" it sounds as if its using unspecific parts of the gpu to improve the process without specifying how much of the process is "accelerated" and if it says "based" it sounds as special hardware to do all RT related like RT cores


but in reality they will probably be the same thing anyway since both cases can refer to RT cores but if we are going to be word by word it makes more sense that "based" refer directly to special hardware to take care of RT
 
Last edited:
Agreed. It is pretty laughable some people think this to be true. Both the PS5 and XBXS are coming out around the same time frame. There isn’t going to be much of a tech gap between the two. Especially not in major version (RDNA1.0 vs RDNA2.0).

Actually, there very well can be if Microsoft pays for it. And maybe Sony's design was simply locked in sooner. It's easily possible. It wouldn't be the first time such a thing happened. It happened when Xbox 360 launched an entire year before PS3 with the world's first unified shader architecture GPU in Xenos, and I guess it happened somewhat again with Xbox One X, but most will say Xbox One X launched a year after. Still, it shows that if one simply aims higher anything is possible. Microsoft more or less had that design locked in even when PS4 Pro was launching. They just likely needed time to manufacture it for the right price.

If RDNA 2 cards really launch this year, it isn't at all impossible Xbox Series X could be RDNA 2. In fact, AMD has all but confirmed that it is. Anandtech and every major site has agreed that AMD was referring to RDNA2 as "next-gen RDNA"


With a single exception, there also aren’t any new graphics features. Navi does not include any hardware ray tracing support, nor does it support variable rate pixel shading. AMD is aware of the demands for these, and hardware support for ray tracing is in their roadmap for RDNA 2 (the architecture formally known as “Next Gen”). But none of that is present here.


AMD and Microsoft have co-designed and co-engineered a custom, high performance AMD SoC to power Project Scarlett to deliver an incredible gaming experience, including the next-generation of performance, graphics, lighting, visuals, and audio immersion. This processor builds upon the significant innovation of the AMD Ryzen™ "Zen 2" CPU core and a "Navi" GPU based on next-generation Radeon™ RDNA gaming architecture including hardware-accelerated raytracing.

Xbox Series X is definitely RDNA 2.
 
Last edited:
It's also important to note that at no point did AMD refer to RDNA as next gen RDNA during their initial reveal event for it where they showed 5700XT, the same event they conveniently also announced they were in PS5.
 
it sounds the other way

if its "accelerated" it sounds as if its using unspecific parts of the gpu to improve the process without specifying how much of the process is "accelerated" and if it says "based" it sounds as special hardware to do all RT related like RT cores


but in reality they will probably be the same thing anyway since both cases can refer to RT cores but if we are going to be word by word it makes more sense that "based" refer directly to special hardware to take care of RT
I'm sorry but it factually does not. Do you know what a sound card is? Hardware acceleration for the audio system. Do you know what an Ageia PhysX card was? Hardware acceleration for physics systems.

Hardware acceleration is already clearly defined in terms of what to expect.
 
Good for them I'm sure all those PC games will look good, but why none of it in Hellblade.

It's the buzz word of the hour, and their show piece didn't have it.

You going to spin me a yarn about devkits not being out or something like that?
Lol. RT will only suit certain scenarios.
Like, how much sun was there in Hellblade?
 
I'm sorry but it factually does not. Do you know what a sound card is? Hardware acceleration for the audio system. Do you know what an Ageia PhysX card was? Hardware acceleration for physics systems.

Hardware acceleration is already clearly defined in terms of what to expect.

Either way we will ultimately see what's what. I honestly view both to more or less mean the same thing. Although, it is true that hardware-based can mean both things. One can mean a dedicated piece of hardware designed to make a task faster or easier to do, and it can also mean using existing computing resources to assist software in performing what the hardware could do better.

As always, time will tell.
 
Lol. RT will only suit certain scenarios.
Like, how much sun was there in Hellblade?

Even without there being sun, there are many clever ways it can be used. I'm not exactly expecting it to be everywhere. In fact, I'm sure many games simply won't even use it, but for the ones that actually do, it can be some clever tricks with lighting from a massive bonfire of sorts, like that sacrifice that took place. It can be all sorts of otherwordly visual tricks that take advantage of ray tracing. Could be reflections in water, shadows etc. We will see.
 
Even without there being sun, there are many clever ways it can be used. I'm not exactly expecting it to be everywhere. In fact, I'm sure many games simply won't even use it, but for the ones that actually do, it can be some clever tricks with lighting from a massive bonfire of sorts, like that sacrifice that took place. It can be all sorts of otherwordly visual tricks that take advantage of ray tracing. Could be reflections in water, shadows etc. We will see.
Either way, comparing the amount of ray tracing you see in a PS5 demo which had shiny armour and the like, to a XSX demo where its night time and rainy, and then claim that as evidence that PS5 has better RT is just stupid.
 
I'm sorry but it factually does not. Do you know what a sound card is? Hardware acceleration for the audio system. Do you know what an Ageia PhysX card was? Hardware acceleration for physics systems.

Hardware acceleration is already clearly defined in terms of what to expect.

no, "hardware acceleration" is not defined in what to expect, a sound card without DSP or a specific DSP uses the CPU to do the processing instead, it still "helps" or in other words "accelerate" the process since it can have its own memory or certain codecs but you cannot take it for granted, "acceleration" does not imply it takes care of the whole process

if you help you mother to do the dishes you are accelerating the process of cleaning dishes but it doesn't imply what you or your mother do or who will do most of the work



In addition to the basic components needed for sound processing, many sound cards include additional hardware or input/output connections, including:

Digital Signal Processor (DSP): Like a graphics processing unit (GPU), a DSP is a specialized microprocessor. It takes some of the workload off of the computer's CPU by performing calculations for analog and digital conversion. DSPs can process multiple sounds, or channels, simultaneously. Sound cards that do not have their own DSP use the CPU for processing. Memory: As with a graphics card, a sound card can use its own memory to provide faster data processing.


to be fair "hardware based" and "hardware accelerated" can be interchangeable terms but if we are going to be picky then "hardware acceleration" is historically more ambiguous as 3d accelerators before shaders had to rely on CPU to process geometry while "modern" 3d acceleration cards that support shaders can do geometry calculations on GPU both 3D cards "accelerate" but one helps the CPU more than the other
 
Last edited:

VFXVeteran

Banned
All in all it appears that Microsoft held out for a more advanced architecture (RDNA 2.0) which explains the built in RT, which explains the 12 Teraflops, which explains the lack of shipped out devkits. The more and more you unravel things it all starts to fit together.

That doesn't explain the 12TFLOPS. Just because the card has built-in RT cores doesn't mean it's overall TFLOPS goes up.
 
Last edited:

VFXVeteran

Banned
Either way, comparing the amount of ray tracing you see in a PS5 demo which had shiny armour and the like, to a XSX demo where its night time and rainy, and then claim that as evidence that PS5 has better RT is just stupid.

That demo didn't show RT on the PS5.
 
Last edited:
That doesn't explain the 12TFLOPS. Just because the card has built-in RT cores doesn't mean it's overall TFLOPS goes up.
Well a more capable architecture which likely results in them releasing "Big Navi" opens the door to more CU's which would explain the uptick in them.
 

VFXVeteran

Banned
Lol. RT will only suit certain scenarios.
Like, how much sun was there in Hellblade?

That is completely untrue. When you look at Frozen 2 on the big screen or the upcoming Avatar movie, do you say that the RT is only practical in certain conditions? The entire CG sequence is done using nothing but RT. RT is far superior to rasterizing. Period. When the graphics cards get more powerful enough to handle an entire game using nothing but RT, it will look like the movies. It's a long wait though...
 
Last edited:

VFXVeteran

Banned
Well a more capable architecture which likely results in them releasing "Big Navi" opens the door to more CU's which would explain the uptick in them.

Well that makes more sense... do you guys really think the XSX will have an RDNA 2.0 architecture which isn't even out as a graphics card yet? AMD's main priority is the PC space above the consoles for sure. Listen to their financials in the coming Q2 to see how their sales are distributed.
 
Well that makes more sense... do you guys really think the XSX will have an RDNA 2.0 architecture which isn't even out as a graphics card yet? AMD's main priority is the PC space above the consoles for sure. Listen to their financials in the coming Q2 to see how their sales are distributed.
Given what's been said relative to devkits for Sony being out for quite some time and many people saying Microsoft's are not, or they were simply PC's; it does give some credence to them possibly holding out for RDNA 2.

I mean the Pro had Vega feature sets before Vega actually released...
 
Last edited:

VFXVeteran

Banned
Given what's been said relative to devkits for Sony being out for quite some time and many people saying Microsoft's are not, or they were simply PC's; it does give some credence to them possibly holding out for RDNA 2.

Hardware is already made. I would be surprised if XSX has RDNA 2.0 in it's shipping unit.
 
Last edited:
That is completely untrue. When you look at Frozen 2 on the big screen or the upcoming Avatar movie, do you say that the RT is only practical in certain conditions? The entire CG sequence is done using nothing but RT. RT is far superior to rasterizing. Period. When the graphics cards get more powerful enough to handle an entire game using nothing but RT, it will look like the movies. It's a long wait though...
Maybe I should have said RT will vary depending on the scene. A scene with Windows, water puddles and shiney paint will have the ability to show more RT than a scene at night time.
 

VFXVeteran

Banned
Maybe I should have said RT will vary depending on the scene. A scene with Windows, water puddles and shiney paint will have the ability to show more RT than a scene at night time.

Mirror surfaces isn't the only thing that RT has going for it. But using ambient occlusion during night scene or rendering excellent shadows from small light-sources is a very good thing.
 

Dural

Member
Well that makes more sense... do you guys really think the XSX will have an RDNA 2.0 architecture which isn't even out as a graphics card yet? AMD's main priority is the PC space above the consoles for sure. Listen to their financials in the coming Q2 to see how their sales are distributed.

Xbox 2001 and Xbox 360 both had GPUs that weren't available until the following year in the PC space. If MS told AMD they want RT hardware in their GPU I'm sure they would give them options.
 
That doesn't explain the 12TFLOPS. Just because the card has built-in RT cores doesn't mean it's overall TFLOPS goes up.

Exactly. If there are dedicated RT cores, which there is expected to be, it would have an entirely different performance metric outside of the TFLOPS of the GPU.
 
Not that I believe the source or care, I was replying to TLW to point out how he can take good PS5 rumors and ignore things when it's not in PS5's favor. Back when the rumor was that the PS5 was more powerful, everyone was happy to accept the news even if it also claimed Scarlett had the Raytracing advantage, but now after Xbox Series X is announced and new rumors pointing to it having more TF performance, now his gut is telling him that PS5 has better Ray Tracing because Sony will want to please its developers, ignoring the rumors about Scarlett having a Ray Tracing advantage lol.
Or put up comparison videos that shows a 1fps advantage to the Pro over the X, yet doesn't post the many other videos that show the X beating out the Pro.
 
Status
Not open for further replies.
Top Bottom