• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

XSXs Ray Tracing Performance vs Nvidia

The 2080 max-q is less powerful than an RTX 2070, so nvidia is wrong on that one.
Of course they're wrong. We already know where the chips land.

The PlayStation 5 GPU is more powerful than the 5700 XT, the 2080 Super is about 22% more powerful than the 5700 XT, the Series X GPU is at minimum 18% more powerful than the PlayStation 5's...

We already know, these guys are just talking BS trying to defend their computers.
 

Tripolygon

Banned
I am not sure what are you showing me. this has nothing to do with what I said . why you are showing me graphics ? ( oh and by the way the 1080ti is more powerful than both RTX 2070 super and Radeon 7. and on bar with 2080 ( none super ) + or - 3% difference depending on the game in 90% of the games if not more.

but that still on my point where the PS5 is around lil better than 5700xt and XSX around 2070 super. and neither of these cards will handle proper ray tracing even if you boost their ray tracing power 200% thanks to the RDNA 2 tech.

Ray tracing on both consoles will be close to shit. period.
You have no idea what you are talking about.

XSX - 3328 X 1825 X 2 = 12TF
1080Ti - 3584 X 1582 X 2 = 11.3TF
PS5 - 2304 X 2230 X 2 = 10.3TF
RTX 2080 - 2944 X 1800 X 2 = 10TF
5700XT - 2560 X 1905 X 2 = 9.7TF
RTX 2070S - 2560 X 1770 X 2 = 9TF

That's just pure theoretical TF figures. In terms of architectural improvement, RDNA is on par with Turing, give or take 5% average in games performance. RDNA 2 is said to be improving on that and also clocks better as you can see.

In terms of ray tracing, the Series X Minecraft demo has shown it is doable and that was path tracing. Doing ray trace global illumination like Metro Exodus is way simpler and cheaper in terms of performance.
 
Last edited:
Of course they're wrong. We already know where the chips land.

The PlayStation 5 GPU is more powerful than the 5700 XT, the 2080 Super is about 22% more powerful than the 5700 XT, the Series X GPU is at minimum 18% more powerful than the PlayStation 5's...

We already know, these guys are just talking BS trying to defend their computers.
Says the guy trying to defend his company/console. Like I said. We have proof of what Nvidia AND AMD can do on pc's. The same can't be said about consoles. And AMD hadn't shown much of anything on pc side. And going by many years of history, AMD can't hold a candle to Nvidia in high end or enthusiast gpu's.

You have no idea what you are talking about.

XSX - 3328 X 1825 X 2 = 12TF
1080Ti - 3584 X 1582 X 2 = 11.3TF
PS5 - 2304 X 2230 X 2 = 10.3TF
RTX 2080 - 2944 X 1800 X 2 = 10TF
5700XT - 2560 X 1905 X 2 = 9.7TF
RTX 2070S - 2560 X 1770 X 2 = 9TF

That's just pure theoretical TF figures. In terms of architectural improvement, RDNA is on par with Turing, give or take 5% average in games performance. RDNA 2 is said to be improving on that and also clocks better as you can see.

No one goes by TF #'s though... Nvidia can beat AMD cards while having less TF. Literally by 2 or more TF's, LESS than AMD.
 
Last edited:
Says the guy trying to defend his company/console. Like I said. We have proof of what Nvidia AND AMD can do on pc's. The same can't be said about consoles. And AMD hadn't shown much of anything on pc side. And going by many years of history, AMD can't hold a candle to Nvidia in high end or enthusiast gpu's.
Oh my boy I hope you remember this post, you sound just like the Intel evangelicals.
 
i thought it was common sense? my argument is that the 2080 max q is less powerful than both next gen consoles GPUs. The max q doesn’t matcha 2070, so how can it match a 12 TF GPU in the XSX?
Refer to my previous post
Oh my boy I hope you remember this post, you sound just like the Intel evangelicals.
I don't even like Intel. I already said, AMD cpu department is doing great... Gpu department on the other hand... Not so much.
 
Last edited:
You have no idea what you are talking about.

XSX - 3328 X 1825 X 2 = 12TF
1080Ti - 3584 X 1582 X 2 = 11.3TF
PS5 - 2304 X 2230 X 2 = 10.3TF
RTX 2080 - 2944 X 1800 X 2 = 10TF
5700XT - 2560 X 1905 X 2 = 9.7TF
RTX 2070S - 2560 X 1770 X 2 = 9TF

That's just pure theoretical TF figures. In terms of architectural improvement, RDNA is on par with Turing, give or take 5% average in games performance. RDNA 2 is said to be improving on that and also clocks better as you can see.
All wrapped up in a far more optimized system that can outperform the equivalent PC part.
Show me a PC card of 1.84tflops that managed to have graphics anywhere near the same level as GOW for instance.
 
Last edited:
Refer to my previous post

I don't even like Intel. I already said, AMD cpu department is doing great... Gou department on the other hand... Not so much.
And you don't think they can come back from that? Clearly their architecture is on the right track.. They could have released a 64 CU RDNA 1.0 GPU and it would have matched the 2080 Ti in raster performance.
 
And you don't think they can come back from that? Clearly their architecture is on the right track.. They could have released a 64 CU RDNA 1.0 GPU and it would have matched the 2080 Ti in raster performance.
Well... Did they? Nvidia showed performance before RTX cards released. The only thing to go by AMD, is consoles, and by hardware stats, aren't that impressive. Nothing near the level of Nvidia's high end cards. Not only that, but AMD it's quiet as a mouse in regards to pc gpu's this generation. So again, nothing to go by, and based on history, they have not been able to complete with top of the line gpu's for MANY years now. And going by their 5700/xt... I don't have my hopes up. Driver issues out the wazoo.
 

//DEVIL//

Member
You have no idea what you are talking about.

XSX - 3328 X 1825 X 2 = 12TF
1080Ti - 3584 X 1582 X 2 = 11.3TF
PS5 - 2304 X 2230 X 2 = 10.3TF
RTX 2080 - 2944 X 1800 X 2 = 10TF
5700XT - 2560 X 1905 X 2 = 9.7TF
RTX 2070S - 2560 X 1770 X 2 = 9TF

That's just pure theoretical TF figures. In terms of architectural improvement, RDNA is on par with Turing, give or take 5% average in games performance. RDNA 2 is said to be improving on that and also clocks better as you can see.

In terms of ray tracing, the Series X Minecraft demo has shown it is doable and that was path tracing. Doing ray trace global illumination like Metro Exodus is way simpler and cheaper in terms of performance.
Posting numbers means you are the one who has no idea what’s going on by these measures . By your measures how is the rtx 2080 is about same performance as 1080ti lol .

mans wasn’t the Radeon 7 13.5 tflops or close to 14 with 16gb of memory yet same performance as 2080 lol .
Dear god lol
 

//DEVIL//

Member
All wrapped up in a far more optimized system that can outperform the equivalent PC part.
Show me a PC card of 1.84tflops that managed to have graphics anywhere near the same level as GOW for instance.
This is exactly the saving grace for consoles and what I said in my previous posts that Sony got talented guys . Ps5 and xsx are powerful consoles . They will produce amazing games but in terms of just power numbers and data ? They are less than 2080. And this topic was about ray tracing what happened . Ray tracing on consoles will be shit I stand by what I said lol
 
Posting numbers means you are the one who has no idea what’s going on by these measures . By your measures how is the rtx 2080 is about same performance as 1080ti lol .

mans wasn’t the Radeon 7 13.5 tflops or close to 14 with 16gb of memory yet same performance as 2080 lol .
Dear god lol
I've been saying TF's are the worst metric to go by, and pretty much deciphers individuals who either know what they are talking about, or have no clue. Nvidia gpu's beat cards with 2 or more TERRORFLOPS from AMD, with ease. TF count is baseless af in the realm of pc's.
 

Tripolygon

Banned
Posting numbers means you are the one who has no idea what’s going on by these measures . By your measures how is the rtx 2080 is about same performance as 1080ti lol .

mans wasn’t the Radeon 7 13.5 tflops or close to 14 with 16gb of memory yet same performance as 2080 lol .
Dear god lol
And the 5700XT is neck and neck with the Radeon 7. So what should that tell you? RDNA is a better gaming graphics card than previous Vega and GCN architecture and RDNA 2 which the Series X and PS5 are based on are better than that. Series X will be better than RTX 2080.
 
Last edited:

//DEVIL//

Member
And the 5700XT neck and neck with the Radeon 7. So what should that tell you?

Means your graph about tera flops is useless and you just proved your own post above doesn’t count lol

Also Off topic question. Headed the Radeon 7 got disconnected . Any reason why ? Or I got the wrong information
 

Tripolygon

Banned
Means your graph about tera flops is useless and you just proved your own post above doesn’t count lol

Also Off topic question. Headed the Radeon 7 got disconnected . Any reason why ? Or I got the wrong information
mans wasn’t the Radeon 7 13.5 tflops or close to 14 with 16gb of memory yet same performance as 2080 lol .
RX 5700XT performs similar to Radeon 7

So that means RX5700XT performs close to RTX 2080

Series X is RDNA 2 and will trounce RX5700XT, Radeon 7, RTX 2080.

Go back to philosophy class because you have failed common deductive reasoning. My graph holds up pretty fine.
 
Last edited:
RX 5700XT performs similar to Radeon 7

So that means RX5700XT performs close to RTX 2080

Series X is RDNA 2 and will trounce RX5700XT, Radeon 7, RTX 2080.

Go back to philosophy class because you have failed common deductive reasoning. My graph holds up pretty fine.
I've given up, talking to these guys is a lost cause. Nothing they're saying corresponds to computational reality.
 
I've given up, talking to these guys is a lost cause. Nothing they're saying corresponds to computational reality.
Why not just show proof of xsx beating a 2080 max-q, 2080, 2080S, or even a 2080 TI? Wouldn't that be the easiest thing to prove, and put the ball in your court for once?

Any examples are welcome of course.
 
Last edited:

//DEVIL//

Member
RX 5700XT performs similar to Radeon 7

So that means RX5700XT performs close to RTX 2080

Series X is RDNA 2 and will trounce RX5700XT, Radeon 7, RTX 2080.

Go back to philosophy class because you have failed common deductive reasoning. My graph holds up pretty fine.
I give up . Soon you will say 5700xt is more powerful than 2080ti .
You don’t even read properly .
First you came swinging your graphs and flops and showed you plus others did that it means nothing .
Now you come and say 5700xt is same as Radeon 7 because of a video god knows what he is running at and what conditions from an unknown YouTuber with a joker face in his video . That was alone for me not to even watch it further .
Please stop .

the games will come this fall and you will see for your self .
 
Last edited:
This is exactly the saving grace for consoles and what I said in my previous posts that Sony got talented guys . Ps5 and xsx are powerful consoles . They will produce amazing games but in terms of just power numbers and data ? They are less than 2080. And this topic was about ray tracing what happened . Ray tracing on consoles will be shit I stand by what I said lol
We shall see, but according to Nvidias own numbers, the XSX will have amazing RT.
 

jaytyla

Neo Member
In the digital foundry video they say that the series x Minecraft is path traced, which they say is the hardest to run raytracing workload. Check around the 3:00 minute mark. Is Nvidias Minecraft pathtraced too?
 
Why not just show proof of xsx beating a 2080 max-q, 2080, 2080S, or even a 2080 TI? Wouldn't that be the easiest thing to prove, and put the ball in your court for once?

Any examples are welcome of course.
  • There's no debate of the PlayStation 5's GPU being more computationally capable than the 5700 XT, correct?
  • There's no debate of the Series X GPU being a minimum of 18% more computationally capable than the PlayStation 5 GPU, correct?
  • There's no debate of the RTX 2080 Super being 22% more computationally capable than the 5700 XT, correct?

Welp, that was hard.
 
Last edited:

Tripolygon

Banned
I give up . Soon you will say 5700xt is more powerful than 2080ti .
You don’t even read properly .
First you came swinging your graphs and flops and showed you plus others did that it means nothing .
Now you come and say 5700xt is same as Radeon 7 because of a video god knows what he is running at and what conditions from an unknown YouTuber with a joker face in his video . That was alone for me not to even watch it further .
Please stop .
No 2080Ti is more powerful than Series X.
Another youtube channel


And a different Youtbe channel



Same similar results. Some games Radeon 7 is slightly faster and some 5700Xt is faster and some they are about equal. The fact still remains that Series X will be a newer architecture that is better than 5700XT and is more powerful than RTX 2080. If you don't like it you can go jerk off to your PC and cry.
 
Last edited:

BeardGawd

Banned
We actually have the results from a benchmark for XSX and 2080 TI:



As you can see, the Gears 5 on Xbox Series X does look noticeably better than the Xbox One X version. On XSX the game basically runs PC Ultra settings (and sometimes beyond Ultra) at 60fps, even in cutscenes. We also see evidence of some advanced techniques, including more natural-looking Screen Space Global Illumination lighting and some impressive volumetric fog. Ray tracing and other new DirectX 12 Ultra features have not yet been implemented, but there’s a very good chance they’ll be added by the time the Xbox Series X launches later this year (again, this demo was thrown together in only two weeks). The Coalition has also said they hope to get multiplayer running at 120fps.

Digital Foundry also got to see the Gears 5 benchmark running on the Xbox Series X and performance was almost identical to a PC equipped with an AMD Ryzen Threadripper 2950x and GeForce RTX 2080 Ti graphics card. So yeah, at least when it comes to running current-gen games, Microsoft’s new console stands up to (almost) the best PC hardware you can get.


Supposedly a dirty 2 week port. With optimizations it should be even better.
 
Last edited:

//DEVIL//

Member
Another youtube channel


And a different Youtbe channel



Same similar results. Some games Radeon 7 is slightly faster and some 5700Xt is faster and some they are about equal. The fact still remains that Series X will be a newer architecture that is better than 5700XT and is more powerful than RTX 2080. If you don't like it you can go jerk off to your PC and cry.

lol @ jerk off. shows your lower class among the rest of the people here.

I am not a PC fanboy. and nothing makes me happier than XSX or PS5 to be as powerful as 2080. competition is good. it will make Nvidia lower their prices.

Then come people like you that start comparing 5700xt to 2080.....

yep. I am done here . good luck banana
 
These Nvidia PC guys really, really want to down play consoles it would appear.
I've had more than double the gpu's from AMD than Nvidia. Same with cpu's. I choose performance over bending over for a certain company. For the longest AMD hasn't been able to complete with Nvidia gpu wise. I started off with AMD cpu's, transitioned to Intel, and now will be switching back to AMD. If AMD had the best gpu 2 years ago, I would have went with them. I'm not hardware agnostic like many console warriors are. I just need the best performance out there. PC doesn't operate the same way as Microsoft vs Sony. I'm not sure why many don't understand this basic concept. Pc players can play best of all worlds, pc exclusives, best of 3rd party games, and best at playing both MS and Sony exclusives. This can all be done regardless of an allegiance to any hardware manufacturer.
 
Last edited:
I've had more than double the gpu's from AMD than Nvidia. Same with cpu's. I choose performance over bending over for a certain company. For the longest AMD hasn't been able to complete with Nvidia gpu wise. I started off with Intel cpu's, transitioned to Intel, and now will be switching back to AMD. If AMD had the best gpu 2 years ago, I would have went with them. I'm not hardware agnostic like many console warriors are. I just need the best performance out there. PC doesn't operate the same way as Microsoft vs Sony. I'm not sure why many don't understand this basic concept. Pc players can play best of all worlds, pc exclusives, best of 3rd party games, and best at playing both MS and Sony exclusives. This can all be done regardless of an allegiance to any hardware manufacturer.
And yet here you are saying these things which don't make any practical sense in relation to computational tiering.
 
I've had more than double the gpu's from AMD than Nvidia. Same with cpu's. I choose performance over bending over for a certain company. For the longest AMD hasn't been able to complete with Norris gpu wise. I started off with Intel cpu's, transitioned to Intel, and now will be switching back to AMD. If AMD had the best gpu 2 years ago, I would have went with them. I'm not hardware agnostic like many console warriors are. I just need the best performance out there. PC doesn't operate the same way as Microsoft vs Sony. I'm not sure why many don't understand this basic concept. Pc players can play best of all worlds, pc exclusives, best of 3rd party games, and best at playing both MS and Sony exclusives. This can all be done regardless of an allegiance to any hardware manufacturer.
I dont know dude, you are certainly going out of your way to bend over for a certain company.
As I said, I am quoting direct Nvidia specs, and direct XSX specs.
If that threatens the value you find in the $1000 GPU you bought, then I cant help you with that.
 

Tripolygon

Banned
lol @ jerk off. shows your lower class among the rest of the people here.

I am not a PC fanboy. and nothing makes me happier than XSX or PS5 to be as powerful as 2080. competition is good. it will make Nvidia lower their prices.

Then come people like you that start comparing 5700xt to 2080.....

yep. I am done here . good luck banana
Anybody who says someone is lower class than them is trash. I'm only engaging you with the same hostility you are others. Talk to people with respect and the same courtesy will be afforded to you.

The fact that a 350$ GPU is within 20% of a 600$ GPU should tell you its not even worth comparing and yet there people like you doing it.

I'm not really done with you but good luck orange? Lol

I am not a PC fanboy. and nothing makes me happier than XSX or PS5 to be as powerful as 2080. competition is good. it will make Nvidia lower their prices.
A minute ago Series X was as powerful as RTX2070 and PS5 less powerful than 5700XT. Make up your mind dude.
 
Last edited:
I dont know dude, you are certainly going out of your way to bend over for a certain company.
As I said, I am quoting direct Nvidia specs, and direct XSX specs.
If that threatens the value you find in the $1000 GPU you bought, then I cant help you with that.
Says the one trying to put the Xsx over better hardware....

Where have you mentioned the core count, tensor cores, shaders, etc? I don't recall that. I'll never regret buying my gpu, as my games play better than my 1080 ti that I sold, which also performed miles better than consoles in the past decade. I'm all about performance if you haven't noticed yet. I'm not rich, but I'll spend money on the best products.

If consoles were a better buy, I wouldn't be a pc gamer, now would I? It's only better now, as I don't have to upgrade my pc for next gen games, and still have better performance than next gen consoles. I'm not shittting on ppl that prefer consoles, but I'll stop someone who spews utter b.s. about hardware they have no clue about.
 

CrustyBritches

Gold Member
B BeardGawd
Despite what Wccftech states, in the video DF says "they compared it directly to a PC with a RTX 2080 and a thread Ripper 2950X" and "they produce nearly identical results now the PC still has some slight advantages" and "basically what we have is performance on par with an RTX 2080". Basically, almost 2080 performance, and they could have used a 3000-series 8-core CPU and had better performance, but "16-core CPU" sounds hardcore.

Not your fault, just more shabby work from Wccftech.

---
As for the comparison between Minecraft RTX and XSX Minecraft DXR, it's hard to make any conclusions at this point based on the comment "it runs between 30-60fps with RT enabled". We don't know what maps, settings, and precision they used. My 2060 Super gets around 18-25fps at native 1080p depending on the level and the chunk size being used. On Crystal Castle I get around 25fps with an aerial shot of the level at 20 Chunk Size. Anything between a 2070 Super and a 2080ti would fall into that "30-60fps" range on 20 Chunk Size. Who knows how the settings line up and the specifics of the test. Best guess is that XSX has between 2080 and 2080 Super rasterization and RT performance.
 
Last edited:
Says the one trying to put the Xsx over better hardware....

Where have you mentioned the core count, tensor cores, shaders, etc? I don't recall that. I'll never regret buying my gpu, as my games play better than my 1080 ti that I sold, which also performed miles better than consoles in the past decade. I'm all about performance if you haven't noticed yet. I'm not rich, but I'll spend money on the best products.

If consoles were a better buy, I wouldn't be a pc gamer, now would I? It's only better now, as I don't have to upgrade my pc for next gen games, and still have better performance than next gen consoles. I'm not shittting on ppl that prefer consoles, but I'll stop someone who spews utter b.s. about hardware they have no clue about.
Sometimes people only read what they want to.
I'm not here saying the XSX is more powerful than a 2080ti, because it's not.
All I have done is posted official Ray Tracing benchmarks direct from Nvidia and MS. According to those official specs the XSX will have an advantage with RT.
This may well be correct as we are talking about AMD tech that hasn't even come to market vs 18 month old Nvidia tech.
It's obviously triggered you, and you are trying really hard to discount it.
RT is in it's infancy, and it's not out of bounds for big leaps to be made. I would not be surprised or hurt if the next Gen of Nvidia cards overtake the XSX specs of 380 billion i/s. In fact I expect them to.
 
Sometimes people only read what they want to.
I'm not here saying the XSX is more powerful than a 2080ti, because it's not.
All I have done is posted official Ray Tracing benchmarks direct from Nvidia and MS. According to those official specs the XSX will have an advantage with RT.
This may well be correct as we are talking about AMD tech that hasn't even come to market vs 18 month old Nvidia tech.
It's obviously triggered you, and you are trying really hard to discount it.
RT is in it's infancy, and it's not out of bounds for big leaps to be made. I would not be surprised or hurt if the next Gen of Nvidia cards overtake the XSX specs of 380 billion i/s. In fact I expect them to.
If you think xsx had better raytracing, why doesn't Minecraft have all the features and benefits, or even the material qualities of pc, yet still run vastly slower? Imagine if it had all the pc features, could it even reach anywhere near 30fps? Why do console games have checkerboard rendering, while pc games run at native resolution? Why would I be triggered when I have the best experience, and will continue to through this entire gen, than a console that isn't even released yet?


This is the problem with console gamers. They can't stay in their own lane. You don't see threads on here made by pc gamers shitting on console players. But you'll see millions of threads made by clueless people, claiming ridiculous things, like this thread for instance. I literally only came here to comment on the ridiculousness or this thread. Then here you came, trying to infer the same pipe dreams console warriors are claiming. I know my 2080 TI, won't beat the next NEXT gen Xbox series Z, or the PS6. But for now, do you see me making threads about how my 2080TI will shit on series X or Ps5? No, you won't see that. I just hate the spread of misinformation, like what you and others are doing.
 

Ascend

Member
As far as I'm aware, the implementation for BVH is the same for both nVidia and AMD, albeit they do it in another "place" within the GPU pipeline.
Assuming this is true (it might turn out not to be), then their performance should be similar per CU/SM, simply because each CU/SM has one RT "core". Unless there is some magic mojo or something really wrong, it won't be too different from nVidia's implementation. Clock speeds also have an influence, obviously. But considering the XSX GPU falls between a 2080 Super and a 2080 Ti in terms of both CU count and clock speeds, the performance should be somewhere between those for RT as well.

2080 super = 48 RT cores @ 1.8 GHz
2080 Ti = 68 RT cores @ 1.5 GHz
XSX GPU = 52 RT cores @ 1.8 GHz
 
Last edited:

S0ULZB0URNE

Member
As far as I'm aware, the implementation for BVH is the same for both nVidia and AMD, albeit they do it in another "place" within the GPU pipeline.
Assuming this is true (it might turn out not to be), then their performance should be similar per CU/SM, simply because each CU/SM has one RT "core". Unless there is some magic mojo or something really wrong, it won't be too different from nVidia's implementation. Clock speeds also have an influence, obviously. But considering the XSX GPU falls between a 2080 Super and a 2080 Ti in terms of both CU count and clock speeds, the performance should be somewhere between those for RT as well.

2080 super = 48 RT cores @ 1.8 GHz
2080 Ti = 68 RT cores @ 1.5 GHz
XSX GPU = 52 RT cores @ 1.8 GHz
hmm i choose XSX
 
If you think xsx had better raytracing, why doesn't Minecraft have all the features and benefits, or even the material qualities of pc, yet still run vastly slower? Imagine if it had all the pc features, could it even reach anywhere near 30fps? Why do console games have checkerboard rendering, while pc games run at native resolution? Why would I be triggered when I have the best experience, and will continue to through this entire gen, than a console that isn't even released yet?


This is the problem with console gamers. They can't stay in their own lane. You don't see threads on here made by pc gamers shitting on console players. But you'll see millions of threads made by clueless people, claiming ridiculous things, like this thread for instance. I literally only came here to comment on the ridiculousness or this thread. Then here you came, trying to infer the same pipe dreams console warriors are claiming. I know my 2080 TI, won't beat the next NEXT gen Xbox series Z, or the PS6. But for now, do you see me making threads about how my 2080TI will shit on series X or Ps5? No, you won't see that. I just hate the spread of misinformation, like what you and others are doing.
Again, if you are using a demo knocked up by one guy over a 4 week period, compared to a beta release that has been worked on for more than a year, by hundreds of people, then it really knocks down your credibility.
You know that's not in any way a meaningful comparison.
If That's the only thing you have, you dont have much. And again, Nvidia have given the RT specs of their GPU. It is what it is.
 
As far as I'm aware, the implementation for BVH is the same for both nVidia and AMD, albeit they do it in another "place" within the GPU pipeline.
Assuming this is true (it might turn out not to be), then their performance should be similar per CU/SM, simply because each CU/SM has one RT "core". Unless there is some magic mojo or something really wrong, it won't be too different from nVidia's implementation. Clock speeds also have an influence, obviously. But considering the XSX GPU falls between a 2080 Super and a 2080 Ti in terms of both CU count and clock speeds, the performance should be somewhere between those for RT as well.

2080 super = 48 RT cores @ 1.8 GHz
2080 Ti = 68 RT cores @ 1.5 GHz
XSX GPU = 52 RT cores @ 1.8 GHz
You are discounting any improvenments in the tech.
Do You think Nvidia will be able to improve the RT performance of their next cards outside of just cuda core and clock increases?
I fully expect they will.
AMDs RT tech hasn't even hit the streets yet.
 
Look at my post history. This had been disproven, time and time before. A SSD in 2016 was faster than ps5 ssd, 6.7gbps. In 2019 there was a ssd with over 9gbps. Stop posting things as a fact, that can be easily disproven with a simple Google search.
He is talking about HIS SSD--for all we know he has a SATA III drive.

Very few people have drives that go beyond PCIe 3.0 speeds (it tops around 3.5GB/s?)... So unless you are one of the lucky few that has a PCIe 4.0 or some magic enterprise class SSD... no regular desktop has a 5.5GB/s drive yet (well very few, I'm sure some PCIe 4.0 drive has reached that kind of speed).

MS will run with the RT thing, especially if they have a significant advantage over the PS5 with it, and for good reason.
 
If That's the only thing you have, you dont have much. And again, Nvidia have given the RT specs of their GPU. It is what it is.
This is all that's available at the moment, you can't really hold it against him.

Also, sometime 1 guy working 4 weeks on something with enough dedication does more than 1 million monkeys working on something similar.
 

OmegaSupreme

advanced basic bitch
I dont know dude, you are certainly going out of your way to bend over for a certain company.
As I said, I am quoting direct Nvidia specs, and direct XSX specs.
If that threatens the value you find in the $1000 GPU you bought, then I cant help you with that.
I'm not getting in this pissing contest but I will say this. There is always a price to pay for early adoption. Its awesome consoles are getting ray tracing. Its awesome they are getting ssds. Finally. We've been enjoying those for ten years. Enjoy your console. It'll be a great piece of tech at launch. Just like every other generation though it will become outdated and pc will be miles ahead.
 
Last edited:
He is talking about HIS SSD--for all we know he has a SATA III drive.

Very few people have drives that go beyond PCIe 3.0 speeds (it tops around 3.5GB/s?)... So unless you are one of the lucky few that has a PCIe 4.0 or some magic enterprise class SSD... no regular desktop has a 5.5GB/s drive yet (well very few, I'm sure some PCIe 4.0 drive has reached that kind of speed).

MS will run with the RT thing, especially if they have a significant advantage over the PS5 with it, and for good reason.
Pc didn't need super fast ssd, though... That's why we have ram, even lower speed ram is multiple times faster than a ssd... I seriously wish people got this though there head already.... The average pc gamer has 16gb of ram, which is at least 4x faster than ps5 ssd.
Again, if you are using a demo knocked up by one guy over a 4 week period, compared to a beta release that has been worked on for more than a year, by hundreds of people, then it really knocks down your credibility.
You know that's not in any way a meaningful comparison.
If That's the only thing you have, you dont have much. And again, Nvidia have given the RT specs of their GPU. It is what it is.
And Nvidia rtx specs blew next gen consoles out there water, 2 years ago, before next gen consoles even release... You want to discredit Minecraft as a minimum release, without material based lighting, or high quality textures, yet have nothing to make up for it. Who has the better credibility now? The one who's making excuses, or the one with proof in the pudding?

Even a half ass performance would blow Nvidia cards out the water if it would even beat the performance, more or less, 38x better than the best Nvidia card out there. Wouldn't you agree? Yes, no, maybe?
 
Last edited:
I'm not getting in this pissing contest but I will say this. There is always a price to pay for early adoption. Its awesome consoles are getting ray tracing. Its awesome they are getting ssds. Finally. We've been enjoying those for ten years. Enjoy your console. It'll be a great piece of tech at launch. Just like every other generation though it will be become outdated and pc will be miles ahead.
It's the way it is. PC will steam ahead, consoles will be rebirthed in 6 years time, and the process will repeat.

And I'm not having a go at Nvidia here. Nvidia have been pushing the new tech out before anyone. VRS, Mesh Shading, Ray Tracing, DLSS etc.
We need to be thankful that Nvidia has been putting this stuff out into the market even without any competition pushing them to.
They have led the way.
 

Ascend

Member
You are discounting any improvenments in the tech.
Do You think Nvidia will be able to improve the RT performance of their next cards outside of just cuda core and clock increases?
I fully expect they will.
AMDs RT tech hasn't even hit the streets yet.
Oh. You haven't seen AMD's RT patent? Here;

eOisryjCEReidYTS.jpg



Here's nVidia's;

image20-1.jpg


It's basically the same thing. I'm sure there are differences in the details. But RT is not THAT complicated, so there's little leniency. Its requirement for such high computational power comes from the parallelism of it, not from its complexity. If the article is correct, AMD will be more efficient on die-space while basically achieving the same result.
 
Pc didn't need super fast ssd, though... That's why we have ram, even lower speed ram is multiple times faster than a ssd... I seriously wish people got this though there head already.... The average pc gamer has 16gb of ram, which is at least 4x faster than ps5 ssd.

And Nvidia rtx specs blew next gen consoles out there water, 2 years ago, before next gen consoles even release... You want to discredit Minecraft as a minimum release, without material based lighting, or high quality textures, yet have nothing to make up for it. Who has the better credibility now? The one who's making excuses, or the one with proof in the pudding?

Even a half ass performance would blow by Nvidia cards out the water if it would even beat the performance, more or less, 38x better than the best Nvidia card out there. Wouldn't you agree? Yes, no, maybe?
The Minecraft thing isnt helping your argument dude.
You have really taken this badly.
 
I'm sure it's much more simple than that and you're making it seem convoluted because shocking; you're an Nvidia guy and gush over your 2080 Ti and RTX any chance you get. Don't think we don't have history that I wouldn't bring up just because we're 'friends'.

The only thing it needs to do is garner surface and corresponding volume data and calculate based upon that, there's nothing to suggest that the texture quality or resolution itself plays any part in the calculation of ray interaction.

Carpet is carpet, marble is marble, it doesn't matter if it's a 512x512 or 4096x4096 texture map, the resulting interaction would be the same.
The texture has nothing to do with it. The materials do... and you know that. Stop being stupid. Notice how there's NO reflections in the Series X demo? Did you notice that? I sure did. I also noticed the amount of bounces is reduced... as is the chunk draw distance.. as are many other things.

Hey... you have to learn to stop taking logical and obvious observations as an attack on your system of choice... and you know damn well I know what I'm talking about.
 
The Minecraft thing isnt helping your argument dude.
You have really taken this badly.
Any other half assed examples you got? Cause one of us have an argument, the other, not so much..... You haven't helped your original post yet. No factual information, not even speculation.... No?
 

Real

Member
Numbers are bullshit.

AMD calculation: 4TMUs * 52CUs * 1.825Ghz = 379,600,000,000 intersections per second

Nvidia calculation: 14 TFLOPS [FP32] * 80%) + (14 TIPS [INT32] * 28% [~35 INT32 ops for every 100 FP32 ops, which take up 80% of the workload]) + (100 TFLOPS [ray tracing] * 40% [half of 80%]) + (114 TFLOPS [FP16 Tensor] * 20%) = 77.9 RTX Ops

You can't compare straight across.
 
Top Bottom