• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Tech Press Accidentally Prove RTX Ray Tracing ISN'T Ready!

Hardware accelerated ray tracing in consumer products is new tech you obnoxious pedant.

This.

Ray tracing itself is nothing new. Some of the earliest algorithms were developed in the 1960's and 1970's. The hardware just wasn't there back then to render ray tracing at a reasonable speed. It was a very long and slow process. And the more complex the scene, the more intense the rendering became.

here's a famous example of ray tracing on an Amiga computer, that was done in 1986. Each frame took an hour to render 64k worth of light beams.



Pixar didn't use ray tracing until Monsters University in 2013. before that they used their own methods to simulate ray tracing with their RenderMan algorithms. Though they may have still used ray tracing in smaller doses. I think they use the RTX cards for prototyping and setting up scenes now in real time.

There have been 3D modeling packages that have ray tracing feature sets in them. But as far as hardware ray tracing goes. the RTX cards are the first. It's still baby steps.
 
Last edited:

Ascend

Member
Hardware accelerated ray tracing in consumer products is new tech you obnoxious pedant.
Really? Something being present for the first time in a specific market does not make that something itself new... That would be like putting 3D screens that don't require glasses in cars for the first time, and calling glass-free 3D new, even though the 3DS did it like over 8 years ago. Yeah... It doesn't work that way.
nVidia sure knows how to brainwash people...

Lastly... This is not for you, but a general statement... Rather than calling someone a fanboy because of his content, how about you start using counter arguments saying why you think he's wrong, rather than dismissing things based on your own bias?
 
Last edited:

GymWolf

Member
I just hope that they don't waste resources on the next gen console for this thing....
I'm very happy with great graphics at 4k with my fake shadow and reflections, thanks.

I tried rtx in control but they are not nearly close to the benefits of native 4k or 60 frames...
 
Last edited:

TUROK

Member
Really? Something being present for the first time in a specific market does not make that something itself new... That would be like putting 3D screens that don't require glasses in cars for the first time, and calling glass-free 3D new, even though the 3DS did it like over 8 years ago. Yeah... It doesn't work that way.
nVidia sure knows how to brainwash people...

Lastly... This is not for you, but a general statement... Rather than calling someone a fanboy because of his content, how about you start using counter arguments saying why you think he's wrong, rather than dismissing things based on your own bias?
Yeesh, y'all are so eager to discredit the work of engineers who busted their asses off in getting hardware-accelerated ray tracing working in a viable, fast way.

Fucking gamers.
 

Ascend

Member
Yeesh, y'all are so eager to discredit the work of engineers who busted their asses off in getting hardware-accelerated ray tracing working in a viable, fast way.

tenor.gif


Anything that requires a $1000+ GPU to run at 1080p 60fps is not 'fast', and whether it's considered a viable solution is for each person to decide. To me, it's not. And to many people, it's not a viable solution... See the comments in this video;

 

thelastword

Banned
tenor.gif


Anything that requires a $1000+ GPU to run at 1080p 60fps is not 'fast', and whether it's considered a viable solution is for each person to decide. To me, it's not. And to many people, it's not a viable solution... See the comments in this video;

IQ looks much better in Control and Battlefield on the AMD card......Are they using DLSS in control for this video? Looks muddier on Nvidia...Very big IQ gap in control, NV has this fuzzy look in this video.....Well framerates? AS I said in the OP, people value framerates and rez over anything RTX is bringing, which seems to be minimal changes over rasterization, even when they dial down rasterization from previous efforts, the differences are still minimal.....That has to saomething now, innit...
 

01011001

Banned
tenor.gif


Anything that requires a $1000+ GPU to run at 1080p 60fps is not 'fast', and whether it's considered a viable solution is for each person to decide. To me, it's not. And to many people, it's not a viable solution... See the comments in this video;



getting ray tracing of this quality running in real time at 1080p 60fps was through to be a pipe dream only a couple of years ago.
so yes, this is "fast"
 
None of these are an optimal use of raytracing. It's all about maximising the screenshot potential of Nvidia hardware.

Raytracing when used independently of vendor interference will be still be worthwhile, but will be more subtle and with less of an insane performance hit.

There are ways to use, for example, ray traced reflections, but as a step between screen space and reflection map options. And you can control the number of rays used per frame to limit the performance hit. But this takes integration beyond short term vendor influence vs not using it at all.

We're currently in a messy spot between naysayers and zealots.
 

TUROK

Member
tenor.gif


Anything that requires a $1000+ GPU to run at 1080p 60fps is not 'fast', and whether it's considered a viable solution is for each person to decide. To me, it's not. And to many people, it's not a viable solution... See the comments in this video;


Completely unsurprising that you're unable to understand this outside of the context of "muh vidya gayms!"
 

Trimesh

Banned
Completely unsurprising that you're unable to understand this outside of the context of "muh vidya gayms!"

This is a technology that is being sold on the basis of the idea that it will improve video quality in games, being discussed on a website the primary focus of which is video games. What exactly are you expecting people to focus on?
 

Ascend

Member
Completely unsurprising that you're unable to understand this outside of the context of "muh vidya gayms!"
Oh I understand the implications perfectly. RTX is not much unlike G-Sync. nVidia comes with their 'premium' solution. AMD follows with their open and flexible solution that gains more mass adoption, and ultimately nVidia's solution while still moderately useful, will be too expensive to be the standard and they will be forced to adapt to AMD's implementation despite resisting at first.
 
Last edited:

Trimesh

Banned
Oh I understand the implications perfectly. RTX is not much unlike G-Sync. nVidia comes with their 'premium' solution. AMD follows with their open and flexible solution that gains more mass adoption, and ultimately nVidia's solution while still moderately useful, will be too expensive to be the standard and they will be forced to adapt to AMD's implementation despite resisting at first.

I think that's a little unfair - although I personally don't think that RTX is ready for prime time, I still feel the technology has some actual utility. G-SYNC was just bullshit from day 1 - they basically made the ability to support VRR conditional on the display using an extremely overpriced nVidia scaler chip.
 

Ascend

Member
Yep, sounds about right.
Really?
Who uses smaller nodes first? (In Before "They do it because they need to!")
Who had DX 10.1 features first?
Who came with a unified shader architecture first?
Who first came with tessellation support on their cards?
Who was the first to put sound through HDMI on their cards?
Who had DX12 support first?
Who had concurrent async compute first?
Were is nVidia's equivalent of Radeon Chill?
And how exactly did the whole Anti-lag thing go? Oh right... nVidia lied saying they did it first, and then they released a low latency mode to compete with AMD after AMD already released their anti-lag.

Long story short, your statement is pure BS, and it disgusts me.

I used to agree. Then I played Metro Exodus and experienced by far the best lighting I've ever seen in a videogame. Not even close.
Yeah... So? You do realize that a 5700XT and even an RX 580 can run ray tracing in Minecraft and also look amazing?



And there's the Crytek Demo.
And Intel is now implementing a DX11 ray tracing solution that works on all graphics hardware... If you hate AMD, at least take this one as a sign. So... Yeah... The signs point to RTX becoming obsolete in the future, but hey, we'll let time decide who's right and who isn't.
 
Last edited:

kraspkibble

Permabanned.
i have an RTX card and i agree it's not ready. at least not for most people. right now it's still very demanding to run. in a game where you can be playing at 4K 60fps on high/ultra settings you need to go down to 1080/1440p and deal with chaotic framerates.

i just think most people don't realise how impressive it is that we even have realtime ray tracing games today. sure the performance sucks but it's realtime raytracing! before RTX cards you only ever saw raytracing in cutscenes or movies and of course they are prerendered so they had all the time in the world to render them to just 30fps!

raytracing is definitely the future. with next gen consoles supporting it and future RTX cards it will become more mainstream and performance will improve. i don't think consoles will have a great experience with raytracing but next gen Nvidia cards will see an improvement. hopefully AMD/Intel can also provide gpus with raytracing support.

i know i must sound like an nvidia fanboy but if it wasn't for them most likely nobody would be even talking about raytracing right now. someone had to come out and give others a push. by all means don't buy an RTX card and shit on nvidia to your hearts content but 5 years from now raytracing will be in all our games (or majority of them) and we'll think nothing of it. it'll be normal.

Yeah... So? You do realize that a 5700XT and even an RX 580 can run ray tracing in Minecraft and also look amazing?

And there's the Crytek Demo.
And Intel is now implementing a DX11 ray tracing solution that works on all graphics hardware... If you hate AMD, at least take this one as a sign. So... Yeah... The signs point to RTX becoming obsolete in the future, but hey, we'll let time decide who's right and who isn't.

oh wow. minecraft. im impressed.... :messenger_tears_of_joy:

yes AMD, future intel cards, and non RTX cards can run raytracing. the point of the RTX cores is that they're dedicated for raytracing. a 1080 Ti can't perform the same as a 2080. simple as that.

RTX won't be obsolete. i very much doubt it's going anywhere. we might get another 1 or 2 generations of RTX branded cards but i suspect eventually they'll go back to GTX branding for all their cards or do something new. the actual technology on the cards will still be there. a point will come where raytracing is more supported and mainstream. every card in nvidia's line up will be cable of RTX right down to the x30/x50 cards. right now RTX is limited to the higher end cards. it wouldn't make sense to stick RTX cores into cards like the GTX 1650/1660. when the cost drops then sure we'll see it in those cards.
 
Last edited:

Ascend

Member
i have an RTX card and i agree it's not ready. at least not for most people. right now it's still very demanding to run. in a game where you can be playing at 4K 60fps on high/ultra settings you need to go down to 1080/1440p and deal with chaotic framerates.

i just think most people don't realise how impressive it is that we even have realtime ray tracing games today. sure the performance sucks but it's realtime raytracing! before RTX cards you only ever saw raytracing in cutscenes or movies and of course they are prerendered so they had all the time in the world to render them to just 30fps!

raytracing is definitely the future. with next gen consoles supporting it and future RTX cards it will become more mainstream and performance will improve. i don't think consoles will have a great experience with raytracing but next gen Nvidia cards will see an improvement. hopefully AMD/Intel can also provide gpus with raytracing support.

i know i must sound like an nvidia fanboy but if it wasn't for them most likely nobody would be even talking about raytracing right now. someone had to come out and give others a push. by all means don't buy an RTX card and shit on nvidia to your hearts content but 5 years from now raytracing will be in all our games (or majority of them) and we'll think nothing of it. it'll be normal.
You don't come off as a fanboy, because you acknowledge the current issues. Coreteks said it best... His analogy of RTX is on point. He says to imagine that a company comes out with a new car and puts wings on it. The car can't really fly, but if you drive off a cliff, you will be able to float for a short while. Even though the car will be smashed afterwards, everyone including reviewers are saying that since flying cars are the future, it's better to pay more to buy this car with wings. And another company that brings out a car without wings gets bad reviews, even though the 1st company doesn't actually really achieve flying.

I must agree that if it wasn't for nVidia, ray tracing would not be such a hot topic today. But I'm still not sure if that's a good thing. Games have many more important problems today that should get a higher priority over shinier graphics. What good is ray tracing in battlefield when the creators bash their own audience/customers over political jargon?
 

kraspkibble

Permabanned.
You don't come off as a fanboy, because you acknowledge the current issues. Coreteks said it best... His analogy of RTX is on point. He says to imagine that a company comes out with a new car and puts wings on it. The car can't really fly, but if you drive off a cliff, you will be able to float for a short while. Even though the car will be smashed afterwards, everyone including reviewers are saying that since flying cars are the future, it's better to pay more to buy this car with wings. And another company that brings out a car without wings gets bad reviews, even though the 1st company doesn't actually really achieve flying.

I must agree that if it wasn't for nVidia, ray tracing would not be such a hot topic today. But I'm still not sure if that's a good thing. Games have many more important problems today that should get a higher priority over shinier graphics. What good is ray tracing in battlefield when the creators bash their own audience/customers over political jargon?

i do agree with the last part. as nice as Raytracing is i think there are other areas that could use more attention. i mean games look great today without raytracing but i'd like to see more focus on animations and AI.
 

Ascend

Member
oh wow. minecraft. im impressed.... :messenger_tears_of_joy:
What is the issue with Minecraft? Compare for yourself... The AMD video above... And the RTX 2070 Super in Minecraft with Ray Tracing. Note that the 2070 Super is at 1440p rather than 1080p, so it's not a direct comparison. But it should give some sort of indication of how close or far apart the RT cores really are for ray tracing...


yes AMD, future intel cards, and non RTX cards can run raytracing. the point of the RTX cores is that they're dedicated for raytracing. a 1080 Ti can't perform the same as a 2080. simple as that.
I am aware of that. I don't see how that changes anything. But... I've posted this before and I'll post it again.... Draw your own conclusion (2:15 to 3:50);



RTX won't be obsolete. i very much doubt it's going anywhere. we might get another 1 or 2 generations of RTX branded cards but i suspect eventually they'll go back to GTX branding for all their cards or do something new. the actual technology on the cards will still be there. a point will come where raytracing is more supported and mainstream. every card in nvidia's line up will be cable of RTX right down to the x30/x50 cards. right now RTX is limited to the higher end cards. it wouldn't make sense to stick RTX cores into cards like the GTX 1650/1660. when the cost drops then sure we'll see it in those cards.
I wasn't referring to RTX as a brand, but RTX as it's implemented in current Turing cards. It most likely will need some adaptation to run things more efficiently.

Bonus video;
 

01011001

Banned
What is the issue with Minecraft? Compare for yourself... The AMD video above... And the RTX 2070 Super in Minecraft with Ray Tracing. Note that the 2070 Super is at 1440p rather than 1080p, so it's not a direct comparison. But it should give some sort of indication of how close or far apart the RT cores really are for ray tracing...


this Minecraft mod doesn't use the RT cores in the RTX card. it's also a very simplified form of raytracing since it doesn't track dynamic objects like aninals or your player characters. it basically only updates static blocks in the environment.

useless as a benchmark.

the official RTX support that's coming actually is fully realtime raytracing and will use the RT cores. meaning a non RT core card will completely buckle while a card with RT cores will run it just fine
 
Last edited:
Really?
Who uses smaller nodes first? (In Before "They do it because they need to!")
Who had DX 10.1 features first?
Who came with a unified shader architecture first?
Who first came with tessellation support on their cards?
Who was the first to put sound through HDMI on their cards?
Who had DX12 support first?
Who had concurrent async compute first?
Were is nVidia's equivalent of Radeon Chill?
And how exactly did the whole Anti-lag thing go? Oh right... nVidia lied saying they did it first, and then they released a low latency mode to compete with AMD after AMD already released their anti-lag.

Long story short, your statement is pure BS, and it disgusts me.

The company who's plans line up with the production capabilities of their manufacturing partners...
Who gives a SHIT about DX10? lmao
Who had the first GPU with programmable pixel shader support?
AMD, and their cards suck at it lmao
lmao... who the fuck uses HDMI sound from their GPU?
Nvidia had FULL DX12 support first.
Who cares when you have the performance crown without concurrent async compute?
Nvidia Whisper Mode... been around since 2017... And who needs an equivalent when any framerate limiter will do this.. lmao
Anti lag went uh... fine? Nvidia didn't lie.. lmao.. they LITERALLY took the option (max pre-rendered frames) that was already there, and changed how it was worded to bring attention to it since people didn't know...



Would you look at that... Nvidia's works just as well as AMD's... and incredibly enough... you can get even better results by keeping your GPU under 95-98% utilization without using either of them...

Long story short... Nvidia comes... AMD follows.

SLI - Crossfire
Nvidia Hairworks - TressFX
Gsync - Freesync
Shadowplay - Relive
Freestyle - RIS
Ray Tracing - Raydeon Tracing :rolleyes:
 
Last edited:

thuGG_pl

Member
So I should buy AMD to get worse performance than nvidia, and NO ray tracing option for similar price?
Where is the logic here?
 
Last edited:

Ivellios

Member
So I should buy AMD to get worse performance than nvidia, and NO ray tracing option for similar price?
Where is the logic here?

Current AMD cards have better performance in their price range compared Nvidia, just see the benchmarks.

The problem is that they have less features as well, and no ray tracing is not good when next gen consoles will support it.
 

meirl

Banned
The vendetta some people have against RTX is ridiculous.

At least Nvidia are making an effort to advance tech. They wouldn't be able to get away with the silly prices if the competition was up to scratch.


This is mostly because the PS5 won’t have true hardware based RT (Only Audio RT. For visual it’s only software based RT)
The next Xbox the other hand WILL have hardware based RT, That’s why people try to downplay it beforehand.
 

01011001

Banned
This is mostly because the PS5 won’t have true hardware based RT (Only Audio RT. For visual it’s only software based RT)
The next Xbox the other hand WILL have hardware based RT, That’s why people try to downplay it beforehand.


to be fair, we don't know that yet.
it is highly suspicious that Sony didn't mention hardware accelerated RT yet but they also didn't confirm that they don't have it
 

FireFly

Member
Sad that the discussion seems to have devolved into "my GPU company can beat up your GPU company". Both AMD and Nvidia have been first with different features at different times, and a commonality is that first generation features are often powerful but not without compromise.

I think a good example is the Geforce 3. Having programmable pixel shaders was a huge advancement, and when Carmack demoed the Doom 3 engine running on the card it blew everyone away. On the other hand the final release required the majority of visual settings to be turned down to get a decent frame rate. Though it still looked highly impressive.
 

psorcerer

Banned
RTX is an example of diminishing returns. Where each new "technology" is not improving image quality enough to justify a performance drop.
Modern GPU is a general purpose processor with massive SMT and a bandwidth-oriented architecture.
The step forward would be to remove the barriers from that processor accessibility for developers but Nvidia is unwilling to do so. Currently Nvidia is a cancer. Sorry.
 
RTX is an example of diminishing returns. Where each new "technology" is not improving image quality enough to justify a performance drop.
Modern GPU is a general purpose processor with massive SMT and a bandwidth-oriented architecture.
The step forward would be to remove the barriers from that processor accessibility for developers but Nvidia is unwilling to do so. Currently Nvidia is a cancer. Sorry.
I disagree completely. This isn't diminishing returns. This isn't about image quality... this is about the accuracy of lighting and shadowing. It's about simulating the effects of something rather than imitating them. Simulation holds up from all angles, whereas imitation inevitably falls apart at the seams. Looking at a still screen-shot does NOT give you the effects of what this technology improves.

Not to mention this is all currently AN OPTION. If a technology's worth was based on whether or not it's improvement justified a performance drop, then we'd never get anywhere.

Make the push... the technology will catch up. It's why I wish PC game devs would put more future looking settings into their games. Oh no... the "Insane" preset makes my fps drop :messenger_loudly_crying: "this game is terribly optimized!!!".... lmao no.

And I think it's going to be great looking back on this and the naysayers. People have been thoroughly convinced that current lighting and shadowing in games is 'accurate enough' and looks realistic... Visuals with OBVIOUS lighting and shadowing inconsistencies are accepted as 'acceptable'... After a couple years of games developed with RT in mind... people are going to look back on this current generation of games and think.. "How did we ever think that was acceptable?"

It happens all the time. Ambient Occlusion? People at the time said "oh it darkens the corners. Who cares about that?" and now try to play a game with no AO. It looks disgusting lol. Or how about games that we've thought looked "real" at the time and you look back and realize it was garbage? Gran Turismo PS1, PS2, PS3, and now GT Sport.

Watch... when GT comes out on PS5 with RT reflections, lighting, and shadows... it's going to put that to shame.

It's UNDENIABLY where the industry is going. GPUs are finally getting there to do these effects in real-time. And now that the initial push has been made... everyone is hopping on board. AMD fanboys are just mad because it was Nvidia that made the push. 🤷‍♂️
 
Last edited:

psorcerer

Banned
this is about the accuracy of lighting and shadowing

Nope. Accuracy is still shit.
It's just a different tradeoff.
You now need denoise and have bad data structures that will bite you. Yes you get slightly better reflections and shadows. But it's still marginal improvement over path tracing that works okay on any gpu.
 

slade

Member
So I should buy AMD to get worse performance than nvidia, and NO ray tracing option for similar price?
Where is the logic here?

This is where I am at the moment. If you build a top of the line PC that will keep up for the next few years, chances are few if any parts are going to come from AMD. The only thing they have at the moment is price vs performance comparisons but those only really make sense in the short term. If you want to build a top line PC for gaming for the next five years with no to minimal upgrades, an Intel CPU paired with a Nvidia GPU makes more sense.
 
Nope. Accuracy is still shit.
It's just a different tradeoff.
You now need denoise and have bad data structures that will bite you. Yes you get slightly better reflections and shadows. But it's still marginal improvement over path tracing that works okay on any gpu.
Slightly better he says "pie_tears_joy:

Marginal improvement over path tracing?? What the hell are you even talking about? lol
 

Ascend

Member
Who gives a SHIT about DX10? lmao
We were talking about who did what first right? At that time, it was a big deal. Why? Because DX10.1 added tessellation as a feature (among other things), and nVidia did not have it. But we already know you like to dismiss all evidence that is inconvenient to your biases. In any case...

Who had the first GPU with programmable pixel shader support?
Completely missing the point of my post... Giving another examples does not change the fact that AMD came with innovations first too. You wish to dismiss them for no reason.

lmao... who the fuck uses HDMI sound from their GPU?
I do. And I bet more people do than you think. But thank you for once again showing that you want to undermine or downplay everything that AMD did first, even if nVidia copied it afterwards... Nice double standard you got going there...

Nvidia had FULL DX12 support first.
False. nVidia still does not have "FULL DX12" support, since it still does not support Stencil reference value from Pixel Shader and tier 2 resources heap.

Who cares when you have the performance crown without concurrent async compute?
Who cares if you lose the performance crown to $200 cards when you turn on RTX, right?

The point was to show that AMD has done its fair share of things first as well. That you wish to stick your head in the sand and pretend it's only nVidia that always does everything first is your problem. It does not conform to reality.

this Minecraft mod doesn't use the RT cores in the RTX card. it's also a very simplified form of raytracing since it doesn't track dynamic objects like aninals or your player characters. it basically only updates static blocks in the environment.
Is that why you can see shadows of the player on the ground with it on?

the official RTX support that's coming actually is fully realtime raytracing and will use the RT cores. meaning a non RT core card will completely buckle while a card with RT cores will run it just fine
If you say so... We'll see when it's released, and in addition, we will see if 'superior' RTX version will indeed have so much better performance and so much better image quality than this 'inferior' mod.

So I should buy AMD to get worse performance than nvidia, and NO ray tracing option for similar price?
Where is the logic here?
There indeed is no logic here, because AMD's cards are better price/performance and are definitely not similar prices. RTX is in no better state than Async compute is. The difference is that nVidia is pushing giant marketing for RTX in order to convince people that it's a killer feature, pretty much just like the times of PhysX, which was supposed to be the top of the line way of doing physics in games, and look how that turned out.

Sad that the discussion seems to have devolved into "my GPU company can beat up your GPU company". Both AMD and Nvidia have been first with different features at different times, and a commonality is that first generation features are often powerful but not without compromise.

I think a good example is the Geforce 3. Having programmable pixel shaders was a huge advancement, and when Carmack demoed the Doom 3 engine running on the card it blew everyone away. On the other hand the final release required the majority of visual settings to be turned down to get a decent frame rate. Though it still looked highly impressive.
Wow. Some logical sense in here. Thankfully....
I agree with you. And unified shaders was also an important evolution from that. I don't understand why people are so blind towards nVidia.

RTX is an example of diminishing returns. Where each new "technology" is not improving image quality enough to justify a performance drop.
Modern GPU is a general purpose processor with massive SMT and a bandwidth-oriented architecture.
The step forward would be to remove the barriers from that processor accessibility for developers but Nvidia is unwilling to do so. Currently Nvidia is a cancer. Sorry.
THANK YOU

This is where I am at the moment. If you build a top of the line PC that will keep up for the next few years, chances are few if any parts are going to come from AMD. The only thing they have at the moment is price vs performance comparisons but those only really make sense in the short term. If you want to build a top line PC for gaming for the next five years with no to minimal upgrades, an Intel CPU paired with a Nvidia GPU makes more sense.
Is that why Apple switched to AMD videocards, Google Stadia uses AMD videocards and Microsoft switched to AMD CPUs for their Surface laptops?
Yeah yeah I get it. It's not desktop.

But seriously... Buying an Intel CPU right now is really NOT the way to future proof your PC... It's the equivalent of buying an i5 vs an i7 5 years ago. The ones with an i7 can still use their CPUs, the ones with an i5 will have to upgrade or deal with a bunch of stutters. The situation is the same right now, except it's Ryzen vs Intel CPUs. The additional cores/threads that Ryzen has WILL be used in the very near future. Single core performance is not expected to jump that much anymore, and developers are forced to go 'wide', which is exactly why Intel is really in trouble right now, and they know it. They are losing market share in practically every market, and that includes the desktop market.
As for graphics... The 5700XT cards or RDNA architecture, is the equivalent of AMD's 1st GCN cards or the HD 7000 series... That basically says it all about longevity.

Look around. Every major tech giant is going with AMD. What do you think that is going to do with developer support for AMD products? And don't get me wrong. I'm not saying you are definitely wrong. What I am saying is that there are signs that things are shifting. And if you really do want a system to last 5 years, you currently buy a good X570 AM4 motherboard with PCI-E 4.0 which will let you upgrade to even the Ryzen 4000 series, and you buy the graphics card that satisfies your performance needs while ignoring RTX.

And as a final reminder... I said RTX, NOT Ray Tracing. Let's quote Mr. Jensen here, on what RTX actually is;

"The Nvidia RTX is a PLATFORM, consisting of architecture, software, SDKs and libraries that allows us to COMBINE DIFFERENT types of rendering technologies into ONE unified and cohesive PLATFORM."
Source

Just remember that RTX is the next GameWorks. That is all.
 
Last edited:
All this companies pitching ray tracing don't know how to pitch it, why not make a racing game demo because cars are easy to pitch with rtx put reflections on the car global illumination some proper tyre dirt with physx and call it a day, deffieteley not a woman putting on make up or "control! Where by Ur just inside offices, same to the guys pitching new game engines guys guys simply recreate a Pixar film scene just like the original with the best graphics you can throw on it simple.
 
We were talking about who did what first right? At that time, it was a big deal. Why? Because DX10.1 added tessellation as a feature (among other things), and nVidia did not have it. But we already know you like to dismiss all evidence that is inconvenient to your biases. In any case...


Completely missing the point of my post... Giving another examples does not change the fact that AMD came with innovations first too. You wish to dismiss them for no reason.


I do. And I bet more people do than you think. But thank you for once again showing that you want to undermine or downplay everything that AMD did first, even if nVidia copied it afterwards... Nice double standard you got going there...


False. nVidia still does not have "FULL DX12" support, since it still does not support Stencil reference value from Pixel Shader and tier 2 resources heap.


Who cares if you lose the performance crown to $200 cards when you turn on RTX, right?

The point was to show that AMD has done its fair share of things first as well. That you wish to stick your head in the sand and pretend it's only nVidia that always does everything first is your problem. It does not conform to reality.

Is that why you can see shadows of the player on the ground with it on?


If you say so... We'll see when it's released, and in addition, we will see if 'superior' RTX version will indeed have so much better performance and so much better image quality than this 'inferior' mod.


There indeed is no logic here, because AMD's cards are better price/performance and are definitely not similar prices. RTX is in no better state than Async compute is. The difference is that nVidia is pushing giant marketing for RTX in order to convince people that it's a killer feature, pretty much just like the times of PhysX, which was supposed to be the top of the line way of doing physics in games, and look how that turned out.


Wow. Some logical sense in here. Thankfully....
I agree with you. And unified shaders was also an important evolution from that. I don't understand why people are so blind towards nVidia.


THANK YOU


Is that why Apple switched to AMD videocards, Google Stadia uses AMD videocards and Microsoft switched to AMD CPUs for their Surface laptops?
Yeah yeah I get it. It's not desktop.

But seriously... Buying an Intel CPU right now is really NOT the way to future proof your PC... It's the equivalent of buying an i5 vs an i7 5 years ago. The ones with an i7 can still use their CPUs, the ones with an i5 will have to upgrade or deal with a bunch of stutters. The situation is the same right now, except it's Ryzen vs Intel CPUs. The additional cores/threads that Ryzen has WILL be used in the very near future. Single core performance is not expected to jump that much anymore, and developers are forced to go 'wide', which is exactly why Intel is really in trouble right now, and they know it. They are losing market share in practically every market, and that includes the desktop market.
As for graphics... The 5700XT cards or RDNA architecture, is the equivalent of AMD's 1st GCN cards or the HD 7000 series... That basically says it all about longevity.

Look around. Every major tech giant is going with AMD. What do you think that is going to do with developer support for AMD products? And don't get me wrong. I'm not saying you are definitely wrong. What I am saying is that there are signs that things are shifting. And if you really do want a system to last 5 years, you currently buy a good X570 AM4 motherboard with PCI-E 4.0 which will let you upgrade to even the Ryzen 4000 series, and you buy the graphics card that satisfies your performance needs while ignoring RTX.

And as a final reminder... I said RTX, NOT Ray Tracing. Let's quote Mr. Jensen here, on what RTX actually is;

"The Nvidia RTX is a PLATFORM, consisting of architecture, software, SDKs and libraries that allows us to COMBINE DIFFERENT types of rendering technologies into ONE unified and cohesive PLATFORM."
Source

Just remember that RTX is the next GameWorks. That is all.
Tesselation as a future came first on Xbox 360 GPU then dx10
 
SLI - Crossfire

Nvidia did not create SLI. It was developed by 3DFX for their Voodoo graphics cards in the early 90's. Nvidia bought the remnants of 3DFX after the company folded, then adapted SLI to their early Geforce cards.

q3d_mercury_1.jpg


Multiple GPU's are definitly going to be a requirement for pushing ray tracing, going forward. Things like SLI and Crossfire are dead, but multiple GPU's are definitely still a thing with lower API's like Vulkan and DX12. I think 20 teraflops is the ballpark esitmate to get solid ray tracing at 1080 or 1440p?
 
Last edited:
Nvidia did not create SLI. It was developed by 3DFX for their Voodoo graphics cards in the early 90's. Nvidia bought the remnants of 3DFX after the company folded, then adapted SLI to their early Geforce cards.

q3d_mercury_1.jpg


Multiple GPU's are definitly going to be a requirement for pushing ray tracing, going forward. Things like SLI and Crossfire are dead, but multiple GPU's are definitely still a thing with lower API's like Vulkan and DX12. I think 20 teraflops is the ballpark esitmate to get solid ray tracing at 1080 or 1440p?
In that case AMD didn't create Crossfire... ATI did...

And I didn't say they created it... I said they came first with it. Between Nvidia and AMD.. Nvidia came first with a multi gpu solution.
 
Last edited:

slade

Member
Is that why Apple switched to AMD videocards, Google Stadia uses AMD videocards and Microsoft switched to AMD CPUs for their Surface laptops?
Yeah yeah I get it. It's not desktop.

When you don't have a commanding lead in the industries you cater to, you can give all sorts of incentives to companies to pick your products. I mean high end wise Intel and Nvidia hardware isn't cheap. AMD's problem though is that it has had problems competing even in the low end.

But seriously... Buying an Intel CPU right now is really NOT the way to future proof your PC... It's the equivalent of buying an i5 vs an i7 5 years ago. The ones with an i7 can still use their CPUs, the ones with an i5 will have to upgrade or deal with a bunch of stutters.

You say this here and then just a paragraph down:

And if you really do want a system to last 5 years, you currently buy a good X570 AM4 motherboard with PCI-E 4.0 which will let you upgrade to even the Ryzen 4000 series, and you buy the graphics card that satisfies your performance needs while ignoring RTX.

OK, granted this may work but remember I said minimal to no upgrades over those five years. Currently, only Intel offers a CPU at 5ghz.

AMD may be at that speed next year but we know for sure the consoles won't be there. Meaning that if you build a high end gaming PC right now, AMD won't be the one you go to for either a CPU or a GPU.
 

thelastword

Banned
Time and time again people misunderstand and lose focus of the real conversation.....No one is against raytracing. BTW, RTX is just not raytracing, it's hybrid raytracing + DLSS+ Whatever else Nvidia decides to implement in it's RTX suite of features.....

So yes, raytracing as a pure technology, nobody is against that, Nvidia did not event raytracing, it has been there for ages...PS3 did raytracing, older PC's before PS3 did it did it...…..It was done on cheap cell phone technology before Nvidia and the list goes on.....As I said nobody has really tried to commercialize raytracing mainly because they did not have a well researched package that would make the feature viable yet for modern gamers...…..

For people who just cant seem to get any perspective, my arguments against Nvidia is simple......

-Imagination showed real time raytracing with a $150 tablet GPU way back in 2016, that was capable of 6 gigarays, equivalent to the RTX 2070, a card which debuted at $600.....A 2080ti by comparison has 10 gigarays, which makes this very clear. Now lets advance to late 2018 when RTX launched, don't you know that 6 gigarays per second is much cheaper than $150 and 10 gigarays is also now much cheaper than $150 tablet technology too?......

-Yet Nvidia is not even doing real full path-tracing, bi-directional path tracing or real time raytracing as can be seen in some of those old demos mentioned with imagination or on PS3......They are mostly using one feature in the graphics pipeline......Shadows, Reflections or lighting and using rasterization to stitch the effect together for slightly better gains or not in that one aspect of the graphics pipeline......Why did I say not? In many instances, the results are not ideal, like reflections everywhere, on all surfaces in Battlefield or Control, which is not realistic or aesthetically pleasing.....So what we have in many instances is an overuse of one feature, which becomes overdone and wasted, even when it's not necessary and correct on all surfaces. for eg... the blood on the floor in Control being a mirror, where every surface is reflective with the same intensity, then we can also talk about all the denoising that must be done on the image in Battlefield to correct lots of IQ issues, then you combine that with the shimmering noise and blur you get with DLSS on top of RTX's noise.....With such results so far, It only means that people are not even aware of what proper raytracing should look like, and in RTX's case, nothing close to what proper and controlled hybrid raytracing could look like with good hardware. Remember, I said good hardware, because NV's RT hardware is anything but good, it's cheap, it's old technology, this is why its not performant....Plain and simple.

In essence.... 6-10 Gigarays of old mobile GPU technology performance is what Nvidia is trying to sell you as the holy grail.... Forget about Nvidia ploughing folks without lube in the price department, that's just one of their sins....The reason why raytracing performs so bad on Nvidia isn't because it's expensive to produce, its because it's old and cheap technology that just can't deliver the uplift we need whilst keeping the resolutions and framerates we prefer....It has nothing to do with raytracing hardware being so expensive or being such a strain on hardware, truth is much better hardware can be produced to commercialize RT today, much stronger and longer R&D investments for better software solutions can be produced too.......RTX is all about Nvidia's greed to sell you something they did not spend enough time and research doing, just in an effort to dominate the industry since they knew mindshare with them was high.....It's just that many folk woke up and did not drink the cool-aid......They very well knew AMD was working on an open standard raytracing tech/policy for years ongoing, way before Nvidia ever showed a video or spoke about raytracing, so they tried to one-up AMD by rushing some cheap mobile technology out the market and worse yet, fool persons into believing it's some expensive stuff........Yet it's only a Chevrolet Spark under a Enzo's shell....Anyone thinking buying RTX means they are futureproofed for raytraced games in the next 2 years are in for a rude awakening.......RTX will not be the raytracing standard late next year......



 

TUROK

Member
I feel like this thread is an excellent example as to why you should never ever go full fanboy. That shit rots your brain.
 

Ascend

Member
OK, granted this may work but remember I said minimal to no upgrades over those five years. Currently, only Intel offers a CPU at 5ghz.
Even if you don't upgrade, you have the choice of going for an R5 3600 or an R9 3900X, depending on your budget. And I'm quite sure you'll still be able to find a new AM4 CPU in 5 years if you go for a budget option like the R5 3600. I mean really... Are these numbers below really THAT different to justify the 9900k?
1080p_Ultra.png


The answer is... No... And that's at 1080p... Go to 4K and the difference completely diminishes.

Next point... What is it with people and 5 GHz? It's a number and nothing more. Did you know what else did 5 GHz? The AMD FX 9590. That doesn't somehow make it a faster CPU than modern ones running at lower clocks. Clock speed alone means nothing. And the IPC between Intel and AMD is not huge anymore.

Additionally, as already mentioned, developers HAVE to go wide in their programming to get more out of CPUs. Does single core performance help? Obviously. But AMD is not so far behind as to see it as a concern. The 9900K is about 5% faster in single core speed. That is really nothing to write home about. The only real advantage that Intel has is that it has more widespread support than AMD CPUs, and as already mentioned, that is changing.

And since this thread is actually about RTX... Do you know how the additional CPU cores on AMD's CPUs that are currently idling might be used? To assist with ray tracing. So there's two things. You care about performance now, where the R5 3600 is the best choice, or you care about the future, which means you know the importance of more cores, and the 3900X is a better deal than the 9900K. Just my two cents, which just so happens to agree with this YouTube channel.

And yes, I really am suspecting that AMD will be using its Ryzen CPU cores to assist in ray tracing. Call me crazy.
 
Last edited:
Top Bottom