• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD ray-tracing hardware strategy overview.

This is what next gen graphics will look like with a bit of luck, no need for ray tracing for another 10 years lol:


I particularly liked the clothing animation or physics, I expected similar from the next halo.
thing is: without RT or at least some decent realtime GI there willl be a big graphical gap between realtime cinematics and gameplay. i don't want this to be the case.
I don't know the effect on current games of rtx being enabled is quite subtle, outside reflections it doesn't seem to do much, and might not be that much better than voxel based global illumination, when dealing with nonreflective surfaces, which only requires a few TFs.
 

ethomaz

Banned
Are you sure the I/O portion will scale for nvidia? That's a big chunk of any die. It's a big reason AMD ended up with chiplets.
Of course I'm not sure... nVidia has no 7nm product to rely on :D
But the die size being 100% bigger is exactly the difference between the process nodes... if nVidia can archive that in the die shrink is another story.

AMD for example left a lot of unused space on Vega 7nm that made the transition not that good in die space save.
In that point RDNA is a way better design:

Navi 10 diagram (not how the units are placed on the chip):

16-1080.9ce6ffcb.jpg


Navi 10's chip (real placement):

b3da034.png
 
Last edited:

gspat

Member
Of course I'm not sure... nVidia has no 7nm product to rely on :D
But the die size being 100% bigger is exactly the difference between the process nodes... if nVidia can archive that in the die shrink is another story.

AMD for example left a lot of unused space on Vega 7nm that made the transition not that good in die space save.
In that point RDNA is a way better design:

Navi 10 diagram:

16-1080.9ce6ffcb.jpg


Navi 10's chip:

That's a worry to me...

No-one knows just how bad I/O stuff shrinks (AMD never mentioned exact ratios) So for all we know that portion may only shrink 10 - 25%, eating a good chunk of the die space and taking away room from other portions of the die.

This is probably why Navi has the limit of CUs it does for the 5700 series cards.

For Nvidia to make a 7 nm version of the 2080ti, it may have to make a mammoth chip die for it.
 

Kenpachii

Member
The reason they released a 5700 is because of 8gb of v-ram. It's something nvidia jews people out of to create artificially upgrade need when next gen comes out. The 1660 and 2600 have no legs.

The 5700 is a far better choice to go for if you however got a lot of CPU performance. And can deal with the AMD jank.

However the price is iffy at best, it's more expensive then where a 2060 sells for atm ( benchmarks of amd directly isn't something anybody should care for ). Which simple isn't a good thing to push. It will be interesting for people that want to sit with the card for a while but honestly a year infront of new consoles that launch is a incredible bad time to upgrade anyway.

So the market is going to be limited to people that really need a upgrade and don't want to upgrade for a while and do want to spend almost 400 bucks on a gpu just for the 2gb of v-ram.

It will have a market for people that basically search for a cheaper 2070 card with a bit less performance for a cheaper price and that don't care for RTX.

The 5700 XT sits at about the same price as a 2070 and offers about the same performance. Has no memory advantage. Basically DOA product.

It's what it is really.
 
Last edited:

somerset

Member
Again this ray nonsense replicates what we saw with hardware tesselation nonsense a number of years back. Remember when hardware tesselation was going to be a 'game changer'? Despite AMD and Nvidia finally implementing a hardware *standard* supported via openGL, and directX, hardware tesselation was a giant nothing-burger.

Today the gimmick is ray-tracing, which is actually *not* ray tracing (a method of rendering the entire image via screen space rays sent into world space) but ray/triangle collision algorithms calculated on the same old shader ALUs.

Nvida currently has little ASICs in Turing that allow a configuration where geometry data can be more easily fed into shader code- the so-called ray shaders. A crude hardware hack that pre-empts industry standard support of such configurations in DirectX and Vulkan.

Today you can do the *same* calculations on any GPU, which is why Nvidia retrofitted the ray extensions to Pascal- albeit at a far lower performance than Turing. But there are far better algorithms to use on existing GPUs.

As for AMD's 5700XT announcement- wait for the reviews. AMD has still revealed too little to speculate on. And the prices are just *obscene* (early adopters will be morons- AMD and Nvida are going to batter down prices over the next 6 months to something far more realistic).

The biggie is not garbage for the hard-of-thinking like 'ray tracing' but the *only* thing that matters- namely performance per watt. Until the skilled benchmarkers get their hands on Navi, we don't have much that is concrete- just hope.

In many ways, computer hardware is now tracking like AAA games- silly prices on release, but pretty damn good deals six months down the road. My 1700 CPU travelled to *one third* of its cost from purchase to today, and that's an amazing deal. Admittedly I'd hoped for a *sane* price from AMD like they did with the 470/480, but success has clearly gone to their head, and idioticly they think they are Nvidia now. Nvidia is going to teach AMD a *very* hard lesson over the next 6 months.

PS everyone saying AMD is stuck at a ROP limit or CU limit or any other such garbage is a moron. How such idiocy spreads, heaven only knows. The *only* reaspon a company keeps with a specific data structure is convenience- often they have software tools that work well and are well tested within certain parameters. In the case of a chip company, you have the concept of automated layout. Which respects the boundaries of *2D* geometry. We call this *topography* and it speaks to interconnects and busses.

But mathematically, there is *nothing* preventing one from folding to higher levels. Yes, the dimensional split in the data trees introduces higher latencies between blocks- which in the case of rendering is a matter for the soiftware systems that distribute work-loads to take into account. True low level rendering APIs like Vulkan can suppose the application coder will take responsibility for the utilisation of the work units.

Windows has this *lazy* curse where it is supposed the OS/driver magically takes sh-t code, and makes it work well across increasing resources (CPU cores/GPU 'cores'/flash etc. Now this isn't so- cannot be so. On the consoles the games can code 'to the metal' so the APU can jump to *any* level of compelxity and be well utilised.

On the PC, it is tempting to keep to pre-historic configurations for as long as possible (and this applies to the *entire* PC architecture- which is now aging very badly indeed, largely thanks to Intel, Nvidia and Microsoft). Today the 5700 simply does this- plays safe and milks the pre-new-console release period.

However what AMD and Nvidia have planned for the near future is going to blow people's minds.
 

thelastword

Banned
Hey what do you know, anytime a new Radeon product releases....It's all about Power Draw and if Radeon outperforms Nvidia? Ohhh! the price is too high....These guys want 2080ti performance for the price of a GTX 1060 from AMD.....It's preposterous...….

Having the best performing gaming GPU means nothing if only a few denizens buy it or can afford it....So I want to ask, all these kids saying AMD is not matching the 2080ti, do they have a 2080ti? Moreover, how many people talking about the 2080ti in this thread has one? Yet again, all of a sudden, raytraycing on turing is a talking point and an important feature again.....So, what games are you guys playing with RTX on at this moment, tell me, what recent games? More than that, how many games over the course of the last 9 months? Could you imagine if AMD released a card with only 4 games supported on it's biggest feature pitch, Yet NV's GPU die is gargantuan in the name of raytracing and Turing is already inching to a year on the market.?..…

In truth, raytracing on Turing means nothing, nothing unless AMD comes on board, "whenever that is" it is only then that the market will open up to the feature and for you to see regular raytraced games a thing, just as you have non-raytraced content dominating now........That will only happen with AMD onboard, when consoles have RT, when regular low end to mid-end PC gpu's have RT, when even mobile has RT...It's the only way Raytracing is taking off and will be seen as viable to devs, when the majority of people purchasing games have some type of RT hardware, that's capable of giving suitable/acceptable gaming performance across all tiers of GPU's....
 

Tygeezy

Member
Whats the point of releasing expensive videocards this year without ray tracing? The only reason I am for sure going with an amd card form my itx build is because I have to due to my tv only having hdmi inputs and nvidia not supporint vrr over hdmi. So if I want to have my build now I have to buy a graphics card now and then upgrade in less than a year to get exactly what I want.
 

llien

Member
445 * 0.52 = 232mm2

Ohhh the magic.
Come on, you can't be claiming both things at a time, either AMD is behind (and then, yeah, magic) pr it is on par (and then, sure, no magic)

I expect nvidia to simply add a "S" at the end of the name, charge 50 bucks more and call it a day.
That wouldn't be confusing enough, since they still want to sell older stuff and nobody complained the last time they've muddied the water, they might keep the old naming.

Plus Turing has RT units in hardware that increases the die size.
That was estimated to be at around 22% of the die.

How your face will look if RTX 2070 in 7nm ends smaller than RX 5700???
If they'd strip off RT cores and actually go "cut it down", yeah, it would be a bit smaller. My face would look the same as now, maybe several month older.
 

gspat

Member
Whats the point of releasing expensive videocards this year without ray tracing? The only reason I am for sure going with an amd card form my itx build is because I have to due to my tv only having hdmi inputs and nvidia not supporint vrr over hdmi. So if I want to have my build now I have to buy a graphics card now and then upgrade in less than a year to get exactly what I want.
These cards aren't really all that expensive, according to Nvidia standards.

I personally can't see the point in buying a card this year (and maybe next) with a "feature" that a niche group of games use.
 

Ascend

Member
Whats the point of releasing expensive videocards this year without ray tracing? The only reason I am for sure going with an amd card form my itx build is because I have to due to my tv only having hdmi inputs and nvidia not supporint vrr over hdmi. So if I want to have my build now I have to buy a graphics card now and then upgrade in less than a year to get exactly what I want.
What's the point of releasing ANY card WITH ray tracing? No one can argue that the visual improvement is worth the performance cost. The 2080Ti, a $1200 graphics card, can run Quake II, a game that is over 20 years old, at only ~90fps on a 1080p resolution... And you want ray tracing on slower cards...?

Stop buying into Ray Tracing. Seriously.
 

SonGoku

Member
It was 2 part:
1. GTX 1080/RTX 2060 can beat PS5
2. PS5 is Navi 10(ie 44-48CU, but now we know *clarify* Navi 10 is 36-40CU), not the bigger Navi 20.

All my guesses and bets are based around performance and power consumption. Features had nothing to do with it.
I did the two part bet, birthbysleep only bet on Navi Big
 

Tygeezy

Member
These cards aren't really all that expensive, according to Nvidia standards.

I personally can't see the point in buying a card this year (and maybe next) with a "feature" that a niche group of games use.
How niche is it really going to be with the next console generation supporting it?
 
Last edited:

CyberPanda

Banned
Whats the point of releasing expensive videocards this year without ray tracing? The only reason I am for sure going with an amd card form my itx build is because I have to due to my tv only having hdmi inputs and nvidia not supporint vrr over hdmi. So if I want to have my build now I have to buy a graphics card now and then upgrade in less than a year to get exactly what I want.
AMD is trying to stay relevant but always behind Nvidia.
 

ethomaz

Banned
Wasn't aware spiderman was ever niche?
That is the joke... it was niche for a lot of gamers because it is exclusive to Sony.
Ray-tracing is niche for a lot of gamers because it is is exclusive to nVidia.

See the similarity?

BTW Niche-man sold gangbuster no matter how gamers "thought" it was niche lol
 

gspat

Member
That is the joke... it was niche for a lot of gamers because it is exclusive to Sony.
Ray-tracing is niche for a lot of gamers because it is is exclusive to nVidia.

See the similarity?

BTW Niche-man sold gangbuster no matter how gamers "thought" it was niche lol
lol

Almost similar. RT Is niche because it's 5 games/demos on special hardware within the PC ecosystem. Every machine within the PS4 ecosystem can run spiderman.

The only way they are both niche is they both fall under "Computer gaming" which is a niche unto itself.
 

thelastword

Banned
Whats the point of releasing expensive videocards this year without ray tracing? The only reason I am for sure going with an amd card form my itx build is because I have to due to my tv only having hdmi inputs and nvidia not supporint vrr over hdmi. So if I want to have my build now I have to buy a graphics card now and then upgrade in less than a year to get exactly what I want.
$450 for good 1440p performance is expensive?....Yet the Nvidia equivalent with less performance is more expensive......Raytracing is a non feature atm......No one is buying these cards for raytracing, to play what? Battlefield at low rez and squashed framerates, Quake II at 540p 25 fps? Is that why AMD is behind Nvidia?

So you see, these folk have no problem paying sky high for Nvidia and it's mediocrity, trumpeting raytracing like it's revolutionary, when all they have is hybrid raytracing for four games in one graphic feature, not even the real thing, minus a Quake demo......And yet, no one is tanking their rez+perf in Battlefield, in Tombraider or Metro for raytracing, when the visual differences are not even arresting to any degree....


$450 is too much money to spend on AMD for more perf, but $500/600 is A-OK to spend on Nvidia for less perf.....Better keep that logic at the door....The other thing that's a bit disingenuous is how no one speaks of AMD features and how revolutionary they are...Radeon Chill, ID buffer, HBCC etc.....and now Fidelity FX, Image Sharpening Filter and Anti-Lag, on average more ram over the competition...…..As you said, the only place you can get Variable rate refresh over HDMI, which is common knowledge, but nobody speaks of these strengths of AMD over Nvidia as a positive, when they do, like you, they say, "only reason I'm buying AMD is because of this or that", if not I'd buy Nvidia.......Yet, they can never praise AMD for doing things or offering features NVDON'T, instead they want all these AMD features that NV does not have at $5, whilst NVIDIA can charge $1200.00....to play a quake demo at 1080p 60fps, because that's what we've been waiting for and that's what justifies RTX...

It's just like Radeon 7, At $699, you can get the best productivity card for youtubers, amateurs and even entry level PRO's without breaking the bank, it's a great gaming 4k card and of course it has tremendous bandwidth and 16GB of HBM for futureproofing, (which is expensive as all hell)….Yet NV charges $700-800 for an RTX 2080 with half the vram/bandwidth (cheaper vram too) which plays quake at what? Where nobody enables RTX to play battlefield online or SP at lower frames and rez.......Yet every NV fan is justifying these prices.....Yet it's AMD that must be every NVIDIA FAN's charity sponsor....I think it's about high time AMD gives the proverbial finger to said persons...…No NV fan is saying to NV, hey RTX support is a mess, these cards are too expensive for just Pascal performance with less ram at a 40% markup...…Everyone wants 2080ti performance for the price of a GTX 1060 from AMD, but they will never ask this of NV....Yet they will blast AMD for offering better perf in many GPU classes, there's always something innit...…

Yet, I know that such internet talk is just NV fans huffing and puffing, they all have 2080ti's you see and AMD just can't put a card better than that, so why should they upgrade....:messenger_smirking:…...Lmao...… In any case, AMD has momentum now and Navi will do very well in this market......They targeted the right set of cards for the first line NAVI products....
 

Ivellios

Member
Ray Tracing might be niche now, but when next gen consoles games start adopting it will become mainstream pretty fast. Plus the number of games with ray tracing is only increasing at the moment, here is a list:

Vampire: The Masquerade Bloodlines 2
New Call of Duty Modern Warfare
Cyberpunk 2077
Wolfenstein Youngblood

and some more random games i dont know about

There is also the older games which already support the technology:

Battlefield 5
Shadow of Tomb raider
Metro Exodus

Source: https://www.digitaltrends.com/computing/games-support-nvidia-ray-tracing/

Personally im interested im playing all of these games, and from everywhere i read even a RTX 2060 can run these games with ray tracing on if you play at 1080p/60fps limit.

I would ignore ray tracing in favor of AMD Navi if it was priced cheaper in the budget mainstream market (GTX 1660/RTX2060), but since their most cheap card is more expensive than the RTX 2060, i dont see a reason to go AMD, since i think ray tracing and DLSS at cheaper price is worth it more than AMD 8gb vram and small performance gains.
 
Last edited:

Tygeezy

Member
$450 for good 1440p performance is expensive?....Yet the Nvidia equivalent with less performance is more expensive......Raytracing is a non feature atm......No one is buying these cards for raytracing, to play what? Battlefield at low rez and squashed framerates, Quake II at 540p 25 fps? Is that why AMD is behind Nvidia?

So you see, these folk have no problem paying sky high for Nvidia and it's mediocrity, trumpeting raytracing like it's revolutionary, when all they have is hybrid raytracing for four games in one graphic feature, not even the real thing, minus a Quake demo......And yet, no one is tanking their rez+perf in Battlefield, in Tombraider or Metro for raytracing, when the visual differences are not even arresting to any degree....


$450 is too much money to spend on AMD for more perf, but $500/600 is A-OK to spend on Nvidia for less perf.....Better keep that logic at the door....The other thing that's a bit disingenuous is how no one speaks of AMD features and how revolutionary they are...Radeon Chill, ID buffer, HBCC etc.....and now Fidelity FX, Image Sharpening Filter and Anti-Lag, on average more ram over the competition...…..As you said, the only place you can get Variable rate refresh over HDMI, which is common knowledge, but nobody speaks of these strengths of AMD over Nvidia as a positive, when they do, like you, they say, "only reason I'm buying AMD is because of this or that", if not I'd buy Nvidia.......Yet, they can never praise AMD for doing things or offering features NVDON'T, instead they want all these AMD features that NV does not have at $5, whilst NVIDIA can charge $1200.00....to play a quake demo at 1080p 60fps, because that's what we've been waiting for and that's what justifies RTX...

It's just like Radeon 7, At $699, you can get the best productivity card for youtubers, amateurs and even entry level PRO's without breaking the bank, it's a great gaming 4k card and of course it has tremendous bandwidth and 16GB of HBM for futureproofing, (which is expensive as all hell)….Yet NV charges $700-800 for an RTX 2080 with half the vram/bandwidth (cheaper vram too) which plays quake at what? Where nobody enables RTX to play battlefield online or SP at lower frames and rez.......Yet every NV fan is justifying these prices.....Yet it's AMD that must be every NVIDIA FAN's charity sponsor....I think it's about high time AMD gives the proverbial finger to said persons...…No NV fan is saying to NV, hey RTX support is a mess, these cards are too expensive for just Pascal performance with less ram at a 40% markup...…Everyone wants 2080ti performance for the price of a GTX 1060 from AMD, but they will never ask this of NV....Yet they will blast AMD for offering better perf in many GPU classes, there's always something innit...…

Yet, I know that such internet talk is just NV fans huffing and puffing, they all have 2080ti's you see and AMD just can't put a card better than that, so why should they upgrade....:messenger_smirking:…...Lmao...… In any case, AMD has momentum now and Navi will do very well in this market......They targeted the right set of cards for the first line NAVI products....
I agree with a lot of what you say here and would like to add the AMD tech where they do tone mapping on the graphics card for hdr gaming which reduces input lag. That being said I think you are downplaying ray tracing. It's going to be a staple of next gen cards and it's a good idea to at least have your graphics card on par bare minimum with console tech. I don;t like upgrading my graphics card yearly and I would feel very tempted to upgrade in less tan a year when navi with ray tracing comes out.
 
Last edited:

SonGoku

Member
Whats the point of releasing expensive videocards this year without ray tracing? The only reason I am for sure going with an amd card form my itx build is because I have to due to my tv only having hdmi inputs and nvidia not supporint vrr over hdmi. So if I want to have my build now I have to buy a graphics card now and then upgrade in less than a year to get exactly what I want.
What's the point of selling cheap cards when your overpriced cards gonna sell anyways
 

Ascend

Member
Ray Tracing might be niche now, but when next gen consoles games start adopting it will become mainstream pretty fast. Plus the number of games with ray tracing is only increasing at the moment, here is a list:

Vampire: The Masquerade Bloodlines 2
New Call of Duty Modern Warfare
Cyberpunk 2077
Wolfenstein Youngblood

and some more random games i dont know about

There is also the older games which already support the technology:

Battlefield 5
Shadow of Tomb raider
Metro Exodus

Source: https://www.digitaltrends.com/computing/games-support-nvidia-ray-tracing/

Personally im interested im playing all of these games, and from everywhere i read even a RTX 2060 can run these games with ray tracing on if you play at 1080p/60fps limit.

I would ignore ray tracing in favor of AMD Navi if it was priced cheaper in the budget mainstream market (GTX 1660/RTX2060), but since their most cheap card is more expensive than the RTX 2060, i dont see a reason to go AMD, since i think ray tracing and DLSS at cheaper price is worth it more than AMD 8gb vram and small performance gains.
At this point, ray tracing for consoles to me sounds like when Microsoft was talking about the power of the cloud for effects when they were gonna launch the Xbox One. A gimmick.

Why would ray tracing be used if you can get very close with the traditional way of rendering, if you get a 50% performance drop from it? It's either that, or AMD found a way to do it significantly faster than nVidia with their next RDNA architecture, in which case your nVidia RTX card will be useless anyway...
 

gspat

Member
Ray Tracing might be niche now, but when next gen consoles games start adopting it will become mainstream pretty fast. Plus the number of games with ray tracing is only increasing at the moment, here is a list:

Vampire: The Masquerade Bloodlines 2
New Call of Duty Modern Warfare
Cyberpunk 2077
Wolfenstein Youngblood

and some more random games i dont know about

There is also the older games which already support the technology:

Battlefield 5
Shadow of Tomb raider
Metro Exodus

Source: https://www.digitaltrends.com/computing/games-support-nvidia-ray-tracing/

Personally im interested im playing all of these games, and from everywhere i read even a RTX 2060 can run these games with ray tracing on if you play at 1080p/60fps limit.

I would ignore ray tracing in favor of AMD Navi if it was priced cheaper in the budget mainstream market (GTX 1660/RTX2060), but since their most cheap card is more expensive than the RTX 2060, i dont see a reason to go AMD, since i think ray tracing and DLSS at cheaper price is worth it more than AMD 8gb vram and small performance gains.
Sounds uncannily similar to people talking about VR... But they are just fanboys talking out their asses, right?
 

thelastword

Banned
I agree with a lot of what you say here and would like to add the AMD tech where they do tone mapping on the graphics card for hdr gaming which reduces input lag. That being said I think you are downplaying ray tracing. It's going to be a staple of next gen cards and it's a good idea to at least have your graphics card on par bare minimum with console tech. I don;t like upgrading my graphics card yearly and I would feel very tempted to upgrade in less tan a year when navi with ray tracing comes out.
Yes, but let's think of it...…..These Navi cards even without RT hardware will not be as crippled as Pascal is using RT...…...NV is dirty, I have no doubt they gimp performance on Pascal through the algorithms they use, only to sell Turing and it's expensive "You need Turing hardware for Raytracing"......

Then Crytek shows a beautiful raytraced demo on a Vega 56 running at a nice clip, none of that noisy Battlefield mess or overdone reflections where every object in the environment is a mirror...….Thing is, not just current Navi, but Vega will also do much better than Pascal is doing now with RT games, because it's CU architecture and streaming processors suits RT much more......Then all the unicorns that NV promised with RTX hardware is what AMD will deliver with Navi 20, by upshotting the CU levels with Infinity Fabric, they're already litmust testing with PRO VEGA DUO.......

There's a reason Lisa Bae Su said she 'love small things" when showing Navi's die...….So, just as she's putting so many chiplets on a CPU, you'll see the same in the GPU space.....Nvidia's way is not forward thinking or revoutionary, they just wanted to be first to market because they thought their fans were/are lemmings and will buy any thing they produced at alarming rates and prices....and they are right to some degree, but I guess the camel's back broke with Turing...….


Also, the people thinking NV can just lower prices on Turing against Navi are not thinking...…Turing is a huge die, they can't just lower the price.....Nvidia already had a backup plan, in the event that Turing did not jive, and that's the 1650, 1660, 1660ti, 1670 ti 1680 ti etc.....They've already launched that series without RTX, that's how they planned to lower prices……..

-
 

Ivellios

Member
At this point, ray tracing for consoles to me sounds like when Microsoft was talking about the power of the cloud for effects when they were gonna launch the Xbox One. A gimmick.

Why would ray tracing be used if you can get very close with the traditional way of rendering, if you get a 50% performance drop from it? It's either that, or AMD found a way to do it significantly faster than nVidia with their next RDNA architecture, in which case your nVidia RTX card will be useless anyway...

I dont have an RTX card yet though.

But this is all pure speculation on your part, both Sony and Microsoft announced ray tracing, we will only know how it will be implemented when the next gen console launches. So its far too early to say that will be just an gimmick

Sounds uncannily similar to people talking about VR... But they are just fanboys talking out their asses, right?

I dont get what you are trying to say, but VR and ray tracing are completely different things.
 

Kenpachii

Member
$450 for good 1440p performance is expensive?....Yet the Nvidia equivalent with less performance is more expensive......Raytracing is a non feature atm......No one is buying these cards for raytracing, to play what? Battlefield at low rez and squashed framerates, Quake II at 540p 25 fps? Is that why AMD is behind Nvidia?

So you see, these folk have no problem paying sky high for Nvidia and it's mediocrity, trumpeting raytracing like it's revolutionary, when all they have is hybrid raytracing for four games in one graphic feature, not even the real thing, minus a Quake demo......And yet, no one is tanking their rez+perf in Battlefield, in Tombraider or Metro for raytracing, when the visual differences are not even arresting to any degree....


$450 is too much money to spend on AMD for more perf, but $500/600 is A-OK to spend on Nvidia for less perf.....Better keep that logic at the door....The other thing that's a bit disingenuous is how no one speaks of AMD features and how revolutionary they are...CD Projekt Red , ID buffer, HBCC etc.....and now Fidelity FX, Image Sharpening Filter and Anti-Lag, on average more ram over the competition...…..As you said, the only place you can get Variable rate refresh over HDMI, which is common knowledge, but nobody speaks of these strengths of AMD over Nvidia as a positive, when they do, like you, they say, "only reason I'm buying AMD is because of this or that", if not I'd buy Nvidia.......Yet, they can never praise AMD for doing things or offering features NVDON'T, instead they want all these AMD features that NV does not have at $5, whilst NVIDIA can charge $1200.00....to play a quake demo at 1080p 60fps, because that's what we've been waiting for and that's what justifies RTX...

It's just like Radeon 7, At $699, you can get the best productivity card for youtubers, amateurs and even entry level PRO's without breaking the bank, it's a great gaming 4k card and of course it has tremendous bandwidth and 16GB of HBM for futureproofing, (which is expensive as all hell)….Yet NV charges $700-800 for an RTX 2080 with half the vram/bandwidth (cheaper vram too) which plays quake at what? Where nobody enables RTX to play battlefield online or SP at lower frames and rez.......Yet every NV fan is justifying these prices.....Yet it's AMD that must be every NVIDIA FAN's charity sponsor....I think it's about high time AMD gives the proverbial finger to said persons...…No NV fan is saying to NV, hey RTX support is a mess, these cards are too expensive for just Pascal performance with less ram at a 40% markup...…Everyone wants 2080ti performance for the price of a GTX 1060 from AMD, but they will never ask this of NV....Yet they will blast AMD for offering better perf in many GPU classes, there's always something innit...…

Yet, I know that such internet talk is just NV fans huffing and puffing, they all have 2080ti's you see and AMD just can't put a card better than that, so why should they upgrade....:messenger_smirking:…...Lmao...… In any case, AMD has momentum now and Navi will do very well in this market......They targeted the right set of cards for the first line NAVI products....

Look at all those fanboys.

If they only knew the truth. They would all be buying AMD. Those idiots that bought nvidia and have a good time playing games without any issue's. What a morons.

e4d3becd0044afd7ec23f5a66401a584.png


Hell even devs won't give a dam about AMD anymore look at how many people got there cards not worth your time. AMD is no longer a thing on PC segment other then to press prices on nvidia hardware. Sadly even nvidia isn't given a shit anymore that's how much they are not a thing anymore.
 
Last edited:

Ascend

Member
Look at all those fanboys.

If they only knew the truth. They would all be buying AMD. Those idiots that bought nvidia and have a good time playing games without any issue's. What a morons.

e4d3becd0044afd7ec23f5a66401a584.png


Hell even devs won't give a dam about AMD anymore look at how many people got there cards not worth your time. AMD is no longer a thing on PC segment other then to press prices on nvidia hardware. Sadly even nvidia isn't given a shit anymore that's how much they are not a thing anymore.
 
Summary of thread:

AMD: "We have no strategy, we're just going to hope that RT will be a fad since we don't actually fund any R&D for RTG."

Nvidia: "Call of Duty and Cyberpunk 2077 are going to support RT"
 
Top Bottom