• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD David Wang: We Won’t Implement DirectX RayTracing (DXR) In Games Until Its Offered In All (Gaming) Product Ranges.

IbizaPocholo

NeoGAFs Kent Brockman
https://wccftech.com/amds-david-wan...-dxr-until-its-offered-in-all-product-ranges/

According to Wang, the company does not have plans to support DXR at this point in time until it becomes available across their entire product line – from the low end to the high end.

“For the time being, AMD will definitely respond to Direct Raytracing,” for the moment we will focus on promoting the speed-up of offline CG production environments centered on AMD’s Radeon ProRender, which is offered free of charge ….. utilization of ray tracing games will not proceed unless we can offer ray tracing in all product ranges from low end to high end,” – David Wang, AMD in an interview to 4Gamer .
 

peronmls

Member
I think this is just a clever way of saying "we don't even have it yet.."
uhhh no? They know not to waste resources with features that are not ready for the game industry. RT is a performance hog and you wont be able to use it at 4k with anything. Smart move for AMD. Learn from others.
 
Last edited:

manfestival

Member
I think this is just a clever way of saying "we don't even have it yet.."
What is the point of that? What he is saying makes total sense. No need to waste resources trying to launch a competitive product when there still is really nothing to show for it.
 
Sounds like he is saying it won't come from them until it can come to budget, midrange and console gaming. Seems like a winning financial strategy. I'll get an Xbox 2 and PS5 before I upgrade my 1080.
 

lukilladog

Member
1.5nm should be able to put 2080ti like RT performance on mainstream cards, but that´s like year 2030 lol, nobody is gonna want to play at 1080p with mediocre frame rates at that point in time. RT is a joke and a waste of people resources.
 

octiny

Banned
Completely agree. And this is after buying a 2080 Ti, which died by the way like many others on the interwebz lolz. Nobody including myself, wants minimal *hybrid* RT @ 1080P on a $1200 card. The sad part is it's two months later & not a single RTX has been released. Nvidia really borked the reveal & launch.

On another note, RTX is only based off of Microsoft's new ray-tracing implementation in DirectX 12. It's not like Nvidia developed it from scratch. AMD could easily come up with their own implementation based off of it but it just doesn't make sense performance, capability & price wise. Real non-hybrid ray tracing in the majority of cards is at least 2 generations away.
 
Last edited:

GreenAlien

Member
Well, makes sense, no (or almost no?) games support it yet and that will not change until there are enough potential users.
 

Leonidas

Member
Sad to see AMD continue to fall farther behind GPU technology.

They still haven't matched 1080 Ti(which is a previous gen card now).
No signs of AMD having built in ray-tracing technology.

Hoping Navi will at least provide 1080 Ti level performance at a decent price.

The sad part is it's two months later & not a single RTX has been released.

MS hasn't re-instated the Windows Update which allows for DXR/RTX to work. BFV is supposed to get it in a Day 0 patch(11/15), but that can only happen if MS releases the update which was supposed to come out last month.
 

dirthead

Banned
I don't even buy AMD cards, but to suggest that the Nvidia 20x0 series has been anything but a failure so far is a laugh.

It's something that the hardware can't do properly yet, that won't be supported in a non trivial way on any games since they're all designed around consoles that can't do it. It's bullcrap. Full HDMI 2.1 support should have been prioritized before this crap.
 

octiny

Banned
MS hasn't re-instated the Windows Update which allows for DXR/RTX to work. BFV is supposed to get it in a Day 0 patch(11/15), but that can only happen if MS releases the update which was supposed to come out last month.

Exactly, because it was borked. So time will tell when the 1st RTX game arrives. Regarding BFV, they did say "near launch window" not specifically a day 1 patch. Which leaves room for a delay upwards to another month.

"DXR enables realistic real-time ray traced reflections in Battlefield V for players with NVIDIA GeForce RTX graphics cards including the GeForce RTX 2070, 2080, and 2080 Ti. An early release of DXR will be available in an upcoming patch, near the Battlefield V Deluxe Edition release window. EA, DICE, and NVIDIA will also continue to optimize this implementation and deliver regular updates after its release."

Sad to see AMD continue to fall farther behind GPU technology.

They still haven't matched 1080 Ti(which is a previous gen card now).
No signs of AMD having built in ray-tracing technology.

They don't need to match Nvidia's top 1 or 2 cards. Only a small minority like myself buy such cards, the terrible yields are just not worth it. Cards are priced accordingly to their tier in comparison to Nvidia. RTX 2070/Vega 64, with the 2070 (essentially a 1080) coming in @ 500-600, while Vega 64 is around $400 now ($399.99 @ newegg). On top of Vega 64 already coming within 15% of a 1080 Ti (2080) in Battlefield V, it's handily beating a 1080 (2070) in the majority of newer games @ 1440P or higher. Without even taking into account it's huge Vulkin game advantage & the consistent driver performance improvements to the Vega architecture.

Do I wish AMD had a top tier card? Sure, but it's not necessary from a business perspective. The RX480/580/Vega 56 & the 1060/1070 price range is where the vast majority of gamers lay at. Which is why so many people bought used Ti's & 1080's before the 20 series reveal, as the prices were temporarily @ the price points where most buy at. Now they've gone back up in price as the used market & new stock has been depleted, due to the similar performance with the 20's series. RX 590's based on finfet 12nm are going to sell like hot cakes when it launches in 1 week. Also, originally Navi technically wasn't meant to be top-tier performance, that's what Kuma is, ironically they changed the Kuma name (top-tier) back to Navi because they didn't like the Kuma code name lol. So we'll essentially be seeing Navi 10 & Navi 20 (with 20 launching after Navi 10 near the end of next year) in similar fashion of Vega 10 & 20 (though the Vega 20 7nm aka instinct will not coming to the consumer gpu market).

Edit: Tad more info
 
Last edited:

dirthead

Banned
Do I wish AMD had a top tier card? Sure, but it's not necessary from a business perspective. The RX480/580/Vega 56) & the 1060/1070 price range is where the majority of gamers lay at. RX 590's based on finfet 12nm are going to sell like hot cakes when it launches in 1 week. Also, originally Navi technically wasn't meant to be top-tier performance, that's what Kuma is, ironically they changed the Kuma name (top-tier) back to Navi because they didn't like the Kuma code name lol. So we'll essentially be seeing Navi 10 & Navi 20 (with 20 launching after Navi 10 near the end of next year).

You're completely right. High end cards are basically made for developers as a target for next generation games. End users hit diminishing returns very quickly with video cards. Software on the shelf simply doesn't take advantage of them until they're already out of date.
 

SonGoku

Member
Eh? I will never use 4k on my PC because I find it a waste. Why shouldnt I have access to it?
lol never really? in 10 years 4k will be a trivial rez that any midrange card will blow through just like 1080p its today
You are also missing the point, if their highest end card can only do 1080p that would make it unfeasible for mid and low range at any rez. So it won't be able support all ranges
You're completely right. High end cards are basically made for developers as a target for next generation games. End users hit diminishing returns very quickly with video cards. Software on the shelf simply doesn't take advantage of them until they're already out of date.
If you want 60fps+ at 1440p/4k they are absolutely necessary
or for people who run games at 100fps+
 
Last edited:

Shai-Tan

Banned
I think Nvidia is already doing this. Hybrid rendering isn't good enough and won't be for a while. They're hyping this up for gamers to pay for their customers in rendering/modeling etc (anyone who bought an RTX card for ray tracing in games is being hoodwinked). I expect very little from the non raster part of my 2080 ti. This isn't quite PhysX again but with all the corners cut to do ray tracing there will be quite a few instances where old raster techniques look close enough or are close enough considering the difference in performance.

DLSS is another instance of the same marketing strategy. Apparently 4k DLSS performs and looks similar to an 1800p upscale. Shoehorning neural networks into it is how they get gamers to pay for a general computing use case.
 
Last edited:

Leonidas

Member
"DXR enables realistic real-time ray traced reflections in Battlefield V for players with NVIDIA GeForce RTX graphics cards including the GeForce RTX 2070, 2080, and 2080 Ti. An early release of DXR will be available in an upcoming patch, near the Battlefield V Deluxe Edition release window. EA, DICE, and NVIDIA will also continue to optimize this implementation and deliver regular updates after its release."

I see they've delayed it then. At any rate, it can't be too much longer.

They don't need to match Nvidia's top 1 or 2 cards.

Agreed, but at this point there are 4 nVidia cards which outperform The Vega 64. All 3 RTX cards and the 1080 Ti. I don't consider RTX 2070 and Vega 64 to be on the same level. RTX 2070 conclusively beats Vega 64.

Vega 64 is around $400 now ($399.99 @ newegg).

It's not right now. Outside of a loud reference model(for $449), Newegg is selling Vega 64s for $499-$649. Rip off compared to RTX 2070.
 

octiny

Banned
Agreed, but at this point there are 4 nVidia cards which outperform The Vega 64. All 3 RTX cards and the 1080 Ti. I don't consider RTX 2070 and Vega 64 to be on the same level. RTX 2070 conclusively beats Vega 64.

What you consider and what the actual reality is what matters. 2 Nvidia GPU's outperform it (2080/2080 Ti). Updated drivers have put the Vega 64 ahead in most benchmarks, with even the Vega 56 sometimes beating a 1080. There was actually an in-depth test done on reddit as well with the newest drivers, will try & find it. The 1080/2070 loses in a favorable number of the newer benchmarks, this isn't even up for debate. BFV being the icing on top as the newest AAA game out, with a Vega 56 even kicking the 1080's ass @ 1080P/1440P. Even worse, that's without undervolting which Vega cards are known for....which would push it ahead even more.

Link


It's not right now. Outside of a loud reference model(for $449), Newegg is selling Vega 64s for $499-$649. Rip off compared to RTX 2070.

It's $400. Reference or not, it's still a lot cheaper. Via promo code on Newegg for both the Asrock Vega 64 & Sapphire, on Ebay as well without promo code needed. These sales have been going on for the past 2 weeks. Which points to an incoming official price drop. If you want to talk about a rip-off, that would be the 2070 aka the 1080 for $500+ 2 years later :messenger_grinning_sweat:. Every single RTX card is worthless besides the 2080 Ti at their current price points when compared to the 10 series.

Link

Man, look at all those Vega 64 cards being sold today lol.
 
Last edited:

octiny

Banned
Oh, the eBay Newegg store, I was looking at newegg.com. At newegg.com they are ripping people off on Vega 64.

Yeah, usually when it comes to AMD on Newegg, they send out the promocodes in their newsletters which are easily obtainable from slickdeals if you don't subscribe & or /buildapc on reddit. In the past, this was before an official price drop. So I'm thinking that's where it's headed to coincide with the RX 590 launch.
 

Leonidas

Member
What you consider and what the actual reality is what matters.

These are results from reputable websites.

https://www.3dcenter.org/news/gefor...tresultate-zur-wqhd-performance-im-ueberblick

2070 won in every site, and does so while consuming much less power. Averaged across multiple websites the 2070 stock reference came out 10% ahead. In one of the results where Vega 64 was close(TechSpot), they used an overclocked Vega 64 vs. a reference stock RTX 2070.

I'll believe the above results over some dude on Reddit.
 
Last edited:

octiny

Banned
These are results from reputable websites.

https://www.3dcenter.org/news/gefor...tresultate-zur-wqhd-performance-im-ueberblick

2070 won every time, and does so while consuming much less power. Averaged across multiple websites the 2070 stock reference came out 10% ahead. In one of the results where Vega 64 was close(TechSpot), they used an overclocked Vega 64 vs. a reference stock RTX 2070.

I'll believe the above results over some dude on Reddit.

You're reaching so hard lol. You just pointed to a various review site (a "lol" of one at that) that compiles scores of launch review data used on launch drivers, when it's already been proven that Vega has improved a shit load since the launch drivers. Talk about a lazy way to try & prove a point. You only need to work your way through new data on newer drivers, it's out there. Vega 64 performs better than the 1080/2070 in the majority of newer games, which makes sense as more games move towards DX12 & Vulkan. Unless you enjoy ignoring facts? This historically also coincides with previous AMD cards performing better as the generations went on. Like I said, this isn't debateable. AMD GPU's age like a fine wine.

Here's gamegpu, one of the most reputable benchmark sites out there. For every game you find the 1080 aka the 2070 (both cards trade blows, depending on the game), you'll find the Vega 64 now fares better in twice as many now. All taken across various resolutions, mainly 1440P. If you want to add 8% for a 2070 going by the site you posted (mind you they averaged sites using using launch 1080 drivers as well), to each score, Vega 64 still comes out ahead in current day....even without undervolting which adds an easy 150mhz+ on the sustainable boost under load without OCing, something a 2070 wish it could achieve as it barely has any headroom. Let's not forget Turing is just an extension of Pascal, so it's driver potential for performance improvement is minimal at best, compared to say when Pascal first launched. On top of the fact that minimums & 1 percentile frames are almost always better with Vega, Vega is simply the better card w/ a better price to boot. I'm a guy with 2 Titan XP's, 1 2080 Ti, 1 2 1080 Ti's & 1 1080 (which I'm more than happy to prove) so I'm no AMD homer...just spitting facts.









The only major games you'll see a 1080/2070 with a clear win that come to mind is COD, Assassins Creed as well as some games based on old engines. SOTTR too close to call, as AMD performs on par with both depending what site you go to.

In any event, my case still stands. Vega 64 is a steal now & it's only going to get better as drivers are constantly maturing & newer games come out, especially once next-gen systems. Reminds me of the 290X in terms of longevity, handily outperforming the 780 Ti by a huge margin for quite some time now, which wasn't the case near launch.
 
Last edited:

OverIt

Member
lol never really? in 10 years 4k will be a trivial rez that any midrange card will blow through just like 1080p its today
You are also missing the point, if their highest end card can only do 1080p that would make it unfeasible for mid and low range at any rez. So it won't be able support all ranges

Do you not understand how PC gamers play games? All of us customize our gameplay experience, some of use put focus on lighting, some of us on resolution(multiple screens), some of us on FPS, textures. The whole point of the PC platform is that.
 

Leonidas

Member
You're reaching so hard lol. You just pointed to a various review site that compiles scores of launch review data used on launch drivers

A few examples of those compiled reviews have these drivers being used

From Anand

NVIDIA Release 416.33 Press
AMD Radeon Software Adrenalin Edition 18.9.1

SweClockers

AMD Radeon Software 18.8.1

Where are you getting launch drivers from? These are up to date.

Cherry picked benchmarks don't mean anything to me. Vega 64 is faster in some games, but it's not on equal footing with the 2070, as seen in the compiled review results which were run on updated drivers.
 

octiny

Banned
A few examples of those compiled reviews have these drivers being used

From Anand



SweClockers



Where are you getting launch drivers from? These are up to date.

Cherry picked benchmarks don't mean anything to me. Vega 64 is faster in some games, but it's not on equal footing with the 2070, as seen in the compiled review results which were run on updated drivers.

Correct, clearly facts showing newer games run better don't mean anything to you lol. Google the newest games out, Vega 64 is ahead on Hitman 2 & BF5. And no, the majority of those reviews reused data from launch reviews without retesting every single card in the performance charts in said reviews.

Edit: But hey, if you like taking data from older games, with skewed driver data & average it out to try and prove a point. Be my guest! If that's the case, there really isn't anything to debate about :)
 
Last edited:

Leonidas

Member
Correct, clearly facts showing newer games run better don't mean anything to you lol. Google the newest games out

I'm not getting into a cherry picking contest. I'm sure if I looked, I'd find many newer titles where 2070 won.

And no, the majority of those reviews reused data from launch reviews without retesting every single card in the performance charts in said reviews.

The majority of the sites updated their test suite with games that came out this year.
 
Last edited:

SonGoku

Member
Do you not understand how PC gamers play games? All of us customize our gameplay experience, some of use put focus on lighting, some of us on resolution(multiple screens), some of us on FPS, textures. The whole point of the PC platform is that.
In 10 years 4k will be as trivial as 1080p today
Claiming you will never move past a certain resolution is naive and the equivalent to playing at 480p nowdays
 

PocoJoe

Banned
I see they've delayed it then. At any rate, it can't be too much longer.



Agreed, but at this point there are 4 nVidia cards which outperform The Vega 64. All 3 RTX cards and the 1080 Ti. I don't consider RTX 2070 and Vega 64 to be on the same level. RTX 2070 conclusively beats Vega 64.



It's not right now. Outside of a loud reference model(for $449), Newegg is selling Vega 64s for $499-$649. Rip off compared to RTX 2070.

And how many consumers have money to waste 1000+ €/£/$ into one card?

Only rich ones + simple minded ones(that have little to spend but still buy em)

It is wiser just to drop details/resolution than pay for overpriced cards.

Reason why i quit pc gaming and moved to ps4:

Had PCs from 1995 till 2013, always could buy cheap cpu(50-150€) and overclock it, + midrange gpu for 150-200€.

Now good overclockable cpu costs 250-350€ and midrange gpu 300-400€.

And nvidia just continues to pump up the prices, just like apple, product by product.

AMD is wise and offers what most gamers want:

Cpu/gpu that is fast enough for normal gaming, with affordaple price
 

octiny

Banned
I'm not getting into a cherry picking contest.

Right, so you don't want to compare data on new games? I see! Lol......


Proof? How is that even possible when the majority of sites updated the Game Suite for this year. You're reaching at this point...

I haven't seen any site re-using the same data, I must be clicking on all the wrong ones. Anand, SweClockers, Guru, they all updated their test suite including games that weren't out when Vega 64 launched, the others probably did as well. If you can show otherwise do so cause I'm not going through all the reviews.

Proof? Just compare their launch review numbers of Vega, to the numbers in the new reviews. It's not rocket science. Data is reused on the majority of them. I'm not linking every single one, it's your link you're trying to use to prove a point, which has already been nullified with the pics I posted on NEW games. Since the beginning I've said Vega 64 has been outperforming the 1080 in the majority of newer games. You tried to dispute it with reviews of old data with the majority using older games still as benchmarks, which has done nothing to disprove my point of Vega performing better on newer majority of newer games & will continue to do so as next-gen approaches with more use of newer API's & updated engines.
 

OverIt

Member
In 10 years 4k will be as trivial as 1080p today
Claiming you will never move past a certain resolution is naive and the equivalent to playing at 480p nowdays

More like 20 years. If raytracing is really that expensive, I'd rather just use that and play at 1440p. I just dont see the point of going higher resolution thank 1440p at 30 inches screen. I wont be able to see the difference.
 

Leonidas

Member
Since the beginning I've said Vega 64 has been outperforming the 1080 in the majority of newer games.

I didn't contest that, nor did I look into it in regards Vega 64 vs. 1080. You mentioned 1080/2070 in that post with the cherry picked benchmarks without posting 2070 results.

And how many consumers have money to waste 1000+ €/£/$ into one card?

Who knows, the 2080 Ti has been virtually out of stock since launch.

And that's the only $1000+ RTX card. 2080 prices have come down some since launch and will in time hit the "starting" price. RTX 2070 started at $499.

Cpu/gpu that is fast enough for normal gaming, with affordaple price

Their CPUs are fine to me since I only care about high-res/60 FPS but their GPUs all leave something to be desired as far as I'm concerned(I've had bad experiences with AMD cards in the past few years). I've owned both RX 480 and Vega 56 and was disappointed each time. The only good experience with AMD graphics I've had recently has been on consoles.
 
Last edited:

llien

Member
This is from 20th of March 2018.
Formerly known as AMD FireRays, Radeon Rays 2.0 is targeted at content developers who want to utilize high-performance ray-tracing capabilities with AMD GPUs, CPUs, and APUs via asynchronous compute.
AMD Has Its Own Ray-Tracing Technology (tomshardware)

Nvidia released it's RTX in late August.

Do you not understand how PC gamers play games? All of us customize our gameplay experience, some of use put focus on lighting, some of us on resolution(multiple screens), some of us on FPS, textures. The whole point of the PC platform is that.
Because about 100 million players will suddenly abandon "it just plays games" simplicity of consoles, and dive into "there is that shiny new GPU, that costs as much as 2 years old GPU, but is whopping 6% faster", "oh, and while 2080Ti struggles RT-ing even at 1080p, there also is 2070 RTX, just for lolz" and "oh, $1.3k for a graphic card alone is cool" world?

I doubt it.
 
Last edited:
If you want 60fps+ at 1440p/4k they are absolutely necessary
or for people who run games at 100fps+

A 580 already handles 1440p pretty comfortably around 60FPS at a mixture of medium/high settings. Navi is rumored to offer 1080-level performance at $250. That would handle 1440p comfortably at high/max settings for most.
 

Solomeena

Banned
If their is one constant in life like the Sun rising every day is people making excuses for AMD and it's incompetence with Graphics cards. Whether you like it or not RayTracing is the future. It's pathetic that AMD is using an excuse to cover up their failings in keeping up with Nvidia.
 

dark10x

Digital Foundry pixel pusher
Nobody including myself, wants minimal *hybrid* RT @ 1080P on a $1200 card. The sad part is it's two months later & not a single RTX has been released. Nvidia really borked the reveal & launch.
This isn't entirely their fault, though. The only way to solve this problem would have been to delay the launch, basically. Windows itself isn't ready. That's the issue. Then Microsoft pulled that Windows Update due to an unrelated issue and, thus, here we are.
 
If AMD could just actually produce a card that's not a minor refresh of a years old 480. The stagnation in the graphics card market is real, and Nvidia repurposing its machine learning cores for gaming at super high prices isn't much of an advancement in my book.

I think AMD might have an angle here if next gen consoles also won't include any hardware utilizing DXR, or similar for Sony, AMD could utilize all the die space for more conventional performance, and actually beat Nvidia outside RTX supported titles. This would pretty much force Nvidia to either push really hard on dev support for RTX features, or put them in a real difficult position since they do absolutely want to push the tensor and RT cores due to markets outside gaming, but what they really don't want to do is make bespoke chips for gaming and AI. The fp64 HPC chips are pretty much the only ones that seem to be the only thing Nvidia and AMD are willing to invest in for HPC market alone, probably because they have a few big clients like DoE that are guaranteed.

On the other hand, ray-tracing seems like it's going to be the next thing eventually, and Nvidia's bespoke corea approach might end up being the only real option in the long term. In any case, if AMD has learned anything from its past, it's that they probably shouldn't jump into unsupported tech first. They don't have the clout to push support, so even if they had some hw feature that was superior, dev support was never really there. Even for Nvidia it's going to be an uphill battle to get RTX going. If AMD provides a Navi card with 2080 levels of performance for a significantly lower price, without ray-tracing support, people might still opt for it easily just because current games they play won't benefit from it.
 

llien

Member
If AMD could just actually produce a card that's not a minor refresh of a years old 480.
It's there and it's called Vega.
In fact there are 2 cards, Vega 56 and Vega 64, both quite competitive with current discounts.

Nvidia repurposing its machine learning cores for gaming
"Machine learning cores" multiply/add matrices. I can't even remotely imagine how that kind of operation could help with ray tracing.
 

octiny

Banned
This isn't entirely their fault, though. The only way to solve this problem would have been to delay the launch, basically. Windows itself isn't ready. That's the issue. Then Microsoft pulled that Windows Update due to an unrelated issue and, thus, here we are.

Oh, I know. Problem is Nvidia hyped the crap out of it to try & push sales, so in the end it doesn't really matter who's fault it is. It's still not in the hands of the minority of gamers who plan on actually using it. Nevertheless, the 20 series launch will definently go down as one of Nvidia's worst. From the RTX delay, to a crap ton of 2080 Ti's dying (including mine) coinciding with the price increases and no sellouts across the line besides the Ti, they are definently feeling the backlash & rightfully so.
 
Last edited:
It's there and it's called Vega.
In fact there are 2 cards, Vega 56 and Vega 64, both quite competitive with current discounts.


"Machine learning cores" multiply/add matrices. I can't even remotely imagine how that kind of operation could help with ray tracing.

That's because they aren't. Nvidia is using them for DLSS. RT cores are used for ray-tracing. Nvidia's designs are definitely built for machine learning and pro rendering purposes, and they're trying their best to come up with use cases in gaming to not have to do different chips for their gaming and professional markets.

Vega is anything but competitive in reality. It's priced appropriately to what Nvidia has on offer, but it places near zero pressure on them, nor is it particularly profitable for AMD.
 

kraspkibble

Permabanned.
raytracing is expensive in both money and performance. a 2080 Ti can only do 1080p 30-60fps with it on and that cost well over $1,000.

it will be a good many years yet before it works it way into all cards. forget about raytracing on consoles until 2030-40. we're talking next next gen or even next next next gen consoles.
 
Last edited:

llien

Member
nor is it particularly profitable for AMD
I see your point, given the size of the chip/interposer+HBM2 costs, but given how inflated GPU prices are at this point (nvidia refusing to drop even 1xxx series price), it should be still quite profitable for AMD.
 
I think plastic models and mercury puddles can be well implemented on screen by dx11 or even dx10, let's leave this fancy stuff for $500+ justification over previous gen for nvidia fans.
 

JCK75

Member
You're reaching so hard lol. You just pointed to a various review site (a "lol" of one at that) that compiles scores of launch review data used on launch drivers, when it's already been proven that Vega has improved a shit load since the launch drivers. Talk about a lazy way to try & prove a point. You only need to work your way through new data on newer drivers, it's out there. Vega 64 performs better than the 1080/2070 in the majority of newer games, which makes sense as more games move towards DX12 & Vulkan. Unless you enjoy ignoring facts? This historically also coincides with previous AMD cards performing better as the generations went on. Like I said, this isn't debateable. AMD GPU's age like a fine wine.

Here's gamegpu, one of the most reputable benchmark sites out there. For every game you find the 1080 aka the 2070 (both cards trade blows, depending on the game), you'll find the Vega 64 now fares better in twice as many now. All taken across various resolutions, mainly 1440P. If you want to add 8% for a 2070 going by the site you posted (mind you they averaged sites using using launch 1080 drivers as well), to each score, Vega 64 still comes out ahead in current day....even without undervolting which adds an easy 150mhz+ on the sustainable boost under load without OCing, something a 2070 wish it could achieve as it barely has any headroom. Let's not forget Turing is just an extension of Pascal, so it's driver potential for performance improvement is minimal at best, compared to say when Pascal first launched. On top of the fact that minimums & 1 percentile frames are almost always better with Vega, Vega is simply the better card w/ a better price to boot. I'm a guy with 2 Titan XP's, 1 2080 Ti, 1 2 1080 Ti's & 1 1080 (which I'm more than happy to prove) so I'm no AMD homer...just spitting facts.









The only major games you'll see a 1080/2070 with a clear win that come to mind is COD, Assassins Creed as well as some games based on old engines. SOTTR too close to call, as AMD performs on par with both depending what site you go to.

In any event, my case still stands. Vega 64 is a steal now & it's only going to get better as drivers are constantly maturing & newer games come out, especially once next-gen systems. Reminds me of the 290X in terms of longevity, handily outperforming the 780 Ti by a huge margin for quite some time now, which wasn't the case near launch.





There has been an ongoing series of articles about how AMD/Radeon launches a bit below Nvidia but for some reason once games/drivers mature later in the life of the cards AMD outperforms Nvidia across the board.. This has been true for the last few generations so I don't doubt for a second that it's true with Vega.
 

LordOfChaos

Member
Hoping this also means it's included with ~2020 consoles. Maybe adaptation will take a while, but an entire 5-7 year console generation missing out on any form of it while it was taking off on PCs would be a big miss.
 
AMD are willingly (even if unconsciously) CHOOSING to lose the game and go bankrupt...

When the crap RTX cards were released I was enthusiastic and expected AMD's response to be a tackle to it. So far, nothing, and all I'm is pretty much "we're not even bothering anymore".
 

LordOfChaos

Member
AMD are willingly (even if unconsciously) CHOOSING to lose the game and go bankrupt...

Or they're running each architecture off 1/3rd the R&D budget Nvidia does.

Vicious cycle. Subpar performance leads to lower revenue leads to lower R&D leads to subpar performance.

Last I checked Nvidia was plopping 3 billion on each new architecture, and having split consumer and compute architectures at that, while AMD may have just scraped a billion for Vega.
 
Last edited:
Or they're running each architecture off 1/3rd the R&D budget Nvidia does.

Vicious cycle. Subpar performance leads to lower revenue leads to lower R&D leads to subpar performance.

Last I checked Nvidia was plopping 3 billion on each new architecture, and having split consumer and compute architectures at that, while AMD may have just scraped a billion for Vega.

Budget is not some magical currency that transforms into ideas, conceptions, planning, research and engineering the more you pour in. I think AMD have done a way better job with less budget for years, until they completely abandon the competition 2 years ago, which would have been okay if they waiting the RTX to react...

The fact that they still don't have a card with 1080ti performance, are not pursuing hybrid raytracing pipeline, or at least announcing anything, is worrying.
 

SonGoku

Member
More like 20 years. If raytracing is really that expensive, I'd rather just use that and play at 1440p. I just dont see the point of going higher resolution thank 1440p at 30 inches screen. I wont be able to see the difference.
Sure if the choice was raytracing or 4k i would choose raytracing at 1080p
But in 10 years you won't have to make that choice, 4k will be the defacto base resolution (like 1080p is today) and we will be pushing even higher resolutions, 4k will be common place for midrange gpus
A 580 already handles 1440p pretty comfortably around 60FPS at a mixture of medium/high settings. Navi is rumored to offer 1080-level performance at $250. That would handle 1440p comfortably at high/max settings for most.
MAX SETTINGS or at the very least all on high (not ultra) no compromises. For that high end is still needed, not to mention we are getting close to a new generation where the lowest common denominator will raise the bar higher, and no single card exits yet that can push 4k at 60fps consistently on all titles with no compromises
or for people who run games at 100fps+ at 1080p high end is needed
 
Last edited:
Top Bottom