• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon VII Announced

llien

Member
"it's very quiet"



Benchmarks (by AMD):
mGd5FRS.png


 
Last edited:

shark sandwich

tenuously links anime, pedophile and incels
Funny that the example you are championing its a last last gen game (ps360)
When games developed exclusively for next gen happen, 8GB wont be enough

You know which card isnt that much more powerful than 1080 ti either? the rtx 2080
Funny how some people are using this to diss VII when the RTX 2080 is the exact same... They are both in the same ballpark similarly priced
If 2080 were Nvidia’s flagship product and it released 2 full years after AMD released a similar-performing product, at the same price, we would be bashing it exactly the same as we are bashing Radeon VII now.

And the only reason why Nvidia can charge $700 for the 2080 is because they have no real competition in that market segment. Sadly Radeon VII doesn’t change much in that regard.
 

SonGoku

Member
AMD needs 7 nm to match Nvidia's 16 nm and they are so proud of it they put the 7 in the name. Okay.
This is just Vega on a new process, old ass arch with a hard CU limit, when Navi hits then you can judge amd
Companies tend to release old arch refreshes on new process nodes to get a grip of the new process and better yields

Also amd is matching Nvidia 12 nm with an old ass arch (rtx 2080)
If 2080 were Nvidia’s flagship product and it released 2 full years after AMD released a similar-performing product, at the same price, we would be bashing it exactly the same as we are bashing Radeon VII now.

And the only reason why Nvidia can charge $700 for the 2080 is because they have no real competition in that market segment. Sadly Radeon VII doesn’t change much in that regard.
But this is not AMDs answer, VII is a repurposed workstation card and their practice run at 7nm
When their actual new arch hits this year, then you can judge amd perfomance

Bolded: You are right but you should be bashing nvidia as well for offering the same performance for money. It doesnt excuse
 
Last edited:

llien

Member
2 full years after AMD released a similar-performing product
Market didn't buy AMD's products even when they were superior all around. Money for R&D need to come from somewhere, it's amazing AMD managed to roll out Zen.

Vega 7 is a die shrink of Vega, with some CUs disabled.
It is a 331 mm² chip built for other market, competing with nVidia's 545 mm² chip on older process, built for gaming.
AMD can't afford (hopefully yet) multiple parallel R&D pipelines, having 7nm market ready product this early is quite an achievement.
 
Last edited:

kraspkibble

Permabanned.
Very underwhelming.

This is the best they can do with their 7nm process? Why on earth did they put 16GB in? Not only is 16GB overkill but HBM is expensive as shit. 8GB DDR6 would not compromise performance and be much cheaper. They have shot themselves in the foot here. Also, I bet that this card will consume a shit load of power and output insane levels of heat.

I love AMD for what they're doing in the CPU market but let's face it they simply can't compete with Nvidia in the GPU market. Hopefully Intel can enter and give them some serious competition. Until then I'm even more excited for Nvidia's 7nm GPUs.
 

kraspkibble

Permabanned.
No.


Radeon Instinct.

570 is simply awesome, Vega 56 is not bad either, 64/7 are mediocre, but certainly far from "can't compete".

And that not factoring in adaptive sync, after nvidia admitting defeat
Are you suggesting that they intentionally released a half assed product then? It's the best they can do now. Of course it will improve but we're talking about now. Not later this year or next year. For 7nm this is awful.

And why would any gamers need Radeon Instinct? They should have released an 8GB version for gamers.

OK i take it back...maybe they can compete at the low end but this isn't really competition for the 2080 and where is their card to compete against the 2080 Ti? Oh yeah, probably need to wait another year for that.
 
Last edited:

Ascend

Member
If 2080 were Nvidia’s flagship product and it released 2 full years after AMD released a similar-performing product, at the same price, we would be bashing it exactly the same as we are bashing Radeon VII now.
I seriously doubt that. Maybe you'll think all the following is unrelated, but look at the sentiment regarding DLSS right now, and compare it with the sentiment of Async compute back then... Even though both for their time have similar amount of support, DLSS is touted as something great, and even a reason to get nVidia, while back then, Async compute was touted as a useless feature and not a reason to get AMD...

Look at the GTX 970's 3.5GB vs the RX 480 PCI-E power consumption issue. The GTX 970 3.5GB was a deliberate deception that had no fix, while the RX 480 PCI-E power consumption was a driver error/bug that was fixed in less than a week and blown way out of proportion. The GTX 970 is still seen as a great card, while the RX 480 is seen as an 'ok' card. Not to mention that the R9 390 being practically superior to the GTX 970 in every possible way except power consumption, people get the GTX 970 anyway, because higher power consumption is apparently worse than purposefully deceiving your customers. Except when it was the HD 5870 vs the GTX 480. Then all the importance regarding power consumption and heat goes out the window.

nVidia promised working and improved Async compute drivers for Maxwell ages ago. No one cares that it was never implemented and swept under the rug. Yet everyone that knows about Vega's stream binning rasterizer is complaining about it never being implemented.

The point is that AMD is judged a lot harsher than nVidia on all fronts, and nVidia is barely criticized, and if they are, it's not for long.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I seriously doubt that. Maybe you'll think all the following is unrelated, but look at the sentiment regarding DLSS right now, and compare it with the sentiment of Async compute back then... Even though both for their time have similar amount of support, DLSS is touted as something great, and even a reason to get nVidia, while back then, Async compute was touted as a useless feature and not a reason to get AMD...

Look at the GTX 970's 3.5GB vs the RX 480 PCI-E power consumption issue. The GTX 970 3.5GB was a deliberate deception that had no fix, while the RX 480 PCI-E power consumption was a driver error/bug that was fixed in less than a week and blown way out of proportion. The GTX 970 is still seen as a great card, while the RX 480 is seen as an 'ok' card. Not to mention that the R9 390 being practically superior to the GTX 970 in every possible way except power consumption, people get the GTX 970 anyway, because higher power consumption is apparently worse than purposefully deceiving your customers. Except when it was the HD 5870 vs the GTX 480. Then all the importance regarding power consumption and heat goes out the window.

nVidia promised working and improved Async compute drivers for Maxwell ages ago. No one cares that it was never implemented and swept under the rug. Yet everyone that knows about Vega's stream binning rasterizer is complaining about it never being implemented.

The point is that AMD is judged a lot harsher than nVidia on all fronts, and nVidia is barely criticized, and if they are, it's not for long.

AMD is often a victim of high hopes and those high hopes unfairly turn into high expectations. False rumors about amazing performance get published by some sites with credibility and people start to believe that an absolutely killer and amazing performer is in the pipeline. And of course guys like IbizaPocholo post these on sites like GAF and the rumors are spread like wildflower.

AMD is absolutely terrible at managing these rumors as many of them are allowed to fester for far too long and when the announced product isn't what people hope, people naturally get upset. They really need to get a PR team on the ball to squash the rumors.

Heck, AMDs best bet may be to lead people to believe that Navi and Ryzen 2 will not be what they were hoping and then deliver something exceptional.

Nvidia on the otherhand has the complete opposite in that they are good at pissing people off initially and making people assume the worst, but do something that surprises people (Gsync now compatible with Freesync monitors and the 2060 being a good performer at an OK price).

The mantra is that the only way AMD can "win" is if they deliver something with twice the performance and half the cost of the nvidia rival. Its now ancient history, but people were bitching about Ryzen because it didn't destroy Kaby Lake at the time it was released since the single core performance was below Intel. Since then Ryzen has become way more beloved since it has opened the door for developers to take advantage of extra cores and the performance gap is closed.

However, I have no hesitation in saying that I am disappointed in the Vega 2 announcement. The performance looks to be pretty good if the benchmarks are to be believed, I expect it will trade blows with the 2080 in non-raytracing games. I don't see the lack of ray tracing as a big deal, because by the time ray tracing becomes more mainstream most people will have upgraded and AMD will likely have their own ray tracing. But the $699 price is a major turn off and they have very much left the door wide open for Nvidia to do something simple like drop the price of 2080 by $50 - $150 and immediately make the Vega 2 much less of a competitor. I personally was hoping for a $200 - $300 card that could rival a 2070 and a $400-$500 card that could rival a 2080.
 
Last edited:

SonGoku

Member
are you suggesting that they intentionally released a half assed product then?
Yes! this is Vega refresh at 7nm, its impressive for what it is
Companies tend to release old arch refreshes on new process nodes to get a grip of the new process and better yields

When navi hits, we will see the best they can do with the 7nm process
but this isn't really competition for the 2080
Why is that?
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
OK i take it back...maybe they can compete at the low end but this isn't really competition for the 2080.

Dude, yes it is. The performance is on par with a 2080, so yes the Vega 2 is competing with it. You can't say it's competition is a 1080 Ti (or even 1080) when those cards are no longer available. Yes, the 2080 can do ray tracing, but that feature is still in it's infancy and will drastically impact game performance to the point that almost nobody will use it.
 

SonGoku

Member
Dude, yes it is. The performance is on par with a 2080, so yes the Vega 2 is competing with it. You can't say it's competition is a 1080 Ti (or even 1080) when those cards are no longer available. Yes, the 2080 can do ray tracing, but that feature is still in it's infancy and will drastically impact game performance to the point that almost nobody will use it.
So much truth
Hilarious to diss VII performance as 1080ti range when the 2080 is the exact same with less ram
 
Oh boy. I hope that really nobody expects for amd to suddenly release an APU that is as powerful as this power hungry vega refresh that is already on 7nm process.........

What this card clearly shows us that you can't expect next gen consoles [both ps5 and xbox] to be as powerful as a top card from 2017. I predicted 1080 performance for next gen - it seems that I might be totally on point.

I'd say that sounds about right, given that we know that:

1) Navi will probably be mid range and similar to a 1080 in terms of raw performance
2) The PS5 is strongly rumoured to use Navi

I always thought people were a bit optimistic expecting 1080ti/2080 performance for next gen consoles and VERY optimistic to expect native 4k/60 as a standard.

I predict it will be 4k/30 as standard, with 4k/60 for fighting games, racers, some sports Sims and UHD remasters of current gen games. A few games may offer a performance mode (probably 1440p/60).
 

JohnnyFootball

GerAlt-Right. Ciriously.
I'd say that sounds about right, given that we know that:

1) Navi will probably be mid range and similar to a 1080 in terms of raw performance
2) The PS5 is strongly rumoured to use Navi

I always thought people were a bit optimistic expecting 1080ti/2080 performance for next gen consoles and VERY optimistic to expect native 4k/60 as a standard.

I predict it will be 4k/30 as standard, with 4k/60 for fighting games, racers, some sports Sims and UHD remasters of current gen games. A few games may offer a performance mode (probably 1440p/60).
I'd say this hits the nail on the head as far as what is reasonable to expect. What does give me hope that the 4K/60 fps can happen on next gen consoles is the fact that the CPU will be that much more powerful. I'd be pretty happy with 1440p with solid AA and 60 fps.

What I do hope to see is developers implement dynamic resolution scaling, that's a very future proof feature that will automatically give games a free upgrade if Sony and MS release more powerful consoles.
 

thelastword

Banned
For all those people saying Radeon 7 competes with a 2 year old card. At least Vega 7 is a die shrink with double the bandwidth, double the memory, double the rops…...I'd take that over Nvidia's 8GB equivalent named RTX 2080 on a new architecture, but guess what, it performs on par with the 2 year old pascal card (1080ti)...So serious question, did you guys mention that when you try to prop the $800 RTX 2080FE or when you gave a fair assessment like you are doing now?....RTX 2080 is such value, all with 1 RTX hybrid game where you slash your perf and rez in half for no transformative gain in visual fidelity.....Frames and rez over mirrors everywhere, because that's how life is, you see reflections everywhere......

It looks like basically 7nm Vega instinct, which had those 16Gb:
with 4 out of 64 CUs disabled (probably faulty).
AdoredTV mentioned AMD having a pile of about 60k of those cards.
For sure...….

"Though it should be noted that AMD’s performance estimates are realistically conservative here; while 7nm does bring power consumption down, AMD is still only touting >1.25x performance of MI25 at the same power consumption. The true power in the new cards lies in their new features, rather than standard FP16/FP32 calculations that the existing MI25 card was already geared for."
https://www.anandtech.com/show/1356...ct-mi60-mi50-accelerators-powered-by-7nm-vega

But, there's some unique engineering for the cards too...….Increased clocks, double the rops, double the memory for the gaming variation and double the bandwidth.....That in itself should net you a nice jump, +25% on the conservative side....Yet there is more to Radeon 7, the boosting scheme has been reworked amongst other things, which we will get more details on later...So it's basically Vega 64 in Rage mode.....HBM will be properly fed, enough bandwidth to run 4k without issue or stutter.....And it's $699....There's simply no way RTX 2080FE beats this card, especially when it is overclocked like the FE.....I am on board for that 16GB HBM and bandwidth alone tbh, I will run my games with max AA and at 4K, and there's enough bandwidth to even supersample better.....

It's crazy, but people are pretending that benches are already live, this card is a better card than RTX 2080FE at a lower price, finally this card is giving AMD users a 4k card that offers stutter free gaming with 16gb at $700.00...That's a steal imo, not even the 2080ti has 16GB....I will take this and run come February 7th...….Then I will look to build another system on Ryzen 3000 later in the year or early 2020 with Navi....

590 comes with 3 games. With 580 you can choose 2 out of 3.
Vega, being high end, will definitely come with all 3.
Yes, that is confirmed...You get The Division, Dmc5, RE2...I suspect these games will run better on AMD hardware too......And these are some of the best looking and anticipated multiplat games this year...With more Vulkan and DX12 games to come this year as well.....No longer bogged down by DX11, it was about time.....I think Vega 7 will bench very nicely for the year and above RTX 2080FE at that...
 

JohnnyFootball

GerAlt-Right. Ciriously.
Supposedly it beats a GTX 2080 in Battlefield V @1440p Ultra settings (Direct X 11)

AMD-CES-2019-Radeon-VII-16-740x414.jpg
Don't post this. Remember, it's competing with the 1080 Ti and showing it being compared to the 2080 is not accurate and simply not fair to those with an agenda.

sarcasm alert
 

llien

Member
Are you suggesting that they intentionally released a half assed product then?
A good looking card, with twice as much memory as competition, coming with 3 games that is said to be rather silent is In no way "half assed".

It's the best they can do now.
Depends what you mean by "can do now".
It's the best product they have for the given niche at the moment, having focused (their very limited) R&D resources elsewhere.

Closing 40% IPC gap with Intel was amazing, they can do much better on GPU front too.

...maybe they can compete at the low end
They've just presented a card that trades blows with 2080, which by no means is low end.

...where is their card to compete against the 2080 Ti?
There doesn't have to be a card to compete against 2080Ti.
 

onQ123

Member
So people really can't comprehend the fact that this is just Vega with a die shrink?

This is like complaining that the Xbox One S isn't blowing the Xbox One away.
 

JohnnyFootball

GerAlt-Right. Ciriously.
So people really can't comprehend the fact that this is just Vega with a die shrink.

Right. A die shrink counts for a lot, but it's not the end all be all in performance. Based on these quotes you would be lead to believe that Intel could simply reuse their original 65nm Core 2 and shrink it down to 7nm and be competitive with Coffee Lake.
 

GoldenEye98

posts news as their odd job
If anything this is a showcase for 7nm.

Die size comparison:

Vega VII: 331mm^2
RTX 2070: 445mm^2
RTX 2080: 545mm^2

So while Vega architecture might not be the best you are getting near 2080 performance(assuming early numbers are anything to go by) in a smaller (cooler/quieter) package.
 

onQ123

Member
If anything this is a showcase for 7nm.

Die size comparison:

Vega VII: 331mm^2
RTX 2070: 445mm^2
RTX 2080: 545mm^2

So while Vega architecture might not be the best you are getting near 2080 performance(assuming early numbers are anything to go by) in a smaller (cooler/quieter) package.

This is also with 4 CU's disabled
 

shark sandwich

tenuously links anime, pedophile and incels
This is just Vega on a new process, old ass arch with a hard CU limit, when Navi hits then you can judge amd
Companies tend to release old arch refreshes on new process nodes to get a grip of the new process and better yields

Also amd is matching Nvidia 12 nm with an old ass arch (rtx 2080)

But this is not AMDs answer, VII is a repurposed workstation card and their practice run at 7nm
When their actual new arch hits this year, then you can judge amd perfomance

Bolded: You are right but you should be bashing nvidia as well for offering the same performance for money. It doesnt excuse
I’m not giving Nvidia a free pass. They deserve all they hate they get for their ridiculous prices and anticompetitive behavior. But the fact is they are behaving exactly how you’d predict a company with no competition to behave.

As for “wait for Navi”, that’s true. But keep in mind most of the rumors so far say that the first Navi products will be mid-range. My guess is it’ll be something around Vega 64 performance but cheaper and lower power consumption - more like a successor to the RX580 than to Vega. It could be a year or more before we have high-end Navi.
 

GoldenEye98

posts news as their odd job
This is also with 4 CU's disabled

Right well when you are comparing it to the Vega 64, the Vega VII has 4 less CU's and is beating it by an average of 30%(if the AMD figures are anything to go by). So yeah it is impressive in that sense but we don't know how those tests were done ie clocks of Vega 64 card vs Vega VII card. Also there is the fact it has better memory.
 

thelastword

Banned
May I ask where you exactly live? Even here in Germany where prices are usually higher, you can get the 2080 for 699€.



Err no. 5$ in electricity per year is so blatantly false. If I were to compare the 225W 2080 against the presumably 300W VII at THE BEST bucks per Watt in my local town, it's around 50€ a Year with my gaming habits. Since I DON'T have that electricity deal, it's actually around 70€ in electricity I'd pay for the VII per Year... sooo I get same performance for the same price as a 2080 and even have to pay ~50€ at least on TOP... That's a perfect example of a Premium on top just to go Team Red.

Who knows, maybe the real next Arch IS going to be very competitive or even bash out Nvidia, but let's be honest. This Card is in no way really competitive. Less features and more power draw for the same price as a 2080? Tell me please, what is the reason to get this card if I can buy a 2080 with RTX, DLSS and now even FreeSync Support for the same price and even save on ~50€ a year on electricity?

Waiting for your arguments.
Buying a Radeon 7 already saves me $100.00 as opposed to if I had to buy a RTX 2080FE......According to Gamers Nexus, HBM is $150.00, so I know I'm getting quality memory and AMD is not pricing the card at $700.00 just for shits and giggles, and guess what, I don't get 8GB of HBM, I get 16GB, as opposed to 8GB on a $800.00 card where even Polaris offered 8 gb on RX580 and Rx 590 at sub $300.00........More than this, I get double the bandwidth of the $800.00 Nvidia card......So I think I could spend a few more bucks on electricity for such value...……..

Furthermore, Nvidia is offering this cheap RTX hardware solution which has been available in mobile hardware since 2016 with 6 gigarays back then to boot.....What they're offering here is still hybrid raytracing, lots of noise on the image and it's only 1 game after 4 months, yeah I'll spend a few buck on some watts tbh....Further to that, the dlss feature is very noisy and destroys IQ and distant detail as can be seen in the Infiltrator demo and now FF......None of these features were ready for primetime or just work, farless properly.....Also I have a 1440p screen and a 4k screen, if I want to game in 1440p the Radeon 7 is more than capable and if I want to game in 4k it has enough bandwidth and memory to run at that rez stutter free, even more stutter-free over an Nvidia card at even lower resolutions as can be seen with Vega 64 vs RTX 2070 in a video I posted elsewhere recently.....Notwithstanding, the Vega 64 has only 8Gb, half the bandwith of Radeon 7....and half the rops…..So you think this is a bad deal?

Hey! I think AMD could offer 32GB HBM, 256 rops, more bandwidth @ $1100, which is $100.00 less than a 2080ti and outperform that card and people would still complain about a few bucks in electricity per year.....I just feel if you're buying cards at such prices a few extra cycles on your current bill is the least of your worries......

Also bare in mind, all the features of Vega 64,the same applies to Radeon 7, but it has even more, as several features were retooled.....So I can undervolt Radeon 7 just as Vega, to save some cycles, I can use wattman if I'm such a guardian of the electrical bill-axy. Hey, it all works out, but I'd dare say, the returns on this card looks/seems to offset any issues we may come up with...…..Hey, I'd pay $800.00 for such a card if I knew all the competition had was some hybrid raytracing in 1 game, one noisy dlss game, half the bandwidth, half the memory, less rops etc......If NV can offer this RTX2080 for $800.....I feel like I'm AMD is robbing themselves for what they offer in comparison...

I'll say this also, there's a reason AMD has not offered Raytracing yet, it's not ready, it's still hybrid, they're doing research and trying to make it open, where everybody can contribute and consumers won't have to pay 8 arms and 8 legs for the feature...At least if NV's hybrid RT was not perf and rez was upshot, but it is.....So I'll tell you now, when Radeon Rays debuts, I can tell you that Radeon 7 will be compatible with it, cause it will be done via CU's and they will have enough bandwidth to do so on this card...Just look at the jump in open-cl performance from regular Vega to Radeon 7.....And this is only a preview driver and yet, the card has not even been overclocked like the RTX 2080FE.....I have no doubt, this card will have access to higher HBM voltage for those who want to put it on water etc or just go wild on clocks and OC.....Yet, however I spin this card, it's definitely great value in comparison to the competition and what they are offering....

Don't post this. Remember, it's competing with the 1080 Ti and showing it being compared to the 2080 is not accurate and simply not fair to those with an agenda.

sarcasm alert
A Massive leap in Horizon 4 and Strange Brigade. The more DX12 and Vulkan games we get in the future, the more we will see a gulf between Radeon 7 and RTX 2080FE, and especially the 1080ti Pascal...
 

SonGoku

Member
I’m not giving Nvidia a free pass. They deserve all they hate they get for their ridiculous prices and anticompetitive behavior. But the fact is they are behaving exactly how you’d predict a company with no competition to behave.

As for “wait for Navi”, that’s true. But keep in mind most of the rumors so far say that the first Navi products will be mid-range. My guess is it’ll be something around Vega 64 performance but cheaper and lower power consumption - more like a successor to the RX580 than to Vega. It could be a year or more before we have high-end Navi.
Indeed rumors say midrange Navi for 2019 and high end 2020. All Im saying is, dismissing amd based on vega performance is missing the point.
Still imagine a mid range Navi card in the $300 range with 1080Ti performance, would be a great start
 
Last edited:

thelastword

Banned
FWIW, this is current Vega OC'd vs RTX 2080



You can get a powercolor Vega 64 on Newegg right now for $399 just flash the bios to get LC clocks....You can also get a LiquidCooled Vega for $499...This matched up against an $800.00 Turing card is pretty impressive, especially in DX12 titles and even in DX11 where AMD cards are crippled....At $400.00 the difference I think they should compare Vega64 with RTX on for RTX 2080. Yet, check the graphs and see how smoother it is on Vega's side, just some perspective before Radeon 7 launches.....

Also, for people saying NAVI will only be midrange, I'd not be so sure...….Remember how some people said there will not be a 7nm Vega gaming card, well it's here.....Besides, AMD said they will be competing with NV on the high end......So on the new architecture, Midrange might be cheaper because of NAVI, but that's technology and progress for you in mid-late 2019, early 2020......There will be high end products on Navi too......Just that current high end will be mid tomorrow.....It's just the evolution of tech...
 

Ascend

Member
So people really can't comprehend the fact that this is just Vega with a die shrink?
It can't be just a die shrink if it has double the ROPs. One of GCN's main issues was being unable to go beyond 64 ROPs. They must have adapted quite a bit to be able to double the ROPs, which was the weakest part of GCN in general.
 

onQ123

Member
It can't be just a die shrink if it has double the ROPs. One of GCN's main issues was being unable to go beyond 64 ROPs. They must have adapted quite a bit to be able to double the ROPs, which was the weakest part of GCN in general.


Not just a die shrink but mostly a die shrink
 

SonGoku

Member
It can't be just a die shrink if it has double the ROPs. One of GCN's main issues was being unable to go beyond 64 ROPs. They must have adapted quite a bit to be able to double the ROPs, which was the weakest part of GCN in general.
It has some tweaks and improvements for sure but its essentially vega
The real deal will be when Navi hits
 

SonGoku

Member
I remember people saying the same at Polaris launch... the real deal will be when Vega hits.
I wasn't there yet when that happened
But it was a bigger deal than Polaris, it was just a clusterfuck
This time is more applicable than ever, VII its just Vega done right. Navi will be a proper new arch and what's more its supposed to be the first post gcn card since 7xxx series
When navi launches you can judge amd, whether they botch it or knock it out the park
 

Redneckerz

Those long posts don't cover that red neck boy
I only asked the guy for a demo, because he kind off implied that raytracing can be done without dedicated hardware designed for it, but emulating it. It certainly can be done - very poorly. Absolutely not anywhere at the same IQ and perf. we see in BF5.
Its a bit of a mixed answer, really. AMD can do this through Radeon Rays and OpenCL but that stuff isn't so much supported in games.
Then there is also the bit that DXR has a Fallback Layer that runs on CUDA/GCN cores. Meaning it will run, but obviously with lesser performance.

On the other end of the bargain, you absolutely do not need RTX cores to generate the same Gigarays/s perf - The demoscene uses a lot of raytracing in their demo's and that all runs on any DX11 capable CUDA/GCN based hardware.

So this is why it is somewhat a mixed bag - Atleast for raytracing. Specifically, what is traced, and how many rays are shot throughout the scene..
Radeon Ga Ga

I'd sit alone and watch the light
You rendered every gaming night
With every graphics option on
I saw it on my Radeon
Radeon

You gave them all a high framerate
Though sometimes your drivers were late
You made 'em run without a hitch
In BF1942, Nvidia was your bitch

So don't become some console pleb
Running shovelware made for the web
For casuals who are so easy to impress
And can't see above 30fps
You had your time, you had the power,
I want to put you in my tower
Radeon

All we hear is Radeon ga ga
Radeon goo goo
Radeon ga ga
All we hear is Radeon ga ga
Radeon what's new?
Radeon, someone still loves you!

We watch the keynotes, we watch some suit
but no one cares about Compute
We hardly need to watch this trash
When you announce slow parts for lots of cash

Let's hope you support HDMI 2.1
So we can buy new TVs and have some fun
And stick around cos we might miss you
When we're stuck with Nvidia for a discrete GPU
You had your time, you had the power,
I want to put you in my tower

All we hear is Radeon ga ga
Radeon goo goo
Radeon ga ga
All we hear is Radeon ga ga
Radeon goo goo
Radeon ga ga
All we hear is Radeon ga ga
Radeon blah blah
Radeon what's new?
Radeon, someone still loves you!

You had your time, you had the power,
I want to put you in my tower
Radeon

All we hear is Radeon ga ga
Radeon goo goo
Radeon ga ga
All we hear is Radeon ga ga
Radeon blah blah
Radeon what's new?
Radeon, someone still loves you!
Loves you
Okay, not gonna lie, this gave me a good chuckle, especially when it was near the bottom of the page and everything else was serious talk.
 
People in this thread going ons about RTX this DLSS that... where are the games??? If there eher 20+ games then fine, but it has been almost 4 months since release there is 1 ray tracing title and 1 dlss title. Lets be generous and say another game will be available in another 3 months, and so on, that would mean lets say 5 games this time next year. How does that in any way shape or form make RTX cards worth while now? The 3000 series might be out by then!

RTX features are unproven and if only nvidia supports it it will always be added as an afterthought as 99%games are made for consoles first ala nvidia game works Ubisoft jammed into their titles a few years back that in the end just nade game run worse.
 
Last edited:

LordOfChaos

Member
So by their own advertising, they're expecting this to be a bit better in some titles than a 2080, undoubtedly hand picked titles and it'll trade blows. So 2080 performance, for the same price, 5 months later, with a generations fab lead and no RTX bits taking die space? That’s underwhelming. Oh, all that and it also appears to take more power with the power connectors...

But this will be bigger for compute. 16GB of 60% faster memory than Nvidia has, plus the same FP64 perf as Instinct. It’s a similar play as some Titans and Frontier, more of a Pro card for cheaper, rather than an expensive gaming card. iMac Pro with this soon maybe, if they're never going to get over their Nvidia spat.
 

The Skull

Member
Workstation card that can play games. TBH I'm not expecting much out of the the Radeon group until an architecture change. Is Navi the final GCN iteration? Also confirmed over PcGamer that it will have a 300W tdp, along with a 1.45GHz base clock and 1.75GHz boost clock. Interesting to see the under-volt and overclock potential. Probably skip this, as I'm happy with my reference Vega 56 i got for £300, then flashed it to a 64 and put it on water.
 
Last edited:

SonGoku

Member
So by their own advertising, they're expecting this to be a bit better in some titles than a 2080, undoubtedly hand picked titles and it'll trade blows. So 2080 performance, for the same price, 5 months later,
Why is this a bad thing? options i mean. If anything the extra ram makes it more future proof for 4k gaming, if i was in the market for one of these i would choose the VII over the 2080
with a generations fab lead and no RTX bits taking die space? That’s underwhelming. Oh, all that and it also appears to take more power with the power connectors...
But its the old arch with even less CUs its impressive for what it is, i don't think AMD intended to retake the performance crown with this product. Everybody knew 7nm Vega was coming and it performed as expected i think? Nobody was expecting miracles from a old ass arc

Its probably a repurposed workstation chip and it also helps them get a grip of the new process with a familiar arch.
Is Navi the final GCN iteration?
Its not clear... rumors used to say it was but the latest rumors now claim Navi to be post gcn
 
Last edited:

blastprocessor

The Amiga Brotherhood
Near 300w tdp vs 215w (rtx280) tdp is the problem. Doesn’t bode well for Ps5 and we need to temper our expectations unless they can pull something out the hat with Navi. No mistake the Vega architecture is a disappointment for perf per watt.
 
Last edited:

SonGoku

Member
Near 300w tdp vs 215w (rtx280) tdp is the problem. Doesn’t bode well for Ps5 and we need to temper our expectations unless they can pull something out the hat with Navi. No mistake the Vega architecture is a disappointment for perf per watt.
Why even make this comment if you already know that this is not the arch that's going inside any console
Just old ass vega with some tweaks shrunk to 7nm. The high tdp likely because of diminishing returns being hit on clock speeds

There's no miracle to pull just a more efficient arch that breaks the CU limit which its becoming severely limiting. GCN reached its peak/limit
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Why even make this comment if you already know that this is not the arch that's going inside any console
Just old ass vega with some tweaks shrunk to 7nm. The high tdp likely because of diminishing returns being hit on clock speeds

There's no miracle to pull just a more efficient arch that breaks the CU limit which its becoming severely limiting. GCN reached its peak/limit

Going with this much super fast memory (1 TB/s with a non embedded DRAM is mind boggling) and possibly raising the clock so much you needed a voltage bump would explain the higher TDP: we will need to see how they achieved the high clocks, but if they need to raise the voltage their power increases with the square of V not linearly.

It seems this design is a bit rushed and that we may see another Vega refresh (with a more optimised solution), then Navi semi custom on PS5, then Navi on Desktop...
 

Silver Wattle

Gold Member
Competes against the 2080, but should have been 12GB(768GB/s) and $649, the extra memory loses impact at $699.

Though this is just a temporary solution for AMD, it's main objective is as PR to prove that they can still compete with Nvidia at the high end.

I don't even expect this card to be in production in 6 months time.
 

Kenpachii

Member
Can't see a market for this, Nvidia will just slash there price if they need too, or make a additional gpu 2070 ti or whatever and kinda dumpsters this thing.

Also tailored benchmarks from AMD means nothing,

Competes against the 2080, but should have been 12GB(768GB/s) and $649, the extra memory loses impact at $699.

Though this is just a temporary solution for AMD, it's main objective is as PR to prove that they can still compete with Nvidia at the high end.

I don't even expect this card to be in production in 6 months time.

Would be a dumb move as this thing is going to age far better then any nvidia card simple because of the ram setup. It's the only strong point of the card and AMD always does this. While Nvidia artificially limits v-ram to push there base in upgrading over time.
 
Last edited:

Senior

Neo Member
Competes against the 2080, but should have been 12GB(768GB/s) and $649, the extra memory loses impact at $699.
Since HBM is not solderd to the PCB but instead its all one big package, AMD is saving costs by keeping the 16gb of HBM2 that comes with Instinct GPUs instead of manufacturing a whole new GPU.
Makes perfect sense as they really dont want to take any risks, especially since Vega is not as powerful and efficient as Nvidia's chips.
Navi is likely when we will see much better results.
 
Last edited:
Top Bottom