• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Radeon RX Vega thread

Still, Ryzen if I am correct is also made by GloFo. And those chips don't have the issues that there seems to be with Polaris and Vega.
Vega is also 14nm compared to Polaris' 28nm, so they're a different process, and I think 14m was licensed from Samsung.
I think there's more to Vega's issues than GloFo.

Ryzen die size is 192 mm², reported Vega die size is in the ~500 mm² range.

Vega die is four times larger, and in the world of semiconductors your difficulty of manufacturing increases exponentially as your die size ramps up. GloFo + Samsung 14LPP is probably fine for Exynos SoCs and Ryzen, but gets really iffy for something as gigantic as Vega.
 

Firenze1

Banned
Still, Ryzen if I am correct is also made by GloFo. And those chips don't have the issues that there seems to be with Polaris and Vega.
Vega is also 14nm compared to Polaris' 28nm, so they're a different process, and I think 14m was licensed from Samsung.
I think there's more to Vega's issues than GloFo.
Polaris is 14nm
 

Locuza

Member
Ryzen die size is 192 mm², reported Vega die size is in the ~500 mm² range.

Vega die is four times larger, and in the world of semiconductors your difficulty of manufacturing increases exponentially as your die size ramps up. GloFo + Samsung 14LPP is probably fine for Exynos SoCs and Ryzen, but gets really iffy for something as gigantic as Vega.
And prior GCN GPUs had really good voltage and clockfrequency settings because they were made at 28nm TSMC?
There is no proof for a significant difference.
The GP107 at 14nm Samsung isn't noticeable worse than his TSMC brothers.

excuse my ignorance, but which of these points is explaining the poor frequency scaling exactly? i don't get it.
Do you mean the OC headroom or the performance scaling?
 
Do you mean the OC headroom or the performance scaling?

did you watch the video? GN downclocked vega to furyX frequencies there. so while vega is running 50% higher clockspeeds compared to the downclocked variant, gaming benches are only around 20% higher with the not downclocked vega. that seems pretty strange to me and lets assume some severe bottlenecking is going on.
 

Locuza

Member
did you watch the video? GN downclocked vega to furyX frequencies there. so while vega is running 50% higher clockspeeds compared to the downclocked variant, gaming benches are only around 20% higher with the not downclocked vega. that seems pretty strange to me and lets assume some severe bottlenecking is going on.
Just partly, I primary read their article:
http://www.gamersnexus.net/guides/2977-vega-fe-vs-fury-x-at-same-clocks-ipc

Where they don't have Vega @ 1600 Mhz but at stock frequency which is a range roughly around 1300-1500 Mhz, mostly settling down at 1440 Mhz.
PCGH did comparisons with 1050 Mhz for Fiji and Vega and for Vega with 1600 Mhz, where they increased the Power-Target and Fan-Speed to make sure it's staying at that clockspeed.

One part of the synthetics makes it quite clear that Vega is bandwidth starved.
C&P:
- Effective Texture Bandwidth

Now here things are getting really interesting.

That's a bandwidth test where two different types of textures are tested.
One black texture and one with random colors.
Since the recent GPUs are using Delta Color Compression techniques you see a big difference between a black texture where no color deltas are found and the compression can be optimal and a random colored texture where the compression effectively doesn't work.

Nvidia is quite the king here, the difference between the best case and worst case is about105-130% in bandwidth.
GCN Gen 3 only manages 17%, GCN Gen 4 47%.
One possible speculation was that Nvidias Color Compression might not be that much better than from AMD but that the tiled based renderer is helping the color compression technique in addition.
But the DSBR seems currently to be inactive as the trianglebin test doesn't indicate any tiling.
Without the DSBR the results are 52-60% better for GCN Gen 5 with the black texture.
The range is maybe too small to call it a clear improvement over GCN Gen 4, maybe a few percent.

What's more interesting are the results with the random colored texture where the achieved bandwidth is actually 24% lower than from the Fury X, you would expect 6% (484 GB/s vs. 512 GB/s) but not 24%.

The bandwidth utilization is miserbel and explains the limited scaling seen with Vega.

You also see 20% higher memory copy throughput in Aida (336 GB/s vs. 303 GB/s) with the Fury X in comparison to Vega.
 

dr_rus

Member
can somebody tell me, why the frequency scaling of vega in this test done by gamersnexus is so abysmal?
The card throttles heavily with higher clocks, seemingly from insufficient power supply and HBM2 temperatures.

The article says the water cooled version competes pretty closely and even beats a 1080 on occasion...
not sure that's a "dud" if they bother to price it right (probably won't).

A WC card consuming 300W+ built with 500mm^2 chip and HBM2 memory being pretty close to a year old 314mm^2 GPU running GDDR and consuming 180W is a dud. They will have to price it lower than stock 1080 for people to even look at it as an alternative.
 
The card throttles heavily with higher clocks, seemingly from insufficient power supply and HBM2 temperatures.



A WC card consuming 300W+ built with 500mm^2 chip and HBM2 memory being pretty close to a year old 314mm^2 GPU running GDDR and consuming 180W is a dud. They will have to price it lower than stock 1080 for people to even look at it as an alternative.

The silver lining here is Freesync (2) value, which is important to some people.
 
Well, yeah, those who've chosen AMD's Freesync don't have many alternatives if they want to play in anything higher than 1080p.

It's not just those who already chose Freesync. There is probably a market for people who are looking to upgrade from older displays to monitors with VRR.
 

dr_rus

Member
It's not just those who already chose Freesync. There is probably a market for people who are looking to upgrade from older displays to monitors with VRR.

IMO, this is where things are getting a lot muddier as even with Freesync's arguable price advantage it's really hard to justify going with AMD GPU h/w because of their recent track record in the high end. People looking at high end videocards are less likely to be attracted by FS's lower price as they tend to upgrade videocards more often and spending some premium on a Gsync display which they'll use for some five years is a lesser problem to them than the inability to upgrade their videocards in a timely fashion.
 
IMO, this is where things are getting a lot muddier as even with Freesync's arguable price advantage it's really hard to justify going with AMD GPU h/w because of their recent track record in the high end. People looking at high end videocards are less likely to be attracted by FS's lower price as they tend to upgrade videocards more often and spending some premium on a Gsync display which they'll use for some five years is a lesser problem to them than the inability to upgrade their videocards in a timely fashion.

I don't think this conclusion can be easily made without data to support it.

Even if it were true, there are a lot of assumptions a consumer has to make before making a decision. What is the price/performance proposition of Nvidia and AMD right now? How significant is the GSync premium? Did Nvidia and AMD produce competitive products during the last 2 -3 years? For 99,9% of gamers, assumptions stop there. Nobody cares about analyzing products in the pipeline for the next 3-5 years because even top analysts don't really know that.
 

PFD

Member
I think the fact that Vega is slower than Fury X clock-for-clock brings up a lot of questions, and means that we potentially might get a decent performance boost from drivers.

I might, at worst it should perform similarly to Fury X clock-for-clock, no?

How does Polaris fare?
 

Locuza

Member
PCGH did also an undervolting test with MandelS:
300 Sekunden MandelS (burn test demo) in 1200p:

1.200 mV (default PT): PC zieht ~380 Watt
1.200 mV, +50 % PT: PC zieht ~540 Watt nach 30 Sekunden, dann habe ich abgebrochen
1.100 mV, +20 % PT: PC zieht ~460 Watt
1.075 mV, +20 % PT: PC zieht ~430 Watt
1.075 mV, +50 % PT: PC zieht ~430 Watt
1.050 mV, +50 % PT: PC zieht ~415 Watt

MandelS ist stabil, den Crash hatte ich beim Benchmark (evtl. lief da doch noch +50 % PT, kann ich jetzt nicht mehr sagen). Das mal als erste Idee, was UV bringt.

MfG,
Raff
https://www.forum-3dcenter.org/vbulletin/showthread.php?p=11429155#post11429155

And under games 1,075V + 50% PT were stable and the GPU consumed roughly the same amount of power with 1600 Mhz in comparison to the default settings with 1,2V and ~1440 Mhz.
At 1,060V they had their first crash, although there are not sure if 1,060V are really set or internally switches to 1,050V.
http://www.pcgameshardware.de/Vega-Codename-265481/Tests/Benchmark-Preis-Release-AMD-Radeon-Frontier-Edition-1232684/2/#a5
 

JapeMincers

Member
Yes, GTX 1080 performance after all this time is a little disappointing considering the power requirements but if it's priced well (<£400) and sees driver improvements inch it closer to the GTX 1080Ti then I would call it a success.

Also, now AMD have this new architecture ready then it will be easier to refine and build up on it so we shouldn't see such large gaps between releases.
 
So it willl release at 1080-level performance, then it will inch closer to 1080Ti over time thanks to FineWine technology

Volta will be out in a few months, then what?

The FineWine argument is increasingly stupid when they are more than a year behind Nvidia and getting close to a full generation behind.

Anyone who was looking for 1080 performance has probably bought a 1080 over the past 1 year and 2 months. Who's left that waited except a tiny number of truly dedicated fans? I mean the Fury X was only a month behind the 980 Ti and even that was enough to put the nail in the coffin. Does AMD actually think they are going to have people who want to buy Vega more than a year after the 1080 and offering the same performance while drawing almost double the power and needing a closed-loop water cooler to reach the performance of a year-old card which uses air cooling?

Vega would have been a tough sell 1 year ago going against the brand-new 1080. Now AMD might as well just cancel the stupid thing because who the hell wants to buy it?
 
It is a good deal If they release it at $100 less than 1080 considering driver enhancements possible.

On the other hand, I strongly suspect they compared the prices to 1080 FE which would make it a much tougher sell.
 

ISee

Member
So performance will be similar to a Vega FE running at ~1600MHz. No one expected this result.
/s
Hopefully they are able to sell RX vega for a good price, at least.
 
So it willl release at 1080-level performance, then it will inch closer to 1080Ti over time thanks to FineWine technology

Doubtful. A lot of the improvements GCN saw were courtesy of it being more well equipped to handle DX12 and developers developing console games for GCN-based consoles. It's unlikely Vega sees the kind of longevity GCN did, because almost no other architecture has. It might improve a little, but a 1080 Ti FE is 35% faster than stock/reference 1080's, and the AIB versions are upwards of 40-50% faster. I find it unlikely a gap that size is closed thanks to driver optimizations.

Volta will be out in a few months, then what?

The FineWine argument is increasingly stupid when they are more than a year behind Nvidia and getting close to a full generation behind.

Anyone who was looking for 1080 performance has probably bought a 1080 over the past 1 year and 2 months. Who's left that waited except a tiny number of truly dedicated fans? I mean the Fury X was only a month behind the 980 Ti and even that was enough to put the nail in the coffin. Does AMD actually think they are going to have people who want to buy Vega more than a year after the 1080 and offering the same performance while drawing almost double the power and needing a closed-loop water cooler to reach the performance of a year-old card which uses air cooling?

Vega would have been a tough sell 1 year ago going against the brand-new 1080. Now AMD might as well just cancel the stupid thing because who the hell wants to buy it?

Bingo. The 1080 has been out for over a year, releasing a GPU that barely competes with it while being incredibly inefficient is silly.
 
Doubtful. A lot of the improvements GCN saw were courtesy of it being more well equipped to handle DX12 and developers developing console games for GCN-based consoles. It's unlikely Vega sees the kind of longevity GCN did, because almost no other architecture has. It might improve a little, but a 1080 Ti FE is 35% faster than stock/reference 1080's, and the AIB versions are upwards of 40-50% faster. I find it unlikely a gap that size is closed thanks to driver optimizations.


Bingo. The 1080 has been out for over a year, releasing a GPU that barely competes with it while being incredibly inefficient is silly.


My view of Vega was always that if they managed 1080 performance and released at a really aggressive price point then they could have a mainstream success on their hands. I never saw it competing with 1080ti and have pretty much been proven right, but the completely inefficient nature of Vega was total shock you me as recently their gpu have been pretty good in this respect.

I also am dubious that they can price this aggressively enough with use of hbm 2, and I'm certain at this point Nvidia will aggressively cut prices on 1080 when Vega launches you counter
 

Ac30

Member
Volta will be out in a few months, then what?

The FineWine argument is increasingly stupid when they are more than a year behind Nvidia and getting close to a full generation behind.

Anyone who was looking for 1080 performance has probably bought a 1080 over the past 1 year and 2 months. Who's left that waited except a tiny number of truly dedicated fans? I mean the Fury X was only a month behind the 980 Ti and even that was enough to put the nail in the coffin. Does AMD actually think they are going to have people who want to buy Vega more than a year after the 1080 and offering the same performance while drawing almost double the power and needing a closed-loop water cooler to reach the performance of a year-old card which uses air cooling?

Vega would have been a tough sell 1 year ago going against the brand-new 1080. Now AMD might as well just cancel the stupid thing because who the hell wants to buy it?

Let's hope the miners do so AMD gets another shot with Navi.
 

ISee

Member
My view of Vega was always that if they managed 1080 performance and released at a really aggressive price point then they could have a mainstream success on their hands. I never saw it competing with 1080ti and have pretty much been proven right, but the completely inefficient nature of Vega was total shock you me as recently their gpu have been pretty good in this respect.

I also am dubious that they can price this aggressively enough with use of hbm 2, and I'm certain at this point Nvidia will aggressively cut prices on 1080 when Vega launches you counter

We are to close to Volta for nvidia to care about a competitor to their second strongest and soon 'outdated' product. They will most probably select a certain volta card (maybe the xx70) to go against RX vega. But at this point I wouldn't be surprised if RX vega is going to be just as expansive as a gtx 1080 in the first place.
 
We are to close to Volta for nvidia to care about a competitor to their second strongest and soon 'outdated' product. They will most probably select a certain volta card (maybe the xx70) to go against RX vega. But at this point I wouldn't be surprised if RX vega is going to be just as expansive as a gtx 1080 in the first place.

I agree, by time Vega launches the 1080 will be over year old, and they will be in position to price cut due to having been in production this long and having sold plenty cards already. Is going to be really hard to sell Vega as it stands unless they can somehow come in and launch at say $50-100 than a 1080.
 
I agree, by time Vega launches the 1080 will be over year old, and they will be in position to price cut due to having been in production this long and having sold plenty cards already. Is going to be really hard to sell Vega as it stands unless they can somehow come in and launch at say $50-100 than a 1080.

I don't know if they can undercut 1080 that easily.

Die size, cooling, power delivery are in 1080ti class gpus.

And even if they do - Nvidia will be releasing next iteration of cards sooon (3 to 9 months depending on estimates) and they can do 970 killing price again if they want.
 

dr_rus

Member
PCGH did also an undervolting test with MandelS:

https://www.forum-3dcenter.org/vbulletin/showthread.php?p=11429155#post11429155

And under games 1,075V + 50% PT were stable and the GPU consumed roughly the same amount of power with 1600 Mhz in comparison to the default settings with 1,2V and ~1440 Mhz.
At 1,060V they had their first crash, although there are not sure if 1,060V are really set or internally switches to 1,050V.
http://www.pcgameshardware.de/Vega-Codename-265481/Tests/Benchmark-Preis-Release-AMD-Radeon-Frontier-Edition-1232684/2/#a5

I dunno, I tend to think that AMD isn't run by idiots who can't even choose proper voltages for their chips. It's more likely than not that such undervolting will lead to stability issues, maybe obscure and obvious only after the chip's been used for several months but still, such experiments aren't really applicable to real world usage IMO.

Yes, GTX 1080 performance after all this time is a little disappointing considering the power requirements but if it's priced well (<£400) and sees driver improvements inch it closer to the GTX 1080Ti then I would call it a success.

Also, now AMD have this new architecture ready then it will be easier to refine and build up on it so we shouldn't see such large gaps between releases.

GCN5 is hardly new, and this is pretty evident from the results it is showing. It is arguably the biggest update GCN got so far but all in all it's still GCN, back from 2012.
 

Renekton

Member
I don't know if they can undercut 1080 that easily.

Die size, cooling, power delivery are in 1080ti class gpus.

And even if they do - Nvidia will be releasing next iteration of cards sooon (3 to 9 months depending on estimates) and they can do 970 killing price again if they want.
Even die size? Man that is rough &#128532; they can't really undercut much thanks to die size and HBM
 

Marmelade

Member
GPU ranking including Vega FE from PCGH

7% faster than a stock 1070
13% slower than a stock 1080

2017-07-2023_01_38-grcnb3w.png
 

AmyS

Member
https://seekingalpha.com/article/4089195-amd-vega-stillborn

AMD Vega Is Stillborn

Summary

AMD is doing very well on the CPU side of things.

However, AMD isn't just CPUs.

Arguably, AMD is doing very, very badly on the GPU side of things. Vega just shows how bad.

On the side that's going right, CPUs, you have (had) a factor that the other side, GPUs, does not have. That factor is Jim Keller. Jim Keller was there at AMD when AMD last became competitive with Intel back in the early Opteron days. Jim Keller was at Apple right before Apple had a breakthrough with its own custom-designed CPUs. And now Jim Keller was at AMD when AMD designed the new Zen cores at the base of its current resurgence. Jim Keller is gone now , but let's leave that for later.

AMD, however, isn't just CPUs. It also has a discrete GPU segment, which it got from acquiring ATI back in 2006. And there, as I will show, things are going horribly wrong. How can this be going wrong when CPUs are going right? Well, apparently, that would be because Jim Keller doesn't do GPUs.

It's very easy to see that AMD has had a huge problem when it comes to GPUs. Since May 2016, when Nvidia launched the GeForce 10 series, AMD has had no competitive GPU at the top end of the market. AMD has had nothing to compete with either the Nvidia GTX 1070 or the GTX 1080, based on Nvidia's Pascal architecture.

Only now, nearly 1.5 years later, is AMD getting ready to launch its new Vega architecture, namely the Vega RX for the consumer market. Early indications, though, show that both the Vega Frontier Edition and the Vega RX struggle to even match the Nvidia GTX 1080 in performance terms. That is, cards based on AMD's new architecture just match Nvidia's top end cards from 1.5 years ago

It gets worse, though. Much worse. Consider the following:

Some will quickly say that these cards will shine in the data room, where GPUs are increasingly being used for AI purposes. In other words, that these cards will bring the fight to Nvidia for AI purposes.

Unfortunately, there's no hope in that either, and for a very simple reason:

Nvidia is now bringing out its Volta architecture. The Volta architecture includes tensor cores specifically for AI jobs, which massively accelerate neural network training and inference. The Vega architecture has no such specific-purpose cores and thus will be unable to compete when it comes to AI applications.

There really is no saving grace, either in the consumer side of things or the data room side of things. Vega is a stillborn, unlike Zen.

I would tend to agree, and Vega is very, very late. It was way back in Spring 2015 when tech sites reported AMD's next major GPU / architecture, codename Greenland.

Vega 10 = Greenland.

http://www.fudzilla.com/news/graphics/37386-amd-working-on-greenland-hbm-graphics
http://www.fudzilla.com/news/graphics/37584-amd-greenland-2016-is-a-14nm-gpu
http://www.fudzilla.com/news/graphics/37712-amd-greenland-gpu-comes-in-2016-gets-finfet

Then, summer 2015, Greenland also showed up as part of an APU design.

aNrVlun.jpg


Q5F28pC.jpg


Hopefully AMD has enough resources to correct its mistakes with 'Navi' and 'Next Gen' GPU architectures (and products base on them).

 
From what I've seen mentioned in various comments Vega architecture was first one done by AMD design team in Shanghai.

Probably what they call leapfrogging design teams in one of those slides.

And while I don't really care too much about AMD gpus on desktop this can have really bad repercussions on next gen console apu design as power efficiency will be super important there.
 

AmyS

Member
From what I've seen mentioned in various comments Vega architecture was first one done by AMD design team in Shanghai.

Probably what they call leapfrogging design teams in one of those slides.

And while I don't really care too much about AMD gpus on desktop this can have really bad repercussions on next gen console apu design as power efficiency will be super important there.

Exactly, so we can rule PS5 GPU being based on Vega (I know PS4 Pro GPU is mostly Polaris based with a couple of Vega's features like FP16, but the situation for next gen isn't comparable).
I'm guessing PS5 GPU will be a somewhat more even blend of IP from 'Navi' and 'Next Gen' Depends on when PS5 hits the market. Late 2019 or late 2020.
The timing of the future Xbox will then determine if its GPU is about the same as that of PS5, or lean more toward AMD 'Next Gen'. ...Late 2020 or late 2021?
 
Hopefully AMD has enough resources to correct its mistakes with 'Navi' and 'Next Gen' GPU architectures (and products base on them).

$20 says that Navi is still a new iteration of the zombie corpse of GCN. Maybe they can bolt on more irrelevant shit to become Nvidia also-rans in brand new markets and further erode their gaming marketshare.

I'm really not happy with AMD right now. This showing up late to the party with disappointing garbage is just getting worse and worse. At least if you had enough with Nvidia's top end price gouging you could get a cheaper card that could still reasonably compete with everything but a Titan. What are AMD going to do when Nvidia show up with an 84 SM GV102 on a card? Hell, what are they gonna do when they show up with a GV104 in six months that's going to outperform a 1080 Ti by some reasonable amount?

It's really not going to be fun to see an 1180 launch at $649/749 price point but what can someone really do? Well there's Vega with 2/3 the performance and... crickets.

Without healthy competition we're fucked on the top end and all we can do is sit there and let ourselves be fucked.
 
It's even more disappointing because I've given out a few RX x80s to friends and relatives as birthday/Christmas presents and they're really excellent, great value cards. They just screw up the top end so utterly completely and it's so infuriating.
 

iavi

Member
It's even more disappointing because I've given out a few RX x80s to friends and relatives as birthday/Christmas presents and they're really excellent, great value cards. They just screw up the top end so utterly completely and it's so infuriating.

This is where I'm at. The X80s are amazing cards for their price.

Vega is a massive dissapointment
 

AmyS

Member
$20 says that Navi is still a new iteration of the zombie corpse of GCN. Maybe they can bolt on more irrelevant shit to become Nvidia also-rans in brand new markets and further erode their gaming marketshare.

I'm only somewhat hopeful Navi will be to Vega what Rv770 (Radeon HD 4870, summer 2008) was to R600 (Radeon HD 2900, spring 2007).

However, I won't place a bet! :p
 

etrain911

Member
If I'm considering an RX 480 over an Xbox One X is that a safe bet for performance? I don't really care about playing at anything higher than 1080p but they're priced in such a way where I would like the one with the best performance. I know this requires speculation.
 

dr_rus

Member
This is where I'm at. The X80s are amazing cards for their price.

Vega is a massive dissapointment

How's Vega any different from RX x80 cards though? It will likely be more or less even with 1080 while consuming considerably more power, cost somewhat the same and available before NV will switch to a new architecture - which can happen whenever really but probably won't happen in a week from now. This seems exactly like RX x80 cards to me.
 

chaosblade

Unconfirmed Member
If I'm considering an RX 480 over an Xbox One X is that a safe bet for performance? I don't really care about playing at anything higher than 1080p but they're priced in such a way where I would like the one with the best performance. I know this requires speculation.

Probably better off getting a 1060. Cheaper and they perform about the same. Unless you want adaptive sync, in which case it's a small price to pay, since Gsync alone is going to run you $200+ on top of whatever else you would buy.
 
If I'm considering an RX 480 over an Xbox One X is that a safe bet for performance? I don't really care about playing at anything higher than 1080p but they're priced in such a way where I would like the one with the best performance. I know this requires speculation.

It's hard to find RX 480/580 because of miners. You would do better to get a 1060, uses way less power and runs much cooler and performs similarly. Also it's still widely available, the miners on the Green side seem to be buying out the 1070 and leaving the 1060 alone.
 
If I'm considering an RX 480 over an Xbox One X is that a safe bet for performance? I don't really care about playing at anything higher than 1080p but they're priced in such a way where I would like the one with the best performance. I know this requires speculation.

id be very surprised if an xbox x didnt outperform a 480
 

iavi

Member
How's Vega any different from RX x80 cards though? It will likely be more or less even with 1080 while consuming considerably more power, cost somewhat the same and available before NV will switch to a new architecture - which can happen whenever really but probably won't happen in a week from now. This seems exactly like RX x80 cards to me.

8gb RX480s could be found for as low 180 at one point before this mining craze--I bought one off of Jet.com. That's a value that negates any differences between it and the sim spec'd 1060 at a much higher price.

If Vega runs hotter, louder, and hungrier than the 1080 with not even half the addons like streaming etc while getting the same fps and costing the same as well. That's a disappointment
 
Top Bottom