• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Radeon RX Vega thread

dr_rus

Member
8gb RX480s could be found for as low 180 at one point before this mining craze--I bought one off of Jet.com. That's a value that negates any differences between it and the sim spec'd 1060 at a much higher price.

If Vega runs hotter, louder, and hungrier than the 1080 with not even half the addons like streaming etc while getting the same fps and costing the same as well. That's a disappointment

1060 is similarly priced which means that the deals on it are also similar. It is also pretty much always cheaper than x80 in our parts of the world, even without the deals. Can't see how prices are better on x80 cards.
 

Bashtee

Member
I think the HBM2 is one of the biggest negative impacts. Seems like it made driver development a lot more difficult without the much needed big gains. I wonder if they can figure something out or we'll have to wait for Navi's successor.

https://seekingalpha.com/article/4089195-amd-vega-stillborn
I would tend to agree, and Vega is very, very late. It was way back in Spring 2015 when tech sites reported AMD's next major GPU / architecture, codename Greenland.

Vega 10 = Greenland.

http://www.fudzilla.com/news/graphics/37386-amd-working-on-greenland-hbm-graphics
http://www.fudzilla.com/news/graphics/37584-amd-greenland-2016-is-a-14nm-gpu
http://www.fudzilla.com/news/graphics/37712-amd-greenland-gpu-comes-in-2016-gets-finfet

Then, summer 2015, Greenland also showed up as part of an APU design.

aNrVlun.jpg


Q5F28pC.jpg


Hopefully AMD has enough resources to correct its mistakes with 'Navi' and 'Next Gen' GPU architectures (and products base on them).

Wait a second. Will the consumer version of Volta also contain the Tensor cores? I thought those were exclusive for their Tesla cards.

If that is true, then I might as well save the money I planned for the 1080 Ti to execute some deep learning stuff.

Vega is dead either way, the most popular libraries in Python utilize Cuda and NVIDIA GPUs. They have open source projects to convert Cuda code, but it just isn't the same. I have yet to see an OpenCL lib with AMD support for machine learning.
 

dr_rus

Member
Wait a second. Will the consumer version of Volta also contain the Tensor cores? I thought those were exclusive for their Tesla cards.

I wouldn't count on it. Tensor cores are useless in gaming markets. They may leave something for the chips to be compatible in CUDA but chances are that it will run as "well" as native FP16 math does on Pascal gaming GPUs.
 

napata

Member
id be very surprised if an xbox x didnt outperform a 480

I wouldn't be surprised if even a 470 is going to outperform it on average. I'm basing this on the Pro where you often see devs give meagre bumps in resolution/graphics or sometimes not even that. Games like Prey bring down the average hard.

When developers actually take advantage of the extra hardware it should beat the 480 but that's definitely not a guarantee. I think the install bases are just too small for developers to properly care.
 

ISee

Member
Worth noting that that is an overclocked 1080, but still, we're talking 5-10% at best

Still a rather low overclock though, at least for manuel overclocking. The MSI gaming X should be able to go for 2036 MHz maybe even for 2050. The next question is, how much headroom is left on RX Vega. Are 1700MHz even possible?
 
Still a rather low overclock though, at least for manuel overclocking. The MSI gaming X should be able to go for 2036 MHz maybe even for 2050. The next question is, how much headroom is left on RX Vega. Are 1700MHz even possible?

Based on PC Per's analysis of the FE (air and watercooled), I don't think there's much overclocking headroom. At least, unless you want to draw huge amounts of power for modest gains.

And that's true about the 1080 OC, I saw that too. Not that it's a big jump, but all of the good cards will do 2050+ no problem.

For as well as AMD has done with Ryzen I think they've screwed the pooch (where does that phrase even come from) with Vega. I don't think they'll be able to keep the cost low, which would mean nvidia would just have to introduce a modest price cut ($50? $75?) to undermine them. Or just show off Volta if it's around.
 

thelastword

Banned
Hey guys the reveal is not far away, so no need to speak of disappointment already, only 5 days away. Even with such performance on 3d Mark we dont know how RX vega performs in actual games. There are quite a few features that may give Vega better fps in-game.

In any case, there are also many games coming up that will target Vega's architectural strengths, so there's alot to consider. Price to perf is also the other. I think many people will be satisfied with Vega if price to perf is along Ryzen performance...
 

PaNaMa

Banned
That's HBM for ya,

That's unfortunate. AMD not supplying even the big name sites with review samples I guess was foreshadowing of these results. Like others, I hope they provide consumers a solid price to performance ratio at least.

I'm running Team Green 980ti right now (was team Red dual 7970Ghzs before that). No brand loyalty here. I like healthy competition tho, and was hoping AMD was gonna pull off something special.
 

Steel

Banned
That's unfortunate. AMD not supplying even the big name sites with review samples I guess was foreshadowing of these results. Like others, I hope they provide consumers a solid price to performance ratio at least.

I'm running Team Green 980ti right now (was team Red dual 7970Ghzs before that). No brand loyalty here. I like healthy competition tho, and was hoping AMD was gonna pull off something special.

To be clear, the low frequency doesn't mean anything with HBM, the bus is wider so it's a lot faster than the memory in the Nvidia cards. That's not the problem.
 

ISee

Member
Why did they want to go with HBM anyway?

Because it's cool (from a PR perspective), the theoretical memory bandwidth is even higher than on GDDR5X (but bandwidth isn't a bottleneck currently anyway) and because it allows them to go for 16 GB of VRAM without having to place a ridiculous amount of GDDR5X chips on their PCB. Memory stacking is the future, it's just not really needed right now (imo).
 

Marmelade

Member
If those results are confirmed could this mean a later release than expected for gaming Volta?
I mean it looks like Nvidia will be able to compete with anything from AMD with their current lineup..
 
This is so irritating because FreeSync monitors are so much more affordable than GSync.

Guess I'll be paying the premium for an OLED GSync, whenever that comes out. No way I'm coming over to AMD with such weak GPUs
 

JaseC

gave away the keys to the kingdom.
Hey guys the reveal is not far away, so no need to speak of disappointment already, only 5 days away. Even with such performance on 3d Mark we dont know how RX vega performs in actual games. There are quite a few features that may give Vega better fps in-game.

The thing is, though, the 3D Mark results are only oh-so-slightly better than those of the FE and you can be sure it's among the selection of software that AMD has sought to further optimise at the driver level given it's the go-to benchmark tool. It should be clear now AMD has no ace up its sleeve that's going to produce significantly better, 1080 Tiesque results in games.
 

ISee

Member
Are the coin miners going to want this?

Not sure. HBM2 could be a problem for current mining software. Miners apparently skipped the 1080 / Ti because there were some problems with GDDR5X. Vega power consumption also seems to be very high, which could be a problem.
 

thelastword

Banned
The thing is, though, the 3D Mark results are only oh-so-slightly better than those of the FE and you can be sure it's among the selection of software that AMD has sought to further optimise at the driver level given it's the go-to benchmark tool. It should be clear now AMD has no ace up its sleeve that's going to produce significantly better, 1080 Tiesque results in games.
I don't think anything is clear now, things will become clear when the product launches and it is officially benchmarked. There's no need to make a conclusion on anything yet.....Carts before horses and all that....
 

JaseC

gave away the keys to the kingdom.
I don't think anything is clear now, things will become clear when the product launches and it is officially benchmarked. There's no need to make a conclusion on anything yet.....Carts before horses and all that....

That would be a valid counter-point if it weren't for the short timeline. Those benchmarks are hours old (not to mention no benchmarks have put Vega within striking distance of the 1080 Ti) and the end of July is less than a week away.
 

AmyS

Member
HBM2 in Vega is a huge disappointment.

There's actually less bandwidth than with HBM1 in Fiji in 2015. Although HBM1 was inherently limited to 4GB.
 

TSM

Member
*For as well as AMD has done with Ryzen I think they've screwed the pooch (where does that phrase even come from) with Vega. I don't think they'll be able to keep the cost low, which would mean nvidia would just have to introduce a modest price cut ($50? $75?) to undermine them. Or just show off Volta if it's around.

Even worse, Nvidia could deal a serious blow to them by moving up Volta and AMD's flagship card could be competing with the 1160 in a few months at an untenable price point for AMD. I'd imagine that would be the nightmare scenario for AMD since it'd drop the floor out from under their mid tier cards. I wonder what kind of time frame Nvidia could have Volta ready for.
 

Steel

Banned
Even worse, Nvidia could deal a serious blow to them by moving up Volta and AMD's flagship card could be competing with the 1160 in a few months at an untenable price point for AMD. I'd imagine that would be the nightmare scenario for AMD since it'd drop the floor out from under their mid tier cards. I wonder what kind of time frame Nvidia could have Volta ready for.

I don't see any reason for Nvidia to rush Volta with Vega where it is. They're probably pretty confidant with their current schedule.
 
Even worse, Nvidia could deal a serious blow to them by moving up Volta and AMD's flagship card could be competing with the 1160 in a few months at an untenable price point for AMD. I'd imagine that would be the nightmare scenario for AMD since it'd drop the floor out from under their mid tier cards. I wonder what kind of time frame Nvidia could have Volta ready for.

I think nvidia would rather stretch out Pascal and rake in the extra cash. Unless something changes I don't think they need to do much other than keep selling them, maybe bundle a game or two.

I imagine Vega stock is going to be quite low for the next couple months anyway (especially if you want an AIB card), so more easy cruising for nvidia, I guess.

I would think we'll see Volta early next year (just imo).
 

Durante

Member
Ok, is this gonna wow people or what, it seems like the sentiment is this is gonna be a dud, but this guy is hyping it up
If you have a GPU which is a successful product you don't need "hands-on events" or "blind gaming tests", you just need to get it out to legitimate reviewers and have them evaluate and present the full spectrum of numbers.
 

Newboi

Member
If you have a GPU which is a successful product you don't need "hands-on events" or "blind gaming tests", you just need to get it out to legitimate reviewers and have them evaluate and present the full spectrum of numbers.

Any ideas as to why Vega's design appears to be so behind the curve? It draws a massive amount of power compared to the gtx1080, while requiring water cooling to barely outperform it.

Even if the card was a year late due to HBM2 production issues, it would have been underwhelming regardless due to the massive heat and power concerns.
 

Kuosi

Member
Year late, higher power draw, higher manufacturing costs vs 1080, dont think amd can be too happy about it, sure if they sell it for under the price of 1080 it may get some sales, but how much are they actually making from those sales?
 

dr_rus

Member
Ok, is this gonna wow people or what, it seems like the sentiment is this is gonna be a dud, but this guy is hyping it up

"RX Vega FreeSync vs. GTX 1080 Ti G-Sync Blind Gaming test video being edited now (AMD Told me to compare it with GTX 1080)"
Depending on the price of the card, it can be either a hype or a burn - if RX Vega will have 1080Ti's price with 1080's performance then it makes sense to compare it to 1080Ti despite AMD's suggestion.

I don't think that this will be the case though. AMD isn't as stupid to price the card against 1080Ti. I think they just want to have some fun trying to see if people will be able to notice a ~30% performance difference on a VRR display.

I thought that part of the appeal was that HBM2 would allow the cards to be much smaller? I take it that it's only feasible with water cooling since you need all that radiator surface area to cool the card?

Radiator is one thing, power circuits is another. It's hard to have a small card when you need to supply 300W+ to it. In fact, the WC Vega FE teardown on PCPer have shown that they use a second water chamber to take heat off the power converters as well:

IMG_4733.JPG
 

ISee

Member
Depending on the price of the card, it can be either a hype or a burn - if RX Vega will have 1080Ti's price with 1080's performance then it makes sense to compare it to 1080Ti despite AMD's suggestion.

I don't think that this will be the case though. AMD isn't as stupid to price the card against 1080Ti. I think they just want to have some fun trying to see if people will be able to notice a ~30% performance difference on a VRR display.



Radiator is one thing, power circuits is another. It's hard to have a small card when you need to supply 300W+ to it. In fact, the WC Vega FE teardown on PCPer have shown that they use a second water chamber to take heat off the power converters as well:

IMG_4733.JPG

The second water chamber is just an additional Storage, according to gamernexus.

http://www.gamersnexus.net/guides/2988-vega-fe-watercooled-tear-down-and-internals


Gamer nexus said:
Anyway, once the liquid feeds in through the inlet, it eventually feeds out of the outlet and hits the VRM components. This includes chokes, FETs, and capacitors – everything, basically. The VRM components are hit with liquid after the core and HBM, meaning that the water will be pre-heated, but the VRM can handle far higher heat than the GPU + HBM2. This is the ideal way to route the coolant.
...
We are fairly confident that this is an additional liquid tank, intended to prolong life of the card as liquid inevitably permeates the tubes. Typically, you see around 5 years of life on CLCs for CPUs, and GPUs
 

dr_rus

Member
The second water chamber is just an additional Storage, according to gamernexus.

http://www.gamersnexus.net/guides/2988-vega-fe-watercooled-tear-down-and-internals

There's clearly a second (blue) water plate over the VRMs on the photo and the chamber above it is not just an additional storage, it's a secondary diaphragm water pump:

This unit on the right is part of the diaphragm pump design that makes this card interesting. Think of this is as a flexible reservoir with a high-tension spring to create pressure back into the system. A diaphragm pump works with one-way check valves and reciprocating diaphragm material to create alternating areas of pressure and vacuum. The T-split you see at the top of the primary pump allows the liquid stored in the overflow area to maintain reliable usage of the cooler through the course of natural evaporation of fluid. This is very similar the kinds of pumps used in fish tanks and artificial hearts, likely a more expensive solution than was found on the Radeon Pro Duo or Fury X as an attempt to correct the deficiencies of older generations (noise, reliability).

Basically, it's there to help cool the 350W+ thing off.
 

longdi

Banned
HBM2 in Vega is a huge disappointment.

There's actually less bandwidth than with HBM1 in Fiji in 2015. Although HBM1 was inherently limited to 4GB.

My overclocked 1080ti has more bandwidth than hbm2 Vega :(

520gbs vs 484gbs.
 
D

Deleted member 325805

Unconfirmed Member
I got a Freesync monitor today but I'm using a GTX 970, I'm still interested in switching to AMD with Vega, even the bottom card would be a good upgrade for me. I want something that can hit 144fps more regularly at 1080p.
 
Well I guess I'll stick to the RX 480. Was thinking about upgrading to Vega but this doesn't feel right :/

Maybe wait until 2018 or early 2019 to upgrade. By then GPUs (and displays/TVs) ought to have HDMI 2.1.
 
Top Bottom