• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Radeon RX Vega thread

FingerBang

Member
I'm glad I bought a Gigabyte G1 Gaming 1070 for $430 last August. They are $600 now? Can I mine bit coins with this bitch or what is going on?

Yep. Basically after the price of the 480/580 skyrocketed for the same reason, people started to move towards the 1070 which offers the best performance per Watt.

As a consequence, nobody is able to buy a 480/580/1070(probably Vega) for retail price anymore. Yay,
 

kuYuri

Member
As with all GPU launches at this point, I'm more interested in AIB versions of Vega since I never buy reference due to the type of build I have, plus AIBs are just generally better.

I just hope that whatever AMD does next isn't super late the way Vega was when Nvidia does another major GPU launch.
 

Krayz

Member
Yeah you're clearly much more intelligent. That fact that different levels in a game can have vastly different performance metrics is clearly lost on you. And a vega 56 beating a 1080 in doom isnt a surprise at all. Between vulkan and shader intrinsics its likely the best optimized pc game there is for amd



1060 v 480 at launch
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/26.html

1060 v 480 now
https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_56/31.html

Dude do you not understand what everyone else has been trying to tell you. Using relative performance charts to prove AMD cards age better is flawed.

Year one, four games
1060 performs better in three out four of those games compared to the 480
1080(baseline)=>100%
1060(GTAV, BF4, COD)=>75%
480(BF1)=>25%

Year two, four games again but two were replaced by a new game. Two of the games that were removed, were the ones that performed better on the 1060. The new games favor the 480. So now 480 performs better in three out of the four games tested.

1080(baseline)=>100%
480(BF1, D2, DOOM)=>75%
1060(GTAV)=>25%

Do you see how those charts could skew the outcome? Now see what happens when you reintroduce those two games that were removed and remove one game that favors 480.

Year two, five games
1080(baseline)=>100%
1060(GTAV, BF4, COD,)=>60%
480(BF1, DOOM)=>40%
 

SRG01

Member
Watching this video, it seems Steve Burke from Gamers Nexus can do a 180MHZ OC on memory and it remains stable...All in all, he was able to get a 20% perf increase on the Vega 56 in this video in the final analysis, but he is quite adamant that they can do even better with more time and tweaking......He even suggests they will do a watercooling solution on Vega 56 to see how things improve for much bigger gains.

There are still constant reports of the primitive pixel shader, DSBR and HBC being off in Vega right now...from other sites mostly, however, Steve has contacted AMD on one thing..."trying to get them to open up the bios"...One tech reviewer was able to push more power through the Vega FE through a windows registry entry, who knows if he will have any luck with Vega 56, but AMD is considering opening it up......

I'm watching buildzoid and he's describing the voltage/OC limits of HBM with respect to Vega. Seems like it's literally free performance.

On a personal note, a hardmod of the HBM voltage would be the best bet as it would bypass the BIOS.
 
Or maybe there are more game that runs better on AMD in 2017 than 2016... 2018 can be the opposite.

well thats kinda his point: legacy amd cards work better in modern game environments.

it's highly unlikely that it will be the other way around in 2018 as it never was in the past six years.

->

There's a misunderstanding going on here. People who claim AMD cards 'age well' don't just mean purely from the perspective of what you've presented above i.e improvements from drivers to older games. It's about the fact that Polaris cards perform stronger relatively on newer games or games released after launch as well.
 

Nydus

Member
Yeah you're clearly much more intelligent. That fact that different levels in a game can have vastly different performance metrics is clearly lost on you. And a vega 56 beating a 1080 in doom isnt a surprise at all. Between vulkan and shader intrinsics its likely the best optimized pc game there is for amd



1060 v 480 at launch
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/26.html

1060 v 480 now
https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_56/31.html

Then plz show me these levels. I looked at two other sites which show different fps but a constant performance delta.
 
I see, and it very well could be true. But it is impossible to prove that "fact".
Mainly because we have to take 1060 performance as given and not changeable over time to prove that polaris "aged well" (we need some sort of base to work with). And that assumption is obviously false, because nvidia and graphical engineers also spend time to develop and improve their engines/games for pascal.

But okay, let's push some data. Not because you gave me a task btw, just because I want to find this out for myself.

The RX 480 is on average 4.7% slower than a GTX 1060 FE (18 games total)

Performance improved by 2,53% this year (if 1060 FE performance is consistent). It's something?
[Benchmarks are mostly from computerbase, pcgh, gamegpu and techpowerup]

That is telling. Because you've used such large samples the differences in % seem smaller but 2.53% slower over 18 games is actually quite a significant swing from 7.7% the previous year.

If you took the 10 biggest, AAA games from both years (because these are more relevant titles that people will push with thsir cards) the swing in perf would be even larger.
 

Karanlos

Member
Ordered a Vega 64, I know they recommend a 750w PSU but only got 700, you guys think this is enough for balance mode? Got a Ryzen 1700x as CPU.
 

llien

Member
I just hope that whatever AMD does next isn't super late the way Vega was when Nvidia does another major GPU launch.

AMD promised Navi to be 7nm, which means second half of 2018 the earliest, more likely Q4 and quite likely to slip into 2019. If they are lucky and glofo scores a home run, perhaps Q2 2018. It's a gamble for them, but with their budget, they must gamble.

After seeing Ryzen beat Intel on perf/watt, I think it's clear that GloFo isn't as bad as many thought.


Ordered a Vega 64, I know they recommend a 750w PSU but only got 700, you guys think this is enough for balance mode? Got a Ryzen 1700x as CPU.

More than enough. AdoredTV had liquid Vega 64 (that has 50w higher TDP than Air).
So he measured:

550W-ish (total system power consumption) in Turbo mode
460-ish in Balanced (lost only 1-2% of perf, Turbo makes no sense)
370-ish in Power Saving mode (5-15% loss, differs from game to a game)

Even with not very efficient power supply you shouldn't have problems.
 

ISee

Member
That is telling. Because you've used such large samples the differences in % seem smaller but 2.53% slower over 18 games is actually quite a significant swing from 7.7% the previous year.

If you took the 10 biggest, AAA games from both years (because these are more relevant titles that people will push with thsir cards) the swing in perf would be even larger.

It is at least an indication that there could be improvements. I'd even go the extra mile and say: Yes, there seem to be small improvements (look I'm an engineer, going from "I see no proof" to "there seem to be an improvement" is a huge step). But as already mentioned the biggest problem here is that there is no base data to check the results with (in the end you are still comparing apples and oranges) and the second problem is that you have to make the assumption that 1060 FE performance is consistent in all engines and games. Which is false; because we already know since forever that some games prefer GCN over pascal/maxwell in the first place (= there is no consistent lead over polaris in the first place). Further Nvidia drivers also evolve over time but we have no data about the gradient at which nvidia performance is improving vs the gradient at which amd performance is improving. I'm not a big fan of Linus but take a look at the video he made about nvidia drivers the other day. Even old keplar cards still profit from newer drivers in games (sometimes). There are just to many unkown factors to really make a profound statement.

I've used as much data as I was able to find on purpose (without having to work for hours, because come on I don't want to do that). The problem with just picking the 10 biggest games is that you start to influence the data. Is it for example okay to take AMD and NVIDIA sponsored AAA titles? Is it okay to prefer DX12 over DX11? Who defines what an important/big AAA title is? Is Hellblade big 10 AAA worthy, it's just a 30€ game but it for sure is demanding and very good looking? Or what about Injustice 2?. Long story short: The only way to approach this is to stay open minded and to collect as much data as possible. Everything else could be declared as cherry-picking (and rightfully so), which is very counterproductive when trying to proof a point. But it's at least an interesting conversation :)
 
It is at least an indication that there could be improvements. I'd even go the extra mile and say: Yes, there seem to be small improvements (look I'm an engineer, going from "I see no proof" to "there seem to be an improvement" is a huge step). But as already mentioned the biggest problem here is that there is no base data to check the results with (in the end you are still comparing apples and oranges) and the second problem is that you have to make the assumption that 1060 FE performance is consistent in all engines and games. Which is false; because we already know since forever that some games prefer GCN over pascal/maxwell in the first place (= there is no consistent lead over polaris in the first place). Further Nvidia drivers also evolve over time but we have no data about the gradient at which nvidia performance is improving vs the gradient at which amd performance is improving. I'm not a big fan of Linus but take a look at the video he made about nvidia drivers the other day. Even old keplar cards still profit from newer drivers in games (sometimes). There are just to many unkown factors to really make a profound statement.

I've used as much data as I was able to find on purpose (without having to work for hours, because come on I don't want to do that). The problem with just picking the 10 biggest games is that you start to influence the data. Is it for example okay to take AMD and NVIDIA sponsored AAA titles? Is it okay to prefer DX12 over DX11? Who defines what an important/big AAA title is? Is Hellblade big 10 AAA worthy, it's just a 30€ game but it for sure is demanding and very good looking? Or what about Injustice 2?. Long story short: The only way to approach this is to stay open minded and to collect as much data as possible. Everything else could be declared as cherry-picking (and rightfully so), which is very counterproductive when trying to proof a point. But it's at least an interesting conversation :)

There is a trend. On your point about cherry-picking I'd disagree somewhat, as we're talking about fps/performance here, the games that benchers test in their suites are specifically selected generally because they are both graphically demanding to some degree as well as being popular, notable PC releases. Therefore we should cull some games from a selection to get more relevant data, otherwise you could end up with stuff like Rocket League in test suites. I know you put that together quickly, but I would take out such games like Styx or RIME. Are these graphically demanding and notable PC releases? No, so SNIP, out they come (notable that more obscure releases like these tend to be unoptimized messes because of the smaller budget and lack of resources/expertise to properly optimize for AMD or NV HW, another reason to extract them, and it shows with the anomolous figures). Take out the more obscure titles, replace them with more relevant titles and the improvement to AMD performance would be even more pronounced.

Lastly, the AMD cards 'aging well' (like succulent fine red wine) applies more drastically to cards previous to the Polaris arch. I mean look at the turnaround of the Fury line. WOW. Similarly the Fiji line. Polaris aging well has been relatively tame. Maybe I'll put this together when I get home from work.
 
There is a trend. On your point about cherry-picking I'd disagree somewhat, as we're talking about fps/performance here, the games that benchers test in their suites are specifically selected generally because they are both graphically demanding to some degree as well as being popular, notable PC releases. Therefore we should cull some games from a selection to get more relevant data, otherwise you could end up with stuff like Rocket League in test suites. I know you put that together quickly, but I would take out such games like Styx or RIME. Are these graphically demanding and notable PC releases? No, so SNIP, out they come (notable that more obscure releases like these tend to be unoptimized messes because of the smaller budget and lack of resources/expertise to properly optimize for AMD or NV HW, another reason to extract them, and it shows with the anomolous figures). Take out the more obscure titles, replace them with more relevant titles and the improvement to AMD performance would be even more pronounced.

Lastly, the AMD cards 'aging well' (like succulent fine red wine) applies more drastically to cards previous to the Polaris arch. I mean look at the turnaround of the Fury line. WOW. Similarly the Fiji line. Polaris aging well has been relatively tame. Maybe I'll put this together when I get home from work.

there is no cherry picking happening. or is techpowerup now amd biased?
 

thelastword

Banned
I'm watching buildzoid and he's describing the voltage/OC limits of HBM with respect to Vega. Seems like it's literally free performance.

On a personal note, a hardmod of the HBM voltage would be the best bet as it would bypass the BIOS.
Yes buildzoid is great just as Steve. here's the thing though, some users have been able to by-pass the security measure of the vega 56 bios through a modified linux kernel and custom bios. That's linux though, however they have been able to by-pass power limits on the vega cards including the vega 56 in windows through a power play table entry in the registry, I guess the same as they did with Vega FE....So I'm looking forward to see Steve's next video on gains using the PPT's. I wish buildzoid would get an RX 56 and 64 card though, he would have some great insight into overclocking potential as well...


OTOH


I wish some of these sites would stop talking about Vega 56 and 64 going up to $500.00 and $600.00 after launch..It's from the same guy who said RX vega 56 would have a 70 m/hash rate......I could not even get a preorder of RX Vega off Amazon, stock is not available or they've been sold out, in such instances there will always be a markup on prices. People are pretending like they've never seen this before and are trying to obsfucate what's really going on, with baseless price markup tweets and posts...

It's crazy that people boast about the 1070 and 1080 being in the market for over a year, but when was the last time you could get either at MSRP? and one of the VEga cards have not even laucnhed yet. Vega 64 is just not widely available to purchase yet, especially online, so maybe there has been a clean out at retail. Someone is buying all of them before it gets through the channel. So yes, it's nothing strange, for a long time you could not get a 6700k at MSRP either.....

I mean if AMD increases price just after launch and did not make that clear at siggraph, they would be shoting themselves by doing so now. I mean they had enough time to make a decision as to how to price these cards internally, so these $100 markup ramblings after lauunch make little sense. At least let AMD announce it before it makes the circles on the internet....

I also don't want to preface an excuse for them on a markup, but it's also clear that HBM2 is expensive and AMD has given us a very quality GPU with proabably the best vrm and power phase in a video card yet, maybe in hindsight they think they should have priced it higher...As it stands AIB Vega 56 cards will wallop 1070 AIB's and get closer to gtx 1080 performance OC'd, so I can understand the sentiment, I mean if 1070's are going for $440-500, why can't a better card go $500.00MSRP?...... but what's done is done..AMD needs the marketshare, that should be top priority imo...
 

ISee

Member
There is a trend. On your point about cherry-picking I'd disagree somewhat, as we're talking about fps/performance here, the games that benchers test in their suites are specifically selected generally because they are both graphically demanding to some degree as well as being popular, notable PC releases. Therefore we should cull some games from a selection to get more relevant data, otherwise you could end up with stuff like Rocket League in test suites. I know you put that together quickly, but I would take out such games like Styx or RIME. Are these graphically demanding and notable PC releases? No, so SNIP, out they come (notable that more obscure releases like these tend to be unoptimized messes because of the smaller budget and lack of resources/expertise to properly optimize for AMD or NV HW, another reason to extract them, and it shows with the anomolous figures). Take out the more obscure titles, replace them with more relevant titles and the improvement to AMD performance would be even more pronounced.

Lastly, the AMD cards 'aging well' (like succulent fine red wine) applies more drastically to cards previous to the Polaris arch. I mean look at the turnaround of the Fury line. WOW. Similarly the Fiji line. Polaris aging well has been relatively tame. Maybe I'll put this together when I get home from work.

Would be interesting to see.

I wonder if the 'it's aging well' argument has a lot of meaning in the gaming business in the first place. Hardware losses value every single day (just a tiny bit) and will be outdated and replaced at one day in the future. Most people only care about getting the best performance for their display resolution, desired refresh rate and set budget and not about name branding (or at least I do). I bought a GTX 1080 ~9 months ago, but if I'd buy today I would have to decide between vega64 and a gtx 1080. Which would be difficult enough, but I don't think that "the fact" that vega 64 may be faster than a 1080 FE in 2-3 years would influence my decision at all, because I will be buying a new GPU anyway when that happens.
Let's assume we have a 2% performance increase every 7 months and a -10% performance difference to the competing product on release. We would need ~3 years to close the performance gap, while having better performance after nearly 4 years. Is that really relevant? A 2017 GPU will be outdated in 2021. A GTX 780 and 7970 were top dogs in 2013, but nobody would consider them good gaming GPUs by todays standards and entry level GPUs like a 570 are able to run circles around them.

Vega 64 is available for 509Euro for those who are fine waiting for 15 days (check the code), France:

http://www.ldlc.com/navigation/rx+vega/

That's a very good deal, even for blower cards.
 

ethomaz

Banned
well thats kinda his point: legacy amd cards work better in modern game environments.

it's highly unlikely that it will be the other way around in 2018 as it never was in the past six years.

->
Sure? Let't take the link you guys based.

2016: https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/26.html
2017: https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_56/31.html

Now looks at the 2017 benchmarks and see they didn't tested the RX 480 you are are using to make comparison... they estimated the RX 480 results to create the Performance Summary that is something I don't like Tech Power Up because they test some cards but make a overall performance graphic with cards that they didn't tested again in 2017.

The only cards tested in 2017 are: RX 580, Fury X, GTX 1060, 980ti, Vega 56, GTX 1070, Vega 64, GTX 1080 and GTX 1080ti.

1) Choose which two you want to comparison and post the results you get.

2) Remove DOOM (the controversial benchmark in 2017) from the comparison and post the results you get again.

Tell me how the change the games (data) affect the difference between performance... that didn't show proof of improvement... that is different games being tested... if 2016 had something like Doom the result could be better to AMD in the performance summary.
 

dr_rus

Member
Which was exactly my point. I never said improvements in existing games were a major source of amds better aging

This point is flawed as well as this year, for example, there are much more games so far which actually perform better on NV's h/w which is enough to make the whole "gets better over time" thing into what it is - a fantasy. You're using a comparison between different benchmarking suits as an indication of general performance change while there's no basis that this is even remotely true.

Or maybe there are more game that runs better on AMD in 2017 than 2016... 2018 can be the opposite.

It's already the opposite. In fact, it has always been the opposite. Outside of a small amount of AAA titles (out of which most of ones used in benchmarks are actually AMD sponsored - take a look at the new Anandtech's suit used for Vega banchmarking for example) the bulk of all games releasing on PC run better on NV h/w comparatively. This has been the case in 2015, it has been the case in 2016 and in 2017 it's even more pronounced so far.
 

Locuza

Member
After Computerbase PCGH also testet the HBCC.
PCGH only allocated as little as they could, ~3,8 GB for ~11,8 GB "unified memory" (Computerbase reserved 8GB for 16GB).
Under 1080p the HBCC never decreased performance but improved it a little bit (1-3%).
One game stood out and that was Metro LL Redux.
Metro LL Redux was on average 6% faster with HBCC on but with SSAA it increased to 11%.
The min FPS got 1% faster, respectivly 7% (with SSAA).

hbccx3jso.jpg

http://www.pcgameshardware.de/Radeon-RX-Vega-64-Grafikkarte-266623/Specials/HBCC-Gaming-Benchmark-1236099/

Some open question remain.
How much does the allocated memory affect performance and does it make a difference if the platform uses Dual-Channel vs. Quad-Channel?
One also might test the effects with increased resolutions.
 
This point is flawed as well as this year, for example, there are much more games so far which actually perform better on NV's h/w which is enough to make the whole "gets better over time" thing into what it is - a fantasy. You're using a comparison between different benchmarking suits as an indication of general performance change while there's no basis that this is even remotely true.



It's already the opposite. In fact, it has always been the opposite. Outside of a small amount of AAA titles (out of which most of ones used in benchmarks are actually AMD sponsored - take a look at the new Anandtech's suit used for Vega banchmarking for example) the bulk of all games releasing on PC run better on NV h/w comparatively. This has been the case in 2015, it has been the case in 2016 and in 2017 it's even more pronounced so far.

Just checked anands

Battlefield - amd
Ashes - amd
Doom - amd
Ghost recon - nvidia
Dawn of war - amd
Dues ex - amd
Gta v - nvidia (why is this title even benched anymore)
F1 - amd
Total war - nvidia

6 to 3 amd
 
Just checked anands

Battlefield - amd
Ashes - amd
Doom - amd
Ghost recon - nvidia
Dawn of war - amd
Dues ex - amd
Gta v - nvidia (why is this title even benched anymore)
F1 - amd
Total war - nvidia

6 to 3 amd


Let's use some more

Prey-Nvidia
Hellblade-Nvidia
Mass Effect Andromeda-Nvidia
Dishonored 2-Nvidia
For Honour-Nvidia
Gears of War 4-Nvidia
Quantum Break-Nvidia
Watchdogs 2-Nvidia
Far Cry Primal-Nvidia
Fallout 4

10 to 0 Nvidia.

This goes both ways and only serves to illustrate that some games are favourable to AMD and others to Nvidia and your choice of games will effect outcome
 

kuYuri

Member
Just checked anands

Battlefield - amd
Ashes - amd
Doom - amd
Ghost recon - nvidia
Dawn of war - amd
Dues ex - amd
Gta v - nvidia (why is this title even benched anymore)
F1 - amd
Total war - nvidia

6 to 3 amd

Im guessing cause it's a popular game that people buy.

I don't understand why some places benchmark PUBG considering the game isn't even optimized and in early access.
 
Im guessing cause it's a popular game that people buy.

I don't understand why some places benchmark PUBG considering the game isn't even optimized and in early access.

Its extremely old and its rendering isnt at all indicative of what games will be doing. Very poor choice of titles

And yes PUBG is also a very poor choice
 

ethomaz

Banned
Im guessing cause it's a popular game that people buy.

I don't understand why some places benchmark PUBG considering the game isn't even optimized and in early access.
Because it is what gamers are talking and playing now so they want to know how it plays.

In simple terms it is popular and what increase the review page views.
 

AmyS

Member
Post-Volta most certainly. Volta is unlikely to reach gamers, ever, as of right now.

Well, a full Volta V100, as used in the newest Tesla card is unlikely to ever reach gamers, but certainly that doesn't mean Volta GPU architecture isn't going to reach consumer graphics cards in 2018, i.e. a GV104 in the next main GTX GeForce card.
 
D

Deleted member 17706

Unconfirmed Member
What a bunch of garbage if true. It's just not a $600 card.
 

ZOONAMI

Junior Member
Yes buildzoid is great just as Steve. here's the thing though, some users have been able to by-pass the security measure of the vega 56 bios through a modified linux kernel and custom bios. That's linux though, however they have been able to by-pass power limits on the vega cards including the vega 56 in windows through a power play table entry in the registry, I guess the same as they did with Vega FE....So I'm looking forward to see Steve's next video on gains using the PPT's. I wish buildzoid would get an RX 56 and 64 card though, he would have some great insight into overclocking potential as well...


OTOH


I wish some of these sites would stop talking about Vega 56 and 64 going up to $500.00 and $600.00 after launch..It's from the same guy who said RX vega 56 would have a 70 m/hash rate......I could not even get a preorder of RX Vega off Amazon, stock is not available or they've been sold out, in such instances there will always be a markup on prices. People are pretending like they've never seen this before and are trying to obsfucate what's really going on, with baseless price markup tweets and posts...

It's crazy that people boast about the 1070 and 1080 being in the market for over a year, but when was the last time you could get either at MSRP? and one of the VEga cards have not even laucnhed yet. Vega 64 is just not widely available to purchase yet, especially online, so maybe there has been a clean out at retail. Someone is buying all of them before it gets through the channel. So yes, it's nothing strange, for a long time you could not get a 6700k at MSRP either.....

I mean if AMD increases price just after launch and did not make that clear at siggraph, they would be shoting themselves by doing so now. I mean they had enough time to make a decision as to how to price these cards internally, so these $100 markup ramblings after lauunch make little sense. At least let AMD announce it before it makes the circles on the internet....

I also don't want to preface an excuse for them on a markup, but it's also clear that HBM2 is expensive and AMD has given us a very quality GPU with proabably the best vrm and power phase in a video card yet, maybe in hindsight they think they should have priced it higher...As it stands AIB Vega 56 cards will wallop 1070 AIB's and get closer to gtx 1080 performance OC'd, so I can understand the sentiment, I mean if 1070's are going for $440-500, why can't a better card go $500.00MSRP?...... but what's done is done..AMD needs the marketshare, that should be top priority imo...

1080s are Readily available at sub msrp in most cases because they are shit for mining ether.

You can get a gigabyte 3 fan 1080 for $509 which is insane on Newegg right now. It will roast even a wc Vega 64 once ocd.
 

TSM

Member
What a bunch of garbage if true. It's just not a $600 card.

At this point it looks like the $499 MSRP was just AMD trying not to look like assholes to gamers while they move these cards along to miners at higher margins. Retailers were always going to mark these up due to high demand, so AMD decided to get a piece of the action.
 

Marmelade

Member
At this point it looks like the $499 MSRP was just AMD trying not to look like assholes to gamers while they move these cards along to miners at higher margins. Retailers were always going to mark these up due to high demand, so AMD decided to get a piece of the action.

Not really, it's just a huge 486mm² 12.5B transistors 8GB HMB2 chip
That's expensive

edit: for comparison, GP102 (1080 Ti) is 471mm², 12B transistors and uses GDDR5X
 

TSM

Member
Not really, it's just a huge 486mm² 12.5B transistors 8GB HMB2 chip
That's expensive

Yes, so AMD should have had a higher MSRP to reflect that fact. Instead they lowballed the MSRP and then made it completely unprofitable for retailers to sell their cards at the lowball MSRP. This works out exactly the same as AMD having a higher MSRP and giving the retailers better margins. The only difference is AMD doesn't look like they are price gouging with the lower MSRP that you can't actually buy the cards for outside of an impossible to find SKU.
 

napata

Member
Just checked anands

Battlefield - amd
Ashes - amd
Doom - amd
Ghost recon - nvidia
Dawn of war - amd
Dues ex - amd
Gta v - nvidia (why is this title even benched anymore)
F1 - amd
Total war - nvidia

6 to 3 amd

Pretty much all games from 2016 and older. 2017 has been heavily in favor of Nvidia so it seems like the 1060 is actually aging better. <- this is a joke btw.

It just seems pretty even/random to me and might switch by the end of the year. Or mabye the 1060 will keep winning. Who knows?

I don't see why PUBG is not relevant? It's what I'm playing. It's what everyone's playing. It's the most popular PC game since a long time. Who cares if it isn't doing anything new techwise. It's a very relevant game for a lot of people. People build PCs for it.
 
D

Deleted member 17706

Unconfirmed Member
At this point it looks like the $499 MSRP was just AMD trying not to look like assholes to gamers while they move these cards along to miners at higher margins. Retailers were always going to mark these up due to high demand, so AMD decided to get a piece of the action.

Fair enough, but man, FreeSync be damned, just about anyone interested in gaming and willing to drop that kind of cash on a video card would be much better off with a 1080 Ti, or saving a decent amount of money and going with a normal 1080.
 

Worth pointing this out as well:
Our present understanding is that Newegg received 60-70 units allocated for ”packs" on their store, but a significantly lower number of standalone cards. That'd explain why we saw the inventory and sell-through behavior at launch.
That would explain why the Vega 64 has simply not been in stock, if one of the largest computer component retailers in the US got less than 200 cards to sell. "Increasing stock for launch" my arse.
 

sirap

Member
Getting my hands on one of these is more stressful than the Switch launch. FUCK, it's not like Nvidia prices are great either.
 

Kayant

Member
Price spike was 50 Euro, not 100, so not sure what to make with this news.
Especially after gibbo's "RX Vega can do 70-100 MH/s" story..
http://www.ldlc.com/navigation/rx+vega/
Sure he statement there turned out to not be accurate but this is AIB partners confirming to GN what Gibbo said and also this
Kitguru said:
Also when a rep is saying this and then editing it later it's pretty clear something is up.
From reddit -
AMDMatt on the Overclockers UK forum:
"We are working on a statement and plan to issue something in the following days.
Note, the statement won't come from me, I'm part of AMD GCC, it will come from AMD PR."
Surely a simple question regarding what the true price of Vega is, doesn't require days to craft an answer.
For anyone not in the know, in the UK (and other countries) the price of Vega started off at £449 but was quickly increased by £100. Overclockers stated that this was on the instruction of AMD. They were being subsidised to sell at the lower price only for a limited number of cards.
Original post that has been edited - https://forums.overclockers.co.uk/threads/the-vega-review-thread.18788643/page-43#post-31069661
 
Top Bottom