• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Radeon RX Vega thread

horkrux

Member
Sony and MS should try to get a good Nvidia contract for next consoles.How are they going to get 4K games with better than current graphics with a Vega 56 equivalent in PS5 and XB2?.

Can't they get Navi for consoles? I mean it's still quite a long way off (but so is Navi to be fair lol).
 

Schlonky

Neo Member
Sony and MS should try to get a good Nvidia contract for next consoles.How are they going to get 4K games with better than current graphics with a Vega 56 equivalent in PS5 and XB2?.

How Nvidia and AMDs GPU architectures' performance compares on the PC where there is generally going to be a strong CPU to support the GPU and where the driver has to do tons of work to generate optimal workloads from more generic code isn't necessarily going to directly translate to how well those architectures' performance would compare in a console where there is a weak CPU to support the GPU and where the workloads can be fully optimized to get the best performance on a specific GPU architecture.
 

dr_rus

Member
Just saw the DF review and Vega 64 really is a massive failure. Vega 64 really is a massive failure. How can a 47 percent increase in core clocks and a 40 percent increase in transistors over fury x only result in a 25 percent uplift in performance in most games? Can anyone explain this outcome? Given how inefficient their architecture is, one gets very worried about the next gen consoles and their performance leap over current gen. PS5 and Xbox two will most likely use the follow up to RX480 on a 7nm process (navi) which should result in largely the same flops and performance as vega 56 or vega 64, not very impressive considering what we could expect from next gen GTX 1160 or 1260 (volta or beyond). I really hope AMD switch architecture before Navi. Otherwise, AMD's law of diminishing returns will destroy next gen console performance.

I also want to question the driver argument. based on history and previous AMD cards, can we really expect a dramatic increase in performance over time? Seems like wishful thinking to me. Given the flops of vega 64, we should expect GTX 1080 ti performance, but we are not even close and I doubt drivers will change that.

It's not really 47% because Vega is universally power limited which means that the clocks are throttled down all the time in real world apps. Also it's bandwidth is essentially unchanged which means that whatever shader processing increases it has can be limited by memory.

Tbh, Vega's performance is mostly where a GCN chip would land if you account for the clocks throttling in it's theoretical specs. What is still quite unbelievable is how much transistors they've spent on getting there - and the fact that none of these complexity increases has any effect on execution effectiveness.
 
All y'all talking about what's going in the PS5 & XBWhatever are jumping the gun a bit. Next gen is almost certainly a long ways away (we wouldn't be getting the 4K refresh consoles otherwise), so by the time Sony and MS finalize what's going to be in their next systems Vega will be nothing more than a bad memory.
 

TarNaru33

Banned
Doesn't the Xbox One X have a Polaris equivalent running 4k already? (to some degree). Zen + Vega APU sounds entirely plausible.

That guy was talking about Nvidia hardware in the next console. I don't see it as very likely due to Nvidia's desire for higher margins and having no real need to do so if AMD still can't compete.

I am not ruling out a better GPU or CPU for the next gen consoles, just ones from Intel or Nvidia.
 

horkrux

Member
So why is everyone hating on the Vega 64? I upgraded from a RX 480 8GB and the performance upgrade is amazing

Because it seems to be entirely pointless for gaming when you can just overclock a 56 to get the same performance. Add a 64 Bios into the mix and things get really ugly.
 
So why is everyone hating on the Vega 64? I upgraded from a RX 480 8GB and the performance upgrade is amazing

Because you coild have had this performance for over a year already. The die size is also bigger than a 1080ti meaning amd has less room than nvidia to upscale while starting from a much lower performance point
 
So why is everyone hating on the Vega 64? I upgraded from a RX 480 8GB and the performance upgrade is amazing

because while it is a 'good' card (in relative AMD terms), it is barely comparable to Nvdia's LAST year's flagship architecture....

Just how many years are AMD behind in terms of GPU when compared to Nvidia?

Ryzen might have given Intel a run for its money for system builders on the mid to high end range in terms of price to performance ratio.

But Vega? Going by the current trajectory, all Nvdia needs to do is to drop their cards price to slaughter AMD.

And no....don't think 'future drivers release' will fix this performance parity in the GPU space.
 

thelastword

Banned
So why is everyone hating on the Vega 64? I upgraded from a RX 480 8GB and the performance upgrade is amazing
Don't worry about it, there are just lots of NV fans in here trying to pounce on anything that could be considered negative on Vega and skew it quite a bit. It's like these guys don't even want competition in the market because NV has been so good to us....right ;(.. The way I see it, if you've been gaming on NV and enjoying it and think it's the best value out there, why are you so disgruntled when someone else expresses that they're buying Vega because they believe it's great value to them, they have a freesync monitor, DX 12 and Vulkan is the way ahead and they expect drivers will improve perf significantly going forward.

I think Richard is right on one thing though, On paper, Vega should be doing much better, I agree with this and it's one of the reasons why I figured they had enough power to take on the NV triplets....Yet despite Vega 64's current performance, they finally have some cards on the market that's targetting the high-end NV cards and that's a good thing....I believe 64 and 56 performance will improve quite a bit, especially with better cooling (AIB's) and better drivers and I believe anyone who's into tech can see that just fine...That's my takeaway from Richard's video tbh....
 

joesiv

Member
Whether they are going to fix the drivers and whether the drivers are at fault for the lower-than-expected performance increase are two separate questions, though.

Raja on more than one occasion has said that the drivers/software is very hard. So it seems that the software that AMD wants is taking far longer than estimated.

The interesting thing to me, is that if they do finally sort out the software/driver side of things, even if it takes Navi to get there, it will help Vega owners as well, so Vega could be the best fine wine argument yet.
 
D

Deleted member 17706

Unconfirmed Member
So why is everyone hating on the Vega 64? I upgraded from a RX 480 8GB and the performance upgrade is amazing

Because the GTX 1080 came out 1 year and 4 months ago and Vega is just now sort of kind of matching it while consuming a lot more power to get there.

It's a good card. It's just incredibly late. Compound that with the fact that it's being sold for way over MSRP by almost everyone and it's hard to recommend at the moment.

Don't worry about it, there are just lots of NV fans in here trying to pounce on anything that could be considered negative on Vega and skew it quite a bit. It's like these guys don't even want competition in the market because NV has been so good to us....right ;(.. The way I see it, if you've been gaming on NV and enjoying it and think it's the best value out there, why are you so enamored when someone else expresses that they're buying Vega because they believe it's great value to them, they have a freesync monitor, DX 12 and Vulkan is the way ahead and they expect drivers will improve perf significantly going forward.

I think Richard is right on one thing though, On paper, Vega should be doing much better, I agree with this and it's one of the reasons why I figured they had enough power to take on the NV triplets....Yet despite Vega 64's current performance, they finally have some cards on the market that's targetting the high-end NV cards and that's a good thing....I believe 64 and 56 performance will improve quite a bit, especially with better cooling (AIB's) and better drivers and I believe anyone who's into tech can see that just fine...That's my takeaway from Richard's video tbh....

I don't think it's fair at all to call people in here that are down on Vega Nvidia fanboys. I bought a Vega 64 despite the performance because I was able to get it at $599/no sales tax with the games bundled in and I wanted something for my FreeSync monitor (which I probably shouldn't have bought in retrospect). I absolutely want competition, but AMD is just not bringing it in the GPU space right now.
 
So why is everyone hating on the Vega 64? I upgraded from a RX 480 8GB and the performance upgrade is amazing

It runs hot, performs well below expectations, and consumes too much power. Nvidia have released a GPU with almost half the TDP that's just as good for the same price more than a year ago.
 
D

Deleted member 17706

Unconfirmed Member
It runs hot, performs well below expectations, and consumes too much power. Nvidia have released a GPU with almost half the TDP that's just as good for the same price more than a year ago.

Let's not forget that the Founder's Edition and the initial AIB cards were all around $700 or more for the GTX 1080 at release. It took a while for the price to come down where it is now.
 

thelastword

Banned
I don't think it's fair at all to call people in here that are down on Vega Nvidia fanboys. I bought a Vega 64 despite the performance because I was able to get it at $599/no sales tax with the games bundled in and I wanted something for my FreeSync monitor (which I probably shouldn't have bought in retrospect). I absolutely want competition, but AMD is just not bringing it in the GPU space right now.
We are making the same point though, there are clear reasons why you bought one...We are all asking why aren't we getting more perf with 10.5, 12.6 and 13.7 TF cards......and yet, it's what we've all been asking and deciphering since launch, with UV and OC and of course wanting better drivers or more feature unlocks. it's generally DF's conclusion too....Even now, we have an inkling of what's holding Vega back, it has the raw processing power, but the cooling and drivers are not there yet....

We all know what NV cards are/were in the market before now, nobody has to tell me that the GTX 1080 and Ti existed a year before Vega, we know their perf too and their prices....Despite all that, Vega's performance at it's current state is good enough for some one who does not want to go Nvidia at the high-end.....for several reasons as stated above, and the fact that we do expect perf to improve overtime.....Also, more DX12 and Vulkan titles are making the cut too......I think these are valid enough reasons......to prevent persons from bombarding you with why didn't you buy NV a year ago in a free market..

FWIW, I did buy NV a year+ ago, it's just that it was not a GTX 1080, it was a 750Ti ;)
 

ApharmdX

Banned
So why is everyone hating on the Vega 64? I upgraded from a RX 480 8GB and the performance upgrade is amazing

In a vacuum the Vega 64 is acceptable. It's just very late and power-hungry. But when you look at the prices of these things on the street ($699 for the three different models I saw at Microcenter Saturday), it's a pretty terrible buy. You could step up to the 1080 Ti and get a lot more performance, at lower power and heat, for that price.

If Vega 64 was available at its 499 MSRP, that would be a different story. It would be cheaper and as fast/faster than the 1080. With the dumpster fire that is the GSYNC monitor market, it would be a very competitive offering when paired with a quality FreeSYNC monitor. At $700, no, it's DOA.
 

ZOONAMI

Junior Member
In a vacuum the Vega 64 is acceptable. It's just very late and power-hungry. But when you look at the prices of these things on the street ($699 for the three different models I saw at Microcenter Saturday), it's a pretty terrible buy. You could step up to the 1080 Ti and get a lot more performance, at lower power and heat, for that price.

If Vega 64 was available at its 499 MSRP, that would be a different story. It would be cheaper and as fast/faster than the 1080. With the dumpster fire that is the GSYNC monitor market, it would be a very competitive offering when paired with a quality FreeSYNC monitor. At $700, no, it's DOA.

Well, all this and also the 56 is so close to a 64 it makes no sense to get the 64 if you're going to go with AMD.
 

thelastword

Banned
In a vacuum the Vega 64 is acceptable. It's just very late and power-hungry. But when you look at the prices of these things on the street ($699 for the three different models I saw at Microcenter Saturday), it's a pretty terrible buy. You could step up to the 1080 Ti and get a lot more performance, at lower power and heat, for that price.

If Vega 64 was available at its 499 MSRP, that would be a different story. It would be cheaper and as fast/faster than the 1080. With the dumpster fire that is the GSYNC monitor market, it would be a very competitive offering when paired with a quality FreeSYNC monitor. At $700, no, it's DOA.
It's DOA, yet they can't stay on shelves...It's also the reason why there's a markup.....

The only people buying graphics cards are not gamers anyway, if AMD have found a market favoring Rx Vega's Compute capabilities and Mining capabilities, it's a win for them regardless....Yet, gamers want and are buying them just as well, to go with their freesync monitors, so it looks like they are winning on all ends here....

And to think that AIB cards and better cooling solutions have not even landed yet, and the thing have not even landed a month yet from launch, far less 3-6 months to see how drivers improve to give a more fair assessment down the line.....NV cards have had over a year of AIB cards and driver improvements to mature, I'm sure they would be better in many respects...At least, let the RX Vega purchasers see what their purchase comes into.....at least 3 months at least...
 
D

Deleted member 17706

Unconfirmed Member
In a vacuum the Vega 64 is acceptable. It's just very late and power-hungry. But when you look at the prices of these things on the street ($699 for the three different models I saw at Microcenter Saturday), it's a pretty terrible buy. You could step up to the 1080 Ti and get a lot more performance, at lower power and heat, for that price.

If Vega 64 was available at its 499 MSRP, that would be a different story. It would be cheaper and as fast/faster than the 1080. With the dumpster fire that is the GSYNC monitor market, it would be a very competitive offering when paired with a quality FreeSYNC monitor. At $700, no, it's DOA.

Other than being a bit expensive, what's wrong with the G-Sync monitor market? You can get something like this for about $400.

https://www.amazon.com/gp/product/B01IOO4SGK/?tag=neogaf0e-20

24" 2560 x 1440, 165hz 8-bit color panel, 1ms response time and super low input lag.

Really not a bad deal at all.

It's DOA, yet they can't stay on shelves...It's also the reason why there's a markup.....

Can't stay on shelves? I'll admit I haven't been in a brick and mortar PC parts shop in ages, but Vega 64 has been available at around $699 at various places according to Nowinstock for weeks now.
 

ZOONAMI

Junior Member
It's DOA, yet they can't stay on shelves...It's also the reason why there's a markup.....

The only people buying graphics cards are not gamers anyway, if AMD have found a market with Rx Vega's Compute capabilities and Mining capabilities, it's a win for them regardless....Yet, gamers want and are buying them as well, to go with their freesync monitors, so it looks like they are winning on all ends here....

And to think that AIB cards and better cooling solutions have not even landed yet, and the thing have not even landed a month yet, far less 3-6 months to see how drivers improve to give a more fair assessment down the line.....NV cards have had over a year of AIB cards and driver improvements to mature, I'm sure they would be better in many respects...At least let the RX Vega purchasers see what they purchase comes into.....at least 3 months at least...

How many of these cards are they actually selling though? My guess is it's no where near even 1080 Ti numbers much less 1070s and 1080s.

And yeah it looks like plenty of 56 and 64 are in stock at my local microcenter.
 

Papacheeks

Banned
All y'all talking about what's going in the PS5 & XBWhatever are jumping the gun a bit. Next gen is almost certainly a long ways away (we wouldn't be getting the 4K refresh consoles otherwise), so by the time Sony and MS finalize what's going to be in their next systems Vega will be nothing more than a bad memory.

Actually it's sooner than you think.

2019 will be the year we actually hear about a new console.

Rumors might start sometime next year if we have any leaks.
 

spyshagg

Should not be allowed to breed
I believe AMD is very, very competitive with nvidia at the TDP levels used on consoles. Only when pushing to catch nvidia on PC is where the efficiency goes out the window. Up to a certain window though, they are very good.
 

thelastword

Banned
Can't stay on shelves? I'll admit I haven't been in a brick and mortar PC parts shop in ages, but Vega 64 has been available at around $699 at various places according to Nowinstock for weeks now.

How many of these cards are they actually selling though? My guess is it's no where near even 1080 Ti numbers much less 1070s and 1080s.

And yeah it looks like plenty of 56 and 64 are in stock at my local microcenter.

I'm seeing mostly, AMD game packs on Newegg....You can hardly find a 56...Solo 64 cards go very fast on Amazon, we're looking at only 2 Gigabyte left in stock, XFX will be in stock in two days and about 20 of the MSI in stock atm, I believe it was out of stock just recently...They are selling....and the fact that air RX 64's are selling at $699 means there's demand...
 

dr_rus

Member
I believe AMD is very, very competitive with nvidia at the TDP levels used on consoles. Only when pushing to catch nvidia on PC is where the efficiency goes out the window. Up to a certain window though, they are very good.

power-rise.png

Believe!

That's ~50% deficit in power efficiency in console TDP levels you are talking about. If you look at how 1080 compete with Vega 64 you will see that it's essentially the same ~50% deficit.
 
D

Deleted member 17706

Unconfirmed Member
I'm seeing mostly, AMD game packs on Newegg....You can hardly find a 56...Solo 64 cards go very fast on Amazon, we're looking at only 2 Gigabyte left in stock, XFX will be in stock in two days and about 20 of the MSI in stock atm, I believe it was out of stock just recently...They are selling....and the fact that air RX 64's are selling at $699 means there's demand...

https://www.nowinstock.net/computers/videocards/amd/rxvega64/

To me, this level of "In Stock" indicates they aren't flying off shelves.
 

thelastword

Banned
https://www.nowinstock.net/computers/videocards/amd/rxvega64/

To me, this level of "In Stock" indicates they aren't flying off shelves.
I'm not saying they are never in stock, stock has to replenish at some point...However, they do go in and out of stock quite frequently at $699, so it means they are selling...

It will be great when there's enough stock, where everyone will be able to purchase an LC at $599, an air 64 at $499 and a 56 at $399, but I don't think that will happen till AIB cards hit the market.... So regular AMD cards will sit at MSRP much easier with AIB's with better coolers selling for $20-100 more as per cooling solution upgrades...
 

ApharmdX

Banned
It's DOA, yet they can't stay on shelves...It's also the reason why there's a markup.....

The only people buying graphics cards are not gamers anyway, if AMD have found a market favoring Rx Vega's Compute capabilities and Mining capabilities, it's a win for them regardless....Yet, gamers want and are buying them just as well, to go with their freesync monitors, so it looks like they are winning on all ends here....

And to think that AIB cards and better cooling solutions have not even landed yet, and the thing have not even landed a month yet from launch, far less 3-6 months to see how drivers improve to give a more fair assessment down the line.....NV cards have had over a year of AIB cards and driver improvements to mature, I'm sure they would be better in many respects...At least, let the RX Vega purchasers see what their purchase comes into.....at least 3 months at least...

This is a very optimistic take. Vega is in stock pretty widely, it's just nowhere near MSRP. Doesn't seem to be screaming off of shelves. Certainly not to gamers. Who would buy a Vega 64 @ $700 when you could get a 1080 at ~530 or 1080 Ti at 700? Or Vega 56 at $550 when 1070 is at $420?

Other than being a bit expensive, what's wrong with the G-Sync monitor market? You can get something like this for about $400.

https://www.amazon.com/gp/product/B01IOO4SGK/?tag=neogaf0e-20

24" 2560 x 1440, 165hz 8-bit color panel, 1ms response time and super low input lag.

Really not a bad deal at all.

That's a $400, 24", QHD TN monitor. That's wretched. There's a lack of options and really inflated pricing in the GSYNC market compared to the FreeSync market.
 
D

Deleted member 17706

Unconfirmed Member
That's a $400, 24", QHD TN monitor. That's wretched. There's a lack of options and really inflated pricing in the GSYNC market compared to the FreeSync market.

1440p 165hz with very low response time and input lag. I don't think 165hz FreeSync panels exist right now, but even a 144hz monitor at the same resolution won't be that much cheaper.

The TN panel is very good, too. Straight on, it looks just as good as my IPS panel. Obviously viewing angles will suffer.
 

llien

Member
Just how many years are AMD behind in terms of GPU when compared to Nvidia?

/looks at 580 vs 1060

Uh, not at all?


Sony and MS should try to get a good Nvidia contract for next consoles.How are they going to get 4K games with better than current graphics with a Vega 56 equivalent in PS5 and XB2?.

This kind of weird expectations reminds me what people expected Switch to be perf wise.
Beating Xbone and PS4 seemed "real", because, you know, magical nvidia sticker superduperpowers.

In real world, Huang has bad habbit of pissing off it's partners, on the other hand AMD has that sweed all in one package that is so hard for console manufacturers to not buy into.
 

spyshagg

Should not be allowed to breed
Believe!

That's ~50% deficit in power efficiency in console TDP levels you are talking about. If you look at how 1080 compete with Vega 64 you will see that it's essentially the same ~50% deficit.

Dont be a smart ass. I'm referring to perf/watt at under 80w tdp polaris vs maxwell. We dont know about vega yet. Too early.
 
D

Deleted member 17706

Unconfirmed Member
/looks at 580 vs 1060

Uh, not at all?

Luckily the 580s look like they are starting to come back in stock at more normal prices. Definitely great cards for 1080p FreeSync gaming.
 

TVexperto

Member
Because that performance has been available to you for over a year with a superior power and temperature profile

But how? I have freesync so obviously I cant get a GTX 1080 for over a year

also I didnt pay over 700 dollars but I paid 500 euros so I guess thats a good price?
 
Y'know, with all this talk of undervolting resulting in increased performance, it was only a matter of time before the miners also tried it. Apparently, a Reddit user is claiming to have got 43.5 MH/s in Ethernium with a Vega 64 card, while using about 130W. It appears that this one's a silicon lottery winner, but if other people can replicate similar results, then RIP Vega's medium term availability.

His math is off on a number of levels. People tried replicating this result and got 300-350W instead of 130.


so i've been investigating the claims about vegas mining efficiency with the following results:

miningperf6xy5p.png


miningpowerkoa48.png


minigefficiencyh0bnl.png


for the power numbers keep in mind that im using a less efficient PSU than computerbase.de. with the right PSU the optimized 56 with 64 bios should draw basically the same as the 1070 system. doing the math, the optimized 56 with 64 bios should draw around 120 w on the DC side of the PSU. therefore i think the numbers posted by the reddit user for the 64 with the full 64CUs available and optimized settings should be accurate.


please note: i don't want to promote mining here. really, i think mining and cryptocurrencies are kinda stupid when considering their environmental and macroeconomic impact. just wanted help clear some desinformation lurking around the net and here.
 

llien

Member
Vega is a problem for AMD. (chip size, expensive RAM, not beating and not even competing with 1080Ti)
OTOH, from consumers perspective, Vega 56 is a very compelling product at 400-450 Euro. (at the moment to be had for 509 in DE, you can get Vega 64 for that price).


Perhaps that's why 1070Ti is coming.
 

dr_rus

Member
Dont be a smart ass. I'm referring to perf/watt at under 80w tdp polaris vs maxwell. We dont know about vega yet. Too early.

I have no idea what you are referring to because what you're saying simply doesn't exist. Polaris vs Maxwell? Why not vs Fermi then to make this even more laughable? Why 80W when console APUs are ~200W with less than 50W of that for the CPU?

The simple truth is that if any console manufacturer would decide to go with NV's GPU instead of AMD right now he could've used GP104 class GPU in the same power envelope as that on PS4Pro or XBX. The same GPU which Vega is struggling to beat at 300W power consumption.

The power efficiency gap between Pascal and GCN4/5 doesn't go anywhere on 150W or even 80W. The percentage is the same, it's just that the absolute number difference becomes irrelevant when you compare a 40W GPU to a 60W one or even a 100W to a 150W one. But you must understand that while this is somewhat irrelevant in PC space, it means a clear loss of performance in a fixed power console platform. NV's GPU would provide considerably higher performance in the same power envelope - be it 80W or actual 200W of modern consoles.
 

dragn

Member
whats going on in the US with availability? hundreds of vega 64 for 500€ like every 2-3 days in germany. but only the 56's for 400 fly off the onlineshelves in minutes
 
I have no idea what you are referring to because what you're saying simply doesn't exist. Polaris vs Maxwell? Why not vs Fermi then to make this even more laughable? Why 80W when console APUs are ~200W with less than 50W of that for the CPU?

The simple truth is that if any console manufacturer would decide to go with NV's GPU instead of AMD right now he could've used GP104 class GPU in the same power envelope as that on PS4Pro or XBX. The same GPU which Vega is struggling to beat at 300W power consumption.

The power efficiency gap between Pascal and GCN4/5 doesn't go anywhere on 150W or even 80W. The percentage is the same, it's just that the absolute number difference becomes irrelevant when you compare a 40W GPU to a 60W one or even a 100W to a 150W one. But you must understand that while this is somewhat irrelevant in PC space, it means a clear loss of performance in a fixed power console platform. NV's GPU would provide considerably higher performance in the same power envelope - be it 80W or actual 200W of modern consoles.

The problem for console manufacturers is that to go with an Nvidia GPU they would probably need to switch to ARM like, well, Nintendo Switch.

So it depends on how much Sony and MS really like x86. Intel isn't going to give them the time of day and Intel iGPU sucks anyways. I don't really think they want to go back to having discrete CPU and GPUs on their console boards but I suppose if they felt they had no choice maybe they would. The problem is that cost mounts rapidly when you're paying two licensing agreements, say for an embedded Ryzen variant and also some semi-custom Nvidia GPU design.

The reality is that the console manufacturers still have a few years to see how things shake out before committing to designing the PS5 and Xbox Two. Personally I feel like switching to ARM and committing to Nvidia Tegra is the right approach. Nintendo Switch shows the versatility of using mobile Tegra X1 as a home console/portable hybrid. The real question is if Nvidia is willing to sell automotive Tegra as a home console part or not. The Tegra variant used in Nvidia Drive PX2 is a beast but it's also a pretty specialized embedded part for installlation in cars. It's not clear if there even exists a variant of Tegra X2 and Xavier which is even suitable for non-automotive applications.
 
I really need your help and advise guys:

Air cooled Sapphire Vega 64 just came to me. Box recommends a 750w PSU, but I know that they basically always overshoot.

I have a 600w PSU - Zalman ZM600-GSⅡ. It has 84% efficiency and of couse has 2x 8-pins to connect the card. According to Tomshardware Vega 64's power peaks are 350w. Anandtech's crazy setup with this card maxes out at 470w total system consumption under FurMark.

Now my system is nowhere near their overclocked monster:
CPU - i5 6600
RAM - 16GB DDR4 2400Hz
1x HDD
1x SSD
MB - MSI Z170A PC MATE
2 case fans

84% of 600w is 504w, which should be sufficient when much more powerful overclocked system maxes out at 470w total system draw under full load.

Am I a crazy lunatic, or does this assumption and reasoning make sense? Does anybody have experience with similar setup?
 

dr_rus

Member
The problem for console manufacturers is that to go with an Nvidia GPU they would probably need to switch to ARM like, well, Nintendo Switch.

So what's stopping them now after about 30 years of constant switching between different CPU architectures? ARM is arguably a better fit for a fixed h/w platform than x86 as well.

Also - that's not the only option as there's a clear possibility of licensing x86 CPU core from either AMD or Intel and pairing it with an NV GPU in a SoC. I mean, we did have an Xbox which was made from Intel CPU and NV GPU so it's clearly possible. The biggest road block here would be the cost of such licensing deal as both Intel and AMD would presumably set pricing for such license to levels which will make it unattractive if not straight unprofitable, although for different reasons. NV GPUs aren't cheap either.
 

joesiv

Member
I really need your help and advise guys:

Air cooled Sapphire Vega 64 just came to me. Box recommends a 750w PSU, but I know that they basically always overshoot.

I have a 600w PSU - Zalman ZM600-GSⅡ. It has 84% efficiency and of couse has 2x 8-pins to connect the card. According to Tomshardware Vega 64's power peaks are 350w. Anandtech's crazy setup with this card maxes out at 470w total system consumption under FurMark.

Now my system is nowhere near their overclocked monster:
CPU - i5 6600
RAM - 16GB DDR4 2400Hz
1x HDD
1x SSD
MB - MSI Z170A PC MATE
2 case fans

84% of 600w is 504w, which should be sufficient when much more powerful overclocked system maxes out at 470w total system draw under full load.

Am I a crazy lunatic, or does this assumption and reasoning make sense? Does anybody have experience with similar setup?

Seems like you could undervolt the Vega 64 and probably be fine.
 
84% of 600w is 504w, which should be sufficient when much more powerful overclocked system maxes out at 470w total system draw under full load.

the 600W is on the DC side, so you don't have to multiply with any efficiency factors. you just have to be aware that you usually don't have the full 600W on your +12V rail. this datasheet says your PSU has continuous 540W on the 12V rail. i don't know the manufacturer, but if their meassurements are correct you will be fine.

i've been running at +50% power limit stress testing with the v64 bios and haven't run into any problems with the 460W provided by my +12V rail.
 

Renekton

Member
AMD's CTO Mark Papermaster announced that the company will be transitioning ”graphics and client products" from the Global Foundries 14nm LPP FinFET process it uses today to the new 12nm LP process in 2018. Global Foundries also announced that 12LP will begin production in 1Q18.

http://www.tomshardware.com/news/amd-ryzen-vega-12nm-lp-2018,35502.html

The real question is if Nvidia is willing to sell automotive Tegra as a home console part or not. The Tegra variant used in Nvidia Drive PX2 is a beast but it's also a pretty specialized embedded part for installlation in cars. It's not clear if there even exists a variant of Tegra X2 and Xavier which is even suitable for non-automotive applications.
Strangely enough, automotive competition heating up and Switch's success could encourage Nvidia to give some attention to consoles again.
 

dr_rus

Member
So, half node shrink in 2018. That will at least bring the power drain of Vega down a bit [but not by much].

DKLnNJBUIAEKNfp.jpg:large


Doesn't look like there are any power improvements or they would probably mention them. Performance improvement doesn't always equal power improvement although the former can be a result of the latter. In any case, it's ~10-15% (and that's compared to 16FF++, not AMD's own 14LPP).

It's also pretty likely to be just a marketing renaming of "14nm+" which they've had on their roadmap for years.
 
so i've been investigating the claims about vegas mining efficiency with the following results:

miningperf6xy5p.png


miningpowerkoa48.png


minigefficiencyh0bnl.png


for the power numbers keep in mind that im using a less efficient PSU than computerbase.de. with the right PSU the optimized 56 with 64 bios should draw basically the same as the 1070 system. doing the math, the optimized 56 with 64 bios should draw around 120 w on the DC side of the PSU. therefore i think the numbers posted by the reddit user for the 64 with the full 64CUs available and optimized settings should be accurate.


please note: i don't want to promote mining here. really, i think mining and cryptocurrencies are kinda stupid when considering their environmental and macroeconomic impact. just wanted help clear some desinformation lurking around the net and here.

forgot to mention, that those were conducted with the normal crimson driver, since i havn't had much luck with the beta crypto driver. with the crypto driver i couldn't hold a stable HBM overclock which compromised the results of the optimized settings way below the crimson drivers level.
 
Top Bottom