• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Radeon RX Vega thread

StereoVsn

Member
Gigabyte says no to custom Radeon RX Vega 64

And Asus Strix has actually ended up being a bit slower than the reference.

Vega's launch is truly the worst GPU launch AMD ever had.
And because of that Nvidia doesn't have to do jack shit other then maybe reshuffling prices and/or rebadging a few things (like rumored 1070ti).

I wish Vega was like Ryzen and lit a fire under Nvidia like they did with Intel. Unfortunately that didn't happen and here we are with lackluster year and next 6 months.
 

RaijinFY

Member
And because of that Nvidia doesn't have to do jack shit other then maybe reshuffling prices and/or rebadging a few things (like rumored 1070ti).

I wish Vega was like Ryzen and lit a fire under Nvidia like they did with Intel. Unfortunately that didn't happen and here we are with lackluster year and next 6 months.

imo once Volta is out, AMD's GPUs will have to be discontinued all together. AMD is utterly uncompetitive in the perf/watt metric with Vega and they are more expensive to manufacture than their nVidia's counterparts!
 
Gigabyte says no to custom Radeon RX Vega 64

And Asus Strix has actually ended up being a bit slower than the reference.

Vega's launch is truly the worst GPU launch AMD ever had.

2900XT was pretty bad back in the day

imo once Volta is out, AMD's GPUs will have to be discontinued all together. AMD is utterly uncompetitive in the perf/watt metric with Vega and they are more expensive to manufacture than their nVidia's counterparts!

There's at least a 50% chance the reason Raja is going away for awhile is so Lisa can spin off or sell RTG and be rid of the GPU division
 
D

Deleted member 17706

Unconfirmed Member
There's at least a 50% chance the reason Raja is going away for awhile is so Lisa can spin off or sell RTG and be rid of the GPU division

I'm not terribly knowledgeable on this subject, but really? Seems like having both GPU and CPU divisions is key to AMD's business and their push of APUs. Surely the Radeon group contributes a ton to the APU R&D, too, right?
 

thelastword

Banned
Wow so many NV fans in here, of course to say how bad a launch Vega was, yet RX Vega continues to sell well......

Some people now want RTG to shutdown so NV will be all alone in the market, so they can continue to sell their cards at high markups and monopolize the market through gameworks et al....

Anyone who wants the competiton out of the market does not care for this industry and their words on Vega';s launch should be discarded. Many of these posters have no intention of buying Vega, but they want AMD to fail so their company of choice can assume 100% of the market..What a market that would be....

At the end of the day, if you want and favor NV, go buy that and have a riot with 300fps on your 60HZ monitor at 1080p, but to come to a Vega thread and to spill how you wish they would exit the industry, is the most ridiculous anti-consumer thing I've read here in a minute...
 
Wow so many NV fans in here, of course to say how bad a launch Vega was, yet RX Vega continues to sell well......

Some people now want RTG to shutdown so NV will be all alone in the market, so they can continue to sell their cards at high markups and monopolize the market through gameworks et al....

Anyone who wants the competiton out of the market does not care for this industry and their words on Vega';s launch should be discarded. Many of these posters have no intention of buying Vega, but they want AMD to fail so their company of choice can assume 100% of the market..What a market that would be....

At the end of the day, if you want and favor NV, go buy that and have a riot with 300fps on your 60HZ monitor at 1080p, but to come to a Vega thread and to spill how you wish they would exit the industry, is the most ridiculous anti-consumer thing I've read here in a minute...

I don't even know what this implication is. ???
 

dr_rus

Member
http://www.tomshardware.com/news/amd-vega-custom-graphics-cards-problems said:
The company traditionally offers re-engineered graphics cards with custom PCB designs for all high-end GPU platforms, but it appears to be skipping the Vega lineup. A company representative told us that MSI “won’t be making a custom card anytime soon,” but could offer no additional information.
MSI is out too.

Hardware.fr reports that Vega prices are still too high in France, with Vega 56 essentially competing with GTX1080, not 1070.
 

kuYuri

Member
What a clusterfuck Vega continues to be. Kind of crazy that MSI and Gigabyte are sitting out. I have to imagine Sapphire and XFX will make something since they are AMD only GPU manufacturers. Though I'm surprised nothing has happened yet...
 
The rumors are that AMD doesn't have enough Vega GPUs to supply the AIBs with enough to make custom cards. This might also be why we only briefly saw the Strix Vega 64 and then it vanished without a trace. The yield on Vega must be very poor and the cost to manufacture extremely high. At this point we can think of Vega as a proof-of-concept more than a production part because of how few AMD can apparently make.

I'm not terribly knowledgeable on this subject, but really? Seems like having both GPU and CPU divisions is key to AMD's business and their push of APUs. Surely the Radeon group contributes a ton to the APU R&D, too, right?

It's pretty clear that RTG has been getting starved of R&D funding for awhile now. I mean we got Ryzen out of the CPU side but the GPU side are no longer able to constantly rehash GCN over and over. Vega is GCN at it's absolute limit. They need to either front the R&D cash to develop an entirely new architecture or cut RTG loose. Let's see what Lisa does.
 
The rumors are that AMD doesn't have enough Vega GPUs to supply the AIBs with enough to make custom cards. This might also be why we only briefly saw the Strix Vega 64 and then it vanished without a trace. The yield on Vega must be very poor and the cost to manufacture extremely high. At this point we can think of Vega as a proof-of-concept more than a production part because of how few AMD can apparently make.



It's pretty clear that RTG has been getting starved of R&D funding for awhile now. I mean we got Ryzen out of the CPU side but the GPU side are no longer able to constantly rehash GCN over and over. Vega is GCN at it's absolute limit. They need to either front the R&D cash to develop an entirely new architecture or cut RTG loose. Let's see what Lisa does.

Ryzen will probably be cash flow positive from now on,
So at least she’ll have the option to lever up AMD to invest into GPUs. Considering the expansion into AI, machine learning, autonomous driving etc it’s clear the GPUs are the way forward. Selling Radeon mobile (now Adreno) was a terrible and short sighted decision. I hope they won’t repeat it.
 

dr_rus

Member
It's pretty clear that RTG has been getting starved of R&D funding for awhile now. I mean we got Ryzen out of the CPU side but the GPU side are no longer able to constantly rehash GCN over and over. Vega is GCN at it's absolute limit. They need to either front the R&D cash to develop an entirely new architecture or cut RTG loose. Let's see what Lisa does.

They won't sell off RTG, their business depends on GPUs being a part of their APUs, not only in console space (which would simply be impossible without RTG) but with Ryzen in notebooks and low end / office PC space too. Hopefully, with Zen being out they can re-allocate R&D money back to RTG now and this may result in them actually improving their GPUs in Navi and beyond.

It's also kinda hard to think of anyone who would be interested in buying RTG right now. I mean, even a much cheaper Imagination was bought by an investment fund tied to Chinese government for some leftover money. RTG in their current form and without their custom APU business is way less interesting than Imagination.
 

thelastword

Banned
I knew something was fishy about this news, seems like people just wanted to blow another RX vega piece way out of proportion....It's looking like more stock is entering the pipelines for RX vega atm, so most likely, AIB cards should be here in October. I think that has given them enough time to sort out some issues, maybe AMD is trying to sort out some drivers as well and they want to go in tandem with that..

At this point, we can expect air and watercooled RX Vega's 56+64, some undervolted + OC'd ones too.....Cooling is essential to how good this thing runs, so it will be interesting to see how some AIB partners tackle it......
 

TVexperto

Member
So who else here has a Vega 64? I upraded my GPU from a RX 480 8GB and I am so happy (I am an ultrawide user), its looks fantastic and runs very good even though it gets too hot sometimes
 
D

Deleted member 17706

Unconfirmed Member
So who else here has a Vega 64? I upraded my GPU from a RX 480 8GB and I am so happy (I am an ultrawide user), its looks fantastic and runs very good even though it gets too hot sometimes

I do. It runs great for me, but I'm only using it on a 4K 60hz FreeSync monitor. I turn it down to 2560 x 1440 for gaming usually, because this definitely is not a 4K card for most games.

I have a separate Intel/Nvidia (GTX 1080) machine on a 1440p 165hz G-Sync monitor on which I do most of my gaming, though.
 
Man I hope I can get my hands on a Vega, preferably liquid cooled (if it ever comes down to MSRP). I'm willing to spend maybe $570 on an air cooled Vega 64 if it has a DVI port though, so hopefully by the end of October the prices will come down a bit (they will come down right...?).
Had to sell my Fury two months ago so I'd really like to replace it ASAP! I'll have to get a new PSU as well, I think I'd be cutting it pretty close with a 600W.
 
Vega's doing incredibly good in the first FM7 benchmark here: https://www.computerbase.de/2017-09/forza-7-benchmark/

Likely means that T10 put about zero fucks into code optimization of PC version though

how does this statement make any sense when rx580 and gtx 1060 are actually pretty close?


probably some nvida dx12 driver issue. OR T10 actually managed to use all those vega shaders properly. i find the rather uncommon good scaling from fiji also interesting.
 
Vega's doing incredibly good in the first FM7 benchmark here: https://www.computerbase.de/2017-09/forza-7-benchmark/

Likely means that T10 put about zero fucks into code optimization of PC version though.

Damn, the Fury X is struggling. It's an 8.6 TF 3rd Gen GCN GPU that is performing close to the RX 580 which is a 6 TF 4th Gen GCN GPU at 1080p and 1440p. Hopefully a new driver or an update to the game can help out it's performance.

I'm guessing that it's running out of memory with the settings they're using at 4K seeing as the frame-rate bit the dust.

Q9a2fja.png

brStA6W.png

Yk2Epfm.png
 
D

Deleted member 17706

Unconfirmed Member
Looks like the prices are finally starting to come back down to normal. Still around $100 overpriced at their lowest for single cards, but they're regularly in stock and the prices aren't quite as ridiculous as they were a month ago.
 
If Vega performed similarly in all other games it would still be a tough pill to swallow due to the price and power draw. I really hope they don't disappoint with Navi

If it perform similarly in every other game, I think the current prices would be pretty reasonable relative to the 1080/Ti. Unfortunately, this isn't the case (yet?). Hopefully they drop in price soon. They need to fix their damn drivers as well.
 

longdi

Banned
In worst cases, 1080Ti will be 30% faster than 1080, so F7 is not scaling well on NV.

At least we see a game where being built on AMD console hardware, is working as good on AMD PC hardware. No NV fuckery i guess.
 
If Vega performed similarly in all other games it would still be a tough pill to swallow due to the price and power draw. I really hope they don't disappoint with Navi

a vega 56 preforming like a 1080ti would be a tough pill to swallow? this thread...

In worst cases, 1080Ti will be 30% faster than 1080

wut

I'm guessing that it's running out of memory with the settings they're using at 4K seeing as the frame-rate bit the dust.

i guess it's bandwith and not such much sheer memory amount. i could probably test that.

nvidias compression techniques still seems to be significantly better.
 

thelastword

Banned
Likely means that T10 put about zero fucks into code optimization of PC version though
What does that even mean? RX Vega isn't the PC version? So because it's performing better, something is wrong, Rx Vega 64 air is a 12.7TF GPU, of course if a game is made utilizing it's strengths it will outdo many NV GPU's...Still, I don't even think Forza is that game as such, there's no FP16 amongst many other vega features...The only issue with F7 is the lack of proper multithreading rather than a GPU one.....

Damn, the Fury X is struggling. It's an 8.6 TF 3rd Gen GCN GPU
I still remember people saying Vega was just a Fury refresh, but here we are and Vega is showing a huge divide in this release against the Fury...

At least we see a game where being built on AMD console hardware, is working as good on AMD PC hardware. No NV fuckery i guess.
At least AMD is not purposely gimping NV hardware with any works shennanigans. It's simply a game that was devved on AMD hardware and arch.
 

dr_rus

Member
how does this statement make any sense when rx580 and gtx 1060 are actually pretty close?

probably some nvida dx12 driver issue. OR T10 actually managed to use all those vega shaders properly. i find the rather uncommon good scaling from fiji also interesting.
So this DX12 driver issue is somehow exclusive to two chips out of four Pascals there? Note that it's not only 1070/1080/1080Ti which struggles but 960, 970 and 1050Ti as well. In fact, 1060 is the only card there which seem to be doing somewhat okay which kinda hints heavily at an engine issue here. Another huge hint are the frametimes spikes on 1080 which is a pretty clear indication of the engine doing something funky with resource management on NV h/w. Then there's the fact that 1080Ti is barely faster than 1080 here. So do the math.

What does that even mean? RX Vega isn't the PC version? So because it's performing better, something is wrong, Rx Vega 64 air is a 12.7TF GPU, of course if a game is made utilizing it's strengths it will outdo many NV GPU's...Still, I don't even think Forza is that game as such, there's no FP16 amongst many other vega features...The only issue with F7 is the lack of proper multithreading rather than a GPU one.....
It means that an Xbox game require pretty substantial optimizations to be done to run properly on PC h/w (CPUs and NV GPUs mostly). FM7 so far seems like a pretty low effort dump of Xbox One code to PC which is now apparent from both the issues in CPU utilization and the performance problems on NV GPUs. GCN GPUs are doing better simply because they are running XBO's shaders which are already heavily optimized for GCN GPU(s), not because of some mystic FP16 or Vega's 12,7 TFlops suddenly outdoing 1080Ti's 11,3 by 20%.
 

thelastword

Banned
It means that an Xbox game require pretty substantial optimizations to be done to run properly on PC h/w (CPUs and NV GPUs mostly). FM7 so far seems like a pretty low effort dump of Xbox One code to PC which is now apparent from both the issues in CPU utilization and the performance problems on NV GPUs. GCN GPUs are doing better simply because they are running XBO's shaders which are already heavily optimized for GCN GPU(s), not because of some mystic FP16 or Vega's 12,7 TFlops suddenly outdoing 1080Ti's 11,3 by 20%.
First off, I never said they used FP 16, I said they did not use any of the Vega features in Forza 7, yet, it was a title devved with AMD arch in mind and it's also DX12, which AMD GPU's always do well in. There's no sign that anyone gimped anything here, unlike Nvidia with their gameworks titles...Even AMD featured games don't have any code to gimp NV hardware as Quake Champions, Sniper Elite 4 etc.....all do well under NV GPU's. I think when more titles uses Vega arch and DX12 in tandem we will see more results like this in the future...

Also, just to quell all of this talk on un-optimized FUD, these games were tested with Forza 7 optimized drivers from both AMD and Nvidia, and this is what Nvidia had to say when contacted.....

ComputerBase reached out to NVIDIA and the company confirmed that the results were indeed accurate.

ComputerBase.de – Google Translation
”The ranking in Forza 7 is very unusual. Nvidia has confirmed ComputerBase, however, that the results are so correct, so there is no problem with the system in the editorial regarding GeForce."

From this article, which has a more thorough read and analysis on Forza 7's performance on AMD hardware...
 
First off, I never said they used FP 16, I said they did not use any of the Vega features in Forza 7, yet, it was a title devved with AMD arch in mind and it's also DX12, which AMD GPU's always do well in. There's no sign that anyone gimped anything here, unlike Nvidia with their gameworks titles...Even AMD featured games don't have any code to gimp NV hardware as Quake Champions, Sniper Elite 4 etc.....all do well under NV GPU's. I think when more titles uses Vega arch and DX12 in tandem we will see more results like this in the future...

Also, just to quell all of this talk on un-optimized FUD, these games were tested with Forza 7 optimized drivers from both AMD and Nvidia, and this is what Nvidia had to say when contacted.....



From this article, which has a more thorough read and analysis on Forza 7's performance on AMD hardware...

i dont think theres anything underhanded going on, but its quite clear that nvidia gpus are underperforming. probly pretty much as Rus said. the console codebase was quickly ported to pc and we already know GCN console code benefits AMD on pc too. microsoft probly deemed nvidia performance good enough so they shipped the game
 

dr_rus

Member
First off, I never said they used FP 16, I said they did not use any of the Vega features in Forza 7, yet, it was a title devved with AMD arch in mind and it's also DX12, which AMD GPU's always do well in.
This isn't true at all, AMD GPUs do not in fact "always" do well in DX12. They do well in DX12 (and coincidentally in DX11 as well in such titles if there's an option) in games ported from consoles because these games are heavily optimized for GCN GPUs by default. Once you run into a PC only game the results are rather variable even in DX12 for AMD.

There's no sign that anyone gimped anything here, unlike Nvidia with their gameworks titles...Even AMD featured games don't have any code to gimp NV hardware as Quake Champions, Sniper Elite 4 etc.....all do well under NV GPU's. I think when more titles uses Vega arch and DX12 in tandem we will see more results like this in the future...
Why are we suddenly talking about "gimping" and Gameworks?

Also, just to quell all of this talk on un-optimized FUD, these games were tested with Forza 7 optimized drivers from both AMD and Nvidia, and this is what Nvidia had to say when contacted.....
DX12 games require developer's optimization to run well, driver side optimizations in DX12 are very limited. NV said that yeah, that's the best they can do with that particular game. It does not mean that the game is running as it should on NV GPUs though. RX580 99% percentile should not be higher than that of 1080Ti, a card which is essentially double in power, but this is what you can see in FM7 right now. It's just a very badly optimized for PC title.
 

ZOONAMI

Junior Member
Hmm, just watched some OC Ryzen and Vega 56 set up vs an intel OC 7700k 1070 set up vids on YouTube and the Vega basically stomps the 1070 especially at 1440p. It’s basically in 1080 territory. I actually think the 56 is a pretty good chip. 64 vs 1080 same set ups are pretty even at 1440p, but the 56 is just so close to the 64 it’s the only amd higher end chip worth buying.

Edit: at $560 the 64 honestly isn’t a bad way to go.
 
Hmm, just watched some OC Ryzen and Vega 56 set up vs an intel OC 7700k 1070 set up vids on YouTube and the Vega basically stomps the 1070 especially at 1440p. It’s basically in 1080 territory. I actually think the 56 is a pretty good chip. 64 vs 1080 same set ups are pretty even at 1440p, but the 56 is just so close to the 64 it’s the only amd higher end chip worth buying.

Edit: at $560 the 64 honestly isn’t a bad way to go.

At msrp id buy a 56 over a 1070. I wouldnt buy a 64 over a 1080 tho
 
I think so, but AFAIK they're all around like $500 right now?

Yeah they are. You would have to be insane to pay $500 for a Vega 56 instead of a 1080. Which is where AMD's pricing problems are right now. It can't be cheap to make Vega and they seem to only be able to make a small quantity anyways because of the bottleneck in HBM2 production, so they might as well charge as much as they think they can get away with to fleece what few fans they have left.
 

thelastword

Banned
Yeah they are. You would have to be insane to pay $500 for a Vega 56 instead of a 1080. Which is where AMD's pricing problems are right now. It can't be cheap to make Vega and they seem to only be able to make a small quantity anyways because of the bottleneck in HBM2 production, so they might as well charge as much as they think they can get away with to fleece what few fans they have left.
Everything is not about fandom, AMD is selling those cards to customers who think it's a great deal....For (gaming, compute/desktop work, mining or whatever else they deem).. They are selling well, hence why the prices are still not at MSRP. I don't see why you should be bitter about that and try to obsfucate what's really going on in the market with Vega.....Vega is not a failed product, not by any means whatsoever, they're not fleecing fans to sell them either....

Didn't you say you have a 1080Ti or something, you should be on your high horse gaming at 4k 60fps for every title amirite??/? Instead you're here wishing the little compettion we have in the market to fail and are trying to diminish AMD's launch with this product with lots of false hyperbole...
 

AmyS

Member
Just saw this rumor:

https://www.tweaktown.com/news/5942....it&utm_medium=twitter&utm_campaign=tweaktown

AMD's next-gen Navi GPU launching in August 2018 at SIGGRAPH

Once again I have an exclusive story that AMD will have Navi ready to go sometime in July-August 2018, with a Navi-based professional card being launched at SIGGRAPH 2018. We're still waiting for AMD to launch Radeon Pro SSG, something they unveiled during SIGGRAPH 2017 that hasn't yet materialized. In the meantime, Radeon Technologies Group boss Raja Koduri has taken a sabbatical from the company until early-2018. AMD needs to launch a refresh of Vega before Navi in order to keep up with NVIDIA's current-gen GTX 10 series cards, let alone any form of Pascal refresh. NVIDIA has their upcoming GeForce GTX 1070 Ti coming out later this month which should put out the fire that AMD started with the Radeon RX Vega 56 in the $400 market, as NVIDIA will have countless AIB partners throwing their custom GTX 1070 Ti cards into the ring over the holidays. Navi will be made on the 7nm process, but other than that we don't know much. So let's start thinking about what AMD could deliver to change things up and actually fight NVIDIA in the GPU arena. Navi could feature modular Navi GPU dies that would be similar to the way AMD made Ryzen and Ryzen Threadripper, where Ryzen Threadripper 1950X is really an EPYC server CPU with dies disabled.

AMD could indeed make Navi a modular GPU, by making smaller not-so-complicated GPU dies instead of a massive GPU die that we've been used to since the introduction of the GPU. This is something NVIDIA is also doing, with multiple GPU modules on future graphics cards instead of a huge "monolithic" GPU. If AMD were to beat NVIDIA to the modular GPU approach with Navi, and on 7nm, 2018-2019 could be two massive years for GPU technology.

http://wccftech.com/amd-navi-gpu-launching-siggraph-2018-monolithic-mcm-die-yields-explored/

Exploring the multi-chip module die philosophy for GPUs

Here's the thing however, AMD has proven itself to be exceptionally good at creating MCM based products. The Threadripper series (the 1920X and 1950X at any rate) were absolutely disruptive to the HEDT market space. They single handedly turned what was usually a 6-core and very expensive affair to a 16 core affordable combo. The power of servers and Xeons was finally in the hands of the average consumers. So why can't the same philosophy work for GPUs as well?


Well, theoretically speaking, it should work better in all regards for GPUs which are parallel devices than for CPUs which are serial devices. Not only that but you are looking at massive yield gains from just shifting to an MCM based approach instead of a monolithic die. A single huge die has abysmal yields, is expensive to produce and usually has high wastage. Multiple chips totaling the same die size would offer yield increases straight of the bat.


I took the liberty to do some rough approximations using the lovely Silicon Edge tool and was not surprised to see instant yield gains. The Vega 64 has a die measuring 484mm² which equates to a die measuring 22mm² by 22mm². Splitting this monolothic die into 4x 11mm² by 11² gives you the same net surface area (484mm²) and will also result in yield gains. How much? lets see. According to the approximation, a 200mm wafer should be able to produce 45 monolithic dies (22×22) or 202 smaller dies (11×11). Since we need 4 smaller dies to equal 1 monolithic part, we end up with 50 484mm² MCM dies. That's a yield gain of 11% right there.


The yield gains are even larger for bigger chips. The upper limit of lithographic techniques (with reasonable yields) is roughly 625mm². On a single 200mm wafer, we can get about 33 of these (25×25) or 154 smaller dies (12.5×12.5). That gives us a total of 38 MCM based dies for a yield increase of 15%. Now full disclosure, this is a very rough approximation and does not take into account several factors such as packaging yields, complicated high level design, etc but the basic idea holds well. But at the same time, it also does not take into account increased gains by lowered wastage – a faulty 625mm² monolithic die is much more wastage than a single 156mm² one!

Long story short, AMD is perfectly capable of creating an MCM based GPU and would even get some serious yield benefits out of this if it chooses to run with this with Navi. Considering the 7nm node is very much in the early bleeding edge stage, yields can't be too good even by mid-2018 for very large high performance ASICs. Switching to smaller dies for an MCM based approach would solve that problem and even allow it to surpass the total 600mm² surface area limitation of monolithic dies. Nvidia is also actively pursuing this path for the same reasons.

Edit, also here: http://www.guru3d.com/news-story/amd-navi-based-graphics-cards-might-arrive-in-august-2018.html

Seems like AMD needs NAVI out ASAP (and they know it) with Vega looking to be such a screw up in every way.

The MCM / Chiplet approach sounds like it'll be the basis for AMD GPUs going forward and could potentually have positive implications for next gen consoles as well.
 
Modular GPUs seems crazy on the desktop space.

People don't realize it but modular GPU designs are already common in the mobile space. ARM Mali GPUs in phones are modular designs and the SoC designer can put as few or as many clusters of GPU cores as they want in the SoC.
 

dr_rus

Member
Modular GPUs seems crazy on the desktop space.

People don't realize it but modular GPU designs are already common in the mobile space. ARM Mali GPUs in phones are modular designs and the SoC designer can put as few or as many clusters of GPU cores as they want in the SoC.

Not the same thing. Chiplets are chips on a package, not modules inside a SoC (SoC is one chip). Closest thing we have right now are HBM memory on the same package as GPU and AMD's Threadripper and Epyc CPUs being two/four chips on the same package.

However I'm not even remotely convinced that this approach would even work with GPUs these days. It seems like a good backup plan for the times when we'll reach the absolute production process limits but until then a single monolithic GPU will likely be both faster and more efficient than any such multi-chip solution. Right now this looks like production cost optimization which would in fact negatively affect performance.
 

RaijinFY

Member
The MCM / Chiplet approach sounds like it'll be the basis for AMD GPUs going forward and could potentually have positive implications for next gen consoles as well.

What are they? Because i, for one, see none...
 

Marmelade

Member
Vega's doing incredibly good in the first FM7 benchmark here: https://www.computerbase.de/2017-09/forza-7-benchmark/

Likely means that T10 put about zero fucks into code optimization of PC version though.

Looks like Nvidia did some work on Forza 7 with their new 387.92 drivers
https://www.computerbase.de/2017-10...eiber-pascal-vega/#diagramm-forza-7-3840-2160

2017-10-1014_06_59-gewpu6z.png


Edit: The 1080 performs better as the resolution goes up
behind Vega 64@1080p, around the same @1440p and ahead @4k
 

dr_rus

Member
Looks like Nvidia did some work on Forza 7 with their new 387.92 drivers

Yeah, this was apparently an Nvidia issue which they fixed somehow. It's pretty rare with D3D12 titles though as usually D3D12 driver don't have many options in fixing stuff after developers (known case of QB@DX12 vs QB@DX11). Maybe they did something which expanded their capacity here, D3D12 driver is still in pretty active development.

387.92 results are very impressive for GTX1060 now.
 
2 Vega 64s left for $564.99 at Amazon, looks like.

Man, when are the prices gonna really drop!?

EDIT: Oh whoops, these are 56s, were listed as 64s on PCPartPicker. Removed link!
The 64s WERE on sale for $560 on Newegg's eBay store, but that sale ended.

I hope the 64 Liquids drop closer to $700 by the end of the month. I saw some for $750 a while ago, but they went up.
 
Top Bottom