• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon VII Announced

ethomaz

Banned
Athlon says hi, have you forgotten about that high clocked P4 guy?

There is no "better clocks".
Actually that was one of the weakest part of GCN Arch... it always lacked high clocks to be competitive.
AMD finally could boost it with 7nm... that is good now but probably won't be anymore when nVidia enters in the 7nm too.

All the issues AMD has with power draw and performance per watt is due they having that work with clock higher than GCN actually can delivery.

Sawing this card with more clocks give a chance to a mid-range to be more balanced in power draw and performance/watt.

And yes... there is better clocks... the issue with P4 was it IPC and longer latency cycles... it still have better clocks.
 
Last edited:

llien

Member
it still have better clocks.
P4 had higher clocks which had nothing to do with "better".
It was losing vs lower clocked AMD's Athlon at perf, price and power consumption.

Intel's on chips of the next gen would step back on clock race.


Actually that was one of the weakest part of GCN Arch... it always lacked high clocks to be competitive.
It was CGN chip that wiped the floor with Fermi.

It was still CGN chip that trounced GPU market.

All the issues AMD has with power draw and performance per watt is due they having that work with clock higher than GCN actually can delivery.
AMD was always the first to embrace the next fab process, but 20nm never took off, it hit AMD more than others.
 
Last edited:
This is worth it for the freesync support alone. Gsync monitors are utterly inaccessible to most due to their price. Even the tiny number of freesync monitors nvidia is now supporting are the super expensive kind. If it costs the same, is slightly faster than the 2080 and has more memory i'd gladly trade the rtx gimmick for actual freesync support. Some amazing monitors can be had for dirt cheap with this feature integrated.
 
Last edited:

ethomaz

Banned
P4 had higher clocks which had nothing to do with "better".
It was losing vs lower clocked AMD's Athlon at perf, price and power consumption.

Intel's on chips of the next gen would step back on clock race.



It was CGN chip that wiped the floor with Fermi.

It was still CGN chip that trounced GPU market.


AMD was always the first to embrace the next fab process, but 20nm never took off, it hit AMD more than others.
P4 have better clocks that will always be true but it lower IPC and longer latency cycles did hurt lot it performance.
Clocks are equal in processors... more clocks mens better clocks.
Of course others parts combined didactic the performance.

GCN always had issues with higher clocks and that become even more apparent with Polares/Vega.
The power draw and performance/watt is disproportional with the increase in clock generation big issues for AMD.

Fermi? nVidia changed architecture... it is not like they are trying to rely with the same old Arch for 8 years in a row.

BTW it is GCN (Graphic Core Next) and not CGN lol

I'm not sure what are you debating.
 
Last edited:

Makariel

Member
Is selling my 1080 and getting this instead a crazy thing to do? I'm just getting tired of Nvidia and want to move away from them.
Depends on your definition of crazy. I think it's not a good economic decision, at least until independent reviews have come in and we know more about the value proposition. It also depends on how much you could sell your 1080 for, and the street prices for partner VII cards. I'm sitting here with a 1080 and would need to see some significant improvements in performance to pay a few hundreds for an upgrade. I personally won't upgrade my GPU until the next generation of consoles hits shelves. My assumption being that until then the 1080 is still be easily able to deal with any new title that's coming out.

At the moment there are just a few cards to consider IMO:
RX 570 if you're on a budget
second hand 1080ti, Radeon VII or 2080 if you're in the market for high spec gaming
2080ti if you want the best of the best
 

llien

Member
I'm not sure what are you debating.
Let me try once more to address your confusion:

1) There is no "higher clock" competition.
2) Only performance, perf/watt, price is what matters
3) Higher clock is not the only way to achieve that, as we have seen many times
4) There is better performance, better performance per watt, better performance per buck, there is no "better clock"
5) It was CGN that trounced nVidia's Fermi, stop being dismissive about it, it was a great architecture
 

evanft

Member
Is selling my 1080 and getting this instead a crazy thing to do? I'm just getting tired of Nvidia and want to move away from them.

Yes. If you don't do content creation, this card is basically a non-starter. You'd be better off buying a used 1080ti for $500 or a new 2080 for ~$700.

Also, if you're tired of nVidia, wait until you experience AMD drivers.
 

ethomaz

Banned
Let me try once more to address your confusion:

1) There is no "higher clock" competition.
2) Only performance, perf/watt, price is what matters
3) Higher clock is not the only way to achieve that, as we have seen many times
4) There is better performance, better performance per watt, better performance per buck, there is no "better clock"
5) It was CGN that trounced nVidia's Fermi, stop being dismissive about it, it was a great architecture
It is GCN, c'mon.

It was a good arch 8 years ago... nVidia surpassed them with new Arch years ago... Maxwell was already better than GCN.
I'm not even sure why are you comparing with Fermi because Fermi is 2010 arch while GCN is 2012 and it was competed with Kepler (2012).

Fermi rival was TeraScale 2 arch... both 2010.

GCN is today an old and dated arch that have big issues to reach the high clocks to be competitive and that is why it suffer with disproportional power draw and heat... AMD have to push the clocks over the limit to try to be competitive... so yes there is "higher clock" issues.

GCN is in the situation it is today because it Arch can't reach better clocks... that made AMD give up the high-end market.

It is a well know issue in any graphic card / PC technical site.
 
Last edited:

Ascend

Member
Yes. If you don't do content creation, this card is basically a non-starter. You'd be better off buying a used 1080ti for $500 or a new 2080 for ~$700.

Also, if you're tired of nVidia, wait until you experience AMD drivers.
There is nothing wrong with AMD drivers. They are arguably superior to nVidia's now.
 

Makariel

Member
Also, if you're tired of nVidia, wait until you experience AMD drivers.
AMD drivers work fine. They used to be rubbish, but with my previous R9 390 I had zero driver issues.

Also, AMD Eyefinity is superior to the nVidia version, since Eyefinity allowed me to run all sorts of different monitors with varying native resolutions and refresh rates together. The stupid nVidia drivers are very picky about which screens I can combine, which is why I had to retire one of my screens. I miss Eyefinity :messenger_crying:
 

Panajev2001a

GAF's Pleasant Genius
It is GCN, c'mon.

It was a good arch 8 years ago... nVidia surpassed them with new Arch years ago... Maxwell was already better than GCN.
I'm not even sure why are you comparing with Fermi because Fermi is 2010 arch while GCN is 2012 and it was competed with Kepler (2012).

Fermi rival was TeraScale 2 arch... both 2010.

GCN is today an old and dated arch that have big issues to reach the high clocks to be competitive and that is why it suffer with disproportional power draw and heat... AMD have to push the clocks over the limit to try to be competitive... so yes there is "higher clock" issues.

GCN is in the situation it is today because it Arch can't reach better clocks... that made AMD give up the high-end market.

It is a well know issue in any graphic card / PC technical site.

Sure, but as llien llien was saying CU counts, processing cores per CU counts, warp/wavefront width and additional instructions and/or improvements to throughout and latency, cache hierarchy, additional units (see new shader types added by the RTX line beyond RT and Tensor cores), etc... clocks is one side of the battle.
 
AMD Radeon VII Detailed Some More: Die-size, Secret-sauce, Ray-tracing, and More

9wNtokztVhnGSMYo.jpg



LQVaycEAcfRWeOtu.jpg



It depends on your priorities. If you are in "f*ck off nVidia" camp and have patience to wait for Q3 launch, why not.
That seems like quite an advancement, thanks for posting this.
 

LordOfChaos

Member
Hot on news of VII not having uncapped FP64 like M150, which further limited its appeal to people who may have wanted it as a compute card rather than a gaming one, turns out AMD is making less than 5000 of these...

...Honestly, makes more sense than the rest of it. AMD never expected this card to have much appeal. Not a great one to gamers who can get an 2080 for the same price and performance with bonus features, and no longer to those in the compute field either, unless they fall perfectly in the niches that all of

1) Don't benefit from reduced precision Tensor cores

2) Don't benefit from FP64 double precision

3) Just need a massive fast VRAM close to the GPU above all


...And those are the 5000 customers I guess, lol

https://www.tweaktown.com/news/64501/amd-radeon-vii-less-5000-available-custom-cards/index.html
 
Last edited:

shark sandwich

tenuously links anime, pedophile and incels
Hot on news of VII not having uncapped FP64 like M150, which further limited its appeal to people who may have wanted it as a compute card rather than a gaming one, turns out AMD is making less than 5000 of these...

...Honestly, makes more sense than the rest of it. AMD never expected this card to have much appeal. Not a great one to gamers who can get an 2080 for the same price and performance with bonus features, and no longer to those in the compute field either, unless they fall perfectly in the niches that all of

1) Don't benefit from reduced precision Tensor cores

2) Don't benefit from FP64 double precision

3) Just need a massive fast VRAM close to the GPU above all


...And those are the 5000 customers I guess, lol

https://www.tweaktown.com/news/64501/amd-radeon-vii-less-5000-available-custom-cards/index.html
Holy shit. Making less than 5000 of them and losing money. Sounds like this thing only exists so that AMD can at least show its face in the high-end market.
 

LordOfChaos

Member
Holy shit. Making less than 5000 of them and losing money. Sounds like this thing only exists so that AMD can at least show its face in the high-end market.

Also maybe a lower clocked variant makes it into the next iMac Pro, while they rack up well binned silicon for it, I guess.
 

thelastword

Banned
I like how we're running with everything the press says...The press said there would be no Gaming Vega too....Press has not even benched the card but they're already giving results.....

Do you do a lot of content creation? Because that's where the 16GB of HBM2 comes in handy. The Radeon VII is more of a 'prosumer' card than a gamer card, imo.
Lisa said it's both, this will be a gaming card and a PROsumer card, the FP64 performance, the memory and bandwidth should increase perf for either potential buyer...….AMD certainly would not just make a dedicated gaming card on Vega, which was already a compute card to begin with, it's just that it's great at gaming too......So two birds with one stone......If gamers are crying it's too expensive and would rather buy an RTX 2080 with half the ram and bandwidth for 800.00, then content Producers will buy this out of stock in a heartbeat….AMD wins either way and then they go on to launch Navi (completely gamercentric) in July....

I welcome this card for 8k and 4k gaming, it will be futureproof with all that bandwidth, besides.....RE7 already uses over 12Gb of VRAM, RE2 can use over 14Gb and there are quite a few games coming that will use lots of Vram in 2019...…..Persons think they are safe with 8Gb's or even 6GB don't know half the story...
 

Ivellios

Member
I like how we're running with everything the press says...The press said there would be no Gaming Vega too....Press has not even benched the card but they're already giving results.....

Lisa said it's both, this will be a gaming card and a PROsumer card, the FP64 performance, the memory and bandwidth should increase perf for either potential buyer...….AMD certainly would not just make a dedicated gaming card on Vega, which was already a compute card to begin with, it's just that it's great at gaming too......So two birds with one stone......If gamers are crying it's too expensive and would rather buy an RTX 2080 with half the ram and bandwidth for 800.00, then content Producers will buy this out of stock in a heartbeat….AMD wins either way and then they go on to launch Navi (completely gamercentric) in July....

I welcome this card for 8k and 4k gaming, it will be futureproof with all that bandwidth, besides.....RE7 already uses over 12Gb of VRAM, RE2 can use over 14Gb and there are quite a few games coming that will use lots of Vram in 2019...…..Persons think they are safe with 8Gb's or even 6GB don't know half the story...

I did a very quick research and the maximum RE7 reached at 4k with ultra was 8GB vram

Source: https://www.overclock3d.net/reviews/software/resident_evil_7_biohazard_pc_performance_review/9

As for RE2, it is still a demo as far as i know so maybe the game is just not well optimized and drivers and patchs can fix this
 

LordOfChaos

Member
Lisa said it's both, this will be a gaming card and a PROsumer card, the FP64 performance, the memory and bandwidth should increase perf for either potential buyer..

Were you aware of this? Because FP64 performance isn't rosy...
https://techgage.com/news/radeon-vii-caps-fp64-performance/

We reached out to AMD’s Director of Product Marketing Sasa Marinkovic to inquire whether or not the FP64 inclusion was real, and were told quite simply that “Radeon VII does not have double precision enabled.” That means instead of delivering 6.7 TFLOPS of FP64 like the MI50, Radeon VII will be closer to ~862 GFLOPS (it’s 1:16 with single-precision like RX Vega).

Kind of kills one of the main ways it could have been interesting, as a Titan equivalent, compute card in a gamer cards clothing, but with 1/16th rate FP64 that's a no go for anything needing higher precision.
 
Last edited:

ZywyPL

Banned
If that is true then 7nm is more like a paper launch because it is immature yet.

7nm is doing well, just look at Zen2 for example, the issue in this particular case is thet GCN has never really benefit well from die shrinking. At least they could boost up the clocks significantly, since the GCN is limited to 4096 cores, 64ROPs etc., there's not much they can do with the architecture anymore (rumors say there is a 6144 core/21TF Vega 20 behind the closed door tho).
 

CuNi

Member
I like how we're running with everything the press says...The press said there would be no Gaming Vega too....Press has not even benched the card but they're already giving results.....

Lisa said it's both, this will be a gaming card and a PROsumer card, the FP64 performance, the memory and bandwidth should increase perf for either potential buyer...….AMD certainly would not just make a dedicated gaming card on Vega, which was already a compute card to begin with, it's just that it's great at gaming too......So two birds with one stone......If gamers are crying it's too expensive and would rather buy an RTX 2080 with half the ram and bandwidth for 800.00, then content Producers will buy this out of stock in a heartbeat….AMD wins either way and then they go on to launch Navi (completely gamercentric) in July....

I welcome this card for 8k and 4k gaming, it will be futureproof with all that bandwidth, besides.....RE7 already uses over 12Gb of VRAM, RE2 can use over 14Gb and there are quite a few games coming that will use lots of Vram in 2019...…..Persons think they are safe with 8Gb's or even 6GB don't know half the story...

If games reach 14 GB of VRAM then someone should slap their fingers and order them to fecking optimize their games. I'm running the RE2 demo on a 970 without issues on well above 60 FPS. Not all maxed but most medium to high. Also to test it out I once cranked up all the settings to max. My 970 was running close to 60 fps but of course it had freezes every couple seconds because of the crippled VRAM. If it had 4GB VRAM instead of 3.5 + 0.5 it would've been fine too. This demo wants way to much VRAM.
 
Rumors are AMD have less than 5,000 aren't surprising. These are parts which failed to qualify as MI50's which are already a low-volume part. So a small quantity of failed silicon from a low-volume part are being resold at a loss to salvage what little revenue can be recovered since at least selling them will result in a smaller loss than just throwing them in the garbage. That's the Radeon Fantasy VII.
 
Last edited:

Kenpachii

Member
I like how we're running with everything the press says...The press said there would be no Gaming Vega too....Press has not even benched the card but they're already giving results.....

Lisa said it's both, this will be a gaming card and a PROsumer card, the FP64 performance, the memory and bandwidth should increase perf for either potential buyer...….AMD certainly would not just make a dedicated gaming card on Vega, which was already a compute card to begin with, it's just that it's great at gaming too......So two birds with one stone......If gamers are crying it's too expensive and would rather buy an RTX 2080 with half the ram and bandwidth for 800.00, then content Producers will buy this out of stock in a heartbeat….AMD wins either way and then they go on to launch Navi (completely gamercentric) in July....

I welcome this card for 8k and 4k gaming, it will be futureproof with all that bandwidth, besides.....RE7 already uses over 12Gb of VRAM, RE2 can use over 14Gb and there are quite a few games coming that will use lots of Vram in 2019...…..Persons think they are safe with 8Gb's or even 6GB don't know half the story...

And new cards will arrive at that point. Which makes this card again useless.
Have fun buying a card that gets zero support so you can play games in 3 years from not a bit more faster on a higher v-ram setting that you need to get a microscope to even see.

Then good luck with there driver support with that no thanks.

Also if RE2 uses 14gb of v-ram its a shit port.

my 970 runs the game at 60+ fps without issue's and anybody that sits at a 970 or higher GPU at this point in time on PC has practically zero reason to upgrade until next generation. And if you want 4k you want all the performance you can get which isn't this vega card.

There is zero market for it.

Nvidia knows this, AMD knows this even themselves.
 
Last edited:

ethomaz

Banned
7nm is doing well, just look at Zen2 for example, the issue in this particular case is thet GCN has never really benefit well from die shrinking. At least they could boost up the clocks significantly, since the GCN is limited to 4096 cores, 64ROPs etc., there's not much they can do with the architecture anymore (rumors say there is a 6144 core/21TF Vega 20 behind the closed door tho).
Limited to 5000 units means yields issues with 7nm.

I can’t see any other business reason except production issues.

5000 units is definitively less than usual paper launch.
 
Last edited:

Ascend

Member
Limited to 5000 units means yields issues with 7nm.

I can’t see any other business reason except production issues.

5000 units is definitively less than usual paper launch.
Or the yields are exceptionally good for the Instinct GPUs, and these are failed parts that are downgraded to gaming graphics cards.
 

Shotpun

Member
I welcome this card for 8k and 4k gaming, it will be futureproof with all that bandwidth, besides.....RE7 already uses over 12Gb of VRAM, RE2 can use over 14Gb and there are quite a few games coming that will use lots of Vram in 2019...…..Persons think they are safe with 8Gb's or even 6GB don't know half the story...

In Digital Foundry's RE2 Demo Tech Analysis the game was actually using 9.6GB of vram 4k@max while the game options was telling it will use nearly 14GB. Also, keep in mind there is a difference between how much of vram game can make use of and how much it actually needs to perform smoothly.

Radeon VII will probably be ok 4k card for a while, but you can forget 8k.

Edit: ram -> vram
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
In Digital Foundry's RE2 Demo Tech Analysis the game was actually using 9.6GB of ram 4k@max while the game options was telling it will use nearly 14GB. Also, keep in mind there is a difference between how much of VRAM game can make use of and how much it actually needs to perform smoothly.

Radeon VII will probably be ok 4k card for a while, but you can forget 8k.

8K needs to be a word (letter?) that hibernates for the next 5 years. It's very surprising and disappointing to me that TV manufacturers are rushing 8K to the market when the market is just now starting to become acclimated to 4K. There is almost no 8K content and I seriously doubt that 35mm has more or less reached its max resolution with 4K. I am just not seeing the benefit at all.

And gaming at 8K. Considering that as of right now, pretty much only the 2080 Ti can break 4K/60fps in most games, the idea of 8K is just a pipe dream.
 

LordOfChaos

Member
Were you aware of this? Because FP64 performance isn't rosy...
https://techgage.com/news/radeon-vii-caps-fp64-performance/



Kind of kills one of the main ways it could have been interesting, as a Titan equivalent, compute card in a gamer cards clothing, but with 1/16th rate FP64 that's a no go for anything needing higher precision.


Correction, it's a 1/8th rate, still a far shot from Vega 20's full rate, but, well, half as bad as 1/16th.

Boy, AMDs messaging is all over the place. But I imagine a 1/8th rate kills it just as dead as 1/16th for anyone needing DP and thinking this was a cheaper M150.

 

SonGoku

Member
Correction, it's a 1/8th rate, still a far shot from Vega 20's full rate, but, well, half as bad as 1/16th.

Boy, AMDs messaging is all over the place. But I imagine a 1/8th rate kills it just as dead as 1/16th for anyone needing DP and thinking this was a cheaper M150.


Its still the best 4k gaming card in that price sector, it has that going for it.
 

ethomaz

Banned
Its still the best 4k gaming card in that price sector, it has that going for it.
Is it? It has the same price than GTX 2080 and it is more expensive than GTX 1080TI.

How these cards compare in performace?
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Is it? It has the same price than GTX 2080 and it is more expensive than GTX 1080TI.

How these cards compare in performace?
Its really getting annoying seeing these posts about the 1080 Ti, because its simply not true any more. All 1080 Tis are completely sold out at this point. I guarantee you will not find one for $699. Go check on Amazon, Newegg or Microcenter. Theyre either completely sold out or being sold by 3rd party seller in $800 - $1200 range. The ones on ebay are used ans likely spent a lot of time as miner card, so that doesnt count.
 
Last edited:

SonGoku

Member
Is it? It has the same price than GTX 2080 and it is more expensive than GTX 1080TI.

How these cards compare in performace?
1080ti is discontinued, cheapest one can be found for $650 in newegg ( that's refurbished) cheapest one 799
VII performs like the rtx 2080 at 4k, but it has the advantage of higher bandwidth which gives it a edge at 4k and the extra memory makes it have longer legs for 4k

If your interest is 4k gaming, VII is the better option in that price bracket
 
Last edited:

ethomaz

Banned
1080ti is discontinued, cheapest one can be found for $650 in newegg
VII performs like the rtx 2080 at 4k, but it has the advantage of higher bandwidth which gives it a edge at 4k and the extra memory makes it have longer legs for 4k

If your interest is 4k gaming, VII is the better option in that price bracket
Where you saw that “edge” in 4k? Can you share any source or it is just speculation?

GCN arch is indeed more hungry for bandwidth than nVidia’s arch... that doesn’t mean better performance.
 
Last edited:

SonGoku

Member
Where you saw that “edge” in 4k? Can you share any source or it is just speculation?

GCN arch is indeed more hungry for bandwidth than nVidia’s arch.
4k=bandwidth good
Plus some benches were posted on this page were VII edged 2080

Vega has a tile based renderer no?
 

shark sandwich

tenuously links anime, pedophile and incels
Where you saw that “edge” in 4k? Can you share any source or it is just speculation?

GCN arch is indeed more hungry for bandwidth than nVidia’s arch.
Purely speculation because we don’t have any independent benchmarks yet.
 

ethomaz

Banned
4k=bandwidth good
Plus some benches were posted on this page were VII edged 2080

Vega has a tile based renderer no?
Different architectures = different requirement for bandwidth in 4k

There is no benchmark yet so far.

I’m not saying you are wrong... it just seem you made up your minds without any evidence.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Different architectures = different requirement for bandwidth in 4k

There is no benchmark yet so far.

I’m not saying you are wrong... it just seem you made up your minds without any evidence.
Speaking of lack of evidence, you just claimed that 1080 Tis are cheaper when in fact they are not.
 

shark sandwich

tenuously links anime, pedophile and incels
Different architectures = different requirement for bandwidth in 4k

There is no benchmark yet so far.

I’m not saying you are wrong... it just seem you made up your minds without any evidence.
Yeah that is nothing but wishful thinking at this point. Radeon VII has lots of bandwidth, more than even the 2080 Ti, but obviously it’s not going to outperform that GPU.

He’s just latching on to the one clear advantage Radeon VII has, and claiming (without evidence) that that’s going to be the Most Important Thing. It has the unmistakable stench of fanboyism.

We’ll see. My guess is it’ll win a few and lose a few (by a small margin either way) vs the 2080. But I personally doubt we’ll be looking at the benchmarks and saying “it’s a wash at 2560x1440, but when you go up to 4K this card really has the edge!”

Also, think critically for a minute. If this had “the edge at 4K” then they would’ve focused far more on Radeon VII vs 2080 benchmarks. Instead they only showed comparison w/Vega 64.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
It was not too long ago.

When the Radeon VII become the " best 4k gaming" card?
It’s been close to 3 months since 1080 Tis were affordable. They were pretty much all gone by November. Same with the regular 1080. Now that the 2060 is out there is literally no reason to compare the 10 series to the new Radeon.
 

llien

Member
AMD has briefly denied that it's either limited to 5k or that there will be no AIB version:

AMD (China) said (translated): "We will not release production figures, but when released on February 7, AMD.com official website and AIB vendor partners will have products on sale, and we expect the supply of Radeon VII to meet the needs of gamers."
techpowerup
 

SonGoku

Member
Different architectures = different requirement for bandwidth in 4k

There is no benchmark yet so far.
I’m not saying you are wrong... it just seem you made up your minds without any evidence.
Sure but one has an excess of bandwidth which will come in handy for 4k
I reckon VII and 2080 are about equal, im just saying its a better investment for 4k gaming due to its bandwidth and more importantly extra memory giving it longer legs
It was not too long ago.

When the Radeon VII become the " best 4k gaming" card?
In that price bracket*
 
Last edited:

ethomaz

Banned
Sure but one has an excess of bandwidth which will come in handy for 4k
I reckon VII and 2080 are about equal, im just saying its a better investment for 4k gaming due to its bandwidth and more importantly extra memory giving it longer legs

In that price bracket*
How do you know that? Please share your sources.
 

JohnnyFootball

GerAlt-Right. Ciriously.
How do you know that? Please share your sources.
It comes from AMDs press conference where they showed benchmarks. Obviously they are shown in a positive light but there is no reason to believe that the 2080 won’t be the hallmark.

We will know much more in 3 weeks once reviews are out.
 
Top Bottom