Athlon says hi, have you forgotten about that high clocked P4 guy?...reached better clocks...
Actually that was one of the weakest part of GCN Arch... it always lacked high clocks to be competitive.Athlon says hi, have you forgotten about that high clocked P4 guy?
There is no "better clocks".
P4 had higher clocks which had nothing to do with "better".it still have better clocks.
It was CGN chip that wiped the floor with Fermi.Actually that was one of the weakest part of GCN Arch... it always lacked high clocks to be competitive.
AMD was always the first to embrace the next fab process, but 20nm never took off, it hit AMD more than others.All the issues AMD has with power draw and performance per watt is due they having that work with clock higher than GCN actually can delivery.
P4 have better clocks that will always be true but it lower IPC and longer latency cycles did hurt lot it performance.P4 had higher clocks which had nothing to do with "better".
It was losing vs lower clocked AMD's Athlon at perf, price and power consumption.
Intel's on chips of the next gen would step back on clock race.
It was CGN chip that wiped the floor with Fermi.
It was still CGN chip that trounced GPU market.
AMD was always the first to embrace the next fab process, but 20nm never took off, it hit AMD more than others.
Depends on your definition of crazy. I think it's not a good economic decision, at least until independent reviews have come in and we know more about the value proposition. It also depends on how much you could sell your 1080 for, and the street prices for partner VII cards. I'm sitting here with a 1080 and would need to see some significant improvements in performance to pay a few hundreds for an upgrade. I personally won't upgrade my GPU until the next generation of consoles hits shelves. My assumption being that until then the 1080 is still be easily able to deal with any new title that's coming out.Is selling my 1080 and getting this instead a crazy thing to do? I'm just getting tired of Nvidia and want to move away from them.
Let me try once more to address your confusion:I'm not sure what are you debating.
Is selling my 1080 and getting this instead a crazy thing to do? I'm just getting tired of Nvidia and want to move away from them.
It is GCN, c'mon.Let me try once more to address your confusion:
1) There is no "higher clock" competition.
2) Only performance, perf/watt, price is what matters
3) Higher clock is not the only way to achieve that, as we have seen many times
4) There is better performance, better performance per watt, better performance per buck, there is no "better clock"
5) It was CGN that trounced nVidia's Fermi, stop being dismissive about it, it was a great architecture
issues to reach the high clocks
because it Arch can't reach better clocks
"higher clock" issues
It is a well know issue
There is nothing wrong with AMD drivers. They are arguably superior to nVidia's now.Yes. If you don't do content creation, this card is basically a non-starter. You'd be better off buying a used 1080ti for $500 or a new 2080 for ~$700.
Also, if you're tired of nVidia, wait until you experience AMD drivers.
AMD drivers work fine. They used to be rubbish, but with my previous R9 390 I had zero driver issues.Also, if you're tired of nVidia, wait until you experience AMD drivers.
It is GCN, c'mon.
It was a good arch 8 years ago... nVidia surpassed them with new Arch years ago... Maxwell was already better than GCN.
I'm not even sure why are you comparing with Fermi because Fermi is 2010 arch while GCN is 2012 and it was competed with Kepler (2012).
Fermi rival was TeraScale 2 arch... both 2010.
GCN is today an old and dated arch that have big issues to reach the high clocks to be competitive and that is why it suffer with disproportional power draw and heat... AMD have to push the clocks over the limit to try to be competitive... so yes there is "higher clock" issues.
GCN is in the situation it is today because it Arch can't reach better clocks... that made AMD give up the high-end market.
It is a well know issue in any graphic card / PC technical site.
That seems like quite an advancement, thanks for posting this.AMD Radeon VII Detailed Some More: Die-size, Secret-sauce, Ray-tracing, and More
It depends on your priorities. If you are in "f*ck off nVidia" camp and have patience to wait for Q3 launch, why not.
Holy shit. Making less than 5000 of them and losing money. Sounds like this thing only exists so that AMD can at least show its face in the high-end market.Hot on news of VII not having uncapped FP64 like M150, which further limited its appeal to people who may have wanted it as a compute card rather than a gaming one, turns out AMD is making less than 5000 of these...
...Honestly, makes more sense than the rest of it. AMD never expected this card to have much appeal. Not a great one to gamers who can get an 2080 for the same price and performance with bonus features, and no longer to those in the compute field either, unless they fall perfectly in the niches that all of
1) Don't benefit from reduced precision Tensor cores
2) Don't benefit from FP64 double precision
3) Just need a massive fast VRAM close to the GPU above all
...And those are the 5000 customers I guess, lol
https://www.tweaktown.com/news/64501/amd-radeon-vii-less-5000-available-custom-cards/index.html
Holy shit. Making less than 5000 of them and losing money. Sounds like this thing only exists so that AMD can at least show its face in the high-end market.
Lisa said it's both, this will be a gaming card and a PROsumer card, the FP64 performance, the memory and bandwidth should increase perf for either potential buyer...….AMD certainly would not just make a dedicated gaming card on Vega, which was already a compute card to begin with, it's just that it's great at gaming too......So two birds with one stone......If gamers are crying it's too expensive and would rather buy an RTX 2080 with half the ram and bandwidth for 800.00, then content Producers will buy this out of stock in a heartbeat….AMD wins either way and then they go on to launch Navi (completely gamercentric) in July....Do you do a lot of content creation? Because that's where the 16GB of HBM2 comes in handy. The Radeon VII is more of a 'prosumer' card than a gamer card, imo.
I like how we're running with everything the press says...The press said there would be no Gaming Vega too....Press has not even benched the card but they're already giving results.....
Lisa said it's both, this will be a gaming card and a PROsumer card, the FP64 performance, the memory and bandwidth should increase perf for either potential buyer...….AMD certainly would not just make a dedicated gaming card on Vega, which was already a compute card to begin with, it's just that it's great at gaming too......So two birds with one stone......If gamers are crying it's too expensive and would rather buy an RTX 2080 with half the ram and bandwidth for 800.00, then content Producers will buy this out of stock in a heartbeat….AMD wins either way and then they go on to launch Navi (completely gamercentric) in July....
I welcome this card for 8k and 4k gaming, it will be futureproof with all that bandwidth, besides.....RE7 already uses over 12Gb of VRAM, RE2 can use over 14Gb and there are quite a few games coming that will use lots of Vram in 2019...…..Persons think they are safe with 8Gb's or even 6GB don't know half the story...
Lisa said it's both, this will be a gaming card and a PROsumer card, the FP64 performance, the memory and bandwidth should increase perf for either potential buyer..
We reached out to AMD’s Director of Product Marketing Sasa Marinkovic to inquire whether or not the FP64 inclusion was real, and were told quite simply that “Radeon VII does not have double precision enabled.” That means instead of delivering 6.7 TFLOPS of FP64 like the MI50, Radeon VII will be closer to ~862 GFLOPS (it’s 1:16 with single-precision like RX Vega).
If that is true then 7nm is more like a paper launch because it is immature yet.
I like how we're running with everything the press says...The press said there would be no Gaming Vega too....Press has not even benched the card but they're already giving results.....
Lisa said it's both, this will be a gaming card and a PROsumer card, the FP64 performance, the memory and bandwidth should increase perf for either potential buyer...….AMD certainly would not just make a dedicated gaming card on Vega, which was already a compute card to begin with, it's just that it's great at gaming too......So two birds with one stone......If gamers are crying it's too expensive and would rather buy an RTX 2080 with half the ram and bandwidth for 800.00, then content Producers will buy this out of stock in a heartbeat….AMD wins either way and then they go on to launch Navi (completely gamercentric) in July....
I welcome this card for 8k and 4k gaming, it will be futureproof with all that bandwidth, besides.....RE7 already uses over 12Gb of VRAM, RE2 can use over 14Gb and there are quite a few games coming that will use lots of Vram in 2019...…..Persons think they are safe with 8Gb's or even 6GB don't know half the story...
I like how we're running with everything the press says...The press said there would be no Gaming Vega too....Press has not even benched the card but they're already giving results.....
Lisa said it's both, this will be a gaming card and a PROsumer card, the FP64 performance, the memory and bandwidth should increase perf for either potential buyer...….AMD certainly would not just make a dedicated gaming card on Vega, which was already a compute card to begin with, it's just that it's great at gaming too......So two birds with one stone......If gamers are crying it's too expensive and would rather buy an RTX 2080 with half the ram and bandwidth for 800.00, then content Producers will buy this out of stock in a heartbeat….AMD wins either way and then they go on to launch Navi (completely gamercentric) in July....
I welcome this card for 8k and 4k gaming, it will be futureproof with all that bandwidth, besides.....RE7 already uses over 12Gb of VRAM, RE2 can use over 14Gb and there are quite a few games coming that will use lots of Vram in 2019...…..Persons think they are safe with 8Gb's or even 6GB don't know half the story...
Limited to 5000 units means yields issues with 7nm.7nm is doing well, just look at Zen2 for example, the issue in this particular case is thet GCN has never really benefit well from die shrinking. At least they could boost up the clocks significantly, since the GCN is limited to 4096 cores, 64ROPs etc., there's not much they can do with the architecture anymore (rumors say there is a 6144 core/21TF Vega 20 behind the closed door tho).
Or the yields are exceptionally good for the Instinct GPUs, and these are failed parts that are downgraded to gaming graphics cards.Limited to 5000 units means yields issues with 7nm.
I can’t see any other business reason except production issues.
5000 units is definitively less than usual paper launch.
I welcome this card for 8k and 4k gaming, it will be futureproof with all that bandwidth, besides.....RE7 already uses over 12Gb of VRAM, RE2 can use over 14Gb and there are quite a few games coming that will use lots of Vram in 2019...…..Persons think they are safe with 8Gb's or even 6GB don't know half the story...
In Digital Foundry's RE2 Demo Tech Analysis the game was actually using 9.6GB of ram 4k@max while the game options was telling it will use nearly 14GB. Also, keep in mind there is a difference between how much of VRAM game can make use of and how much it actually needs to perform smoothly.
Radeon VII will probably be ok 4k card for a while, but you can forget 8k.
It was CGN chip that wiped the floor with Fermi.
TeraScale (HD5000 series) wiped the floor with Fermi not GCN.5) It was CGN that trounced nVidia's Fermi, stop being dismissive about it, it was a great architecture
Were you aware of this? Because FP64 performance isn't rosy...
https://techgage.com/news/radeon-vii-caps-fp64-performance/
Kind of kills one of the main ways it could have been interesting, as a Titan equivalent, compute card in a gamer cards clothing, but with 1/16th rate FP64 that's a no go for anything needing higher precision.
Correction, it's a 1/8th rate, still a far shot from Vega 20's full rate, but, well, half as bad as 1/16th.
Boy, AMDs messaging is all over the place. But I imagine a 1/8th rate kills it just as dead as 1/16th for anyone needing DP and thinking this was a cheaper M150.
Is it? It has the same price than GTX 2080 and it is more expensive than GTX 1080TI.Its still the best 4k gaming card in that price sector, it has that going for it.
Its really getting annoying seeing these posts about the 1080 Ti, because its simply not true any more. All 1080 Tis are completely sold out at this point. I guarantee you will not find one for $699. Go check on Amazon, Newegg or Microcenter. Theyre either completely sold out or being sold by 3rd party seller in $800 - $1200 range. The ones on ebay are used ans likely spent a lot of time as miner card, so that doesnt count.Is it? It has the same price than GTX 2080 and it is more expensive than GTX 1080TI.
How these cards compare in performace?
1080ti is discontinued,Is it? It has the same price than GTX 2080 and it is more expensive than GTX 1080TI.
How these cards compare in performace?
Where you saw that “edge” in 4k? Can you share any source or it is just speculation?1080ti is discontinued, cheapest one can be found for $650 in newegg
VII performs like the rtx 2080 at 4k, but it has the advantage of higher bandwidth which gives it a edge at 4k and the extra memory makes it have longer legs for 4k
If your interest is 4k gaming, VII is the better option in that price bracket
4k=bandwidth goodWhere you saw that “edge” in 4k? Can you share any source or it is just speculation?
GCN arch is indeed more hungry for bandwidth than nVidia’s arch.
Purely speculation because we don’t have any independent benchmarks yet.Where you saw that “edge” in 4k? Can you share any source or it is just speculation?
GCN arch is indeed more hungry for bandwidth than nVidia’s arch.
Different architectures = different requirement for bandwidth in 4k4k=bandwidth good
Plus some benches were posted on this page were VII edged 2080
Vega has a tile based renderer no?
Speaking of lack of evidence, you just claimed that 1080 Tis are cheaper when in fact they are not.Different architectures = different requirement for bandwidth in 4k
There is no benchmark yet so far.
I’m not saying you are wrong... it just seem you made up your minds without any evidence.
It was not too long ago.Speaking of lack of evidence, you just claimed that 1080 Tis are cheaper when in fact they are not.
Yeah that is nothing but wishful thinking at this point. Radeon VII has lots of bandwidth, more than even the 2080 Ti, but obviously it’s not going to outperform that GPU.Different architectures = different requirement for bandwidth in 4k
There is no benchmark yet so far.
I’m not saying you are wrong... it just seem you made up your minds without any evidence.
It’s been close to 3 months since 1080 Tis were affordable. They were pretty much all gone by November. Same with the regular 1080. Now that the 2060 is out there is literally no reason to compare the 10 series to the new Radeon.It was not too long ago.
When the Radeon VII become the " best 4k gaming" card?
Sure but one has an excess of bandwidth which will come in handy for 4kDifferent architectures = different requirement for bandwidth in 4k
There is no benchmark yet so far.
I’m not saying you are wrong... it just seem you made up your minds without any evidence.
In that price bracket*It was not too long ago.
When the Radeon VII become the " best 4k gaming" card?
How do you know that? Please share your sources.Sure but one has an excess of bandwidth which will come in handy for 4k
I reckon VII and 2080 are about equal, im just saying its a better investment for 4k gaming due to its bandwidth and more importantly extra memory giving it longer legs
In that price bracket*
It comes from AMDs press conference where they showed benchmarks. Obviously they are shown in a positive light but there is no reason to believe that the 2080 won’t be the hallmark.How do you know that? Please share your sources.
That they are equal or that the extra memory will give it longer legs?How do you know that? Please share your sources.