https://www.anandtech.com/show/13923/the-amd-radeon-vii-reviewKicking off our wrap-up then, let's talk about the performance numbers. Against its primary competition, the GeForce RTX 2080, the Radeon VII ends up 5-6% behind in our benchmark suite. Unfortunately the only games that it takes the lead are in Far Cry 5 and Battlefield 1, so the Radeon VII doesn't get to ‘trade blows’ as much as I'm sure AMD would have liked to see. Meanwhile, not unlike the RTX 2080 it competes with, AMD isn't looking to push the envelope on price-to-performance ratios here, so the Radeon VII isn't undercutting the pricing of the 2080 in any way. This is a perfectly reasonable choice for AMD to make given the state of the current market, but it does mean that when the card underperforms, there's no pricing advantage to help pick it back up.
Comparing the uplift over the original RX Vega 64 puts Radeon VII in a better light, being about 24% faster at 1440p and 32% faster at 4K. Reference-to-reference, this might even be grounds for an upgrade rather than a side-grade. But the fact of the matter is that its predecessor was competing against the second-tier GTX 1080, and now with the Radeon VII, Vega is still looking to match the performance of last generation’s flagship, the GTX 1080 Ti. The positioning is still set on pure gaming terms too, and power efficiency isn’t one of the allures of the Radeon VII.
Where it does have some interesting potential is on the compute and professional visualization side, though given our limited tests there’s little conclusive to say. So where does this leave AMD? Their situation is improved, but the overall competitive landscape hasn’t been significantly changed. The renewed AMD option to high-performance graphics is still important in terms of maintaining FreeSync possibilities. As an upgrade choice, the Radeon VII offers itself better as a high-VRAM prosumer card for gaming content creators, and of course that is intended justification for its $699 cost, a price point initially carved out by its competitor. For pure gamers, then the price is really the point of contention.
here in the UK the VII is going for about £650-700. a 2080 is £750-850. both are expensive as shit but the VII you can get a bit cheaper upfront. so it ain't a terrible deal if you don't care about RTX/DLSS. well, i guess the "savings" you get from buying a VII will be lost because your power bill will skyrocket lol.AMD, stay losing. 2080 is the same price with more features.
AMD, stay losing. 2080 is the same price with more features.
Tech Jesus's Review was detailed and full of great info as usual......No review thread is complete without Tech Jesus:
Another thing to note; is that this card is being compared to an $800 OC'd 2080FE, whilst the Radeon 7 is at reference clocks.......It is not being compared to the $700.00 standard 2080 with lower clocks...So take that as you will, or as it is.......
Weird is a review with only two games tested where one RTX 2080 have a good lead and the other it trade blows...Techradar's review was MUCH more positive... as is IGN's
https://www.techradar.com/reviews/amd-radeon-vii
https://www.ign.com/articles/2019/02/07/amd-radeon-vii-review-and-benchmarks
Digital Trend's review is even better https://www.digitaltrends.com/computing/amd-radeon-vii-review/
Weird.
Every bit helps and skews results........Sometimes a game wins by 1 -2fps, it's still recorded as a win.....Besides, depending on the game, I think an OC can give you a bit more than that....If a title is already an Nvidia developed title or heavily favors NV, the OC boosts it even more over the competition........It would just be fairer to compare a $700.00 card against a $700.00 card, then if they want they can OC both of them.....That should have no real impact. It's only a 90MHz offset, if I recall correctly, and the difference between reference and overclocked is almost non-existent in practice. Looking at different versions of the 2060, for instance, the EVGA XC Black (bone stock, 1680MHz listed boost clock) vs the ASUS Rog Strix (overclocked to 1860MHz) shows no real performance difference (1-2 fps.)
Sorry but this is ridiculous. You really think it makes sense to buy a card that has worse performance today in the hopes that it’ll maybe have better performance in the future?I'm surprised Radeon VII can match 2080 results, I wasnt expecting so big performance jump from vega64. Compared to RTX 2080 there's of course one issue here, lack of tensor and RTX cores equivalent but on the other hand Radeon VII have impressive 16GB vram. The amount of vram should be very important factor in the near future, because one year from now we should see next gen consoles with around 16-24GB of system memory so when 8GB cards like RTX 2080 will run out of vram (games like RE2 remake can allocate around 9GB even today and that's game made with current consoles in mind), 16GB on radeon should sufice for a long time.
Performance today is very comparable and when games will start using more vram then RTX 2080 results will only get much worse, and it's not a guestion of if, because when next gen consoles will launch in the near future with 16-24GB of system memory, then vram usage in multiplatform games on PC will increase without any doubts. It happened many times before and it will happen again. I still remember how angry I was when I bought GTX 680 2GB in 2012, it was extremely fast card, but very soon PS4 has launched and although I could run PS4 games with 2x better performance or 2x higher resolution my GPU was vram limited in almost every game, and stuttering killed whole experience.Sorry but this is ridiculous. You really think it makes sense to buy a card that has worse performance today in the hopes that it’ll maybe have better performance in the future?
I guess we’ll see, but I’d be surprised if the relative performance between 2080 and Radeon VII changes all that much. I certainly wouldn’t base a $700 purchase on that hope.
Well I think there is more than a little......I guess it can also depend on which vendor you prefer, because people buy cards or don't based on that reason alone, but technically; Radeon 7 has +8Gb of Vram +2.1 more bandwidth which translates to less stuttering in games.....(watch the optimum tech video for a few examples).......So these features makes it more futureproof.....Nobody buys a card at $700.00 simply for games you can get now or for older games, which as was shown in many of the videos, even the bittech video btw. There are many games and applications where the extra memory is being used now....Even Richard Leadbetter's productivity application errored out with an NV card decked with 12GB of Vram but finished the job with the 16Gb Radeon.....It's a decent card, but the problem is that there is little reason to get it over the 2080. I wouldn't be surprised if nvidia announces a tiny price drop on the 2080 to make it better.
VRAM usage won't be moved that much in the future years... it wil have a spike when games started to use 8k resolutions but the trend is to better memory management and compression to use less and less VRAM.Performance today is very comparable and when games will start using more vram then RTX 2080 results will only get much worse, and it's not a guestion of if, because when next gen consoles will launch in the near future with 16-24GB of system memory, then vram usage in multiplatform games on PC will increase without any doubts. It happened many times before and it will happen again. I still remember how angry I was when I bought GTX 680 2GB in 2012, it was extremely fast card, but very soon PS4 has launched and although I could run PS4 games with 2x better performance or 2x higher resolution my GPU was vram limited in almost every game, and stuttering killed whole experience.
Nv should never release RTX 2080 with just 8GB. Nv probably wanted to make more money on RTX 2080 cards so they went with just 8GB. Personally I have learned on my own mistake from the past and right now I consider GPU vram amount extremely important aspect. 1,5 year ago I could buy GTX 1080 with 8GB, but I wanted more vram, so I bought 11GB 1080ti and today there are alreqdy games like RE2 that use around 9GB vram in 4K and next year when next gen consoles will launch people will see more and more games like that.
Your graphs shows memory bandwidth reduction, but not memory usage reduction! RE2 is game made with current gen consoles in mind and look at vram usage on PC in 4K, around 9GB! There are also other games like with COD and final fantasy with similar vram usageVRAM usage won't be moved that much in the future years... it wil have a spike when games started to use 8k resolutions but the trend is to better memory management and compression to use less and less VRAM.
Next-gen consoles are set to use 16GB shared memory... maybe 20GB shared memory... no more than that.
That is what is happening each new GPU architecture...
Some games allocate the full VRAM but they didn't use all of them so running them with 8GB or 4GB cards won't make any difference between performance or graphic quality.
Radeon VII has 16GB for prosumers... not gamers.
We reached a era in PC world that more space (that include memory) is not needed that is why the advance is so slow.
This is clearly a stopgap product just so AMD can at least show their face in the high-end gaming market. Wait for Navi.
Great review here........Probably one of the best...He was the only one I've seen able to get the card OC'd too....Great gaming results, Superior production capabilites and great temps/wattage.......The truth is, if you undervolt Radeon 7 slightly, you can boost up clocks over 200Mhz easily, you can also boost the memory as well......Of course, by now, many tech guys should know, that the first thing you do on an AMD card is to boost the power limit to max.....I think we will see some interesting results from Radeon 7, when all folk get the memo on how to use the card properly.....
Another solid review with a different config....
The way the Radeon 7 is designed with it's sensors, at default it pushes the fan speed up quite a bit. AMD is aware of that and will offer a quiet mode option, but undervolting the Vcore is not only allowing an OC on Radeon 7, but it allows, lower temps, lower decibels and of course lower fan speeds....Some updated/public drivers should sort a few issues with afterburner etc......
No. It's priced right at where it's supposed to be. It's performance is on par with the 2080 and is priced at that.Why do you.... type setences...like this....
And why are you defending this card? It’s overpriced to the extreme for what it is.
...except it doesn't have half the features the 2080 has.No. It's priced right at where it's supposed to be. It's performance is on par with the 2080 and is priced at that.
That's not to say it's a great deal by any means, but it's not overpriced.
Did you checked with another tool? Because it didn't use 9GB of VRAM... the ingame tool maths are wrong... it is broken.Your graphs shows memory bandwidth reduction, but not memory usage reduction! RE2 is game made with current gen consoles in mind and look at vram usage on PC in 4K, around 9GB! There are also other games like with COD and final fantasy with similar vram usage
Next gen consoles will have at least twice as much memory compared to current gen consoles, so if 8GB vram can be an issue already then you can bet next gen consoles will push vram requirements to the new level.
2080 unlike radeon VII has very interesting RTX feature (DLSS also) but even RTX feature increase vram requirements even more. Battlefield 5 use around 5-6GB vram without RTX, and 8GB with RTX. Basically speaking 8GB vram is not enough for high end card with RTX feature on top of that, 2080 has 8GB the same amount of memory as much cheaper and old 580 RX! But I'm must say what Nv have done is very clever, because now people will buy 2080 anyway, and 1/2 years later people will be forced to buy new GPU because of vram issues.
And the 2080 doesn't possess the non-gaming features that could be provided by 16GB HBM2....except it doesn't have half the features the 2080 has.
Navi is the next generation architecture. Rumors are that the first Navi cards will be midrange but high-end will come later this year/early 2020So if I'm in the market for a high end GPU for 4K gaming, but can't afford a 2080ti, am I right in thinking it goes something like this in terms of best price/performance:
1080ti used>>>>Radeon VII>>>>>2080
?
Isn't Navi going to be aimed more at the mid range market though?