• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon VII Announced

LordOfChaos

Member
Sure but one has an excess of bandwidth which will come in handy for 4k
I reckon VII and 2080 are about equal, im just saying its a better investment for 4k gaming due to its bandwidth and more importantly extra memory giving it longer legs


Bandwidth is as good as your ability to use it. Still has 64ROPs. Needs more testing before we can make either claim.
 

gspat

Member
It will be interesting to see the benchmarks come out in a couple weeks.

4K textures are huge. Being able to keep them in memory will really be a plus for this card.

I almost jumped at a 2060, but I can wait a couple weeks to see if it's worth it.
 

SonGoku

Member
Since when is a resource that is unused a good investment?

So if I slap 32GB VRAM, by your logic, it should sell like bread and butter.
Since there are already games that go over 8GB at 4k, Resident Evil is said to use 12GB
Next gen is around the corner as well which will raise the ceiling making it all the more likely for games to take advantage of more than 8GB VRAM
So if I slap 32GB VRAM, by your logic, it should sell like bread and butter.
No, that's not what i said
It would be more like having 2 cards that perform the same, that cost the same with one of them having more memory. The one with more memory is a better investment for 4k gaming due to it having more longeivity.
 
Last edited:

ethomaz

Banned
Since there are already games that go over 8GB at 4k, Resident Evil is said to use 12GB
The demo is broken... it shows 12-13 GB but it is using less than 7GB.

It reaches 13.5GB even in video cards with lower VRAM than that.
 
Last edited:

SonGoku

Member
The demo is broken... it shows 12-13 GB but it is using less than 7GB.

It reaches 13.5GB even in video cards with lower VRAM than that.
Are the requirements over 8GB? nevertheless there are games that take advantage of more than 8GB VRAM
 

Leonidas

Member
Buying a 16 GB card with ~2080 level performance(Radeon VII) is like buying an 8 GB card with ~GTX 970 level performance(R9 390).
Not a good investment. Radeon VII will not be a viable 4K/high detail card in a year or two. RTX 2080 might be with DLSS(+30% performance boost).
 

LordOfChaos

Member
Its still in excess that's a good thing no? even if you can't use it efficiently

If I have a TB/s memory but a chipset that is only able to exert 600GB/s load on it, the excess isn't just inefficient, it's unused. I don't know that that's the case here, but if this is still Vega unchanged but for the shrink as AMD says it is, and it still has 64 ROPs like the Vega 64 did, it has to be seen how much of that extra bandwidth it benefits from. I don't know where the limit was hit first between 64's memory or ROPs, that's why I want to see this tested before anyone says it is or isn't a better 4K card.
 

Ascend

Member
Buying a 16 GB card with ~2080 level performance(Radeon VII) is like buying an 8 GB card with ~GTX 970 level performance(R9 390).
Not a good investment. Radeon VII will not be a viable 4K/high detail card in a year or two. RTX 2080 might be with DLSS(+30% performance boost).
Extremely bad example...

 

SonGoku

Member
If I have a TB/s memory but a chipset that is only able to exert 600GB/s load on it, the excess isn't just inefficient, it's unused. I don't know that that's the case here, but if this is still Vega unchanged but for the shrink as AMD says it is, and it still has 64 ROPs like the Vega 64 did, it has to be seen how much of that extra bandwidth it benefits from. I don't know where the limit was hit first between 64's memory or ROPs, that's why I want to see this tested before anyone says it is or isn't a better 4K card.
If anything the limit would be much closer to 1TB/s something like 850GB/s to 900GB/s,
It doesn't make sense to waste resources for 1TB/s if all they could take advantage off is 600GB/s which is barely over half
Not a good investment. Radeon VII will not be a viable 4K/high detail card in a year or two. RTX 2080 might be with DLSS(+30% performance boost).
It will be better than a 2080 at 4k gaming, that alone makes it a good investment
A bit hypocritic to pimp dlss when there are already more released games that take advantage of more than 8gbvram than there are games supporting dlss

DLSS is nothing but a meme atm, even rtx is getting more support than it, for all we know dlss will never take off
 
Last edited:

CuNi

Member
I was with you until this...

Using that logic, RTX and DLSS are just as piss poor "investments".

No because RTX and DLSS are options you can chose to use or not. But its not up to the user to decide how much VRAM a game is going to use. You have a active choice with RTX and DLSS, but you don't have that choice with VRAM. If games keep being optimized they'll likely not going to in need of VRAM well above 8GB. Even when it could technically take up 10GB you will still be fine with 8GB without a noticeable performance hit.
 

SonGoku

Member
No because RTX and DLSS are options you can chose to use or not. But its not up to the user to decide how much VRAM a game is going to use. You have a active choice with RTX and DLSS, but you don't have that choice with VRAM. If games keep being optimized they'll likely not going to in need of VRAM well above 8GB. Even when it could technically take up 10GB you will still be fine with 8GB without a noticeable performance hit.
There are more released games that take advantage of more than 8GBVRAM than there are games supporting rtx and dlss support is non existent atm. Thats where your argument falls apart.
You are also ignoring we are just about to enter a new gen where the minimum and recommended requirements will catapult to new heights. Its pretty much guaranteed you will be seeing games taking advantage of more than 8GB VRAM in the future.

If both cards are on par with one having extra memory that alone makes it a better investment for 4k gaming,
 

SonGoku

Member
Which games?
Rise of the Tomb Raider,shadow of mordor, Black Ops III, FFXV, watch dogs 2 and RE2 Remake (pending release of full game)
There are more examples of games using 7GB+ vram for 4k, that's current gen games, with next gen around the corner 8GB+ games will be all the more common
 
Last edited:

Shotpun

Member
Which games?

There have been games that can gobble up more than 8GB of VRAM, but I haven't yet seen any that actually needs more than 8GB to perform smoothly without issues. If there are any I'd like to know what they are.

I'm sure there will be (if there already aren't) some special extra super overmax settings for some graphical settings at some point that really do need over 8GB of VRAM intended to use only with the most powerful GPUs but I can't see the norm being over 8GB anytime soon, only after majority of 4K users start having more than 8GB.
 

ethomaz

Banned
Rise of the Tomb Raider,shadow of mordor, Black Ops III, FFXV, watch dogs 2 and RE2 Remake (pending release of full game)
There are more examples of games using 7GB+ vram for 4k, that's current gen games, with next gen around the corner 8GB+ games will be all the more common
Neither of these uses more than 8GB VRAM.

Shadow of Tomb Raider has a bug that bring it over 8GB but that was fixed already.
 
Last edited:

CuNi

Member
Rise of the Tomb Raider,shadow of mordor, Black Ops III, FFXV, watch dogs 2 and RE2 Remake (pending release of full game)
There are more examples of games using 7GB+ vram for 4k, that's current gen games, with next gen around the corner 8GB+ games will be all the more common

Just because the game cobbles your VRAM and ends up taking up more than 8GB doesn't mean it NEEDS more than 8GB. I have been sitting here with my GTX 970 with it's crippled 4GB VRAM and playing games on high settings for the last couple years while every GPU generation somebody like you proclaimed the rise of VRAM needs and that we should all prepare for higher VRAM needs... and see where that got us so far.

So no, unfortunately it's your argument that falls apart as those games don't need 8GB or more.
 

shark sandwich

tenuously links anime, pedophile and incels
This thread is getting ridiculous. Can we all just agree to wait for some independent apples-to-apples benchmarks?
 

JohnnyFootball

GerAlt-Right. Ciriously.
Just because the game cobbles your VRAM and ends up taking up more than 8GB doesn't mean it NEEDS more than 8GB. I have been sitting here with my GTX 970 with it's crippled 4GB VRAM and playing games on high settings for the last couple years while every GPU generation somebody like you proclaimed the rise of VRAM needs and that we should all prepare for higher VRAM needs... and see where that got us so far.

So no, unfortunately it's your argument that falls apart as those games don't need 8GB or more.



Except that's exactly what it means. That's like saying my car has run out of gas, but that doesn't mean it NEEDS more gas.

Well yeah, it kinda does....
 
Last edited:

Shotpun

Member

Except that's exactly what it means. That's like saying my car has run out of gas, but that doesn't mean it NEEDS more gas.

Well yeah, it kinda does....


Some games can reserve more VRAM than they actually need. Unless you already knew that and just wanted to throw a lil jab?
 

gspat

Member
No because RTX and DLSS are options you can chose to use or not. But its not up to the user to decide how much VRAM a game is going to use. You have a active choice with RTX and DLSS, but you don't have that choice with VRAM. If games keep being optimized they'll likely not going to in need of VRAM well above 8GB. Even when it could technically take up 10GB you will still be fine with 8GB without a noticeable performance hit.
I would totally agree with you, except we both know games at 1080p and 1440p won't need the extra memory and many games at 4K won't even come close either ... Until they do ... And they always do.

I liken it to buying a car. You can either buy one with the option an oversized trunk (RTX/DLSS) or an option of an oversized gas tank (VEGA 7). Both could be useful but you'll find you use one offered feature more than the other.

But you buy it to suit your (perceived) needs. Some will want two spare tires and others will want longer driving range.

... And in the end it doesn't matter, because a few years down the road you'll end up buying another car anyways.
 

ethomaz

Banned


Except that's exactly what it means. That's like saying my car has run out of gas, but that doesn't mean it NEEDS more gas.

Well yeah, it kinda does....

Some games runs exactly the same in less VRAM cards... it just reserve more VRAM when has more VRAM.

Even it showing more VRAM it didn’t change performance or graphics setthings when it runs with less VRAM.
 
Last edited:

LordOfChaos

Member
If anything the limit would be much closer to 1TB/s something like 850GB/s to 900GB/s,


Based on?

Again, I don't know how much bandwidth this chip can demand, that's why I'm saying wait and see. 600 I pulled from my behind as an example. With the same ROP setup as Vega 64, we just have to wait and see how much, if any, bottleneck this extra bandwidth alleviates. You're claiming the memory bandwidth will make this a better gaming card, but it's entirely possible it's not so useful for games, but there more for the compute side.

Again, let's just see results, we don't know either way right now.
 
Last edited:

thelastword

Banned
Extremely bad example...


More Vram is better for the best textures/settings and higher resolutions.....I mean who saw this coming...:messenger_smirking:..Such a mysterious thing that...

I did a very quick research and the maximum RE7 reached at 4k with ultra was 8GB vram

Source: https://www.overclock3d.net/reviews/software/resident_evil_7_biohazard_pc_performance_review/9

As for RE2, it is still a demo as far as i know so maybe the game is just not well optimized and drivers and patchs can fix this
A gpu cannot use more VRAM than what it has, so developers have many ways to go around that, you saw what happened in Mirrors Edge Catalyst......Another thing that can happen is that System RAM is increased along with CPU usage with cards where there is a VRAM constraint.....

FWIW, RE7 has a high ceiling and you can use if you have it........If you look at a game like FF (though unoptimized and filled with NV performance de-enhancing drugs), you can only really get stutter free gaming at 4k with a card with lots of Vram, like the Titan for e.g.....

Just look at the link below and see what statement they have bolded in their conclusion, and that's at 1080p.....Yet remember, a card cannot use more memory than it has....Doesn't mean it won't run the game, but a card with more Vram will run it smoother, can do better settings with more aplomb, higher resolutions with less stutter, perf etc....

https://www.tomshardware.com/reviews/resident-evil-biohazard-re7-test,4907-4.html

Also, I don't understand the comments on RE2 from other users, the game uses 8GB of Vram on textures alone at 1080p, how can we conclude it will not need more or cannot use more. If you have an 8GB card you will have to run RE2 at lower than 8GB textures to avoid suttering, especially at higher resolutions.......Plus, RE2 has taken things to a whole new level over RE7 as far as textures, lighting and volumetrics..You will need lots of Vram and high bandwidth to run this at top spec at 4k........Which again brings us to the point of consoles; that is why VGtech is correct in saying that the consoles are using reconstruction for both RE2 and RE7, surely to save some cycles from that (would be) rez imprint........(Barring console settings vs best PC setting differences as well)....... So uhhh!.. I'm looking forward to hearing Dictator's breakdown on PC vs Console in RE2, seems they dialed it up a bit more, yes, we've had our differences, but I think he will do a very good job there. So looking forward to that.....

As for Vram in total.....

"The memory side has seen a major uplift with the card featuring 16 GB of HBM2 VRAM across a 4096-bit wide bus interface. There are four stacks, each of which operates at 256 GB/s, delivering a total of 1 TB/s bandwidth. AMD states that the excess memory is useful for content creation and upcoming titles in 2019 can use up to 11 GB of onboard graphics memory."

https://wccftech.com/amd-radeon-vega-vii-gaming-performance-benchmarks-specs-official/

Another exec said that games like the Division, DMC, RE2 will use the vram, especially at 4k as can be seen already....And of course many other games in 2019. Metro and many other titles are looking like good candidates....
 

ethomaz

Banned
RE2 demo didn’t use 8GB in 1080p... it is bug that shows more than it is actually using.

Even in 4k it didn’t cross 8GB VRAM.

It even shows 20GB VRAM usage in a 8GB VRAM card while the game is running fine at 60fps lol
 
Last edited:

SonGoku

Member
Neither of these uses more than 8GB VRAM.

Shadow of Tomb Raider has a bug that bring it over 8GB but that was fixed already.
https://www.guru3d.com/articles-pag...-graphics-performance-benchmark-review,8.html
https://www.overclock3d.net/reviews/software/watch_dogs_2_pc_performance_review/11
https://www.gamersnexus.net/game-be...ck-ops-iii-vram-consumption-benchmark-titan-x
https://www.geforce.com/whats-new/guides/rise-of-the-tomb-raider-graphics-and-performance-guide

Then we have games like dishonored 2 which takes 5.5GB at 1080p, there are more examples like this
With next gen around the corner, this trend will continue with games taking upwards of 10GB VRAM
Just because the game cobbles your VRAM and ends up taking up more than 8GB doesn't mean it NEEDS more than 8GB. I have been sitting here with my GTX 970 with it's crippled 4GB VRAM and playing games on high settings for the last couple years
So no, unfortunately it's your argument that falls apart as those games don't need 8GB or more.
1.) I never claimed needed, i said several games can take advantage of more than 8GB, and this trend will move forward as we enter next gen
2.) 970 is forced to lower textures in some games or face stutter
while every GPU generation somebody like you proclaimed the rise of VRAM needs and that we should all prepare for higher VRAM needs... and see where that got us so far.
It has nothing to do with a new gpu generation and everything to do with a next generation of consoles which push the bar higher
If you really expect 8GBvram to be sufficient for the entirety of next gen you are in for a rude awakening.

Based on?

Again, I don't know how much bandwidth this chip can demand, that's why I'm saying wait and see. 600 I pulled from my behind as an example. With the same ROP setup as Vega 64, we just have to wait and see how much, if any, bottleneck this extra bandwidth alleviates. You're claiming the memory bandwidth will make this a better gaming card, but it's entirely possible it's not so useful for games, but there more for the compute side.

Again, let's just see results, we don't know either way right now.
Why would they waste resources on 1TB/s bandwidth if all they can use is half?
All im saying is that if there is wasted bandwidth, it would be much less than half
 
Last edited:

Leonidas

Member
Extremely bad example...

I never said it didn't make a difference in any game, it's just not a good investment. Buying a potentially slower "high end" card today in hopes that it becomes faster in some games in the future...
 
Last edited:

LordOfChaos

Member
Why would they waste resources on 1TB/s bandwidth if all they can use is half?
All im saying is that if there is wasted bandwidth, it would be much less than half

I was only responding to you saying even if 1TB/s wasn't efficiently used, it was better for 4K, with an example. 600 was in no way a guestimate of what this could use, but any bandwidth in excess of what the chip can exert doesn't help it any more towards 4K performance, was all I was saying. My concerns are about it sharing the exact same ROP config as the Vega 64. This is why I'm waiting and seeing how much the bandwidth does help.

AMD's own slides show 25%, where the bandwidth is almost twice as high. Just looking at the bandwidth available has never been a reliable metric for how much something has improved.

Classic Jiren and Goku
 
Last edited:

Ascend

Member
I never said it didn't make a difference in any game, it's just not a good investment. Buying a potentially slower card today in hopes that it becomes faster in some games the future.
That's fair. But if the cards are equal in performance and equal in price, I'd take the one with more VRAM.


On another note... This is slightly relevant, if people are interested in what happened that Radeon VII got announced instead of something else (Navi).

 

SonGoku

Member
I was only responding to you saying even if 1TB/s wasn't efficiently used, it was better for 4K, with an example. 600 was in no way a guestimate of what this could use, but any bandwidth in excess of what the chip can exert doesn't help it any more towards 4K performance, was all I was saying. My concerns are about it sharing the exact same ROP config as the Vega 64. This is why I'm waiting and seeing how much the bandwidth does help.

AMD's own slides show 25%, where the bandwidth is almost twice as high. Just looking at the bandwidth available has never been a reliable metric for how much something has improved.
Im not sure how to word it properly but what i was trying to say is that VII has bandwidth to spare so that it won't be a bottleneck in the future, especially for 4k other components will lag behind first
Memory wise VII is better equipped to handle future games at 4k than the 2080*

*Pending 3rd party benchmarks confirmation that both cards have similar performance.
Classic Jiren and Goku
lol true
I never said it didn't make a difference in any game, it's just not a good investment. Buying a potentially slower "high end" card today in hopes that it becomes faster in some games in the future...
What if both cards have similar performance and price but one offers more memory. In this situation more is better
 
Last edited:

Leonidas

Member
Yep. Exactly why this 16gb card will age much better than gtx 2080.

Give rvii a few months for drivers to mature and then we can compare 2080 to it.

That's always the excuse for AMD cards, wait months(years) for driver updates then it'll be faster. There are already games where Radeon VII will only be 28-40 FPS(personal estimate from AMDs numbers, can't be too far off) with maximum settings at 4K, having 16 GB is not going to help it in many cases. And as games get more demanding it's only going to get worse, but I guess you'll be able to turn settings down while making use of 8K textures to have playable frame-rates!
 
Last edited:
That's always the excuse for AMD cards, wait months(years) for driver updates then it'll be faster. There are already games where Radeon VII will only be 30-40 FPS with maximum settings at 4K, having 16 GB is not going to help it in certain graphically intense games that are already available. It's going to be performing even worse at 4K in the future as games become more demanding.
Its not even out yet. Nvidia slaves should wait a month or so before shouting victory. The RT cards have had a chance for drivers to mature, why not a few months for amd?
 
Last edited:

Leonidas

Member
Its not even out yet. Nvidia slaves should wait a month or so before shouting victory. The RT cards have had a chance for drivers to mature, why not a few months for amd?

Doesn't have to be out to have a very good idea of performance numbers. Turing was a new archetecture, Vega VII is still Vega, they've been working on drivers for years already :messenger_tears_of_joy:
 
Doesn't have to be out to have a very good idea of performance numbers. Turing was a new archetecture, Vega VII is still Vega, they've been working on drivers for years already :messenger_tears_of_joy:
Yep, don't wait for reviews. Sounds like a corporate slave to me dood.

Turing shares a lot in common with Pascal/Maxwell.
 

Leonidas

Member
Yep, don't wait for reviews. Sounds like a corporate slave to me dood.

Turing shares a lot in common with Pascal/Maxwell.

Sure it's great to see the reviews for exact results but it won't change anything. There will 100% be games when the reveiws come out where the VII is only ~28-40FPS at 4K max settings, and that's in games that are already out.
 
Last edited:

CuNi

Member
It's not even out yet. Nvidia slaves should wait a month or so before shouting victory. The RT cards have had a chance for drivers to mature, why not a few months for amd?

So people are "Nvidia slaves" when they don't agree with your opinion. Okay, I see where this discussion is heading.
Why can't people have a normal conversation on GAF anymore when it comes to Consoles or Hardware?
It's not like people wish for AMD to burn and go bankrupt but we genuinely ask and point out that all this card has that sets it apart from a 2080 is its larger VRAM but also mention that VRAM won't give you any real performance advantage.
And even though a new console generation is coming soon, I don't think that VRAM requirement will rise as much as people think it will. I bet we will stay at 8GB easily until 2021. Only then I would say that people need more VRAM.
Games may use up to 10GB VRAM but that doesn't mean it needs them to run properly. Like I said before, I can run this game on everything max with a 970 at 1080p with 50-60FPS, even though the game tells me I need 13.4GB of VRAM.

If games would be optimized properly I bet we could go with 8GB until 2022 or even further.
4K screens have a low adoption rate anyway, sitting at roughly 1,5% according to the Steam data survey.
 
You're a corporate slave when you're slamming a product that isn't out yet. On PC, where drivers are a thing. Yes that fits my definition.

Maybe not you, but duck guy is he knocked amd because he had a faulty power supply lol.

Yes, we do not know comprehensive numbers for the vii yet and YES the rtx 2080 will run into vram limitations where the Radeon won't.
 
Last edited:

Leonidas

Member
Maybe not you, but duck guy is he knocked amd because he had a faulty power supply lol.

I never would have known I had a faulty power supply if it wasn't for AMD GPU power draw so that was a positive :messenger_smiling_with_eyes:
I've given AMD credit where credit was due(Ryzen, Vega 56 before Nvidia adaptive-sync, cheap 500 series cards with free games).
 
I never would have known I had a faulty power supply if it wasn't for AMD GPU power draw so that was a positive :messenger_smiling_with_eyes:
I've given AMD credit where credit was due(Ryzen, Vega 56 before Nvidia adaptive-sync, cheap 500 series cards with free games).
Vega 56 didn't stop being better than 1070 because Jensen panicked and unlocked adaptive sync.
 
And he panicked because nobody wants the rtx cards they'd rather get the 1080 ti lol. Only slaves defend nvidia prices and bs features.
 
Last edited:

CuNi

Member
And he panicked because nobody wants the rtx cards they'd rather get the 1080 ti lol. Only slaves defend nvidia prices and bs features.

That's the thing. Like you think us going at that card means we'd praise everything Nvidia would do.
I can't speak for Leonidas, but I am pretty sure he sees it just as I do.

Of course Nvidia deserves the flac they get for the horrendous prices they ask for their 20XX Series. I completely agree. They went nuts.
I set aside like 800-ish bucks because the rumours said that the 80 and 80ti release simultaneously and I planned on getting the ti since it mostly (before mining happened) sold in that price range. When they unveiled the ti for 1.4k € my jaw hit the ground so hard, I think some close seismographs picked up the force of my jaw impacting on the floor. And I also think that RTX is in its current state, just not worth mentioning or usable at all.
But that all together just shows that this card is in a awkward position. It's pitted against a card that has all this RTX and DLSS crap on its chip and still only matches it while even having the 7nm advantage.
So while I wouldn't call RTX a must get feature or advise anyone to go for RTX, the fact that I get such a card in the same price segment just favors the RTX card, because you get the same performance AND RTX/DLSS features on top for basically free at a even lower energy consumption. And while your argument with "it's future proof for 4k" is technically correct you still forget something else. Like I said, 4K isn't that widely adopted now and I don't see it explode in popularity anytime soon. It ofc will be on the rise but I don't think it will be a relevant percentage until well into 2021. Until then, people with 4k setups are the high-end segment anyway and if games were to need more than 8GB, they would simply get the then latest GPU monster for that. And since you bring up the upcoming console generation and its increase in quality for future games.. yes consoles will up the level of quality and fidelity.. but it's not only textures that will go up but also probably model complexity etc. So while this card is well equipped for 4k textures etc, you still would probably need to take some drawbacks in other areas of a game and then you'd have to ask yourself again "Was it worth it to buy this 700$ Card to play at mixed medium/high but with 4k textures?". Another thing is Gameworks features. Afaik they will (probably artificially) run worse on AMD cards which even further pushes the 2080 into a better "bang for bucks" position.


tl;dr
I don't think this card is trash, I just think it's at a very weird spot because of its price and expected performance and most people will opt to simply go with Nvidia for RTX/DLSS/GameWorks features instead of just having to have double the VRAM. I mean people that can spend 700$ on a card now probably can just sell the 80/ti in 1-2 years for still a decent price and just upgrade to the then 4k capable GPU.
 
Tbh I would never buy the vii either, I'm only arguing that from a performance standpoint it will be superior to the 2080 in time ; its simply a more capable card. Taking power usage out of the equation imagine these cards in a console, you should know damn well that memory set up and lack of pc overhead would put it well ahead of nvidia.

Anyways So I wouldn't buy any current high end card right now - on the nvidia side because fuck them they need a kick in the balls, and they're too expensive. On the Radeon side its too expensive as well and there's the power consumption.

And I definitely think its dumb as he'll to upgrade before a new console generation sets the baseline. Just a really shotty time for high end pc parts.
 
Tbh I would never buy the vii either, I'm only arguing that from a performance standpoint it will be superior to the 2080 in time ; its simply a more capable card. Taking power usage out of the equation imagine these cards in a console, you should know damn well that memory set up and lack of pc overhead would put it well ahead of nvidia.

Anyways So I wouldn't buy any current high end card right now - on the nvidia side because fuck them they need a kick in the balls, and they're too expensive. On the Radeon side its too expensive as well and there's the power consumption.

And I definitely think its dumb as he'll to upgrade before a new console generation sets the baseline. Just a really shotty time for high end pc parts.
If the June/July release for Navi is true, then getting the Radeon VII doesn't make sense unless you're a collector or a prosumer. That card is just there to bridge the gap from now to Navi's release so Nvidia isn't the only one offering high-end GPUs.
 
Top Bottom