• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Radeon VII Review Thread....

Makariel

Member
I think this whole discussion gets more relevant once it's possible to buy that bloody thing. Here it's pretty much sold out wherever I looked, the few places with stock go for higher prices than RTX2080.
 

pawel86ck

Banned
Did you checked with another tool? Because it didn't use 9GB of VRAM... the ingame tool maths are wrong... it is broken.
We are not talking about ingame vram usage info, that's real vram usage, and it can go up even higher than that at least in 4K.
 

ethomaz

Banned
We are not talking about ingame vram usage info, that's real vram usage, and it can go up even higher than that at least in 4K.
Source?

I tried to find and there is nothing in the Google.

Are there any difference in graphic quality between the 4k max setting with lower VRAM and higher VRAM?
 
Last edited:

shark sandwich

tenuously links anime, pedophile and incels
If I hear another "wait for Navi" I will just go ahead and buy everything intel and nvidia because you can wait forever for new hardware as there is always something new on the horizon with all of the promise of summer.

I’ve followed PC gaming since the late 90s and there’s definitely been an “AMD cycle” since Core 2 Duo released in 2006. Goes something like this:

- “it’s not fair to compare their current-gen product with AMD’s last-gen product, wait till AMD’s new one launches”

- rabid fantasizing about how much ass product++ is going to kick

- product launches to underwhelming reviews

- “wait until better drivers/devs optimize for the new hardware/games are optimized for AMD because of next-gen consoles”

- “well, it’s still a better value when you consider the motherboards are cheaper/it comes with free games”

- blame consumers for not buying AMD

- competitor releases new product that has a clear lead

- repeat the cycle




I will say that Zen most definitely broke the cycle on the CPU side of things. If Navi does to the GPU market what Zen did to CPUs (aka almost-as-good performance at a much lower price) then we’re in for a treat. But Radeon VII definitely fits into the cycle I mentioned.
 

pawel86ck

Banned
Source?

I tried to find and there is nothing in the Google.

Are there any difference in graphic quality between the 4k max setting with lower VRAM and higher VRAM?
I have linked you screenshot with REAL vram usage, so what more do you want to see :).

Here's entire video, because maybe you want to see it.


ingame options suggest RE2 can allocate even 13.69 GB, but that's of course estimation. But ingame vram information indicating 9GB vram usage is real number, and it's still insane amount of vram no matter how you look at it.

Besides RE2 remake there are also other vram hungry games
The division 2 in 4K, around 9GB vram usage


Screenshot-20190208-162416-You-Tube.jpg


COD WW2 - around 10GB in 4K


FinalFantasy around 10GB vram usage at just 1440p :p


And keep in mind, these are games made with current gen consoles in mind :). just running in much higher resolution.
 
Last edited:
And the 2080 doesn't possess the non-gaming features that could be provided by 16GB HBM2.

Most importantly, those RTX features are essentially meaningless if the performance and support is non-existent.

As of right now, after close to 6 months on the market, Battlefield is still the only game with support for ray tracing.
Yes and DLSS is and will forever be useless right?
 
Based on the reviews, this has been a disaster. It's hard to fathom how stupid AMD was to release these out to review with the drivers being in such poor shape.

Despite that, the performance in a lot of cases is decent, but not exceptional.

The thing that surprised me is how good they made the 1080 Ti look. It's too bad everyone missed out when they could be had for $600 - $650.
I didn't. :messenger_sunglasses:
 
Exactly what I was predicting.

Same price as RTX 2080, louder, hotter, higher power consumption, worse performance in most games, and none of the RTX features.

Might be good for “prosumers” but there’s very little reason for a gamer to choose this card.


This is clearly a stopgap product just so AMD can at least show their face in the high-end gaming market. Wait for Navi.

You do get 3 new games with it. Just saying.

Performance in DX12/new games seems on par with 2080 or occasionally faster.
 

thelastword

Banned
Yes and DLSS is and will forever be useless right?
Yes, you are asking folk to consider RTX for a feature that's pretty much useless and missing in all games except BFV with no appreciable visual or performance gains..... As opposed, people are benefitting since yesterday with Radeon 7's with it's extra 8Gb and it x2. 1 bandwidth, in games at 4k with less stutter, you can also go crazy on Radeon with AA and supersampling at 4k and productivity sees an immense boost over the competition.

I just think people are vastly unfair to AMD and its one of the reasons why Nvidia has monopolized and crippled this industry for so long. You even have people saying this card should be $600 when Radeon 7 is much more expensive to produce than the RTX 2080 and yet the FE card which competes with the Radeon is $800.00, yet no one is asking NV to make the RTX 2080 $400 which it should be, but they want AMD to sell an expensive card at a severe loss so NV would lower their price and of course they could run and buy Nvidia.

AMD is in the business to make money, not to lower the price of NV products for Nvidia fans. At this point, such persons just want AMD to sell at a severe loss so they can go under...... Yet, that's not going to happen, hence the reason NV fans have been so vocal before and after this launch. Even Jensen Huang is shook and hearing the impending bells, especially after yesterday's news broke of investors pulling out. I assure you this, NV's dominance is not for too much longer. The industry will never move forward with RTX unless AMD is on board, for now the feature is just not ready.... I'll tell you this, never encourage a monopoly or bad practice in anyway in any industry, it never ends well, it will always crash and burn......
Is it wierd that the Radeon VII works better on the AMD Threadripper rather than the Intel i7 8700k?
No doubt, I think Radeon 7 and Navi is gonna sing with Ryzen 3000 performance.........
 
The problem with AMD is their driver team must be seriously underfunded compared to Nvidya. You see this in Battlefield 5 benchmarks where the Radeon 7 is 8% faster at 4K.

Then you look at 'smaller', unoptimized games like Dragon Quest XI where it's a massive 30%+ slower!
 

Barakov

Gold Member
I had hopes for this but I think it's time to hop back on the Nvidia train. Isn't Intel releasing a graphics solution this year or did I dream that up?
 

thelastword

Banned
The problem with AMD is their driver team must be seriously underfunded compared to Nvidya. You see this in Battlefield 5 benchmarks where the Radeon 7 is 8% faster at 4K.

Then you look at 'smaller', unoptimized games like Dragon Quest XI where it's a massive 30%+ slower!
I think they've invested in the driver/software team heavily recently, their drivers are pretty good tbh, but every product will have some issues at launch...Turing had more issues, space invaders corruption, pixelated and heavy dot pitch screens, dying cards etc......I don't think Radeon 7's issues are too bad in contrast.....Some persons were even able to get the card overclocked and working nicely with cool temps and lower DB......OptimumTech knows what's up......

Yes, but like anything else, not just for this launch, but some AMD results are strange noting the power of the cards.....but I can chalk it up to DX11 and Nvidia sponsored titles and even older games like Crysis 3 and AC which seems to favor NV, yet it seems AMD has improved it's drivers over the years to make some of these titles run better on their cards......However, I look at Fortnite and there's no way this should be running much better on Nvidia, despite great framerates everywhere....I look at Overwatch and it's the same thing......GTA 5 is a strange one, but even that has improved on AMD hardware over the years....I look at CSGO and NV used to be way ahead on that, but now it's pretty close...Do you remember Kingdom Come Deliverance's performance on AMD hardware at launch, it was pitiful? but now, AMD hardware excels over NV in that game......Pubg on AMD hardware was awful at launch, but that has improved tremendously and they're about par now.....Witcher 3 was much better on NV as well, but AMD made good strides....

So there has been great improvements, but like anything else, there will be some weird odd cases.....I still remember RX580 benches at launch and now, it's a different story entirely..... Yet, if Radeon 7 is already doing so well now, it bodes well for a couple of weeks in......

Yet another thing people overlook, is that you get 3 new games with this card, not old titles, two have not even launched yet, and they all look to be great games so far.....all at a value of $180.00.......I think it's super great value overall, but AMD needs to stock up this card, they had to know that people would lap this up for what it offers.....
 
I'm glad this cards a flop. AMD have done nothing to lead in this industry recently.. they're following Nvidia's lead trying to keep up. Eessentially just riding their coattails, trying to match any features or technology Nvidia comes up months later as best they can with open source initiatives and play the "good guy vs bad guy" shtick to try to make Nvidia look bad. They follow this pattern every time. Which is exactly why people view them as competition insofar as to simply lower Nvidia's pricing.

Worse performance, worse thermals, worse acoustics, and worse power efficiency.. even on a new process node which is dramatically smaller and more efficient. Terrible issues with drivers not working, overclocking being busted, black screens, constant freezing, BSODs. This is why AMD cannot compete simply by being close in performance... AMD needs to DOMINATE in performance to make any difference... and it has to do so at a intriguing price point. The best thing about the card is that it has a simple name. Radeon VII. Sounds nice. (oh and the 3 game bundle of RE2, DMC5, and Division 2 is probably the best value bundle I've ever seen for a GPU)

This card.. simply isn't what AMD needs. Of course AMD knows that, as this card wasn't ever meant for massive consumer adoption. AMD needs Navi to be something special. They need another "9700pro" GPU architecture that is undeniably the king.
 
Seems the Radeon 7 is simply a case of salvaging left over parts. The pricing and composition makes no sense in a competitive sense. as does the limited run (5000 units). If they wanted to crush the 2080 with price and performance, all they had to do was not use the super expensive memory which comprises 60% of the gpu production cost. They could then have added additional cores and turned it into a monster for less than $500. So this is a nothing burger gpu, a way of trying to sell unsold HBM modules.
 
Last edited:
AMD doesn't use HBM because they hate money or something. The reason AMD uses HBM, and this is something that I feel no one realizes, is because HBM requires much less power than GDDR. They have to do this because their GPU cores suck down so much power. They basically have no choice unless they want the video card to consume 500W and require liquid nitrogen to cool properly. The reason Nvidia has been able to continuously get away with using GDDR is because their GPU cores use so much less power, which gives them the power budget to use memory that consumes more power.
 
Last edited:

thelastword

Banned
Optimum Tech showing how it's done.......He speaks about any issues he had, then he goes on to show how he got the Radeon 7 overclocked.......He showed how he was able to get the card silent at 1500 rpm on the fan, how he was able to get wattage down below RTX 2080 levels etc......Also, the OC brought a 9% average FPS in Farcry5, so it's not just 1-2 fps (non-average)...

 

thelastword

Banned
I think it's also important to update folk on FP64 computing on Radeon 7......I guess some are quick to report anything they deem bad news, and try to obsfucate any good news or updates.....So I know some posters made a big issue about FP64 performance on Radeon VII a whle ago........Yet this is what we got for launch....


"The Radeon VII graphics card was created for gamers and creators, enthusiasts and early adopters. Given the broader market Radeon VII is targeting, we were considering different levels of FP64 performance. We previously communicated that Radeon VII provides 0.88 TFLOPS (DP=1/16 SP). However based on customer interest and feedback we wanted to let you know that we have decided to increase double precision compute performance to 3.52 3.46 TFLOPS (DP=1/4SP).

If you looked at FP64 performance in your testing, you may have seen this performance increase as the VBIOS and press drivers we shared with reviewers were pre-release test drivers that had these values already set. In addition, we have updated other numbers to reflect the achievable peak frequency in calculating Radeon VII performance as noted in the [charts]."

https://www.anandtech.com/show/13923/the-amd-radeon-vii-review/3

FP 64 comparison/List;

Radeon 7 = 3.46 TFlops
Radeon 7 (prior to launch) = 880 GFlops
Vega 64 = 786.4 GFlops
Vega 56 = 660.4 GFlops
RX590 = 445 GFlops
RX 580 = 385.9 GFlops
RTX Titan = 509.8 GFlops
RTX2080 Ti = 420.2 GFlops
RTX 2080 = 314.6 GFlops
RTX 2070 = 233.3 GFlops
RTX 2060 = 201.6 GFlops
Titan X Pascal= 342.9 GFlops
GTX 1080 Ti = 354.4 GFlops

https://www.techpowerup.com/gpu-specs/
 

CuNi

Member
I think it's also important to update folk on FP64 computing on Radeon 7......I guess some are quick to report anything they deem bad news, and try to obsfucate any good news or updates.....So I know some posters made a big issue about FP64 performance on Radeon VII a whle ago........Yet this is what we got for launch....


"The Radeon VII graphics card was created for gamers and creators, enthusiasts and early adopters. Given the broader market Radeon VII is targeting, we were considering different levels of FP64 performance. We previously communicated that Radeon VII provides 0.88 TFLOPS (DP=1/16 SP). However based on customer interest and feedback we wanted to let you know that we have decided to increase double precision compute performance to 3.52 3.46 TFLOPS (DP=1/4SP).

If you looked at FP64 performance in your testing, you may have seen this performance increase as the VBIOS and press drivers we shared with reviewers were pre-release test drivers that had these values already set. In addition, we have updated other numbers to reflect the achievable peak frequency in calculating Radeon VII performance as noted in the [charts]."

https://www.anandtech.com/show/13923/the-amd-radeon-vii-review/3

FP 64 comparison/List;

Radeon 7 = 3.46 TFlops
Radeon 7 (prior to launch) = 880 GFlops
Vega 64 = 786.4 GFlops
Vega 56 = 660.4 GFlops
RX590 = 445 GFlops
RX 580 = 385.9 GFlops
RTX Titan = 509.8 GFlops
RTX2080 Ti = 420.2 GFlops
RTX 2080 = 314.6 GFlops
RTX 2070 = 233.3 GFlops
RTX 2060 = 201.6 GFlops
Titan X Pascal= 342.9 GFlops
GTX 1080 Ti = 354.4 GFlops

https://www.techpowerup.com/gpu-specs/

Which greatly just shows that its a prosumer card and not 100% dedicated to gaming. You simply don't need this feature in gaming. So them putting in the silicon to support FP64 to this extend just shows that it's either a cut down workstation GPU or it aims to be a entry level workstation GPU. That's actually a thing against it being a good gaming GPU.
 
Last edited:

CrustyBritches

Gold Member
Did you checked with another tool? Because it didn't use 9GB of VRAM... the ingame tool maths are wrong... it is broken.
Digital Foundry specifically mentioned that RE2 seems to fill whatever VRAM is available, but doesn't suffer stuttering even at 4K max on a GTX 1060 6GB, suggesting that it's not VRAM-bound.
 

ethomaz

Banned
Digital Foundry specifically mentioned that RE2 seems to fill whatever VRAM is available, but doesn't suffer stuttering even at 4K max on a GTX 1060 6GB, suggesting that it's not VRAM-bound.
That means it is only allocating all VRAM and not using all of it.

That is why I’m asking if there is any tool that really shows how much a game uses? Not allocation only.

Another way is to use maxed setthings in a 8GB card vs a 16GB and see if there is any difference in image quality.

I will be surprise if 8GB is not enough for any game... even RE2R.

People needs to stop to believe in these numbers of VRAM from tools that only shows allocation.

Some tool like used in Killzone profiling shows actual VRAM usage but I don’t know if there is anything like that without you have the SDK tools to run the code.
 
Last edited:

CrustyBritches

Gold Member
That means it is only allocating all VRAM and not using all of it.

That is why I’m asking if there is any tool that really shows how much a game uses? Not allocation only.

Another way is to use maxed setthings in a 8GB card vs a 16GB and see if there is any difference in image quality.
There's a few vids that break down the difference between texture settings. Going from 'High 8GB' to 'High 4GB' is basically nothing. Digital Foundry went into a little bit in the 2nd vid linked below.


Digital Foundry's vid. Despite the warning and specified Texture VRAM requirement, it was only using 6GB of 8GB on a RX 580 8GB in this vid. He explains what he thinks is happening:


Vid I made testing their findings on a GTX 1060 6GB. I didn't notice any stuttering even with over 12GB VRAM warning:
 

flomp

Banned
Which greatly just shows that its a prosumer card and not 100% dedicated to gaming. You simply don't need this feature in gaming. So them putting in the silicon to support FP64 to this extend just shows that it's either a cut down workstation GPU or it aims to be a entry level workstation GPU. That's actually a thing against it being a good gaming GPU.

I mean, the Radeon 7 is just a renamed Instinct MI50.
 
Radeon Fantasy VII - 33 games benchmarked


Radeon Fantasy VII manages to consistently keep up with 3 years old 1080 Ti in most games. A huge win for AMD! (spoiler: /s)

The real GOAT of this really is the 1080 Ti. I've had my 1080 Ti for more than 2 years now and I don't see myself replacing it anytime soon, especially if Ampere is planned for 2020 as rumored. Great video card which ended up having much more longevity than I could ever have anticipated.
 
Last edited:

shark sandwich

tenuously links anime, pedophile and incels
Radeon Fantasy VII - 33 games benchmarked


Radeon Fantasy VII manages to consistently keep up with 3 years old 1080 Ti in most games. A huge win for AMD! (spoiler: /s)

The real GOAT of this really is the 1080 Ti. I've had my 1080 Ti for more than 2 years now and I don't see myself replacing it anytime soon, especially if Ampere is planned for 2020 as rumored. Great video card which ended up having much more longevity than I could ever have anticipated.

1080 Ti came out Jan 2017 so it’s only 2 years old.

But yeah that’s crazy to think about. It took AMD 2 full years to finally come out with something that can match the 1080 Ti. And it launched at the same price $700.

Only reason Radeon VII looks even slightly appealing as a gaming card is that RTX is way overpriced.
 
Here's the reality, AMD matched the 1080 Ti and 2080 while making the card more friendly for prosumers and also gaming applications which require ridiculous amounts of RAM.

The card can be undervolted the same as the 64 and 56 can while consuming much less power, it can be made to perform quietly, it can be run on liquid. Let's not forget these things when trying to go nuts on AMD. What did Nvidia do? They released a redundancy for the 1080 Ti and priced themselves out of their own consumer market by making the 2080 Ti $1,200.

They pushed their flagship consumer card to the cost of Titan's, cards that basically no one buys because they're so cost prohibitive. So for actual normal consumers all they did was release a 1080 Ti with Ray Tracing and DLSS which just about nothing uses and most things probably won't for another two hardware generations.

So how exactly is this a disaster for AMD? Their card is more useful.
 

CuNi

Member
Here's the reality, AMD matched the 1080 Ti and 2080 while making the card more friendly for prosumers and also gaming applications which require ridiculous amounts of RAM.

The card can be undervolted the same as the 64 and 56 can while consuming much less power, it can be made to perform quietly, it can be run on liquid. Let's not forget these things when trying to go nuts on AMD. What did Nvidia do? They released a redundancy for the 1080 Ti and priced themselves out of their own consumer market by making the 2080 Ti $1,200.

They pushed their flagship consumer card to the cost of Titan's, cards that basically no one buys because they're so cost prohibitive. So for actual normal consumers all they did was release a 1080 Ti with Ray Tracing and DLSS which just about nothing uses and most things probably won't for another two hardware generations.

So how exactly is this a disaster for AMD? Their card is more useful.

It's a disaster because like you said they only matched a 2 year old card. But just for the record, Nvidia is not doing any better. Honestly, after 10XX by Nvidia, neither one of the released anything meaningful or worthwhile. That's why Nvidia stock dropping like flies and why people just hope for a slumbering miracle inside of AMD R&D. Like I said truth is neither one of them released any kind of upgrade but people shit on AMD because they are yet again over a year late to the party. Let's hope Navi won't be another GCN GPU.. Although I hope it is just to see the internet melt down because of all the salt that will be poured out by AMD fans.
 

thelastword

Banned
Which greatly just shows that its a prosumer card and not 100% dedicated to gaming. You simply don't need this feature in gaming. So them putting in the silicon to support FP64 to this extend just shows that it's either a cut down workstation GPU or it aims to be a entry level workstation GPU. That's actually a thing against it being a good gaming GPU.
Yes, the Radeon VII is a great PROsumer card, it's a great gaming card too, it's already showing less stutter in games and better frametimes over the RTX 2080....Drivers will only improve Radeon 7's performance........

Look at RX580, Vega 56, Vega 64 at launch and now over the GTX 1060, GTX 1070 and GTX 1080 respectively, I've posted many benchmarks showing how these cards are doing now......Besides, 16GB is futureproof for games, the bandwidth helps at 4k with less stutter on Radeon.....Even the non-relevant RTX is using more memory when it's on......So much so, the 6GB RTX 2060 has to be played with raytracing on low, with textures on low at 1080p for 60fps......Anybody who says Vram is not important or 16Gb is too much is straight bonkers....
 

CrustyBritches

Gold Member
Look at RX580 at launch and now over the GTX 1060...
I'm not disagreeing with your post, but it was the RX 480 that launched against the GTX 1060, not the RX 580. Many of the RX 580 AIBs ship with base clocks near the RX 480 max OC levels. Rx 580 is generally ~7% faster than the RX 480 out of the box.

The games that the RX 480 does better in are better optimized because of consoles using AMD/Polaris architecture which relies more on compute, and nothing to do with amount of VRAM. I find the GTX 1060 6GB to be a better card than the RX 480 8GB for gaming.

Anybody who says Vram is not important or 16Gb is too much is straight bonkers....
I'd guess a theoretical 8GB Radeon VII would have a better price/performance ratio than the 16GB model. If I could get the same card with 8GB and save $150 while only losing a few frames and the "promise" of fine wine, I'd go for it easy.

That's what makes this more of a converted PROsumer card than a made-for-gaming card. So in that sense, we agree.
 

CuNi

Member
Anybody who says Vram is not important or 16Gb is too much is straight bonkers....

I guarantee you, we'll be around 8 to 10GB of VRAM for the next 3-4 years.
VRAM is important, but not as important as you think.
People think if you can fit in 16GB of HD or 4k Textures, than games will look super realistic, while completely forgetting that the complete image is what counts.
If a GPU can only render at medium settings for things like shadows, illumination etc, then 4k textures make the scene look even worse than if the textures we're a bit muddier.
Just look at Minecraft, do you think this game would look beautiful with 4k realistic textures but no shaders applied to it? That would look horrendous. But if you take even HD textures and add in Shaders, then it already look 10x better than 4k Textures alone. If VRAM would be all it takes, hell we'd be seeing 32GB VRAM cards left and right, but guess what.. we're not because they don't solve many issues.

What I try to say is that banking on 1 aspect of the GPU alone and counting on it delivering superb performance and being future proof is naive at best.
People say we'll need a lot of VRAM once next gen consoles arrive because of the bump in quality we'll see happen to games.
I agree that consoles will push the image quality and looks of games upwards but that will mean calculation complexity will go up as well and VRAM alone won't be the holy grail to circumvent those problems. It will be something that you will use less and less the lower you have to go with your graphics settings as crappy lighting and effects coupled with 4k textures just looks very odd.
 

thelastword

Banned
I'm not disagreeing with your post, but it was the RX 480 that launched against the GTX 1060, not the RX 580. Many of the RX 580 AIBs ship with base clocks near the RX 480 max OC levels. Rx 580 is generally ~7% faster than the RX 480 out of the box.

The games that the RX 480 does better in are better optimized because of consoles using AMD/Polaris architecture which relies more on compute, and nothing to do with amount of VRAM. I find the GTX 1060 6GB to be a better card than the RX 480 8GB for gaming.


I'd guess a theoretical 8GB Radeon VII would have a better price/performance ratio than the 16GB model. If I could get the same card with 8GB and save $150 while only losing a few frames and the "promise" of fine wine, I'd go for it easy.

That's what makes this more of a converted PROsumer card than a made-for-gaming card. So in that sense, we agree.
The RX580 is essentially the RX480 with slightly better clocks, the 580 is the product on offer now from AMD in the mid-low range for a while now, almost 2 years, the 480 is no longer in production. I compared the 580 at it's launch against the 1060 to now, because it's the card being sold by AMD now and it's been the 1060's competitor for much longer the 480 ever was.......If you compare the 1060 with GDDR5x with the 580 that launched almost two years ago, I have no problem with that btw.....

Also, you not going for 16GB is fine, others see the value and will go for it......Some of you are pretending that Radeon being a PROsumer card is a bad thing, it's not, many of us don't only play games, it's the perfect card for us........If you remember, just a few months back, the 1080ti was all the rage, people said AMD has nothing coming that can touch the 1080ti, because they heard NAVI is only going to go for the mid-range (which is misguided too), the press said Vega 7 wasn't coming, when I said it would. Now that Radeon 7 has launched, Hardware Unboxed says that Radeon 7 is 11% faster than GTX 1080ti, and it's doing so with immature drivers and it's also the best PROsumer card out there at $699, you'd probably need to buy a $3000.00 PRO card to match it and not even the $1200.00 2080ti beats it in productivity.....It also has 40% more bandwidth than the 2080ti and it has 5GB more memory, notwithstanding less stutter and better frametimes at 4K over it's $700-800 competitor......Again, how is that bad value?

I don't know.......Just look at the negative reactions here, some persons reported the whole of the UK only had 100 units for launch, that clearly wasn't true as single stores had 100 units+....Then some went on to say "who is this card for"? "They said it was not ready for release", "they say it's only offering 1080ti performance 2 years later" (how ironic)......because the 2080 is only offering 1080ti performance 2 years later with only 8GB of Vram as opposed to 11Gb two years ago? So do we need less Ram the more we progress? Should we go back to 4GB or 2GB because 8Gb is too much or more than enough? There's too much hypocrisy in this industry, you just look if Nvidia offered 16Gb and 1GB/s bandwidth on a $700.00 card and you would see all the swooning.....It's something the press is obsfucating vehemently, because even NV's $1200.00 card does not offer 16GB of Vram......So on a value basis, it's not even a question.....And I may have heard wrongly, and I'm willing to concede if I'm wrong here, but didn't Leadbetter say that even the RTX Titan has only has a 12Gb allocation for some productivity workloads, like in premiere etc...I'm pretty sure he said that, a $2500.00 card, talk about value, had this been AMD?..... "ooh boy".....

Yet guess what, the press is talking about reasons to purchase a 2080 over a Radeon 7 because of what????? RTX (in one game after 4 months), DLSS in one game on the coattails of better IQ and performance (which is a lie), "smudged detail/textures in FF, lots of shimmering and aliasing) and worse performance than a PC gamer just doing what he could always do, set your card to 1440 instead of 4k and see that framerate fly up....FYI, 1440p with TAA or any other AA or no AA is much superior in performance and in resolved detail over DLSS......No detail is approximated, it's all there natively, IQ is good and perf is better....

This is how the Radeon 7 a $699 holds up to a $1200.00-$1300 card.....If only Radeon had crossfire, it would be something else now wouldn't it....




I guarantee you, we'll be around 8 to 10GB of VRAM for the next 3-4 years.
VRAM is important, but not as important as you think.
People think if you can fit in 16GB of HD or 4k Textures, than games will look super realistic, while completely forgetting that the complete image is what counts.
If a GPU can only render at medium settings for things like shadows, illumination etc, then 4k textures make the scene look even worse than if the textures we're a bit muddier.
Just look at Minecraft, do you think this game would look beautiful with 4k realistic textures but no shaders applied to it? That would look horrendous. But if you take even HD textures and add in Shaders, then it already look 10x better than 4k Textures alone. If VRAM would be all it takes, hell we'd be seeing 32GB VRAM cards left and right, but guess what.. we're not because they don't solve many issues.

What I try to say is that banking on 1 aspect of the GPU alone and counting on it delivering superb performance and being future proof is naive at best.
People say we'll need a lot of VRAM once next gen consoles arrive because of the bump in quality we'll see happen to games.
I agree that consoles will push the image quality and looks of games upwards but that will mean calculation complexity will go up as well and VRAM alone won't be the holy grail to circumvent those problems. It will be something that you will use less and less the lower you have to go with your graphics settings as crappy lighting and effects coupled with 4k textures just looks very odd.
Before this gen started, and people saw the PS4 specs, they said 8Gb was overkill.......2GB was more than enough and even 4GB was a pipe-dream, 8GB was just an insane asylum level prediction. Today, it's the same, consoles are dropping next year, I'll tell you something, they will have more than 16GB and since AMD has the consoles locked, devs will use the higher Vram count in their games to their best ability and they will push visuals and fidelity and even higher resolutions on account of that ram and better bandwidth......Even RTX (in it's current form) is a Vram fiend, and you're only talking a low quality hybrid solution in Turing, only for reflections.....So tell me, when they offer Raytracing in other aspects of the pipeline like shadows etc...What happens then? Why can't we see past our toes?

You know how Nvidia is catering to their lower Vram counts and not so much of a huge upgrade from Pascal to Turing? They're using VRS (Variable Rate Shading), so when you choose 4K rez, they lower many aspects of the rendering pipeline as low as 1080p to free up performance and you know how we got to know about that, by looking at Wolfenstein with VRS.......So Nvidia knows they don't have enough Vram, they know they don't have enough bandwidth, but who cares, that's not business 101...Business 101 is selling as little value to you at the highest price.....(As long as they can get away with it).......(RTX2080) is a $800 dollar 1080ti with only 8GB in 2019, pitched on the coattails of RTX (Raytracing=BFV and DLSS=FF15) after 4 months.....Yet, Nvidia knows people will defend this at all costs, some will even buy it, but it's not enough, because it's really not selling all that well......Huang lieing to investors, Investors pulling out, court date is imminent......I'm sure someone is over at NVidia HQ saying....."had it not been for these meddling kids, we would have gotten away with it too"..Scooby Doo Reference btw....Yet, I think people can tell value when they see it to be frank, people are waking up and smelling the coffee.......You're not fooled that someone is brewing some Hawaiian Kona and trying to impress you it's Black Ivory Coffee...
 

CrustyBritches

Gold Member
The RX580 is essentially the RX480 with slightly better clocks, the 580 is the product on offer now from AMD in the mid-low range for a while now, almost 2 years, the 480 is no longer in production. I compared the 580 at it's launch against the 1060 to now, because it's the card being sold by AMD now and it's been the 1060's
The RX 480 launched against the GTX 1060 in the 150W and below class. RX 480 ended up being more power hungry than they let on, and the RX 580 even more so, consuming over 200W on some models against the GTX 1060's ~130W power consumption.

I've used both cards since they released and I never experienced fine wine as a result of more memory. Can you point me to a specific example of that?

Anyway, I agree with you that the Radeon VII is a PROsumer card, and I consider the RX 480 to be a great card for AMD and an equal to the GTX 1060. I just wanted to nitpick a bit.
 

iHaunter

Member
Yikes, who would pay the same for less?

Unfortunate, AMD used to be "Similar" performance for a "Lesser" cost. Now it's similar performance with less features for the same cost.
 

ethomaz

Banned
If you keep rebranding and overclocking your chip sometime you will be ahead in performance lol

RX 480 -> RX 580 -> RX 590
 
Last edited:

thelastword

Banned
If you keep rebranding and overclocking your chip sometime you will be ahead in performance lol

RX 480 -> RX 580 -> RX 590
Well it's not like Nvidia didn't refresh the 1060......Besides, NV GPU's (Pascal) are generally more overclockable than AMD GPU's.....

The RX 480 launched against the GTX 1060 in the 150W and below class. RX 480 ended up being more power hungry than they let on, and the RX 580 even more so, consuming over 200W on some models against the GTX 1060's ~130W power consumption.

I've used both cards since they released and I never experienced fine wine as a result of more memory. Can you point me to a specific example of that?

Anyway, I agree with you that the Radeon VII is a PROsumer card, and I consider the RX 480 to be a great card for AMD and an equal to the GTX 1060. I just wanted to nitpick a bit.
Finewine is real......Again, go back to Pubg at launch and now, Arma 3 on AMD hardware to now, Vermintide, Kingdom Come Deliverance and so many more...



It's not only about memory, but memory plays a big role, especially when trying to play at higher resolutions and when you want to avoid or minimize stuttering in games......The point more than anything is that the drivers improve performance bigtime over launch.......As you can see in the video, if the game is new, sometimes AMD can be behind, moreso before than now, because Nvidia is more popular and there are more Nvidia titles than AMD titles, but thankfully, that is changing slowly as AMD clutches the console market....Something like Fortnite will always skew benchmarks, even overwatch, these titles and many just run better on NV, especially lots of older titles, I don't think AMD will catchup there or even attempt to, but some of the titles not littered with NV profiles and even many DX11 titles have seen some huge improvements on AMD over time and less stuttering during gameplay......

More memory is always better....

 
I just ordered an AMD.....580. Ok i mainly did it for the 2 free games. XFX better not let me down.
The RX580 is a pretty good "bang for your buck" card. I got mine a few months ago and plan to wait until AMD releases Arcturus when they will finally break away from GCN.
 

CrustyBritches

Gold Member
Finewine is real......
I feel that was tied more to Nvidia sabotaging their older hardware while AMD had poorer initial drivers that they eventually worked out.

I've had both the RX 480 and the GTX 1060 since launch. I've seen nothing to indicate fine wine, and you are relying on a later released heavily overclocked model as your example. I guess if somebody decided to buy the card twice they got the actual wine, right?

It's not only about memory, but memory plays a big role, especially when trying to play at higher resolutions and when you want to avoid or minimize stuttering in games...
I've made previous posts on how erroneous RE2's VRAM reporting is. It's a poor example anyway since the half-memory models perform so closely in this game and the frame gap can largely be explained by the lower memory clocks on the half-sized models.

RE2 is just an example of a game built for AMD-style hardware. It's a nice win along with stuff like DOOM, etc.
 

MadYarpen

Member
I just ordered an AMD.....580. Ok i mainly did it for the 2 free games. XFX better not let me down.
I have been using rx580 from sapphire already for a month. It is a fine card IMO, though it gets quite warm under the desk when the card is under stress haha.

But it took only a month for the drivers to do some funny things... I see the error I am getting is rather common. Maybe a clean install with the next update helps.

Anyway, I feel this is a good choice for a year of 1080p gaming, to see what the market will look like when Navi is here.
 
Last edited:
Top Bottom