• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon VII Announced

DeepEnigma

Gold Member
7nm apu (gpu+cpu) confirmed. But what does 7nm navi have that 7nm vega doesn't?

We don't know yet, but all we hear right now is that it is a different platform not based on GCN.

GCN has hit it's peak, thus Navi will be a whole new architecture supposedly expanding beyond that.
 

TeamGhobad

Banned
We don't know yet, but all we hear right now is that it is a different platform not based on GCN.

GCN has hit it's peak, thus Navi will be a whole new architecture supposedly expanding beyond that.

are u aware of any benchmark results for Jaguar vs Zen 2 ?
 

Dontero

Banned
We don't know yet, but all we hear right now is that it is a different platform not based on GCN.

GCN has hit it's peak, thus Navi will be a whole new architecture supposedly expanding beyond that.

Latest rumors go that Navi is just polaris+.

AMD is working on next gen arch with MS and Sony and they will release it when consoles will be heading to production.
So either this means consoles are released this year and Navi is indeed new arch or consoles will be released next year and navi is just polaris+
 

DeepEnigma

Gold Member
Latest rumors go that Navi is just polaris+.

AMD is working on next gen arch with MS and Sony and they will release it when consoles will be heading to production.
So either this means consoles are released this year and Navi is indeed new arch or consoles will be released next year and navi is just polaris+

This is getting confusing because I thought Navi was the next gen architecture from the AMD slides, and the consoles were one in the same with it.
 

thelastword

Banned
Near 300w tdp vs 215w (rtx280) tdp is the problem. Doesn’t bode well for Ps5 and we need to temper our expectations unless they can pull something out the hat with Navi. No mistake the Vega architecture is a disappointment for perf per watt.
Is this how cards are compared now, via tdp? Forget about twice the vram, over double the bandwidth over 2080, better performance especially at 4k, which I imagine is why you would buy such a card, for 4k gaming or playing at 1080p/1440p+144fps on esports titles.

This is also a perfect card for production work. I think anyone who thinks these cards will not sell are just kidding themselves. There is just insane value here for gamers and content producers.....It hits a healthy gamut of potential customers quite nicely...

OK i take it back...maybe they can compete at the low end but this isn't really competition for the 2080 and where is their card to compete against the 2080 Ti? Oh yeah, probably need to wait another year for that.
Why is this not competition for the 2080FE when it beats that OC card, will be a better card than the 2080FE at 4K with less/no stuttering in games and will deliver less stutter even over the $1200 2080ti at 4K. You will also be less crippled if you want to use lots of AA and if you want to supersample on the Radeon 7 vs the competition.

Also, there's no need to compete with the 2080ti. If the NV crowd is crying so much about the value of this card, when it has the 2080FE beat in all metrics. You think they would think differently if AMD went through the pain of custom engineering more CU's on an old arch, say combined with going 256 on the rop count and doubling the memory to 32Gb hbm, so even more bandwidth..... Say they delivered this with a 350w tdp and a price 100.00 less than a 2080ti ($1100), and beat the 2080ti or was on par, the conversation would be the same as it is now..... Tdp is the most important thing in the world, let's forget about the net of plusses that Radeon brings, let's concentrate on tdp. Let's forget about the lower price, the much higher vram and rop count, the better performance, the better performance in production, let's concentrate on tdp, because the benefits of feeding the high bandwidth of him is null, the smoother playing games with less stutter is null, it's all about tdp..
 

Dontero

Banned
Hm, mind linking it, unless it was a conjecture from "no PS5 this year"? I hope it's not from nvidia.com/huangSaultyAboutLosingVRRWar, chuckle.

Don't remember source now but i think it came from AMD itself when they published roadmap and Navi was just degraded to "7nm" and nothing more where every other milestone was signed like "next gen memory" etc.

I still think consoles will be released by the end of this year. MS will release first and Sony this time will be late to the game because from what i see they are complacent in their market share and ps4 sales.

MS is the one that can only gain right now from releasing next gen console, especially if their next gen console will be more of an API where xboxone , pro and xboxtwo could play all new games just with various amount of detail and different resolutions.
 

SonGoku

Member
Going with this much super fast memory (1 TB/s with a non embedded DRAM is mind boggling) and possibly raising the clock so much you needed a voltage bump would explain the higher TDP: we will need to see how they achieved the high clocks, but if they need to raise the voltage their power increases with the square of V not linearly.

It seems this design is a bit rushed and that we may see another Vega refresh (with a more optimised solution), then Navi semi custom on PS5, then Navi on Desktop...
Is it though? I thought hbm was more power efficient and can reach insane bandwidth without high clocks. That the bandwidth can be attributed to the number of stacks and 4096 bit bus.
Memory Clock wise is not that much higher than old vega (2Gbps vs 1.89Gbps).

I think the high tdp can be attributed to a number of things: Vega arch not being power efficient, core clock hitting diminishing returns to squeeze as much performance as possible, early 7nm yields producing chips that don't clock very high. One thing im not sure is rops, can those increase power consumption dramatically?

Either way all of these issues will be solved by the time consoles enter production, more important of all they will use a new arch
Im hoping consoles go with him at 1tb/s should net some impressive games and gfx fx designed for the hw
I just wish AMD had something to compete with nVidia at the high levels.
This competes with the rtx 2080. Thats high level just not the highest
7nm apu (gpu+cpu) confirmed. But what does 7nm navi have that 7nm vega doesn't?
Performance per watt ie power efficiency
The biggest change i can think is them getting past the 64CU limit present in GCN Vega
Latest rumors go that Navi is just polaris+.
I only seen one site post that, and it was speculation on the writers part.
Because first Navi designs will be midrange he assumed it will be a Polaris successor. Which in a way it is if we are talking market wise, but arch wise is completely different.
Don't remember source now but i think it came from AMD itself when they published roadmap and Navi was just degraded to "7nm"
Roadmaps change all the time see Nvidia for example. Based on this roadmap old rumors pointed to navi being last gcn design but the latest ones claim its a full break from gcn
We'll see i guess but based on all the variables im more inclined to believe Navi will be post gcn
 
Last edited:

Alexios

Cores, shaders and BIOS oh my!
It was discussed a couple pages back but people are reporting the Resident Evil 2 demo can require over 8GB VRAM in 1080p as it has a setting for 8GB textures alone or something along those lines. Taking that down to the 4GB textures setting makes the game use about 6GB total so we can assume it would need about 10GB at least in max settings, possibly more to avoid sturrering in the more complex areas of the full game. It seems like there's a use for over 8GB VRAM already and not in the far future even if last-gen benchmarks like GTAV don't *jab*. Of course maybe this is a case of the game using as much as it has available and it's not actually necessary for it though there's no indication for it yet but I guess we'll see with more extensive testing, maybe from DF (probably not for the demo but the full game, yeah?).
 
Last edited:

SonGoku

Member
It seems like there's a use for over 8GB VRAM already
Of course there is and this will be more apparent as games are designed for next gen consoles
People claiming 8GB vram is all you'll ever need sound just as ridiculous as those claiming early this gen that 2GB vram will carry you for the rest of the gen.
 

CrustyBritches

Gold Member
Of course there is and this will be more apparent as games are designed for next gen consoles
People claiming 8GB vram is all you'll ever need sound just as ridiculous as those claiming early this gen that 2GB vram will carry you for the rest of the gen.
I don't have a horse in this race, and I agree 8GB is not future proof, but I'd thought I'd chime in that the R7 265 2GB I have more than handles itself against the PS4. I think DOOM was the only exception I noticed, which was tied to the heavy use of async, and the PS4 having 8 ACEs instead of 2 like the 7850/R7 265. But the Sapphire R7 265 I have can OC to 1125MHz core and blows the PS4 out of the water in games like The Witcher 3, or NBA 2k. There, I white-knighted for my old GPU.:messenger_grinning_sweat:
 

Leonidas

Member
People claiming 8GB vram is all you'll ever need sound just as ridiculous as those claiming early this gen that 2GB vram will carry you for the rest of the gen.

GTX 1060 kills current gen base console and even beats PS4 Pro with only 3 GB. GTX 1050(a 2 GB card) also offers a better gaming experience than the base consoles...
The only people claming you'll need more than 8 GB next-gen are AMD/console fanboys spreading FUD since more VRAM is the only advantage they have.
 
Last edited:

SonGoku

Member
I don't have a horse in this race, and I agree 8GB is not future proof, but I'd thought I'd chime in that the R7 265 2GB I have more than handles itself against the PS4. I think DOOM was the only exception I noticed, which was tied to the heavy use of async, and the PS4 having 8 ACEs instead of 2 like the 7850/R7 265. But the Sapphire R7 265 I have can OC to 1125MHz core and blows the PS4 out of the water in games like The Witcher 3, or NBA 2k. There, I white-knighted for my old GPU.:messenger_grinning_sweat:
That's the thing though that card should outperform PS4 IN EVERY SINGLE game as its being bottlenecked by memory. You'll have to downgrade some settings compared to PS4
GTX 1060 kills current gen base console and even beats PS4 Pro with only 3 GB. GTX 1050(a 2 GB card) also offers a better gaming experience than the base consoles...
They do and they should based on specs but those cards are bottlenecked by memory in memory intensive games you'll have to lower settings that consume memory compared to PS4
You can have a card with lower memory that performs better than the one with more. What im arguing is the card with less memory will be bottlenecked and require to lower settings that are memory intensive
The only people claming you'll need more than 8 GB next-gen are AMD fanboys spreading FUD since it's the only advantage they have over Nvidia with Radeon VII.
Define need
You clearly think you don't need more than 3GB this gen, yet several games will require downgrades because of memory to run properly
Is 8GB enough to run at next gen consoles settings is what you are claiming? because there will be some settings were consoles will be superior if its memory dependent just like those cards you listed vs PS4

In this very page we have info of a current gen game requiring 6GB
 
Last edited:

CrustyBritches

Gold Member
That's the thing though that card should outperform PS4 IN EVERY SINGLE game
But according to your assertion, 2GB VRAM is not enough for the gen, so it's the R7 265 that should be bottlenecked. In reality, the only way the PS4 could claim a victory was via it's extra ACEs, although it had nearly double the VRAM. So maybe a handful of titles. The R7 265 could almost push 1440p at PS4 settings in The Witcher 3(~27fps). That's the typical gap a good OC gave you in any game not DOOM.
 
Last edited:

SonGoku

Member
RE2? It doesn't require 6 GB. They bundled it with 4 GB cards.
Stop the FUD.
It was just posted in this page that it has the option for 4GB and 6GB textures, how is that fud? Surely you can run it at lower settings on a 4GB
The general consensus for this gen is you need 4GB to run games at 1080p without memory bottlenecks
 

Leonidas

Member
It was just posted in this page that it has the option for 4GB and 6GB textures, how is that fud? Surely you can run it at lower settings on a 4GB
The general consensus for this gen is you need 4GB to run games at 1080p without memory bottlenecks

An option is not a requirement. Mirror's Edge had an option for high res textures as well. It's not a requirement. GTX 1060 3GB will still look much better than base consoles on RE2 with 1050 2 GB still offering a better overall experience than base consoles.
 

SonGoku

Member
But according to your assertion, 2GB VRAM is not enough for the gen, so it's the R7 265 that should be bottlenecked. In reality, the only way the PS4 could claim a victory with via it's extra ACEs, although it had nearly double the VRAM. So maybe a handful of titles. The R7 265 could almost push 1440p at PS4 settings in The Witcher 3(~27fps). That's the typical gap a good OC gave you in any game not DOOM.
Define not enough... Im not even arguing victory j
My argument is that 2GB is not enough because you will run into memory bottle necks forcing you to downgrade some settings
More powerful card should outperform PS4, doesnt change the fact that settings that are memory intensive will require a downgrade

My definition of being enough is running games comfortably at the resolution it targets without running into memory bottlenecks
An option is not a requirement. Mirror's Edge had an option for high res textures as well. It's not a requirement. GTX 1060 3GB will still look much better than base consoles on RE2 with 1050 2 GB still offering a better overall experience than base consoles.
You are arguing on two fronts now: consoles and competing GPUs
There have been games were PS4 had superior textures due to memory advantage. Im not arguing ps4 will be superior on all settings just those that are memory intensive
The other front is similarly priced with similar performance card but more memory edging the 2GB/3GB card in memory intensive games

If the card is being bottlenecked by vram then 2GB is being limiting therefor not enough. Many games require 3GB+ to run comfortably at 1080p at high settings at 1080p
 
Last edited:

CrustyBritches

Gold Member
Define not enough... Im not even arguing victory j
My argument is that 2GB is not enough because you will run into memory bottle necks forcing you to downgrade some settings
Never seemed to happen. Settings never had to be dropped except async-heavy titles. Additionally, the GTX 1060 3GB has less RAM than the PS4, and it outperforms the PS4 Pro and can even match the Xbox One X in some cases.

Anyway, I'll let you have the last word. I just wanted to comment that the R7 265 2GB almost always beats a PS4, and 2GB VRAM was never a problem.
 

SonGoku

Member
Never seemed to happen. Settings never had to be dropped except async-heavy titles. Additionally, the GTX 1060 3GB has less RAM than the PS4, and it outperforms the PS4 Pro and can even match the Xbox One X in some cases.

Anyway, I'll let you have the last word. I just wanted to comment that the R7 265 2GB almost always beats a PS4, and 2GB VRAM was never a problem.
Textures, there have been documented cases were ps4 had better textures.
The only point im making is that 2/3GB cards are bottlenecked by memory not that they perform worse than consoles in case there was any confusion still...

I don't have a horse in this race either... well i do actually i want a much ram as possible for next gen
 

LordOfChaos

Member
Why is this a bad thing? options i mean. If anything the extra ram makes it more future proof for 4k gaming, if i was in the market for one of these i would choose the VII over the 2080

But its the old arch with even less CUs its impressive for what it is, i don't think AMD intended to retake the performance crown with this product. Everybody knew 7nm Vega was coming and it performed as expected i think? Nobody was expecting miracles from a old ass arc

Its probably a repurposed workstation chip and it also helps them get a grip of the new process with a familiar arch.

Its not clear... rumors used to say it was but the latest rumors now claim Navi to be post gcn


The gains of 7nm as a fab shrink alone were in excess of the performance increase they're claiming. This is because they also disabled CUs for yields here, not the fully enabled Pro part. So that's why everyone is so tepid, even expecting a straight shrink it's a little meh.

As for options...I mean, I guess? But why would I pick it if it only matches the 2080 for the same price, when the 2080 uses less power and has added features like DLSS and RTX? Especially now that Nvidia also added Freesync. More VRAM, ok, that's a point, but as I said nothing is really hurting on the RTX's 8GB either, the 16GB of 60% faster memory seems squarely for people buying it as a baby compute card rather than a high end gaming one. It's a valid option to pick it for future proofing, but my spidey sense says GPU hardware will be radically different by the time the rest is needed by games. Every time I've ever gone for a GPU that had a bit overkill memory capacity, it was the silicon itself that became a bottleneck far before the extra memory became handy, with the one possible exception of Fermi's low capacities at launch.

And then about it being an old architecture and not expecting much - yeah, that's why I'm so whelmed ;)
 
Last edited:

Alexios

Cores, shaders and BIOS oh my!
It was discussed a couple pages back but people are reporting the Resident Evil 2 demo can require over 8GB VRAM in 1080p as it has a setting for 8GB textures alone or something along those lines. Taking that down to the 4GB textures setting makes the game use about 6GB total so we can assume it would need about 10GB at least in max settings, possibly more to avoid sturrering in the more complex areas of the full game. It seems like there's a use for over 8GB VRAM already and not in the far future even if last-gen benchmarks like GTAV don't *jab*. Of course maybe this is a case of the game using as much as it has available and it's not actually necessary for it though there's no indication for it yet but I guess we'll see with more extensive testing, maybe from DF (probably not for the demo but the full game, yeah?).
Well, just to add to this, I played the demo a bit but I sucked and didn't progress too much before my time expired. This was my GPU/VRAM usage with max settings (even CA etc) on a GTX1080. In the menu it said I would need 12GB+ and it could result in errors but in the end it didn't need that much though it came close enough to 8GB. Of course the full game or different areas could be a whole different deal.
1ggcoy.jpg

I have to say it doesn't look too impressive and can get aliased despite TAA+FXAA combo option but it's fairly pretty and solid. Some things can look pretty grainy like I think the self shadowing on character faces and the reflections can look off. Some assets seems to have blurry/muddy textures/look too and the lens distortion effect, I have no idea what it's supposed to make better but I have to say the game looks quite a bit cleaner and sharper if you disable all the blur/dof/filmgrain/distortion stuff. Edit: I just experimented with the supersampling, I could maybe do solid 30fps @ 200% 1080p. Big maybe though.
 
Last edited:

SonGoku

Member
As for options...I mean, I guess? But why would I pick it if it only matches the 2080 for the same price, when the 2080 uses less power and has added features like DLSS and RTX?
Depends of your target/monitor
For 4k VII is the better card, more future proof for that resolution
For 1080p/1440p rtx 2080 is the better card since it can run raytracing at those resolutions

I can understand being underwhelmed but Vega was never meant to steal the show
 

LordOfChaos

Member
Good to know thanks for the info
Im relieved actually, thought 128ROP config was needed to drive 1TB/s bandwith

Yeah I think the bottlenecks have been pretty misascribed in GCN, people looking at ROPs when it was the front end. It’s been a 4 tris/clock bottleneck for the AMD top since Hawaii. Primitive shaders were supposed to fix that and just never came.
 

thelastword

Banned
16 gigs of memory? Seems like overkill for games.
It will be used.....You don't buy a card just for today. You buy it so it's futureproof, especially if you spend in excess of $500.00.....

"The memory side has seen a major uplift with the card featuring 16 GB of HBM2 VRAM across a 4096-bit wide bus interface. There are four stacks, each of which operates at 256 GB/s, delivering a total of 1 TB/s bandwidth. AMD states that the excess memory is useful for content creation and upcoming titles in 2019 can use up to 11 GB of onboard graphics memory."

https://wccftech.com/amd-radeon-vega-vii-gaming-performance-benchmarks-specs-official/




Thank You!!!!

Though I will say AMD has been competitive, they have not been able to capture the market or amass sales like Nvidia, but they have always been competitive relative to performance with their cards.....I guess what they need now is to produce a card or a set of GPU's that really take the world by storm, but they have been warming up... The Rx570, 580, Vega 56 and 64 did really well for them, mining or not......Lots of people bough Rx 580's as gaming cards in the last two years...
 

ZywyPL

Banned
AMD is, AGAIN, two years late to the party, that's their major issue, always have been - a 4K60-capable GPU, AWESOME! But people who needed a 4K60 GPU already bought 1080Ti two years/year/half a year ago, the market is already saturated, there's barely any place for this card. But I have to say, AMDs CPU and GPU lineup is shaping up really, REALLY well, for someone who is building a PC from scratch, there's absolutely no reason NOT to go for a full AMD build.
 
That star wars ray tracing demo runs at 11 fps! on 1080 ti and 60fps on 2080 ti. Vega 7 isn't that much more powerful than 1080 ti.

I wouldn't doubt for a second telling you that even consoles will be capable of some sort of significantly downgraded simplified version of raytracing next gen , so of course this vega 7 could even do same IQ raytracing we see in BF5, but at what crappy framerate? 11 fps like on 1080 ti won't cut it.
how about voxel based global illumination? The current consoles nearly had the specs for that, and that looks killer. If the high voxel optimizations of euclideon or atomontage can help with reflections too it would basically offer similar visual quality at a fraction of the performance cost.
 

ZywyPL

Banned
If Navi really is another GCN card, then Nvidia will easily have the performance lead once more.

Given that AMD doesn't use any specialized cores in their GPUs like NV does, they still have quite a space to increase the die size/core count. There was a rumored Vega 20 with 6144 cores/21TF of computing power being tested. They could even go as far as 8192 core version at almost 30TF and call it a day. Sadly that HBM memory isn't cheap, to say the least :/
 

CuNi

Member
Given that AMD doesn't use any specialized cores in their GPUs like NV does, they still have quite a space to increase the die size/core count. There was a rumored Vega 20 with 6144 cores/21TF of computing power being tested. They could even go as far as 8192 core version at almost 30TF and call it a day. Sadly that HBM memory isn't cheap, to say the least :/

Isn't GCN architecturally limited to the amount of cores and ROPs it can have connected on die? Wouldn't that limit the theoretical peak performance of this card too?
 

ethomaz

Banned
Given that AMD doesn't use any specialized cores in their GPUs like NV does, they still have quite a space to increase the die size/core count. There was a rumored Vega 20 with 6144 cores/21TF of computing power being tested. They could even go as far as 8192 core version at almost 30TF and call it a day. Sadly that HBM memory isn't cheap, to say the least :/
GCN is limited to 64 ROPs and 4096 cores.

Unless AMD change to a new Arch they won’t put that much unless they crossfire two GPU die in the same card like in the past to reach 8192 cores.
 
Last edited:

LordOfChaos

Member
https://www.pcgamesn.com/amd/amd-navi-gpu-release-date-performance

According to them, 2019 Navi will only be low to mid range with high end Navi coming 2020. And supposedly still use GCN. If true, that would really be a bummer. Nvidia is expected to release Ampere in 2020 on 7nm as well. If Navi really is another GCN card, then Nvidia will easily have the performance lead once more.

I think that was pretty widely accepted to be the case, but the confirmation still matters. Navi not being talked up a lot yet, plus the conspicuous naming of "Next Gen" after it, made me pretty sure Navi was GCN again already, with all that implies for the four shader engine limit, etc.

So expecting a value play, 1080 for cheaper type deal here, not anything beyond the 2080 at first. What VII should be, really, but for the bold pricing.

Word was also that Sony worked closely with them on Navi for the PS5.
 

Altera

Neo Member
Is selling my 1080 and getting this instead a crazy thing to do? I'm just getting tired of Nvidia and want to move away from them.
 
Is selling my 1080 and getting this instead a crazy thing to do? I'm just getting tired of Nvidia and want to move away from them.
Do you do a lot of content creation? Because that's where the 16GB of HBM2 comes in handy. The Radeon VII is more of a 'prosumer' card than a gamer card, imo.
 
Navi being based on GCN is not good. That will just let Nvidia get ahead of AMD even more.

If Navi really is still based on GCN we can keep writing AMD off in GPU's until after 2020.

At that point Ampere will be a thing and Intel may be back in the GPU game, so AMD will be finished by then. They can't compete with both Nvidia and Intel, there's no hope left for AMD once Intel gets in with a product if it doesn't suck.
 
Last edited:

shark sandwich

tenuously links anime, pedophile and incels
Do you do a lot of content creation? Because that's where the 16GB of HBM2 comes in handy. The Radeon VII is more of a 'prosumer' card than a gamer card, imo.
Definitely seems like more of a prosumer card, but on the other hand, this will probably be their best performing card for the next year.
 

SonGoku

Member
Is selling my 1080 and getting this instead a crazy thing to do? I'm just getting tired of Nvidia and want to move away from them.
Premature move. 1080 is a stellar card
You should at least wait for Navi and Nvidia 7nm lineup before making any rash decisions.
 

llien

Member
AMD Radeon VII Detailed Some More: Die-size, Secret-sauce, Ray-tracing, and More

9wNtokztVhnGSMYo.jpg



LQVaycEAcfRWeOtu.jpg


Is selling my 1080 and getting this instead a crazy thing to do? I'm just getting tired of Nvidia and want to move away from them.
It depends on your priorities. If you are in "f*ck off nVidia" camp and have patience to wait for Q3 launch, why not.
 
Top Bottom