AMD board leak hints at next-gen 'Navi' graphics tech

Shin

Member
Feb 4, 2013
4,266
1,934
605
Little is known about AMD's upcoming graphics hardware based on the new Navi architecture, but a number of leaks and driver releases are starting to bring the project into focus. Navi is the codename for new Radeon video cards based on the new 7nm process technology, allowing for more rendering power and higher efficiency - and it's also the architecture confirmed for use in Sony's upcoming PlayStation 5 GPU.

Beyond that, official details on the tech are somewhat thin on the ground. All that we really know is that like Vega before it, the new architecture is arriving late to market, while AMD roadmaps refer only to support for 'next generation memory' as a defining feature. Beyond AMD's next GPU releases, the next phase of development refers to a next generation architecture, suggesting that Navi is based on the same Graphics Core Next (GCN) foundations as hardware going all the way back to 2011's Radeon HD 7970. This has been confirmed by Linux drivers released this week, explicitly tying the Navi codename to the GCN architecture, according to this Phoronix report.

How the core has been enhanced over Vega and its prior GCN stablemates remains to be seen, but a PCB leak from Komachi Ensaka via a now-deleted post on a Chinese forum reveals an all-new board design we've not seen before, marked with AMD branding. The boards seen in the photos are not populated with any silicon, but a lot of information can be gleaned from them.


Credit: Eurogamer <--- read the full story.
Extra: Navi is GCN-based

Easier understanding for the interested parties and large commits were made recently to the arch that reflects upon Gen 8 consoles (PS4/XBox).
 
Last edited:

Shin

Member
Feb 4, 2013
4,266
1,934
605
Next-gen memory ---> GDDR6
Scalability ---> HBM2/GDDR6 memory, 7nm memory controller AMD debuted with Radeon 7, can be used for all markets.




@Three it is, yet a single Tweet doesn't tell less knowledgeable readers much, especially for a GPU that will power next-gen consoles (PS5 at least).
That's my view at least, if mods feel that this topic isn't necessary or adds anything they can close it.
 

Three

Member
Oct 26, 2014
3,507
641
310
Next-gen memory ---> GDDR6
Scalability ---> HBM2/GDDR6 memory, 7nm memory controller AMD debuted with Radeon 7, can be used for all markets.




@Three it is, yet a single Tweet doesn't tell less knowledgeable readers much, especially for a GPU that will power next-gen consoles (PS5 at least).
That's my view at least, if mods feel that this topic isn't necessary or adds anything they can close it.
Fair enough. I feel any information could be added to the original leak thread that it's based on instead of scattered around different threads.
 
  • Like
Reactions: PhoenixTank
Jun 9, 2012
3,566
1,182
455
Just gonna copy this from the other thread re. my thoughts on this being another GCN-based design:

GCN has lagged behind the competition for years, in terms of performance/watt, performance/memory bandwidth, and performance/transistor.

Hell, Radeon VII performs about equal to 1080Ti despite having double the memory bandwidth, 10% more transistors, 20% higher power consumption, and being on a much more advanced process (7nm vs 16nm) released 2 full years later.

So hell yes it’s a disappointment that Navi is still based on GCN (an architecture that will be freaking 9 years old by the time PS5 launches).

Might Navi be the GCN revision that finally brings AMD’s GPU to parity with Nvidia in terms of the above metrics? It would take a miracle.

You have to be dishonest or delusional to spin this news as anything but disappointing.
 
  • Like
Reactions: ethomaz

llien

Gold Member
Feb 1, 2017
4,689
1,901
500
I do not expect Navi to be perceived as sort of a Ryzen, even if it is a major breakthrough for AMD.
To do that, it needs to be a comeback times stronger than Zen was, GPU market is broken in much worse ways, with people paying more for cards twice slower than competiotr.
Hilarious that even "Turing" xx50 series by nVidia is still slower than similarly priced card by competitor.
 
Last edited:

ethomaz

Member
Mar 19, 2013
23,145
2,250
520
36
Brazil
FUD? Yeap AMD posting new code on open drivers calling Navi GCN should be FUD.

😂😂😂😂
 
Last edited:
Jun 9, 2012
3,566
1,182
455
Complain all you want about Nvidia’s pricing. You’re 100% right. They are gouging their customers and they deserve all the criticism they get.

But the fact remains that AMD’s tech is years behind. Even when they pushed GCN to the absolute limit with the most advanced manufacturing process available, the highest memory bandwidth by far, and maxing out the power draw of 2x 8-pin power connectors, they just managed to reach parity with Nvidia’s best from 2 full years prior.
 

Redneckerz

Banned
Jun 25, 2018
3,611
3,160
505
The stillness of time.
I dunno about the GCN being trash sentiment. Yes, there are limitations. At the same time, the performance longevity of stuff like the HD 7970 Ghz cannot be understated. This is ofcourse in part due to the fact current gen consoles have GPU's of that generation, but having a HD 7970, which would previously see you lose to a GTX 680, now outperforms a GTX 780 and even GTX 780 Ti. So perf over time in that regard is rather something.

Plus, GCN is a mature arch now, so AMD being comfortable with it is also not strange.
 
  • Love
  • Like
Reactions: Ascend and SonGoku

Shin

Member
Feb 4, 2013
4,266
1,934
605
navi GCN based is not what we were promised..
To be fair it was kinda staring at us on the road-map all along, WCCHTech even reported it last year February.
They used and probably still spout a lot of nonsense like most sites but keeping track of their AMD related articles I believe they have actual sources (and that's also supported by other tech sites).
AMD listed the GPU after Navi as Next-Gen for a reason, but then a internal AMD person tried to be coy that "Arcturus" is a special chip, but as far as we know it is their next-gen GPU (and it was spotted in the Xbox Scarlett XDK - Dante).

On the topic of Navi, Sony had a hand in some kind of BC customization, as traces from Polaris/Vega or something like that can be found in the recent commits made.
I didn't bother reading it all, but long story short it kinda supporting the Navi made/financed by Sony theory.
 
Last edited:

llien

Gold Member
Feb 1, 2017
4,689
1,901
500
AMD listed the GPU after Navi as Next-Gen for a reason, but then a internal AMD person tried to be coy that "Arcturus" is a special chip, but as far as we know it is their next-gen GPU (and it was spotted in the Xbox Scarlett XDK - Dante).

Even Fermi outsold competitor.
That, to a big extent, is why we can't have good things.
 

Armorian

Member
Jan 17, 2018
574
373
210
Even Fermi outsold competitor.
That, to a big extent, is why we can't have good things.
Fermi was very hot and loud, that's the fact but it was also MUCH better than 5870/6970 in DX11. Plus:

7970 - hot and loud
290 - very hot and very loud (almost like Fermi)
Fury X - needed LQ
Vega 64 - hot and loud

I'm talking about reference designs of course, some third party putted tons of metal to cool those cards down :p

I had 7970 and 290
 
  • Like
Reactions: SonGoku

llien

Gold Member
Feb 1, 2017
4,689
1,901
500
it was also MUCH better than 5870/6970 in DX11
I don't see that in toms review (they only covered one DX11 game):



Fury X - needed LQ
I'm missing the point.

Not gaining much from the great products, AMD, naturally, had (and still has to) gamble, unlike competitor, who could develop Volta and just throw it into garbage bin (and likely a lot more stuff that we have never seen).. Today we have 1050/Ti outselling much faster (still faster than 1650) and similarly priced 570, to which no standard mantra applies.

I find it peculiar that AMD made it that far and have something at all to compete with, covering range up to 2080.
VII consumes 20% more power, but has double the RAM of the competitor, is still regarded as "very bad" card.
 
Last edited:

Armorian

Member
Jan 17, 2018
574
373
210
I don't see that in toms review (they only covered one DX11 game):




I'm missing the point.

Not gaining much from the great products, AMD, naturally, had (and still has to) gamble, unlike competitor, who could develop Volta and just throw it into garbage bin (and likely a lot more stuff that we have never seen).. Today we have 1050/Ti outselling much faster (still faster than 1650) and similarly priced 570, to which no standard mantra applies.

I find it peculiar that AMD made it that far and have something at all to compete with, covering range up to 2080.
VII consumes 20% more power, but has double the RAM of the competitor, is still regarded as "very bad" card.
GTX285 is DX10 so this test was done on that API.

https://www.purepc.pl/karty_graficzne/test_geforce_gtx_480_i_470_vs_radeon_hd_5870_i_5850?page=0,8



Fury X was so power hungry that it needed LQ to achive clocks that made it somewhat competitive to Maxwells.
 

Redneckerz

Banned
Jun 25, 2018
3,611
3,160
505
The stillness of time.
Fermi was very hot and loud, that's the fact but it was also MUCH better than 5870/6970 in DX11. Plus:

7970 - hot and loud
290 - very hot and very loud (almost like Fermi)
Fury X - needed LQ
Vega 64 - hot and loud

I'm talking about reference designs of course, some third party putted tons of metal to cool those cards down :p

I had 7970 and 290
It seems you tagged the wrong user. Instead of me, the tag belongs to @llien. :messenger_smiling_with_eyes:
 

llien

Gold Member
Feb 1, 2017
4,689
1,901
500
Fury X was so power hungry that it needed LQ to achive clocks that made it somewhat competitive to Maxwells.
Sigh:

"AMD does a great job managing power, and we're happy to see the company competing on a more even footing with Nvidia in this area as well. The Radeon R9 Fury X even manages to beat Nvidia’s reference GeForce GTX 980 Ti due to the Fury's better cooling and the resulting lower leakage currents. "
tomshardware

The constrast between imaginary and real AMD product is astounding, green FUD is so strong.
 
Last edited:
  • Thoughtful
Reactions: Ascend

The Skull

Member
Jul 8, 2018
268
286
190
Vega is fairly power efficient but needs tuning and undervolting, whereas Nvidias are pretty good arleady out of the box. I've got my Vega 64 set at 1600mhz (actual in game core clock speed), HBM at 1100mhz and power draw stays between 190 - 210watts under full load. Just hope AMD spend some of that Ryzen cash on the Radeon group and get a really competetive GPU again, not a year later.
 
  • Like
Reactions: thelastword

Armorian

Member
Jan 17, 2018
574
373
210
No, page explicitly mentions DX11.
Your graph doesn't even list 5970 2Gb.
I says DX10/11 and GTX 2xx series is DX10 only. 5970 is dual GPU...



Sigh:

"AMD does a great job managing power, and we're happy to see the company competing on a more even footing with Nvidia in this area as well. The Radeon R9 Fury X even manages to beat Nvidia’s reference GeForce GTX 980 Ti due to the Fury's better cooling and the resulting lower leakage currents. "
tomshardware

The constrast between imaginary and real AMD product is astounding, green FUD is so strong.
Only thing that saved it from distater was HBM, it consumed less energy than GDDR5 but GPU was still hot enough to require LQ.
 
Last edited:

Armorian

Member
Jan 17, 2018
574
373
210
Well, DX11 game, Stalker:



anadtech
So, what, 90%+ of performance from a card that was about half in price?


This is the point where you say "oh, I thought Fury X was a power hungry card, where, in fact, it wasn't", instead of doubling down on FUD.
When tesselation was a factor, Fermi killed terascale https://www.geeks3d.com/20100525/quick-test-unigine-heaven-2-1-gtx-480-vs-gtx-470-vs-hd-5870-in-opengl-4-0-and-direct3d-11-in-extreme-tessellation/

DX11 games in 2010 were rarity, Fermi was performing much better than AMD pre GCN cards in later years when more games started to use DX11 features. Same way GCN killed Kepler down the line.

Fury used more power than any other card in the end



Without HBM it would look worse.
 
Last edited:

SonGoku

Gold Member
Aug 16, 2018
2,027
1,710
320
GCN is a memory starved design as well, requiring the use of expensive HBM for performance gains, though it has been reported that Navi will take advantage of GDDR6.
Makes me believe even more on 24 GB of GDDR6 with 384 bits bus memory as a high possibility.

24GB GDDR6
16 Gbps / 8 = 2 Gbps
2 x 384 = 768 GB/s

18 Gbps / 8 = 2.25 Gbps
2.25 x 384 = 864 GB/s

20 Gbps / 8 = 2.5 Gbps
2.5 x 384 = 960 GB/s
Even when they pushed GCN to the absolute limit
Vega is not the best it can do, is not GCN final form, let's judge the best it can do after Navi releases
I know and is on top of AMD GPU/CPU information, I'll keep ignoring it, just as people keep re-posting it.
You keep making assumptions on Arcturus being next gen, now you are going as far as to discredit info from official AMD personnel.
 
Last edited:

DeepEnigma

Gold Member
Dec 3, 2013
20,643
16,632
685
Makes me believe even more on 24 GB of GDDR6 with 384 bits bus memory as a high possibility.

24GB GDDR6
16 Gbps / 8 = 2 Gbps
2 x 384 = 768 GB/s

18 Gbps / 8 = 2.25 Gbps
2.25 x 384 = 864 GB/s

20 Gbps / 8 = 2.5 Gbps
2.5 x 384 = 960 GB/s
This would be fantastic.
 
  • Like
Reactions: McHuj and SonGoku

SonGoku

Gold Member
Aug 16, 2018
2,027
1,710
320
could develop Volta and just throw it into garbage bin (and likely a lot more stuff that we have never seen)..
tbf most of Volta tech was put to good use in Turing and Xavier
When tesselation was a factor, Fermi killed terascale https://www.geeks3d.com/20100525/quick-test-unigine-heaven-2-1-gtx-480-vs-gtx-470-vs-hd-5870-in-opengl-4-0-and-direct3d-11-in-extreme-tessellation/

DX11 games in 2010 were rarity, Fermi was performing much better than AMD pre GCN cards in later years when more games started to use DX11 features. Same way GCN killed Kepler down the line.

Fury used more power than any other card in the end
This is all true but the point @llien was making is that while having very price competitive products AMD (ATI back then) was still getting outsold by the competition
I see your point as well, one could argue that people bough more Fermi cards because they were more future proof.
 
Last edited:

Armorian

Member
Jan 17, 2018
574
373
210
This is all true but the point @llien was making is that while having very price competitive products AMD (ATI back then) was still getting outsold by the competition
I see your point as well, one could argue that people bough more Fermi cards because they were more future proof.
That's true, Nvidia is always selling better regardless of quality of their products - Fermi performance was great but power consumption/loudness were horrible. Some people might think that I'm Nvidia fanboy or something when in reality i don't like this company at all, they just happen to (currently) have best GPU's on the market. For me personaly these are factors that prevent me from going AMD again:

1. Lower CPU overhead in DX11 drivers (crucial form my old ass 2600k)
2. HBAO+ support in older games
3, SGSSAA in older games
.
.
.
99. Physx support in some games - it's sad that this feature is not used, Mirror's Edge still looks great with it

AMD sticking to GCN is not making things better, Navi will probably still lose in perf/watt ratio with 14nm RTX cards, and with CU limit still in place they don't have a chance to catch up with 2080ti next year (Navi 20), not to mention that Nvidia will probably release 7nm Turings/Ampere in following months.
 
Last edited:
Dec 1, 2018
87
28
150
Next-gen memory ---> GDDR6
Scalability ---> HBM2/GDDR6 memory, 7nm memory controller AMD debuted with Radeon 7, can be used for all markets.




@Three it is, yet a single Tweet doesn't tell less knowledgeable readers much, especially for a GPU that will power next-gen consoles (PS5 at least).
That's my view at least, if mods feel that this topic isn't necessary or adds anything they can close it.
Hbm 3 is next gen memory not gddr6
 

Ascend

Member
Jul 23, 2018
335
225
190
That's true, Nvidia is always selling better regardless of quality of their products - Fermi performance was great but power consumption/loudness were horrible. Some people might think that I'm Nvidia fanboy or something when in reality i don't like this company at all, they just happen to (currently) have best GPU's on the market. For me personaly these are factors that prevent me from going AMD again:

1. Lower CPU overhead in DX11 drivers (crucial form my old ass 2600k)
2. HBAO+ support in older games
3, SGSSAA in older games
.
.
.
99. Physx support in some games - it's sad that this feature is not used, Mirror's Edge still looks great with it

AMD sticking to GCN is not making things better, Navi will probably still lose in perf/watt ratio with 14nm RTX cards, and with CU limit still in place they don't have a chance to catch up with 2080ti next year (Navi 20), not to mention that Nvidia will probably release 7nm Turings/Ampere in following months.
1) If you truly care about low CPU overhead, DX12 and Vulkan are more important, and nVidia has been actively stagnating development of the newer APIs, simply because AMD's 9 year old architecture that everyone loves to whine about being too old, is actually superior for these newer APIs.
2) Is HBAO+ REALLY that important?


3) Why do you need that in older games? Surely your current GPU is strong enough to use other AA methods?
.
.
.
99) You can still enable PhysX to run on your CPU, if so desired. PhysX was and still is a gimmick, just like RTX is now. Can't believe this nonsense actually convinces people to go for nVidia.

If you don't like a company, don't support them. Simple as that. In the majority of cases, there are alternatives. The Radeon VII for one, was trashed more than it deserved.


As for that PCB... It could be anything. It's presumed to be Navi, but it could be an engineering sample for a workstation card, or an upgraded Polaris with GDDR6 for the lower end (although I don't see why they would do that). If Navi really will be using next gen memory, I'd expect something other than DDR6...
 
Last edited:

Armorian

Member
Jan 17, 2018
574
373
210
1) If you truly care about low CPU overhead, DX12 and Vulkan are more important, and nVidia has been actively stagnating development of the newer APIs, simply because AMD's 9 year old architecture that everyone loves to whine about being too old, is actually superior for these newer APIs.
2) Is HBAO+ REALLY that important?


3) Why do you need that in older games? Surely your current GPU is strong enough to use other AA methods?
There are still a TON of DX11 (and below) games that I want to play, and being able to insert AO and great quality AA in them is amazing, for example Mass Effect series looks great with HBAO+ and SGSSAA.

Nvidia inspector is the big reason why I stayed with Nv cards after swaping GPU's back and forth few years ago


99) You can still enable PhysX to run on your CPU, if so desired. PhysX was and still is a gimmick, just like RTX is now. Can't believe this nonsense actually convinces people to go for nVidia.

If you don't like a company, don't support them. Simple as that. In the majority of cases, there are alternatives. The Radeon VII for one, was trashed more than it deserved.
Physx runs like crap on CPU, I listed it 99 becouse it's a non factor nowadays but there are a few games that use it and at least one that I need to play (Cryostasis) :)
 
Last edited:

Leonidas

"Ask me about computers"
Mar 6, 2007
1,559
948
1,235
After being disappointed by years worth of AMD GPUs I'm setting expectations so low that there is no way I'm going to be disappointed this time :lollipop_smiling_face_eyes:
expectations: ~GTX 1080 level performance & power consumption at $299.
 
  • Fire
Reactions: CurryPanda

Ivellios

Member
Mar 26, 2015
1,158
514
345
Brazil
After being disappointed by years worth of AMD GPUs I'm setting expectations so low that there is no way I'm going to be disappointed this time :lollipop_smiling_face_eyes:
expectations: ~GTX 1080 level performance & power consumption at $299.
GTX 1080 performance for $300 is a better deal than RTX 2060, especially if it comes with 8gb vram.
 
  • Like
Reactions: SonGoku

llien

Gold Member
Feb 1, 2017
4,689
1,901
500
tbf most of Volta tech was put to good use in Turing and Xavier
It could well be, we can only speculate.
The same goes with "CGN".
I doubt that even competitor's lead engineers can make statements even half as bold as what people keep throwing here.

After being disappointed by years worth of AMD GPUs...
Given the near-orgasmic titles you pick for nVidia GPU threads, I wonder, exactly which AMD GPUs have dissapointed you, and what the distance between you and the said GPUs was, when that happened, chuckle. :D

PS
What was the title for 1650?
 
Last edited:
  • Like
Reactions: Samsomite