• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon 300 series (possible) specs

wachie

Member
This has a 285 and a 290 with no 290x

285 scores 19.9 GTexels/s
290 scores 16.6 GTexels/s

Here is the 290X which scores 16.8 GTexels/s

The 285 is also faster than both the 290 and the 290x at tessellation as well.
Pixel benchmark with GTexels? Surely thats a typo.

Thanks, I wasnt aware of this but would like more such cases. As for improved tessellation, thats due to a new geometry engine front end setup rather than any compression algorithms.
 

Marlenus

Member
Pixel benchmark with GTexels? Surely thats a typo.

Thanks, I wasnt aware of this but would like more such cases. As for improved tessellation, thats due to a new geometry engine front end setup rather than any compression algorithms.

Yea the graph is labelled as Gtex/s so a mistake on their part but I should have picked it up too as they do mention Gpix/s in the text.

I know the tessellation is due to an improved front end, the point being that the architecture is better than Hawaii but it is castrated by the low shader count and low clock speeds. If you gave Tonga the same number of shaders and ROPs as Hawaii and clocked it at the same speed it would be faster.
 

wachie

Member
Yea the graph is labelled as Gtex/s so a mistake on their part but I should have picked it up too as they do mention Gpix/s in the text.

I know the tessellation is due to an improved front end, the point being that the architecture is better than Hawaii but it is castrated by the low shader count and low clock speeds. If you gave Tonga the same number of shaders and ROPs as Hawaii and clocked it at the same speed it would be faster.
I would be disappointed if it wasn't, after all it's newer architecture. I was mainly interested in Tonga from the memory utilization etc. There are some guesses that it has a larger L2 cache also, would be interesting to see how a "full" Tonga performs.
 

AmyS

Member
http://www.fudzilla.com/news/graphics/36971-more-amd-r9-300-series-details-show-up

the real next generation GPU is Fiji, the one that should be a part of AMD's Radeon R9 390, R9 390X and the dual-GPU R9 390X2 cards, or whatever AMD decide to call them in the end. Earlier rumors suggested that the Fiji GPU will be based on GCN 1.3 architecture, have 4096 Stream Processors, 256 TMUs, 128 ROPs and be paired up with 4GB of High-Bandwidth Memory, that should offer 640GB/s of memory bandwidth and provide much better performance.

If true, wouldn't having 128 ROPs go a long ways towards helping a single card / single GPU push 4K resolution at good framerates ?

Especially when you consider things like HBM and DX12.
 

Human_me

Member
http://www.fudzilla.com/news/graphics/36971-more-amd-r9-300-series-details-show-up



If true, wouldn't having 128 ROPs go a long ways towards helping a single card / single GPU push 4K resolution at good framerates ?

Especially when you consider things like HBM and DX12.

VRAM is still an issue, for 4k 4gb isnt enough for future games.


Yes, considering the card is balanced in all other things. This chip keeps sounding too good to be true.

Yes, though price is still a concern.
 

AmyS

Member
AMD Fiji XT R9 390X Coming With Cooler Master Liquid Cooler

Finally we can confirm to you that the new graphics card will indeed ship with a Cooler Master closed loop liquid cooling unit. The specific model in question is a 120mm Asetek based design variation that Cooler Master had licensed and used before. The pre-filled water cooler is very similar to what AMD had already introduced with the R9 295X2 and the boxed retail version of the FX 9590.

AMD’s reference designed R9 390X will ship with a liquid cooler, however AMD’s AIB partners such as Sapphire, XFX, HIS, Powercolor, GIgabyte, MSI and Asus may ship non-reference air cooled designs as well. But for it to make sense for the AIBs their designs will have to offer real advantages to users. Which is going to be very difficult to achieve considering that they’re going up against an extremely effective cooling system as is evidenced by the R9 295X2. Needless to say, the reference design will allow for extraordinary overclocking potential.

Back to the question of “when ?”, the shipping data indicates that the card will be retail ready within four to six weeks. This puts its market introduction at the late March early April time-frame, just as we had told you three weeks ago. However make no mistake, just within a couple of weeks time AMD will be capable of demoing the new GPU. And the shipping data indicates that AMD has enough in inventory to make this a possibility.

Hopefully no later than May.
 
Does this mean that this card will have a hard time doing triple 1080p surround gaming with a 4k supplemental monitor for video/movie playback simultaneously?
 
Nvidia won't drop 980 price unless it's seriously outperformed, they have strong mindshare.

This card will do a lot of damamge to 980s game performance. I expect a healthy drop in price if Nvidia wants to actually sell cards, either that or a Titan II launch within a month or so after.
 

wachie

Member
Isn't this AMD's high-end card? It should be comparable to GM200 and blow the 980 out of the water. Should be like a 290X vs a 680.
It should technically, hopefully it does.

Would also mean Nvidia cant do a rip off pricing with the GM200 also. Some wishful thinking but (% in terms of performance)


GM200 (150% of 980) for $649
R9 390X (140% of 980) for $549
980 for $449
R9 390 (120% of 980) for $449
R9 380X (100% of 980) for $399
970 (85% of 980) for $299
R9 380 (85% of 980) for $279
 

longdi

Banned
390X is on 20nm yay?
In that case, either it should match GM200 at lower wattage or beat it on par wattage.

Love they are going H2O even if it is just an Asetek oem, benchmarks have shown even a single thin 120mm rad does wonders for GPU core cooling. Hopefully since its a reference design, the cooling of VRM is well taken care off, unlike the recent NZXT/Corsair/Arctic GPU brackets.
 

wachie

Member
390X is on 20nm yay?
In that case, either it should match GM200 at lower wattage or beat it on par wattage.

Love they are going H2O even if it is just an Asetek oem, benchmarks have shown even a single thin 120mm rad does wonders for GPU core cooling. Hopefully since its a reference design, the cooling of VRM is well taken care off, unlike the recent NZXT/Corsair/Arctic GPU brackets.
No.
 

ZOONAMI

Junior Member
It should technically, hopefully it does.

Would also mean Nvidia cant do a rip off pricing with the GM200 also. Some wishful thinking but (% in terms of performance)


GM200 (150% of 980) for $649
R9 390X (140% of 980) for $549
980 for $449
R9 390 (120% of 980) for $449
R9 380X (100% of 980) for $399
970 (85% of 980) for $299
R9 380 (85% of 980) for $279

I don't think a the GM200 will outperform 390x, it will still be using GDDR5.
 

ZOONAMI

Junior Member
I think it will, may not be by much.


Mammoth die, water cooling block, bleeding edge HBM, that performance.

Dont think it will be 500.


I don't really see how just the full board GM200 can compete, if the rumored specs for the 390x pan out. Maybe if it is 8gb it will edge a 4gb 390x at 4k, but there will probably be a 8gb 390x by the time GM200 releases, and 8gb of HBM edge the GM200.
 

ZOONAMI

Junior Member
The 290x pretty much on par with the 980 at 4k, btw, so I think the next gen Radeons with HBM memory will beat the full board GM200.
 
It should technically, hopefully it does.

Would also mean Nvidia cant do a rip off pricing with the GM200 also. Some wishful thinking but (% in terms of performance)


GM200 (150% of 980) for $649
R9 390X (140% of 980) for $549
980 for $449
R9 390 (120% of 980) for $449
R9 380X (100% of 980) for $399
970 (85% of 980) for $299
R9 380 (85% of 980) for $279

Doubt gm200 will be 50% over a 980. It will probably be 25% on a good day.

Look at it this way: a 980 already draws 200 watts, or close enough. If the gm200 is architecturally the same, including efficiency (and there's nothing to suggest that it isn't) then how much more can they really do with a 250w power budget?

Unless they increase it. Which isn't likely.
 

ZOONAMI

Junior Member
Doubt gm200 will be 50% over a 980. It will probably be 25% on a good day.

Look at it this way: a 980 already draws 200 watts, or close enough. If the gm200 is architecturally the same, including efficiency (and there's nothing to suggest that it isn't) then how much more can they really do with a 250w power budget?

Unless they increase it. Which isn't likely.

Non-reference 980s can pull over 300 watts
 

ZOONAMI

Junior Member
Yes, but not at stock. I'm aware they have insane power limits, but that only applies to overclockers.

Seriously judging nvidia if they release a gpu that draws over 250w. Even the 290x (and probably the 390x, too) follows a strict 250w limit under gaming.

Why? I feel like most of the enthusiast market buying these cards have at least 500 watt PSUs, which can handle a 300 watt draw just fine.
 

tuxfool

Banned
Non-reference 980s can pull over 300 watts

At what level of Overclocking? that is nearly an extra 100W. If Nvidia does this then the blower cooler used on the 980 just isn't up to par.

They will have to go back to the vapour chamber design they used on the 780ti. Even then I don't know if a design like that will stay quiet at 300W.
 

ZOONAMI

Junior Member
At what level of Overclocking? that is nearly an extra 100W. If Nvidia does this then the blower cooler used on the 980 just isn't up to par.

They will have to go back to the vapour chamber design they used on the 780ti. Even then I don't know if a design like that will stay quiet at 300W.

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-11.html

Average consumption is ~175 watts, but analysis shows spikes to around 300 watts when necessary. These are reference cards Toms tested I think.
 

Serandur

Member
All the "leaked" rumors floating around now are very exciting; 980 slaughterer incoming. I feel sorry for the people buying into the little mid-range turds as of now, especially those who just returned their 970s over VRAM and are doing so.

I returned my 970s and grabbed a Lightning 290X to hold me over until Fiji, exciting times. I am very worried about the 390X being limited to 4GBs of VRAM, however; a bit of a killjoy. I might just hold out until GM200 for that reason, it depends on the 390X's end-performance, pricing, and the rumors floating around on GM200's release at the time. I might even just sell my new 290X for the 390X in time for TW3, and then sell the 390X for an affordable GM200 card later in the year.

Regardless, the point is we're finally about to see an actual, substantial increase in the upper ceiling on GPU performance after like 2 years. :D

I'm ready for a real, worthy successor to my old 780.
 

ZOONAMI

Junior Member
At what level of Overclocking? that is nearly an extra 100W. If Nvidia does this then the blower cooler used on the 980 just isn't up to par.

They will have to go back to the vapour chamber design they used on the 780ti. Even then I don't know if a design like that will stay quiet at 300W.

All the "leaked" rumors floating around now are very exciting; 980 slaughterer incoming. I feel sorry for the people buying into the little mid-range turds as of now, especially those who just returned their 970s over VRAM and are doing so.

I returned my 970s and grabbed a Lightning 290X to hold me over until Fiji, exciting times. I am very worried about the 390X being limited to 4GBs of VRAM, however; a bit of a killjoy. I might just hold out until GM200 for that reason, it depends on the 390X's end-performance, pricing, and the rumors floating around on GM200's release at the time. I might even just sell my new 290X for the 390X in time for TW3, and then sell the 390X for an affordable GM200 card later in the year.

Regardless, the point is we're finally about to see an actual increase in the upper ceiling on GPU performance after like 2 years. :D

I'm ready for a real, worthy successor to my old 780.

I just returned my 970 for a 290x, liking the 290x more than the 970. 4k performance is better.
 

tuxfool

Banned
All the "leaked" rumors floating around now are very exciting; 980 slaughterer incoming. I feel sorry for the people buying into the little mid-range turds as of now, especially those who just returned their 970s over VRAM and are doing so.

...

I'm ready for a real, worthy successor to my old 780.

The whole 980 situation is exactly what nvidia did with the 780, titan and 780ti. Fools bought the overpriced titan because it was the best. The more sensible stuck with the 780, but both were superseded 6 months later by the 780ti (as a gaming card). The 780ti then brought prices of the 770 and 780 down to more sensible levels.

The issue here isn't that new higher performing models brought down prices, rather that people keep buying into the whole premium card nonsense. Nvidia is very happy to capitalize on that...
 

Three

Member
I hope no one on GAF is running 4 GPUs of any kind. I wouldn't wish that kind of latency, compatibility issues and frame pacing on anyone.

Is their x2 support just as bad as it's always been? I remember having to fight my own hardware when I had an x2 card. Made me want to switch to nVidia it was so bad. Their support was nonexistent.
 

tuxfool

Banned
Is their x2 support just as bad as it's always been? I remember having to fight my own hardware when I had an x2 card. Made me want to switch to nVidia it was so bad. Their support was nonexistent.

It is the same as crossfire support. It tends to be better, but like SLI and xfire support is always shaky. Some engines don't support Multiple GPUs at all and others sometimes have bugs and optimisation issues.
 

Pwn

Member
I'm getting confused.

Is TMSC's 16 finfet the same as Samgsung's 14 finfet? And is TMSC's 16 finfet+ better than Samsung's?
 

Renekton

Member
It is really the worse time generally speaking to upgrade your graphics card at this moment.
We don't know when we can actually see 16FF SoCs, nevermind big dies.

You could do a lot worse than get a GPU at the mature 28nm node.

Is TMSC's 16 finfet the same as Samgsung's 14 finfet? And is TMSC's 16 finfet+ better than Samsung's?
Samsung (and GF) is using a different 14nm process.

Since 16FF+ is a refinement over the first version, it might be better who knows.
 

ZOONAMI

Junior Member
What games are you playing at 4k with a single 290x ?

You don't have to run everything at ultra if you're at 4k. Gaming evolve optimizes everything to 4k with a mixture of settings. Shadow of Mordor it optimizes to 1440p.

Running 4k for Grid, Crysis 3, Max Payne 3, Witcher 2, Titanfall, Mass Effect 3
 
All the "leaked" rumors floating around now are very exciting; 980 slaughterer incoming. I feel sorry for the people buying into the little mid-range turds as of now, especially those who just returned their 970s over VRAM and are doing so.

I returned my 970s and grabbed a Lightning 290X to hold me over until Fiji, exciting times. I am very worried about the 390X being limited to 4GBs of VRAM, however; a bit of a killjoy. I might just hold out until GM200 for that reason, it depends on the 390X's end-performance, pricing, and the rumors floating around on GM200's release at the time. I might even just sell my new 290X for the 390X in time for TW3, and then sell the 390X for an affordable GM200 card later in the year.

Regardless, the point is we're finally about to see an actual, substantial increase in the upper ceiling on GPU performance after like 2 years. :D

I'm ready for a real, worthy successor to my old 780.

You've completely drunk the Kool-Aid, haven't you?
 

Tablo

Member
It is really the worse time generally speaking to upgrade your graphics card at the moment.

ugh ikr... and I really want an upgrade from my GTX 670 sometime in June/July.
GM200 from NVidia probably isn't coming before then. All I can hope for is that the AMD stuff comes out, shits all over GTX 980/970, and NVidia prices drop.
Staying with NVidia personally.
 
Top Bottom