• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD introduces "Radeon Pro Duo" 16TFLOPS graphics card

OmegaDL50

Member
HBM isn't capable of more than 4GB per GPU, so 8GB total is the max for this card. HBM2 allows a higher amount of memory per GPU but that tech isn't ready yet.

Also HBM has an overall higher / faster bandwidth then GDDR5, so 4GB of HBM is significantly faster than 4GB of GDDR5, despite being the same amount of memory.

In many forums I've read be it Anandtech, HardOCP, you still have some folks that think 4GB HBM is 1 and 1 with 4GB of GDDR5 in terms of performance.

So I kind of don't get when people go Meh towards 4GB of HBM, especially considering the limits of the first revision of HBM. Now if this was 4GB of GDDR5, that would be a different matter entirely.
 
Lol crazy. $650 980 Ti does pretty well alone. I finally got to test the baby with 45 hours of The Division and I couldn't be more happy. I don't see myself needing anything better than a $650 gfx card (the hybrid version of it for a little more, maybe)
 

AJLma

Member
I'm seriously wondering the logic of this pricing. It costs more than both of the individual cards combined, AND it costs more than 2 980 TI's which are faster single, and probably faster in SLI too.
 
Also HBM has an overall higher / faster bandwidth then GDDR5, so 4GB of HBM is significantly faster than 4GB of GDDR5, despite being the same amount of memory.

In many forums I've read be it Anandtech, HardOCP, you still have some folks that think 4GB HBM is 1 and 1 with 4GB of GDDR5 in terms of performance.

So I kind of don't get when people go Meh towards 4GB of HBM, especially considering the limits of the first revision of HBM. Now if this was 4GB of GDDR5, that would be a different matter entirely.

Because memory bandwidth hasn't really been a bottleneck for top end performing cards this generation. Most of them (except for notably the 8GB 390/390X) have more than enough memory bandwidth to dump most of their video memory in a scene down the pipe at >60 times a second.
 

The Argus

Member
God damn video card prices have spiralled out of control in the last decade.

I used to think the $300 I spent for my 9700Pro was ridiculous because that's what a console cost. Then later I figured $500 was the top of the line limit when I got my GTX 680. Now I'm expecting to drop $600-700 this year for the "sorta the best for a few months" card.
 
Nope. Nvidia's SLI is bad enough. Going to Crossfire is just rubbing salt into the wound.

Yeah, I've had multiple SLI setups, and am about to have another 'cause I was crazy and bought a second Titan X from someone here on GAF that will be arriving in a couple of days. The majority of games launch without good SLI support. So, if you're the type that plays new games on launch day and has them done within a week or two, don't even bother with SLI. By the time it has good support, if it ever does, you'll be done with the game.
 

lyrick

Member
Just look at the R7 370 (1024 shading units) vs GTX 950 (768) with the 950 being close to 20% faster in most games.

You do know that the R7 370 is essentially a 2012 AMD part (7800 series Pitcairn Pro) that your comparing to a 2015 Nvidia part... Right?!

You're also aware that FLOps have much more to do with marketing than graphical output.
 
I used to think the $300 I spent for my 9700Pro was ridiculous because that's what a console cost. Then later I figured $500 was the top of the line limit when I got my GTX 680. Now I'm expecting to drop $600-700 this year for the "sorta the best for a few months" card.

Yeah the 9200/9500/9700 was the last generation of cards I was involved in, 9500 pro here though. My buddy unlocked the extra pipes on his regular 9500, I was so jealous lol
 
Not too excited about AMD's future offerings. My watercooled 7970 warmed my room for the winter, and every new flagship keeps raising TDP.
 

AmyS

Member
This is way less money than Nvidia's GTX Titan-Z card a few years ago, which had two big GK110 GPUs and cost $2,999 at launch.
 

x3sphere

Member
SLI/CF is a huge waste of money in my experience. Even if this card was "reasonably" priced at $800 or something, I would not get it over a single 980 Ti or Fury X, since it's effectively still multi-GPU.
 

OmegaDL50

Member
That price is crazy. I'm starting to think I should just go with a mid range card and upgrade every couple of years.

That's basically what I've been doing. Saves me money in the long run and I still get substantial performance leaps to run every game beyond console level spec.

I'm still on my HD 7950 since 2012.

I've been considering jumping to a 970 GTX, but at this point I'm likely to wait and see what Pascal offers to the table.

I usually upgrade my PC when the next Elder Scrolls game comes out, I did this back in 2007 for Oblivion, and I did the same for my current PC for Skyrim back in 2012.

But with stuff like The Witcher 3. Just Cause 3, and Fallout 4 pushing my HD 7950 a bit. Holding out for Elder Scrolls VI to make my decision might be too far off and I'm likely to make the transition sooner.
 
i guess if youre a game who has 1500 dollars to light on fire, this is the product for you. this is in the same category as that awful titan z.
 

x3sphere

Member
It's actually perfectly suited to VR. The SteamVR benchmark actually makes use of LiquidVR MultiGPU rendering tech.

SLI / Crossfire for other stuff though is a garbage fire.

Multi GPU may lend itself well to VR, but so far very little VR content actually supports it. We'll see what happens I guess. Hopefully, once the headsets actually launch we'll see more widespread support but I'm not holding my breath.
 

hesido

Member
Standard enthusiast PC case in a few years:
L5tiQUk.jpg
 

Zojirushi

Member
So if 2018 is already some sort of "Next Gen Memory" does that mean we're essentially only gonna get a single Generation of HBM2 cards (2017)?

That doesn't sound very costs effective at all.
 

//DEVIL//

Member
This is the exact version of the 295x2 from AMD that was released in 2014 (15?)

a 2GPU under one card working as a crossfire..

I used to have that card. I sold it and I got 980ti. while in general the performance of the 2 cards WHEN they work is much better than the 980Ti, in most of my cases it was working as a single GPU only.

AMD is very slow and way behind when it comes to updating their profiles for crossfire. I am never gonna buy a crossfire AMD card ever again.

also whats with the stupid price ? you can buy 2 Fury X cards and same like 200$ over this.
 

riflen

Member
So if 2018 is already some sort of "Next Gen Memory" does that mean we're essentially only gonna get a single Generation of HBM2 cards (2017)?

That doesn't sound very costs effective at all.

It could be because HBM2 seems to use a lot of juice as you increase the bandwidth.

25683986592_f2ec93bb38_o.jpg
 

knitoe

Member
This is the exact version of the 295x2 from AMD that was released in 2014 (15?)

a 2GPU under one card working as a crossfire..

I used to have that card. I sold it and I got 980ti. while in general the performance of the 2 cards WHEN they work is much better than the 980Ti, in most of my cases it was working as a single GPU only.

AMD is very slow and way behind when it comes to updating their profiles for crossfire. I am never gonna buy a crossfire AMD card ever again.

also whats with the stupid price ? you can buy 2 Fury X cards and same like 200$ over this.
It's mostly for people wanting CF, but only have 1 slot = 2 GPU or 2 slot = quad GPU setup. Very few people so very limited quantity. So, you don't have any choice but to pay the extra cost.
 
Top Bottom