• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Head count: Do you own an AMD or NVIDIA GPU?

What brand of discrete GPU do you use in your primary gaming computer?


Results are only viewable after voting.

Octavia

Unconfirmed Member
My 6950 was a good card. Everything worked great and I never had any issues.

Now I have a 670 and it's also a great, reliable card where everything works. I mostly switched to Nvid for physX in borderlands, and there was a good deal at the time. My 6950 lives on as a hand-me-down to a friend when I built their computer.
 
I have the GTX 960, and it's neck and neck with the competitor to it, the AMD 285, but I went with it since nvidia seems to have special treatment with most AAA games.
 

dr_rus

Member
The X1900 series were pretty awesome too, they were able to implement a driver hack which forced AA in Oblivion. The Nvidia equivalent at the time had no such function. I pretty much bought the X1900XTX just to play Oblivion back in the day.

Come to think of it the only 2 ATI/AMD cards I've ever owned are the 9800 Pro during the Half-Life 2 era and the X1900XTX.

X1900XTX was a great card, yeah. The time between its release and GF8800 was one of these periods where I used Radeon almost exclusively.

He's talking about a final build of the game, not the source code, and how they couldn't optimize their drivers because they didn't have that build almost until release date. I'm sure that CDPR sent final builds to AMD as well as a lot of other h/w vendors. TW3 has gone gold on 16th of April which means that the whole industry had a month to tweak their stuff for a final build of the game. The first patch was the day 1 patch AFAIK and it didn't break anything on any GPU.

The talk here is about stability mostly with some "small" TressFX improvements (this is a copy of the patch notes basically - see below). Thing is that the biggest performance improvement for TR's TressFX option on GeForce came with a new driver from NV and not from the patch made by CD.

TressFX (and Hairworks) is a rather complex piece of software which a game developer is unlikely to want or even be able to to optimize on his own. This is why we have them as a separate "plugins" from IHVs in the first place.

Crystal Dynamics and Nvidia were both able to look at the TressFX code, Nvidia made changes on their driver side and CD made changes on the game side so that TressFX could work on the Nvidia cards. 30 seconds and Google would have shown you this. I thought this was common knowledge, so I didn't bother with a source originally.

Well, since you're talking about Google...

1. TR was release on March 5th, 2013

2. On March 15th a GeForce driver was released which provided up to 60% (!) performance improvements in the game. I myself remember that the game was unplayable for me on a GTX680 until the new driver came which made it very smooth. I don't think that they could've optimized TressFX source code for GeForces, test it with AMD on Radeons - 'cause that's how such changes are implemented usually - and publish the update in 10 days. It is far more likely that almost all TressFX optimizations were done by NV in the new driver - this limits the amount of work here significantly and make it possible to produce such optimizations in two weeks.

3. There were two patches released by CD between 5th and 15th of March. The first one said only this about TressFX: "- Some small improvements to TressFX hair rendering." Note that it doesn't say anything about performance at all. The second one said this:

- Cost of TressFX reduced, especially in combination with SSAA.
- TressFX memory usage reduced for AMD Eyefinity and NVIDIA Surround setups.
- TressFX simulation and graphical quality improvements.

And also this:

We’ve been working closely with NVIDIA to address the issues experienced by some Tomb Raider players. In conjunction with this patch, NVIDIA will be releasing updated drivers that help to improve stability and performance of Tomb Raider on NVIDIA GeForce GPUs.

Nowhere does it say anything about them working on TressFX with Nvidia. From the notes it is clear that the cost of TressFX went down for everyone, not for GF users only. Later patches don't mention TressFX performance anywhere. The biggest performance increase came with the new driver from NV which was probably using special optimization paths for TR's (and TressFX) shaders.

To my knowledge, this type of assistance isn't available with Hairworks.

For instance...
http://www.overclock3d.net/articles...nvidia_hairworks_unoptimizable_for_amd_gpus/1



And this quote from CDProject's Marcin Momot...


So unlike Tomb Raider and TressFX in which Crystal Dynamics were able to optimize so that it could work better with Nvidia cards(and even Intel IGPU), CD Project and AMD cannot do the same with Hairworks.

We're getting pretty far off topic at this point. If you'd like to discuss this further or if you have access to any other information on the topic please PM.

What NV did with their driver optimization for TR certainly can be done for TW3 by AMD's driver team - if the performance is really lowered by sub-optimal shaders and not by the h/w basics themselves like slower tessellation or some other feature which works faster on Maxwell. No one can stop AMD from doing this. NV is doing this with every Gaming Evolved title released basically.

This is one of reasons why I tend to stick to NV GPUs lately - I don't care why a game is performing badly on my H/W, I paid an IHV to provide me with the best possible experience and it's not my problem if some other company is somehow blocking them from providing this experience. There is always a way to fix stuff which is what NV is doing more often than not. As a result - all GE titles work fine on GeForces while some TWIMTBP titles can work like crap on Radeons. This is happening not because AMD is providing some things as "open" but because NV is putting a hell of an effort into driver level shader optimization.

It is quite interesting actually how things will unfold with DX12 as this "thin" API should limit IHV's options in driver level optimizations theoretically. It may actually lead to more GE games performing badly on NV hardware.
 
X1900XTX was a great card, yeah. The time between its release and GF8800 was one of these periods where I used Radeon almost exclusively.


He's talking about a final build of the game, not the source code, and how they couldn't optimize their drivers because they didn't have that build almost until release date. I'm sure that CDPR sent final builds to AMD as well as a lot of other h/w vendors. TW3 has gone gold on 16th of April which means that the whole industry had a month to tweak their stuff for a final build of the game. The first patch was the day 1 patch AFAIK and it didn't break anything on any GPU.


The talk here is about stability mostly with some "small" TressFX improvements (this is a copy of the patch notes basically - see below). Thing is that the biggest performance improvement for TR's TressFX option on GeForce came with a new driver from NV and not from the patch made by CD.

TressFX (and Hairworks) is a rather complex piece of software which a game developer is unlikely to want or even be able to to optimize on his own. This is why we have them as a separate "plugins" from IHVs in the first place.



Well, since you're talking about Google...

1. TR was release on March 5th, 2013

2. On March 15th a GeForce driver was released which provided up to 60% (!) performance improvements in the game. I myself remember that the game was unplayable for me on a GTX680 until the new driver came which made it very smooth. I don't think that they could've optimized TressFX source code for GeForces, test it with AMD on Radeons - 'cause that's how such changes are implemented usually - and publish the update in 10 days. It is far more likely that almost all TressFX optimizations were done by NV in the new driver - this limits the amount of work here significantly and make it possible to produce such optimizations in two weeks.

3. There were two patches released by CD between 5th and 15th of March. The first one said only this about TressFX: "- Some small improvements to TressFX hair rendering." Note that it doesn't say anything about performance at all. The second one said this:



And also this:



Nowhere does it say anything about them working on TressFX with Nvidia. From the notes it is clear that the cost of TressFX went down for everyone, not for GF users only. Later patches don't mention TressFX performance anywhere. The biggest performance increase came with the new driver from NV which was probably using special optimization paths for TR's (and TressFX) shaders.



What NV did with their driver optimization for TR certainly can be done for TW3 by AMD's driver team - if the performance is really lowered by sub-optimal shaders and not by the h/w basics themselves like slower tessellation or some other feature which works faster on Maxwell. No one can stop AMD from doing this. NV is doing this with every Gaming Evolved title released basically.

This is one of reasons why I tend to stick to NV GPUs lately - I don't care why a game is performing badly on my H/W, I paid an IHV to provide me with the best possible experience and it's not my problem if some other company is somehow blocking them from providing this experience. There is always a way to fix stuff which is what NV is doing more often than not. As a result - all GE titles work fine on GeForces while some TWIMTBP titles can work like crap on Radeons. This is happening not because AMD is providing some things as "open" but because NV is putting a hell of an effort into driver level shader optimization.

It is quite interesting actually how things will unfold with DX12 as this "thin" API should limit IHV's options in driver level optimizations theoretically. It may actually lead to more GE games performing badly on NV hardware.

I'm not certain you read the same things I did... I'll condense. These are snips directly from the quotes in my previous comment.

Nvidia regarding Tomb Raider and TressFX: (He specifically states Tomb Raider at "maximum settings". The only thing that changes between the second highest and max setting is the activation of TressFX. I'll add the full quote to the bottom of the comment so you can read it.

Please be advised that these issues cannot be completely resolved by an NVIDIA driver. The developer will need to make code changes on their end to fix the issues on GeForce GPUs as well.

CD Project regarding TW3 and Hairworks:

unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products.


With Tomb Raider, we have Nvidia saying that Drivers couldn't fix the issue and that Crystal Dynamics needed to make changes to the code so that the game would work correctly on Geforce... Then we have CD Project saying that they can't optimise the code for their game so that it will run well on AMD cards... Then you respond saying that AMD should just do what Nvidia did and fix it with their drivers... (Which Nvidia wasn't able to do either, they needed the dev to optimise the code to work with their hardware. ...Which AMD can't because CD Project aren't allowed to mess with the Hairworks code...


We are aware of performance and stability issues with GeForce GPUs running Tomb Raider with maximum settings. Unfortunately, NVIDIA didn’t receive final game code until this past weekend which substantially decreased stability, image quality and performance over a build we were previously provided. We are working closely with Crystal Dynamics to address and resolve all game issues as quickly as possible.

Please be advised that these issues cannot be completely resolved by an NVIDIA driver. The developer will need to make code changes on their end to fix the issues on GeForce GPUs as well. As a result, we recommend you do not play Tomb Raider until all of the above issues have been resolved.

In the meantime, we would like to apologize to GeForce users that are not able to have a great experience playing Tomb Raider, as they have come to expect with all of their favorite PC games.
 

elfinke

Member
If this was true the AMD numbers would be lower. I understand outliers existing, but most of the time I can't help but think you'd need some weird bias to stick with AMD right now.

This assumes regular upgrading. There are a number of people on this very page who are still rocking 6950s - or other cards of that era - for example. That's because at the time it was one of the two far and away the best bang-for-buck cards on the market (along with 560ti). In fact I'd argue its relative bang-for-buck performance makes a mockery of today's market (at least here in Australia) where no card exists in the $300-$400 region (well, occasionally a 290 appears in there on sale). Noise aside, my unlocked 6950 is still a fabulous card, though it is absolutely nearing the end of its effective service life now.

You either buy a 280x for ~$300, or you spend $450 on a 970. That fucking sucks if you don't have - or want to spend - $450 for a GPU, as good as the 970 may be (of course, I think the 970 is at least $50 over priced, closer to $100, but whatever, they've sold fucking truckloads of them /shrug). I'm really hoping the launch of 3xx series will shake things up, cos I'd like to grab some new gear :D
 
This assumes regular upgrading. There are a number of people on this very page who are still rocking 6950s - or other cards of that era - for example. That's because at the time it was one of the two far and away the best bang-for-buck cards on the market (along with 560ti). In fact I'd argue its relative bang-for-buck performance makes a mockery of today's market (at least here in Australia) where no card exists in the $300-$400 region (well, occasionally a 290 appears in there on sale). Noise aside, my unlocked 6950 is still a fabulous card, though it is absolutely nearing the end of its effective service life now.

You either buy a 280x for ~$300, or you spend $450 on a 970. That fucking sucks if you don't have - or want to spend - $450 for a GPU, as good as the 970 may be (of course, I think the 970 is at least $50 over priced, closer to $100, but whatever, they've sold fucking truckloads of them /shrug). I'm really hoping the launch of 3xx series will shake things up, cos I'd like to grab some new gear :D

To be fair to small markets, pricing is almost always higher there than in North America and most of Europe. Australia gets screwed on console and game pricing too. In large markets, the pricing of 970 is in the USD $300-330 range which is pretty insane for the performance it delivers, 3.5 GB or VRAM or no.

The Steam Hardware Survey gives you a hint as to just how popular and crazy selling the 970 has been:
http://store.steampowered.com/hwsurvey/videocard/
GTX 970 is in the top 5 of DX11 cards.

As far as AMD is concerned, they are being slaughtered by Nvidia at the high end, not a single Radeon R9 series card appears in the top 20 of DX11 and in fact only 3 total appear in the top 20 of which are all really old models now. The only reason the percentages aren't actually worse for AMD than they appear is that most people really do hold on to video cards for 3-4 years, the people who upgrade every year are a relatively small niche and Nvidia almost exclusively controls that niche if the Steam Hardware Survey is anything to go by (which it is).

edit: Come to think of it, the situation is actually even more dire for AMD than I thought, I had forgotten that starting with Radeon Rx series, all cards in the same family identify as "Radeon R9 200 series" so the R9 2xx lurking down at #29 on the DX11 list is actually R9 290X, 290, 285, 280X, 280, etc. all combined. This is a ritual killing, Nvidia basically is the high end of PC gaming at this point, AMD basically doesn't exist.
 

Xdrive05

Member
GTX 980 from MSI (4G)

But I did own an AMD 7950 boost (also MSI) and it was an incredible card. Only got rid of it because the litecoin boom allowed me to make over $250 above what I had paid for it new just a few months prior.

I had mostly good experience with the AMD drivers at the time. Not having "game ready" drivers on day one was the only driver related knock I had against AMD.
 

Jazz573

Member
Laptop, so neither. Instead I'm stuck with a shitty Intel integrated graphics card. :( I'm planning on building a computer soon though, and I'll be picking NVIDIA.
 
In 2007 I bought a laptop with a GTX 6600M (I think that's what it was... Forgetting now.

In 2011, I built a PC with a 560 Ti.

In 2015, I upgraded to a 750 Ti.

So, I'm 3-0 towards Nvidia since 2007. I don't like supporting the bigger of the two companies, but AMD just hasn't offered what I've wanted at the prices I'm looking.
 

LaffTrack

Neo Member
I've had a 280x, Titan, and now a 980. Looking at the results of the poll my gpu choices align with the percentages shown in the poll. I really hope AMD pulls some market share back from Nvidia when they release the 300 series.
 

pezzie

Member
Currently running AMD R9 280X. It was the best video card in my price range at the time of purchase. I honestly don't care about which brand I have, as my previous card was a GTX 560 Ti.

Honestly speaking, I can't say I've had any problems with drivers with AMD's card so far. Everything's been as rock solid as when I was running on my Nvidia. With my next upgrade, of course again I will be looking at the best card to purchase within my price range, no matter the brand.
 

mobiusXP

Neo Member
Nvidia for me. Just sold my 760 and put in some extra money to pick up a 970 so I could get batman and the witcher along with my hardware upgrade.
 

AllenShrz

Member
Been and AMD user for quite a while now, currently using the HD 7970 and waiting for Fiji. My last Nvidia was the 8800gt and then switched to Radeon and never came back.

Claims about shitty drivers always rubbed me the wrong way since I had none I can remember.
 

dimb

Bjergsen is the greatest midlane in the world
You either buy a 280x for ~$300, or you spend $450 on a 970. That fucking sucks if you don't have - or want to spend - $450 for a GPU, as good as the 970 may be (of course, I think the 970 is at least $50 over priced, closer to $100, but whatever, they've sold fucking truckloads of them /shrug). I'm really hoping the launch of 3xx series will shake things up, cos I'd like to grab some new gear :D
I guess it comes down to what you're paying for. I feel like there is more to buying a card than chasing benchmarks, so when Nvidia dominates both those and outside sectors it's pretty much just baffling as to why anyone would not go with an Nvidia card right now. In America the 970 is only about $350, and many of those include The Witcher 3 and Arkham Knight.

I understand people don't buy a new card every year but it's been like this for a while now...
 

Lace

Member
I have a GTX 770. The fact that some high profile games will run with less issues was one of the primary factors. I know that amd cards can struggle because of partnerships with Nvidia and I have no interest in being on the short end of that stick.
 
I have a AMD 6870 which I feel like I upgraded to just last year but in truth, it was 4 years ago =S

I did upgrade my CPU recently but the AMD card remained and so far nothing I play or own runs poorly so I guess that's fortunate. I see a video card upgrade in my future but I'm sure I will hold off until there's that one game I cannot play in order to do it.
 

Foxix Von

Member
I'd have to double check but I think the first GPU I ever bought was a GeForce FX 5600 Ultra. My laptop has a 460m in it and the PC I bought off craigslist had a 660ti.

The laptop's GPU nuked itself so I picked up a used PC to replace it which... once again had a GPU which promptly nuked itself. I said fuck it, I'm tired of Nvidia's hardware and rebuilt the PC with AMD hardware. Which... had a card that promptly nuked itself but was quickly RMA'd.

Anyway now that I'm finally getting my AMD rig settled I'm very satisfied with them and I'll definitely be sticking with them for the foreseeable future. I ended up going with with a really weird mixed vendor set of 290x cards as after the RMA newegg was out of stock of the first set of HIS cards I ordered.

Looked at the specs and found that, for whatever reason 290x in crossfire trades punches with 980s in sli at high resolutions. I figured if I pick up any of these cards I'd end up getting a second in a year or two. So why waste the money investing in a 980 or 970 that will end up stacking about as well as the significantly cheaper 290x. Just didn't make sense for me to go with Nvidia.
 

FLAguy954

Junior Member
r9 290.

waiting for the 390.

This is where I stand. It depends on how much the 390 is before I sell my current card for it.

I got my 290 for $250 last year and a 970 can't approach that price/performance yet.

I do admit that I have a bias for AMD's graphics cards as AMD's 1.5 year old cards (the 290 and 290X) still compete largely compete with two out of three of the top end Maxwell cards (the 970 and the 980). I also have to give AMD there props for catching up so quickly with what were once Nvidia exclusive features such as native down-sampling, game recording, automatic game optimizers, etc.

The only thing that sucks is that AMD's CPU optimization sucks, so many games don't use my 4.5 GHz 4670K to it's full potential.
 

elfinke

Member
...it's pretty much just baffling as to why anyone would not go with an Nvidia card right now. In America the 970 is only about $350, and many of those include The Witcher 3 and Arkham Knight.

But it's not baffling? The 970 is $450+ is Australia, and you get W3 only. If you only want to spend $400 on your GPU, then you're shit out of luck (well, the 290 exists at that point, and if you forget that there are new cards coming in the next few weeks, that's a fine purchase, but most people would say to "save another $50 and grab a 970" or wait until June, both are fine alternatives).

If you have less than that in your budget for your GPU, say $300-$350, then you're really buggered as nothing exists in that price bracket (aside from 4gb oveclocked 960s and other similar tripe). That's my point. Neither vendor is providing a card in this upper-mid range of GPUs. You either buy a somewhat upper entry level 960 or 280/280x (at $250-$300), or you have to spend considerably more to get a 290 or 970 ($400-$450).

There must be room in nvidia's lineup for something in between the 960 and the 970. Though again I reiterate, the 970 is selling by the truckload, so there is no compulsion to provide that card, unfortunately.

I think the 290 and the 970 should both be occupying that AUD$350-$400 price bracket, if not lower. Certainly the 290 could stand to be nearly $50 less, it'd be a no-brainer purchase at that point, even without packed in game codes (like it had this time last year). Hopefully it'll get there in the next week or three, and pull the 970 down with it!

Meh, it's neither here nor there in the grand scheme of things :). I just think about how much value was packed into the 6950 at $200, or even further back to the Geforce 2 MX at a similar price, and compare that to now and I think we're not getting the best deal.
 

Lork

Member
Things look pretty dire for AMD right now, but I wonder if it's not just a symptom of them losing interest in the PC market in favor of their console business. When you've positioned yourself to make a little bit of money off the back of every single PS4, Xbone and Wii U made, the high end PC market starts to look a little small in comparison.

Edit: The poll seems to be over already, but I have an AMD HD7900.
 
I'm currently on my AMD 6950 that had the unlockable dual bios to a 6970, had the card for about 3+ years, fantastic value, though now I'm replacing it with a Nvidia 580 I got for cheap since it's on it's last legs, randomly cutting off at times. I don't plan for a real upgrade until the VR sets come out as the games I all play run well on my 6950 at 1080 and decent settings still.

This was after coming off the Nvidia 275GTX which was off of the venerable 8800GT I got in 2007.

I tend to flip-flop who ever has the value/performance I can afford at the time.
 

FrunkQ

Neo Member
Not allowed to vote...

But it does not matter as I can't really hit "BOTH".

I have used both pretty evenly in the past. Just goes with what seemed to make most sense at the time from a cost/power perspective and the money at hand.

It is unusual that I have all 4 games capable rigs fitted with NVidia just now (970, 780, K5000 & 870M) after retiring a couple of ATI cards. So in the Nvidia camp.

Both manufacturers have astounded me and frustrated me in equal measure. Don't really have a favourite but the wind it in NVidia direction at the moment.
 
Current system...

560 (not the TI model)

Currently I have all the components for my new Node 304 box except for the CPU and the graphics card.

The GPU is going to be the last thing I purchase and I'm 90% sure it's going to be a GTX 970.

I won't be purchasing it until October of this year, so...that might change, but I'm pretty sure whatever I choose is going to be an Nvidia box.

1) Heat...AMD's offerings aren't as good (small form factor system)
2) Noise
3) Drivers

Right now, the only advantage I can see with AMD is the amount of VRAM you can get on a card for less than $400.

AMD...fix the heat and noise please. We don't all want to build a full or mid tower systerm.
 
Slightly related but just read on AnandTech news that the 2015 Apple MacBook Pro Retina will feature an AMD dGPU (R9 M370X) after the company previously went with a Geforce GT 750M. Apparently, the M370X is much more powerful than the 750M but consumes more power. Either way, this is surprising and is quite a big win for AMD.

So the new MacBook Pros will be better for gaming but may suffer in battery life.

http://www.anandtech.com/show/9276/2015-15inch-retina-macbook-pros-dgpu-r9-m370x-is-cape-verde
 

Stimpack

Member
I hate everything about Nvidia, but they have the performance, the deals, less power consumption, and typically release day drivers for big titles. I love AMD, but I'm just not seeing them as a healthy competitor.
 

Cels

Member
it's pretty simple.

if someone has $200 to spend I'd tell them to go with a 280X.

If someone has $330 to spend I'd tell them to buy a 970.

if someone has $500 to spend they should buy a 980.
 

harpingon

Neo Member
R9 285 in compact form factor for a small PC, it's plenty quick for me. I'll get the rest of this year out of it at least
 
Top Bottom