• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GeForce GTX 1060 announced - July 19, 6GB, $249 MSRP/$299 Founder's

Caayn

Member
I actually believe they are being honest here. Why would anyone throw 500$ on 2x1060 when you can buy a 1070 for less than 450$ with less hassles and 2 more GB of RAM?

They also are discouraging enthusiast people to do tri or quad SLIs with big cards because how bad the scaling is.

This is how 480 scales with crossfire with a bunch of games right now:
It only looks useful for big resolutions, and you wouldn't buy 1060 for that anyway.
To add a second card for cheap without needing to buy an expensive new one. Down the road when your current one just doesn't cut it any more or you want more without spending too much (aka price of a better GPU)

Besides, cherry-picking how does it work:
480cfc3u8sc6.png
480cffcpycsht.png
480cfgtav2issv.png
480cfscsn4srq.png
480cfwowfksea.png
 

Durante

Member
I'd categorize not allowing dual-GPU for low-end cards as protecting people from themselves, just like not supporting >2 GPU configurations any more.

The multi-GPU situation is already bad (with many games not scaling at all, some scaling badly, and yet more having horrible frametime consistency in multi-GPU configurations -- and it certainly hasn't been on an upwards trajectory.
 
To add a second card for cheap without needing to buy an expensive new one. Down the road when your current one just doesn't cut it any more or you want more without spending too much (aka price of a better GPU)

By the time a card isn't enough, there will be far better options for the price than buying a past gen one. Then, no need for a better PSU neither.

SLI/CF has been always about pushing current technology further, not about saving money.

Besides, cherry-picking how does it work:

My bunch of games still can't use multiGPU properly.
 
To add a second card for cheap without needing to buy an expensive new one. Down the road when your current one just doesn't cut it any more or you want more without spending too much (aka price of a better GPU)

Besides, cherry-picking how does it work:

Ehm dude he pointed out that a bunch of games don't gain anything from sli, and he is right.

The ratio of games that either:
-don't support sli
-scale poorly
-have terrible framepacing causing them to stutter so much you're wasting your time trying to sli/crossfire

Is 10 to 1 compared to games that scale well and don't have framepacing issues

the idea that sli or crossfire 2 low end cards is even remotely equivalent to one high end card is ridiculous.

It's objectively a terrible terrible idea to use 2 low end gpus. the 'i'll buy a second gpu later' is also a terrible idea. People have made that assumption/mistake for the past 10 years. By the time you'd buy the second gpu you're already vram bottlenecked, or you can get much better single gpu performance for cheap.

Friends don't let friends sli low end gpus. it has been a stupid idea since the dawn of sli/crossfire,regardless of whether nvidia supports it or not.
 

jrcbandit

Member
Ehm dude he pointed out that a bunch of games don't gain anything from sli, and he is right.

The ratio of games that either:
-don't support sli
-scale poorly
-have terrible framepacing causing them to stutter so much you're wasting your time trying to sli/crossfire

Is 10 to 1 compared to games that scale well and don't have framepacing issues

the idea that sli or crossfire 2 low end cards is even remotely equivalent to one high end card is ridiculous.

It's objectively a terrible terrible idea to use 2 low end gpus. the 'i'll buy a second gpu later' is also a terrible idea. People have made that assumption/mistake for the past 10 years. By the time you'd buy the second gpu you're already vram bottlenecked, or you can get much better single gpu performance for cheap.

Friends don't let friends sli low end gpus. it has been a stupid idea since the dawn of sli/crossfire,regardless of whether nvidia supports it or not.

I agree, it only made sense to SLI either the top end card for those who have to have the absolute best performance and sometimes the 2nd best card if the price/performance works out. For instance, SLI'ing two 970s at launch significantly outperformed a single 980 for not very much more money $660 vs $550-600. You don't want a mid range card that is quite a bit slower than a top end card for when SLI doesn't work and if SLI does work well then it only equals the performance or is slower than the top end card... After the 980Ti launched, it was incredibly stupid to SLI two 970s versus getting a single overclockable 980Ti. In the current lineup, I don't really see any good SLI/Crossfire setups, the 1070 is typically $430-450 or so and not exceptionally overclockable so SLI doesn't seem like a smart idea versus getting a single 1080 for $250-300 cheaper.

In any case, SLI/Crossfire is a big HELL NO for me ever again. When the 970 came out, SLI seemed fantastic, especially compared to a single 980 because most every game I tried supported it well with decent scaling. But less than a year into ownership, I quickly became disillusioned since games when they launched stopped having SLI support or had horrible to non-existent scaling. If you were lucky, a patch would be released 2-4 weeks after launch to finally add in SLI support or improve scaling, but many games never got proper support.
 
Lol what am I even reading.

Nvidia are only removing SLI for our benefit it turns out. Thanks billion-dollar corporation.

Sneaky you need to take some of your own advice about brand loyalty. Them removing features is everything about making more money. They don't need defending all the time.
 

Smidget

Member
Is 6GB going to be a limiting factor with 1440p? I guess when the benches come out and compare 480 vs 1060 at 1440p I will have the answer.
 
Lol what am I even reading.

Nvidia are only removing SLI for our benefit it turns out. Thanks billion-dollar corporation.

Sneaky you need to take some of your own advice about brand loyalty. Them removing features is everything about making more money. They don't need defending all the time.

I've been saying sli ing low end gpus is a bad idea for years. Nvidia not supporting it anymore does not suddenly make it a good idea.

That's the difference between a brandwarrior fanboy like you, and a normal user like me. I don't change my views based on what a corporation is doing.



The ratio of games that either:
-don't support sli
-scale poorly
-have terrible framepacing causing them to stutter so much you're wasting your time trying to sli/crossfire

Is 10 to 1 compared to games that scale well and don't have framepacing issues

this is not an opinion.

But your refusal to aknowledge that speaks volumes about your attitude, you're interested in console wars only.
Your opinions don't come from a place of understanding, but from 'politics' and corporate bannerwaving.

Go crossfire rx 480s mate, that'll really show me!
tumblr_l9i3ucVqLV1qc073co1_400.gif
 

Ashhong

Member
Friends don't let friends get the 3 GB version. There are a number of existing games that bump up against the 2 GB wall at 1080p with low-medium/medium settings. If you're buying a new card, get one with a safe amount of RAM.

Got it. Then I will decide between the 8gb 480 and the 6gb 1060. Will probably come down to price since I'm not that hardcore
 
I wouldn't get the 3GB version. Games like GTA V still eat up the VRAM at 1080p. The 3GB version would probably be adequate if you were just playing games like Minecraft, DOTA, and LOL.
 

spyshagg

Should not be allowed to breed
I'd categorize not allowing dual-GPU for low-end cards as protecting people from themselves, just like not supporting >2 GPU configurations any more.

The multi-GPU situation is already bad (with many games not scaling at all, some scaling badly, and yet more having horrible frametime consistency in multi-GPU configurations -- and it certainly hasn't been on an upwards trajectory.

Ahahah.

Okay!
 

thelastword

Banned
Is 6GB going to be a limiting factor with 1440p? I guess when the benches come out and compare 480 vs 1060 at 1440p I will have the answer.
I think it will sooner rather than later, there are already games launched since 2014 that uses lots of VRAM. Imagine playing Shadow of Mordor with ultra textures and some nice AA, it's going to eat heavily into that VRAM. Games like Arkham Knight also uses lots of VRAM if you want to max it at higher resolutions, the more vram you have in that game the less stuttering you will have and a more fluid game by default.

Don't forget that Nvidia usually falls below the competition in sustaining good framerates at higher resolutions like 1440p and 4k......With less ram? that equation gets a bit worse. I think any enthusiast/gamer should buy 8GB minimum at this point for future games, even the Rx470 has an 8GB version come the 29th, so I'm not envying the 3/6 GB approach of the 1060 here......Come next year or as early as October 2016 we will be seeing cards with 16GB of HBM2, so 3GB or even 6GB will look archaic by then.
 
I know right. SneakyStephan takes this shit way too seriously.

The thing is I've owned an Nvidia 750 Ti, 960, 980 and I've got a passive GT 710 on the way. I'm completely open to getting a 1060/1070 too when the dust has settled and prices are right, hence why I'm posting in here.
 
I think it will sooner rather than later, there are already games launched since 2014 that uses lots of VRAM. Imagine playing Shadow of Mordor with ultra textures and some nice AA, it's going to eat heavily into that VRAM. Games like Arkham Knight also uses lots of VRAM if you want to max it at higher resolutions, the more vram you have in that game the less stuttering you will have and a more fluid game by default.

Don't forget that Nvidia usually falls below the competition in sustaining good framerates at higher resolutions like 1440p and 4k......With less ram? that equation gets a bit worse. I think any enthusiast/gamer should buy 8GB minimum at this point for future games, even the Rx470 has an 8GB version come the 29th, so I'm not envying the 3/6 GB approach of the 1060 here......Come next year or as early as October 2016 we will be seeing cards with 16GB of HBM2, so 3GB or even 6GB will look archaic by then.

Yea but remember these are 1080p cards and while a 3GB is stupid and shouldn't exist, 6 GB is right about what these cards will ever need considering their power. It could be that in a game or two 6 GB will not be enough for maxed settings but If one is obsessed with maxed settings, this is not the card for that anyway. Overall I think the 6GB 1060 will do just fine and by the time games will require more VRAM for 1080p, the card will lack other than just VRAM.
 
To add a second card for cheap without needing to buy an expensive new one. Down the road when your current one just doesn't cut it any more or you want more without spending too much (aka price of a better GPU)

Besides, cherry-picking how does it work:

By the time a card isn't enough, there will be far better options for the price than buying a past gen one. Then, no need for a better PSU neither.

SLI/CF has been always about pushing current technology further, not about saving money.



My bunch of games still can't use multiGPU properly.

Agreed, honestly IMO the best way to go about tech is to treat them like rentals, and simply trade/sell off when you need to get a new single cards solution etc.

I have a 970, It's not quite enough for me to play 1440p, I could have bought another used 970 for $150-200 and used them in SLI. But instead I sold my 970 for $280 on Amazon (Crazy I know) and bought a 1070 for less than buying a second card with more performance than the 970 SLI not to mention avoiding the grief that comes with a multi-gpu setup. If you update often enough your parts retain much better resale value and allow you to upgrade for respectable amounts. It's what I did with my X99/5820k build too. I sold my previous parts 4670k/Z97/16GB DDR3/Case for ~$500 and the difference was about $130 with the right promotion and a bundle (Newegg price matched the Microcenter 5820k @$300 by giving me promotional credit.) You just have to stay on top of it.
 

thelastword

Banned
Yea but remember these are 1080p cards and while a 3GB is stupid and shouldn't exist, 6 GB is right about what these cards will ever need considering their power. It could be that in a game or two 6 GB will not be enough for maxed settings but If one is obsessed with maxed settings, this is not the card for that anyway. Overall I think the 6GB 1060 will do just fine and by the time games will require more VRAM for 1080p, the card will lack other than just VRAM.
There are many games that uses lots of vram on medium and high settings. I imagine if you want to game at 1440p, not necessarily at max settings that an 8GB with similar performance metrics to the 1060 is a better bet.

They may be 1080p cards at Ultra settings, but that means they can be 1440p cards with some sliders driven down to acquire 60fps. I may decide to max out textures and AA on these cards and leave all the other settings at medium/high to get 60fps at 1440p there. I for one have a 1440p monitor and I'm in the loop for a new card, I believe a little more VRAM will go a long way for future titles which will push that limit on top of DX12 gains and efficiencies.

I agree that 3GB has no right in this conversation, but as the year rolls on and more DX12 titles emerge, you will see more titles using more vram and even 6GB will become a questionable purchase in a few months. 8GB is just a safer bet for this year and next till the 16GB's start becoming commonplace.
 
Hmmm it appears the 1060 is better than a 480 in some areas, but also worse in others.

And definitely not better than a 980

http://videocardz.com/62086/nvidia-geforce-gtx-1060-rumors-part-5-full-specs-2-0-ghz-overclocking

I have my doubts as well. I believe it will be really close to the 980 and maybe match it or beat it eventually depending on how Nvidia handles driver optimizations. Same goes for the 480. I don't doubt the 480 will eventually surpass the 980 and 390X in games.

Also important to note here.

PCGAMER said:
Something else we need to point out is our use of reference model cards, as much as possible. AMD's new RX 480 along with their R9 Fury X and Nano are reference models sporting 'stock' clocks, while many retail cards may come with slight overclocks and different cooling; the R9 Fury (Asus Strix), R9 380X (Sapphire), and R9 380 (Sapphire) are all slightly overclocked. On the Nvidia side, the GTX 1080 and GTX 1070 are 'Founders Edition' cards, which is basically the same thing as a reference model. The GTX Titan X, 980 Ti, and 980 are also reference cards, while the GTX 970 (Zotac), GTX 960 (EVGA), and GTX 950 (Asus) are custom models that are slightly overclocked.

So the most important part here is that the 970 and Fury are slightly overclocked AIB cards and even though it's omitted I bet the 390 is as well. The FuryX, Nano and 980 are stock cards.


Looking at this chart averaging performance across 15 games. The 480 looks like it will do well once drivers get a little more work. The 480 has no reason to be behind the 390 in any game, even if it's only by a few frames it should be beating it. That's why I think it's an overclocked card here. I think the 1060 will be a bit better than the 480 on average, but these two card will be directly comparable. Consumers will have to decide between 6GB, Nvidia power effeciency and features or 8GB and lower price.

So far the Zotac 1060 Mini looks really compelling. That would work great in my HPTC. I have an older i5 2400 so the reduced driver overhead and reduced power would be great in that set up.
 
I play essentially every high end game that comes out and have never found myself limited by VRAM on my 980ti. I can't imagine a card with less shader perf being bottlenecked by the same amount of memory. My understanding was that "Game X uses Y (very large) amount of vram" is often just the game taking advantage of however much VRAM is present for cacheing above and beyond what it strictly needs to run at those settings and resolutions.

Considering how many cards are in the wild using 2GB, 3GB and 4GB, and the fact that the new console upgrades aren't increasing the RAM pool (definitely not for Neo, probably not for Scorpio either), I can't imagine that more than a handful of games in the next few years will take full advantage of an 8GB or greater pool. Those games that do will still probably look great bumping textures down one setting.
 
Large VRAM pools could be useless without a proper ratio of ROPs and TMUs.

6GB is more than enough for this range of cards, 8GB is just overkilling. But you know, the price difference is minimum, so there is no reason to go lower.
 
There are many games that uses lots of vram on medium and high settings. I imagine if you want to game at 1440p, not necessarily at max settings that an 8GB with similar performance metrics to the 1060 is a better bet.

They may be 1080p cards at Ultra settings, but that means they can be 1440p cards with some sliders driven down to acquire 60fps. I may decide to max out textures and AA on these cards and leave all the other settings at medium/high to get 60fps at 1440p there. I for one have a 1440p monitor and I'm in the loop for a new card, I believe a little more VRAM will go a long way for future titles which will push that limit on top of DX12 gains and efficiencies.

I agree that 3GB has no right in this conversation, but as the year rolls on and more DX12 titles emerge, you will see more titles using more vram and even 6GB will become a questionable purchase in a few months. 8GB is just a safer bet for this year and next till the 16GB's start becoming commonplace.

I don't believe those 2GB will make that much of a difference if not at all. First we have to wait for the 1060 reviews to properly compare them to the RX 480. If performance is similar, then the next factor for me is overclockability and lets say ~20% more performance gained for free is more important than 2GB of VRAM to me. But again we know nothing yet to compare them so any discussion on this is pretty much useless for now.

Also I think you are overestimating a bit what VRAM requirements will be in the next few years. We very well may have 16GB cards soon enough but remember, games are being made still with current gen consoles as a common denominator and that won't change even with the new consoles for a while at least. 4K ultra will require a lot of VRAM sure, 1080p? 6GB will be fine for the life of these cards and 2GB more won't make a difference in 95% of the games. Again I don't think a 1060 is a good 1440p card and neither is the 480 for the simple reason that you don't get these only to play old and present games. That level of performance is not enough for 1440p right now, let alone with more performance heavy games in the future.
 

120v

Member
I play essentially every high end game that comes out and have never found myself limited by VRAM on my 980ti. I can't imagine a card with less shader perf being bottlenecked by the same amount of memory. My understanding was that "Game X uses Y (very large) amount of vram" is often just the game taking advantage of however much VRAM is present for cacheing above and beyond what it strictly needs to run at those settings and resolutions.

Considering how many cards are in the wild using 2GB, 3GB and 4GB, and the fact that the new console upgrades aren't increasing the RAM pool (definitely not for Neo, probably not for Scorpio either), I can't imagine that more than a handful of games in the next few years will take full advantage of an 8GB or greater pool. Those games that do will still probably look great bumping textures down one setting.

yup... always told myself once i get to 6GB i'm just not going to worry about it. just upgraded to a 980 ti, will probably ride it out for years
 
yup... always told myself once i get to 6GB i'm just not going to worry about it. just upgraded to a 980 ti, will probably ride it out for years

Coming off 2GB I said 6GB would be the minimum for my next card. I've had VRAM issues at the end of my last few cards lives. 460GTX 768MB and 7870 2GB. I wish I would have got the 1GB 460 and the 3GB 7950 instead. I'm still leaning heavily towards the 8GB 1070 atm. If 480 AIBs hit a decent price soon I might consider it's lower price and only keep it a couple years.
 

Josman

Member
Been going back and forth between getting a 480 or a 1060, those videocardz benchmarks put both at very similar performance, there would have to be a bigger delta for me to consider the 1060, $50 is a big difference in this range of cards.
 
Coming off 2GB I said 6GB would be the minimum for my next card. I've had VRAM issues at the end of my last few cards lives. 460GTX 768MB and 7870 2GB. I wish I would have got the 1GB 460 and the 3GB 7950 instead. I'm still leaning heavily towards the 8GB 1070 atm. If 480 AIBs hit a decent price soon I might consider it's lower price and only keep it a couple years.

This is kinda where I'm at in my thinking for my primary rig. The 1070 is looking mighty nice once they're actually available at or below MSRP (this paper launch BS is increasingly frustrating). I'm only half joking when I say by the time I can find the AIB model I want at MSRP, Vega will be shipping.

Meanwhile, I'm split on what GPU I'm going to put in my secondary HTPC. It's hooked up to a 1080p 65" Vizio. I don't plan to upgrade my main TV to 4k for at least another 2-3 yrs (I figure why bother until HDR, Adaptive Sync, etc...are standardized in mid-range sets). So, its down to the 480 and 1060 AIB models to hold me over until my next big TV upgrade (leaning OLED but wary of burn in).

Once both are plentiful and available at or below MSRP I'll pull the trigger. Right now, I'm most interested in the Sapphire Nitro 480 and Zotac Amp! Edition 1060. I'll wait for official benchies on 1060s, but the latest leaks today seem to indicate it performs slower than 480, which is contrary to what most in here have been expecting.

That said, perf/watt efficiency and gaming temps/noise are more important for my HTPC than raw performance. Assuming the 480 and 1060 are +/- ~10% from each other, I care more about real world peak/sustained gaming TDP and case temps.

Which is why, to my preferences for this application, the 480 reference cards are a bit of a disappointment.

The silver lining seems to be that there are many reports of 480s undervolting while holding stock 1266Mhz speeds or even overclocking, while dropping power draw down into the ~125W range (which is where I expected the 480 would be at default). Although I don't want to play the silicon lottery with reference pcb/blowers, my hope is a good AIB like the Nitro may be able to provide a better chance of achieving that nice undervolt/stock clock state = cool/quiet.

If not, I'll go with a 1060 unless it's performance is a disaster.
 

thelastword

Banned
I play essentially every high end game that comes out and have never found myself limited by VRAM on my 980ti. I can't imagine a card with less shader perf being bottlenecked by the same amount of memory. My understanding was that "Game X uses Y (very large) amount of vram" is often just the game taking advantage of however much VRAM is present for cacheing above and beyond what it strictly needs to run at those settings and resolutions.

Considering how many cards are in the wild using 2GB, 3GB and 4GB, and the fact that the new console upgrades aren't increasing the RAM pool (definitely not for Neo, probably not for Scorpio either), I can't imagine that more than a handful of games in the next few years will take full advantage of an 8GB or greater pool. Those games that do will still probably look great bumping textures down one setting.
There is a possibility that the new consoles will come with more v/ram and push usage a bit further. Scorpio is rumoured to have 12GB's atm, since consoles are leading the charge for most games now, if you minus a bit of that for the OS then you have about 8-10GB's available to devs.

I've seen so many leaks where benches are leaning towards Rx480. I'm especially looking forward to DX12 games comparisons and 1440p tests.
 

rrs

Member
saw it in the 480 thread along with claiming both cards were OC'd. It's going to be interesting once game scores are out
I've seen so many leaks where benches are leaning towards Rx480. I'm especially looking forward to DX12 games comparisons and 1440p tests.
I honestly think DX12 is on equal footing for both teams from benches, as for 1440p I think the 1060's smaller bus and slower memory transfer will begin to show at such resolutions
 

Finaika

Member
I play essentially every high end game that comes out and have never found myself limited by VRAM on my 980ti. I can't imagine a card with less shader perf being bottlenecked by the same amount of memory. My understanding was that "Game X uses Y (very large) amount of vram" is often just the game taking advantage of however much VRAM is present for cacheing above and beyond what it strictly needs to run at those settings and resolutions.

Considering how many cards are in the wild using 2GB, 3GB and 4GB, and the fact that the new console upgrades aren't increasing the RAM pool (definitely not for Neo, probably not for Scorpio either), I can't imagine that more than a handful of games in the next few years will take full advantage of an 8GB or greater pool. Those games that do will still probably look great bumping textures down one setting.

My 980 Ti stutters when selecting Very High Textures in Rise of the Tomb Raider.
 

Type_Raver

Member
While there might be some sources pointing towards rx480 being slightly faster (see benchmark linked above), it all comes down to the games and nvidias performance and compatibility wins.
And with $10 between to the models (8GB rx480 is $240 to nvidias gtx 1060 at $250), its all about this. Granted, crossfire can offer considerable performance increases but considerable headaches can be associated along with this.
 

Durante

Member
Ahahah.

Okay!
What's your argument?

For anyone looking to actually play a variety of games well (rather than benchmark their system) multiple low-end (and even mid-end) GPUs have always been a terrible choice, and are even more so now. This is just a fact -- it was a fact (and one I pointed out continuously) even when Nvidia did offer this option.
 

lilltias

Member
What's your argument?

For anyone looking to actually play a variety of games well (rather than benchmark their system) multiple low-end (and even mid-end) GPUs have always been a terrible choice, and are even more so now. This is just a fact -- it was a fact (and one I pointed out continuously) even when Nvidia did offer this option.


I bought an extra 6970 down the line for around 20$ and it helped me game through a lot of games and postpone my next GPU upgrade. Couldn't use it for all games (worked great for DS3!), but I am happy AMD didn't "protect" from that choice LMAO.
 

Durante

Member
I bought an extra 6970 down the line for around 20$ and it helped me game through a lot of games and postpone my next GPU upgrade. Couldn't use it for all games (worked great for DS3!), but I am happy AMD didn't "protect" from that choice LMAO.
Obviously "protect people from themselves" was colorful language, and I don't doubt that there are corner cases like yours where it makes sense if you can get a GPU for almost nothing - though that in itself seems like an exceptional circumstance.

However, the fact is that the impact of multi-GPU with low-end boards on frametime consistency is terrible -- both across games and within single games. And, as pointed out previously, support (from both HW manufacturers) has actually been getting worse over the recent years, not better.

The position I'm coming from is multiple threads and hundreds of posts in PC performance threads of various games here on GAF of people unhappy (or, in many cases, furious) that their dual GPU configuration isn't working like it should. And most of the time I can really only tell them they shouldn't have gone multi-GPU in the first place.
 

lilltias

Member
The position I'm coming from is multiple threads and hundreds of posts in PC performance threads of various games here on GAF of people unhappy (or, in many cases, furious) that their dual GPU configuration isn't working like it should. And most of the time I can really only tell them they shouldn't have gone multi-GPU in the first place.

Yeah, the only way I can see the value of multi-GPU is getting the extra one down the line for cheap. It is a very nice option to have.
 

Widge

Member
While there might be some sources pointing towards rx480 being slightly faster (see benchmark linked above), it all comes down to the games and nvidias performance and compatibility wins.

I'll have to dig it out but there was an interesting thread on here about how, over time, NVidia's drivers get worse where AMD's actually get better.

Now I'm not sure how the thread finished in the end but I may go for a hunt for an interesting read.
 

Jimrpg

Member
To add a second card for cheap without needing to buy an expensive new one. Down the road when your current one just doesn't cut it any more or you want more without spending too much (aka price of a better GPU)

Besides, cherry-picking how does it work:

Those RX480 fps are all a little disappointing to be honest. This was supposed to be a 1080/60 card for $250.
 

Type_Raver

Member
I'll have to dig it out but there was an interesting thread on here about how, over time, NVidia's drivers get worse where AMD's actually get better.

Now I'm not sure how the thread finished in the end but I may go for a hunt for an interesting read.

Interesting, id be keen to read about whar caused the performance to deteriorate as im sure it wouldnt be deliberately malicious.

My statement is from the point of view in which stuttering was experienced and there where more games which i played that required more intensive driver updates to bring it to par. My last graphics card from ati was a 6970.
 
Interesting, id be keen to read about whar caused the performance to deteriorate as im sure it wouldnt be deliberately malicious.

My statement is from the point of view in which stuttering was experienced and there where more games which i played that required more intensive driver updates to bring it to par. My last graphics card from ati was a 6970.

Performance of supported games do not deteriorate over time we've had this discussion before on GAF many times. usually games that had proper driver support stay at that performance level if not improve over time. It's more that Nvidia just stops supporting their cards quicker. Meaning comparatively older AMD cards perform better longer.
 

dr_rus

Member
The same report said that the GTX 1060 Founders Edition will only be sold by Nvidia via its German, France and UK websites and will cost €319.
http://fudzilla.com/news/graphics/41086-nvidia-gtx-1060-be-priced-at-234-in-europe

Performance of supported games do not deteriorate over time we've had this discussion before on GAF many times. usually games that had proper driver support stay at that performance level if not improve over time. It's more that Nvidia just stops supporting their cards quicker. Meaning comparatively older AMD cards perform better longer.
Have nothing to do with how NV support their cards.

People tend to put too much into driver level optimisation.
 

Widge

Member
Performance of supported games do not deteriorate over time we've had this discussion before on GAF many times. usually games that had proper driver support stay at that performance level if not improve over time. It's more that Nvidia just stops supporting their cards quicker. Meaning comparatively older AMD cards perform better longer.

Thanks. As I said, I just saw the top level implication, just didn't see how the discussion panned out in the end.
 

Type_Raver

Member
Performance of supported games do not deteriorate over time we've had this discussion before on GAF many times. usually games that had proper driver support stay at that performance level if not improve over time. It's more that Nvidia just stops supporting their cards quicker. Meaning comparatively older AMD cards perform better longer.

I see, thanks for the clarification.
 

Burai

shitonmychest57
So far the Zotac 1060 Mini looks really compelling. That would work great in my HPTC. I have an older i5 2400 so the reduced driver overhead and reduced power would be great in that set up.

Be sure that you can't fit a full-sized card in your setup before you go for a mini card like that. A single, fast fan will make more noise and they are more susceptible to coil whine which can really mess up the viewing experience on an HTPC. They are also more expensive.

Check out other people's completed builds on PCPartPicker and http://www.overclock.net/f/50/small-form-factor-systems to see what's possible.

I was looking at the 970 Minis from Asus and Gigabyte for my Node 304 but once I saw people putting full-sized 980Ti's in there I went for the full-sized MSI 970 Gaming card instead.
 
Top Bottom