• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GeForce GTX 1060 announced - July 19, 6GB, $249 MSRP/$299 Founder's

Luigiv

Member
Ah, I see. I didn't think 2x AA was was that big of a hit, but maybe I'm confusing SSAA with those other kinds of AA, like FXAA or MSAA. I...don't know much about computer.

Different types of AA have different costs. SSAA is the most expensive as it's pretty much a brute force approach to AA. With 2x SSAA you're literally rendering the game at double the internal resolution.
 

LordOfChaos

Member
First line in my reply was

I'd stay away from the Founders Edition because they're gouging and the coolers aren't as good as third party anyways,


You went with...
Again, people finding ways to make excuses for them.

3-phase VRMs is the absolute bare minimum you would put on a cheap card let alone a $300 one that Nvidia is trying to swindle. I'm not an expert but It's not just about power-delivery with the VRMs, it's also voltage regulation, which is important for overclocking and stability.

Listen, there's no way you can convince anyone to buy one of these when much better 1060s with better engineering will be on the market soon. We need to put our finger up at Nvidia trying to charge a premium for the bare minimum.


re81Us.gif

it's a bold strategy, cotton

I'm not finding excuses, I'm saying avoid the FE, but your points for avoiding the FE indicate some ignorance on the subject matter.

It wasn't that long ago that the 5970 was fed by a three phase VRM. A high end card. It doesn't itself indicate cheapness, there are different ways to skin a pig. If your concern is voltage regulation and overclocking and stability - have we seen any issues with these things yet? If not, maybe wait until we do to make this claim?

You skipped over the 192 bit interface = 3GB or 6GB part I see, and like I suggested it could indicate the board will be reused for another GPU and it's just economies of scale at work, again, nothing new for any GPU maker.

Even if they did cut it down in response to the RX 480s price, isn't it still end performance per dollar that matters?

Wait for reviews, by all means, and avoid the FE like the greedy money grabs they are, but some RAM pads or the number or VRMs in a vacuum are silly reasons to avoid the card. There's reason enough without trying to reach that far.
 

Type_Raver

Member
How do these latest 1060 results compare to the current 980 cards?

Give or a take a few fps, likely down to cpu its paired with.
 
First line in my reply was




You went with...



re81Us.gif


I'm not finding excuses, I'm saying avoid the FE, but your points for avoiding the FE indicate some ignorance on the subject matter.

It wasn't that long ago that the 5970 was fed by a three phase VRM. A high end card. It doesn't itself indicate cheapness, there are different ways to skin a pig. If your concern is voltage regulation and overclocking and stability - have we seen any issues with these things yet? If not, maybe wait until we do to make this claim?

You skipped over the 192 bit interface = 3GB or 6GB part I see, and like I suggested it could indicate the board will be reused for another GPU and it's just economies of scale at work, again, nothing new for any GPU maker.

Even if they did cut it down in response to the RX 480s price, isn't it still end performance per dollar that matters?

Wait for reviews, by all means, and avoid the FE like the greedy money grabs they are, but some RAM pads or the number or VRMs in a vacuum are silly reasons to avoid the card.

As I said, I'm not an expert on the matter, neither are you.

Go on Overclock.net, where the PCB was pictured (I linked to it), and the guys on there that know a lot more than either of us were slightly shocked by the engineering for something that costs $300. If you think sticking the power-pin on the cooler, 3-phase VRMs or leaving 2 empty slots for memory on the PCB is the usual seal of Nvidia quality, more fool you.
 

horkrux

Member
You're right and honestly, what do you expect with Nvidia's own tests.

Just a quick compare with TechPowerUp, whom I trust, gives much better FPS for the 480 (TPU test all games with highest quality settings unless otherwise stated):

GTAV:

TPU 480 FPS (very high settings as well) :

1080p - 66.2fps
1440p - 48.6

Nvdia slides 480 FPS:

1080p - 45.1
1440p - 32.1

Witcher 3

TPU 480 FPS (Hairworks disabled):

1080p - 50.3
1440p - 37.7

Nvdia slides 480 FPS:

1080p - 47.9
1440p - 36.1

1. The GTAV benchmark on techpowerup was with AA off, Nvidia's had AA set to 4x. I don't personally know how much of an impact it can have though.
2. The difference in Witcher 3 is really small and there is mention of 'AA', whatever that is supposed to be in this case.
3. Fallout 4 for instance even has a lower framerate for 480 on techpowerup on 1080p. The difference is even larger there, why not cherry-pick that? Because it is the wrong way around? Comparing the tests like that can go either way, really.
 

LordOfChaos

Member
As I said, I'm not an expert on the matter, neither are you.

Go on Overclock.net, where the PCB was pictured (I linked to it), and the guys on there that know a lot more than either of us were slightly shocked by the engineering for something that costs $300. If you think sticking the power-pin on the cooler, 3-phase VRMs or leaving 2 empty slots for memory on the PCB is the usual seal of Nvidia quality, more fool you.



You can only speak for yourself; I'm well versed on the matter. I'm sure you'll readily admit to not knowing the limits of MOSFETs. And I already agreed with you that routing the power through the cooler rather than the board was scummy and discourages making a better product out of a cheaper 1060 by getting a third party cooler or watercooling. The forums also sprouted a lot of outrage over how the 480s overwatting could not be fixed, are we really going to try to depend on what "those guys" are currently on about.

You again ignored a poignant point in that it wasn't long ago the top end, 188 watt 5970 was fed by three phases per GPU, or the high end 5870 was fed by four - a 120 watt 1060 getting away with three really shouldn't baffle you that much. AMDs six are very, very overdesigned, you keep skipping my points, 600A is way overkill and so is 240 continuous at 120C...If you are going to overclock to those limits...Don't get a 1060 OR a 480...


Again, end results, not theories, is all I'm preaching here. If three VRMs end up providing stable power and enough overclocking headroom that the GPU is the limit and not the VRMs, what exactly is the issue? We will see.
 
You can only speak for yourself; I'm well versed on the matter.

That's reassuring to know.

You again ignored a poignant point in that it wasn't long ago the top end, 188 watt 5970 was fed by three phases per GPU,

Is that a poignant point, really? I don't think it is.

But as for what you think you are trying to say, fair enough. I don't know why you have to go back to a 7-year old graphics card to make an argument, but being well versed in the matter, maybe you know something more than me.
 

dr_rus

Member
You're right and honestly, what do you expect with Nvidia's own tests.

Just a quick compare with TechPowerUp, whom I trust, gives much better FPS for the 480 (TPU test all games with highest quality settings unless otherwise stated):

GTAV:

TPU 480 FPS (very high settings as well) :

1080p - 66.2fps
1440p - 48.6

Nvdia slides 480 FPS:

1080p - 45.1
1440p - 32.1

Witcher 3

TPU 480 FPS (Hairworks disabled):

1080p - 50.3
1440p - 37.7

Nvdia slides 480 FPS:

1080p - 47.9
1440p - 36.1

Both TechPowerUp and Guru3D are testing with Gameworks disabled which obviously result in higher performance on all cards tested. It's also pretty obvious that NV's internal testing is performed with Gameworks enabled - unless something can't be enabled on Radeons. In case of both GTA5 and TW3 all GW features can be enabled on all cards.

Edit: yeah, also - AA, it can have a heavy hit on performance if it's MSAA or SSAA.
 
I would warn anyone away from buying the $300 Founders (reference) 1060 after an image of the PCB leaked:
2823719


http://www.overclock.net/t/1605250/tpu-nvidia-geforce-gtx-1060-founders-edition-pcb-pictured/0_20

This looks really cheap, especially compared to the 480. A few worrying design choices we can see for people who may not be familiar with PCB design:

- 3-phase VRM power delivery (Left of GPU. 480 has 6-phases. These dictate power delivery/overclock)
- 6-pin power stuck to the cooler (replacing the cooler will be a nightmare)
- 2 slots for VRAM memory modules are empty. Either this was originally meant to feature 8GB memory but NV rush changed plans to release a cheaper product to combat the 480, or there will be a 1060 Ti in the future with 8GB and they are just using the same PCB for both).

Nvidia charging a premium for this is slightly scandalous. This card hasn't got the usual Nvidia polish for whatever reason.

For comparison, here is the 480s reference PCB, which is apparently a cheaper product ($240):

AMD-Radeon-RX-480-PCB_Front.jpg


You can see the 6 VRMs to left of GPU (and 8x 1GB GDDR5 VRM modules).

Well, this is one of the most idiotic observation in a while. You are comparing chokes and mosfets of different capacity just by the raw number of them on a system with lower power requirement to start with. It is like arguing 1060 should be cheaper because it has a lower core count. Also, it is a 3+1 design, not a 3 one.

On top of that, as more and more things keep getting integrated into the DIE, the simpler the PCB designs become. AMD PCBs being more populated only means their engineering is worse, and as an user you shouldn't be asking them to charge more money or anyone else to charge less because they did their homework better.
 

wachie

Member
This seems way faster than what I was expecting. Now to wait for real official benchmarks but it relative performance remains the same, the 1060 is worth roughly 20% more value wise.
Average gain over the 480 was 13% excluding synthetic tests, obviously you don't play those. Even including the synthetic tests, it doesn't come close to 20%, its 15%.

I don't know but a 13% lead from Nvidia's own reviewer guide doesn't sound promising if it comes with the Nvidia tax.

Waiting for reviews.
 

Type_Raver

Member
Do I spot a larger heatsink on the SC edition?

The bottom right corner of the card would suggest so...

I've been contemplating between the 1060 and the 480, and even though its dearer, I need a DVI port for my LCD monitor and the 1060 has one.
Plus im likely to benefit from a small addition of performance, so thats not a bad deal.
 
Both TechPowerUp and Guru3D are testing with Gameworks disabled which obviously result in higher performance on all cards tested. It's also pretty obvious that NV's internal testing is performed with Gameworks enabled - unless something can't be enabled on Radeons. In case of both GTA5 and TW3 all GW features can be enabled on all cards.

Edit: yeah, also - AA, it can have a heavy hit on performance if it's MSAA or SSAA.

Gameworks should almost always be disabled for benchmarks. Especially when they donteven work which is the case with gta v. In general benchmarks from amd or nvidia directly are crap
 

kami_sama

Member
The bottom right corner of the card would suggest so...

I've been contemplating between the 1060 and the 480, and even though its dearer, I need a DVI port for my LCD monitor and the 1060 has one.
Plus im likely to benefit from a small addition of performance, so thats not a bad deal.

The AIB 480s have dvi in most cases.
 

dr_rus

Member
Gameworks should almost always be disabled for benchmarks. Especially when they donteven work which is the case with gta v. In general benchmarks from amd or nvidia directly are crap

Then we should disable everything which isn't optimized for NV coming from consoles as well I think. How do you propose we do that?

I also find it very unlikely that any GF owner will disable GW unless its performance hit is too much for his card. So such benchmarks are meaningless to him.
 
Then we should disable everything which isn't optimized for NV coming from consoles as well I think. How do you propose we do that?

I also find it very unlikely that any GF owner will disable GW unless its performance hit is too much for his card. So such benchmarks are meaningless to him.

i disable virtually every gameworks effect other than hbao+ these days so its not meaningless at all. and id be perfectly fine with disabling an effect coded by AMD for benchmarks. they are extremely rare tho so its not even really an issue unlike the far more prevalent gameworks which are also usually very poorly optimized/implemented to boot(other than hbao+)
 

dr_rus

Member
i disable virtually every gameworks effect other than hbao+ these days so its not meaningless at all. and id be perfectly fine with disabling an effect coded by AMD for benchmarks. they are extremely rare tho so its not even really an issue unlike the far more prevalent gameworks which are also usually very poorly optimized/implemented to boot(other than hbao+)

They are in pretty much every console game at the moment. I fail to see how GW are more prevalent.
 

dr_rus

Member
?? im pretty sure amd doesnt write shader libraries for console games.

I'm pretty sure that AMD provides a lot of examples and advice for both consoles SDKs. It also doesn't have to be AMD who will write something optimized only for one architecture.
 
Open air. Blowers need a closed air circuit with a turbine fan on one side, whereas you can see an heat pipe on the second card with a regular fan on top.
 

Stainless

Member
I currently have a 660ti and was planning to upgrade to a 970 as I'm trying to stay in the 250-325 price range, I'm curious to see how this card stacks up. I like that it's 6gb as the ones I've been looking at are 4gb so I will definitely wait. (I can run wow and bf4 just fine but plan to upgrade in anticipation of the new battlefield)
 

Durante

Member
I currently have a 660ti and was planning to upgrade to a 970 as I'm trying to stay in the 250-325 price range, I'm curious to see how this card stacks up. I like that it's 6gb as the ones I've been looking at are 4gb so I will definitely wait. (I can run wow and bf4 just fine but plan to upgrade in anticipation of the new battlefield)
You should certainly get one of these rather than a 970. Buying the latter now for anything more than 200 USD would be a mistake.
 
Top Bottom