• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia upgrades 1060 and 1080 cards with faster memory

theultimo

Member
Mass Effect: Andromeda - 1060>480
Ghost Recon: Wildlands - 1060>480
Halo Wars 2, For Honor - more or less even

So why would 580 be suddenly faster in new AAA games if this clearly isn't the case with 480?
I agree with this, both have certain aspects that perform better, but its a wash or usually a 1060 is above by a few %.

Ita not that the 480/580 is a bad card though, its usually the 1060 has a hair more power, but it also usually costs more.
 
I wonder why AMD didn't also up the memory on their refresh? 1060s are already hitting high clocks, so they couldn't really improve there, but AMD could have done both.
 
I agree with this, both have certain aspects that perform better, but its a wash or usually a 1060 is above by a few %.

Ita not that the 480/580 is a bad card though, its usually the 1060 has a hair more power, but it also usually costs more.

They are generally separated as thus: 1060 faster on average in DX11 titles, 480 faster in DX12.
 
They did. This update will most certainly apply to factory OC cards only.



Mass Effect: Andromeda - 1060>480
Ghost Recon: Wildlands - 1060>480
Halo Wars 2, For Honor - more or less even

So why would 580 be suddenly faster in new AAA games if this clearly isn't the case with 480?

To the bolded, that's wrong:

QkLuFFgzZjfTEgeuXXCqfg-650-80.png


And 480 is faster in both RE7 and SE4, which combined with your list, covers all the big PC titles released so far this year more or less. I also haven't seen any benches for HW2.

RE7
index.php


Sniper Elite 4
index.php


1060 is faster in 1, 480 faster in 3, so that poster is right.
 

dr_rus

Member
To the bolded, that's wrong:

QkLuFFgzZjfTEgeuXXCqfg-650-80.png

No, it's not. This benchmark is for medium settings. This is how it looks on Ultra:

kSsc.png


And if you'd lower SE4 and RE7 settings you'll probably have 1060 winning too.

Also worth mentioning that there's basically no 1060 references on the market with most 1060OC being noticeably faster while most custom 480s perform very close to reference.

Generally there simply is no reason to say that 580 will be faster than 1060 9Gpbs in new AAA games.
 

Hux1ey

Banned
Yeah, I can pretty much hit everything in 1080p 60fps on Very High depending if the game is optimised enough.

I bought a 4k TV recently and would love to get some 4K goodness out of my computer.

You could go 1080ti, but I'd personally wait for Volta if it is indeed Q3

1180 should be a real beast.
 
No, it's not. This benchmark is for medium settings. This is how it looks on Ultra:

kSsc.png


And if you'd lower SE4 and RE7 settings you'll probably have 1060 winning too.

Also worth mentioning that there's basically no 1060 references on the market with most 1060OC being noticeably faster while most custom 480s perform very close to reference.

Generally there simply is no reason to say that 580 will be faster than 1060 9Gpbs in new AAA games.

What? You are saying that the 1060 beats the 480 on ultra vs medium settings with Mass Effect. Then say that on two other games it loses at ultra, but would win at medium? Even if it did why is the setting that the 1060 might win at the most relevant? Countering cherry picked benchmarks with cherry picked benchmarks says a lot about your preference.

I looked up benchmarks to all three games on a handful of sites and they have these cards trading position depending on the site.

Personally it's just easier to say these card have a negligible power performance gap and that if you have to choose, you are choosing 8GB or 6GB, AMD features or NVIDIA features.

Example

QujszJUjeZRE27Yr7h3Srg-650-80.png


PCGAMER shows the 480 still beating the 1060 at ultra.
 
No, it's not. This benchmark is for medium settings. This is how it looks on Ultra:

kSsc.png


And if you'd lower SE4 and RE7 settings you'll probably have 1060 winning too.

Also worth mentioning that there's basically no 1060 references on the market with most 1060OC being noticeably faster while most custom 480s perform very close to reference.

Generally there simply is no reason to say that 580 will be faster than 1060 9Gpbs in new AAA games.

No, ultra settings:

QujszJUjeZRE27Yr7h3Srg-650-80.png


http://www.pcgamer.com/mass-effect-andromeda-pc-performance-analysis/

EDIT: Beaten
 
I6GUhbI.gif


One game and a minor difference, especially looking at minimum frames. There are games that favor either card, even if the 1060 holds a small margin of victory over all, I would choose 8GBs over that.



What? NVIDIA overclocking a component of a GPU is better than AMD overclocking a component of a GPU? Both are what I would consider Tepid tbh.



True, but it's the principle of the thing. Also not everyone likes overclocking. I would rather not over clock my 1070, but it reaches 1936MHz without it. I did overclock my CPU though. I might see if there is any significant increase from a memory boost.

580 does pretty much the same as this new 1060 while consuming almost twice as much power and generate more heat.
 
Would a 1060 be worth the upgrade over a 970? I'd be aiming for 1080p and I don't mind lowering shadows down to medium. I play a lot of AAA console ports but went back to consoles after I always seemed to fall short of 1080/60 on my 970. I recently got fucked out of my Sony digital account so PC is calling again.

Don't really want to go the extra £150 for a 1070. Cheers!
 

Durante

Member
A GTX 1080 with a 2GHz core is the kind of performance I'm looking for, it's around 2x faster than my 1535MHz GTX 970.
That's actually exactly what I told myself when I bought the 1080 ("it's expensive but it's twice as fastas your 970!").


As for the 1060/rx480 discussion: ultimately, these are ridiculously close on average. The stock 1060 is a bit slower than the stock rx580, but the 1060+ at stock will probably be a bit faster. OC models are generally a tiny bit more distant from stock for NV. Here's what it looks like in computerbase's most recent test (note that this is the "old" 1060):

But the real difference is that the RX is a bit cheaper and the Geforce requires less power (pretty significantly so, we are talking about 60-70 W while gaming which is 50%+ of the total).
 

Renekton

Member
Mass Effect Andromeda also has the Tessellation thing that drags down AMD cards <_< which is annoying. Rooting for underdog is exhausting.
 
Mass Effect Andromeda also has the Tessellation thing that drags down AMD cards <_< which is annoying. Rooting for underdog is exhausting.

I was going to buy the 480, mind was all made up until i checked the 1060 benchmarks vs 480, after seeing better performance in almost every damn game...I couldn't root for the underdog anymore..after 6 years straight of AMD cards time to move on..
 
I was going to buy the 480, mind was all made up until i checked the 1060 benchmarks vs 480, after seeing better performance in almost every damn game...I couldn't root for the underdog anymore..after 6 years straight of AMD cards time to move on..

I don't know what benchmarks you've seen. In new games it's almost the other way around. In any case, my next card is going to be vastly faster than a 1060/480. Either small Vega or I'll bag a used 1080.
 

UltSeer1

Banned
I don't see the big deal with PC gaming, 2400$ for a decent rig so you can get a few extra pixels and get a slight resolution bump?
 

CSJ

Member
I don't see the big deal with PC gaming, 2400$ for a decent rig so you can get a few extra pixels and get a slight resolution bump?

This isn't the discussion for that, come on.
Also you're way off about your cost, performance and what games consoles don't have.
 
580 does pretty much the same as this new 1060 while consuming almost twice as much power and generate more heat.

This is kind of redundant. Consuming more power = generating more heat in terms of CPUs or GPUs.

Edit: Btw. it's not "almost twice", but rather about ~50% more.
 

JaseC

gave away the keys to the kingdom.
I don't see the big deal with PC gaming, 2400$ for a decent rig so you can get a few extra pixels and get a slight resolution bump?

Well, you could apply that very same logic to the Pro and Scorpio. It's more applicable to iterative consoles, even.
 

ISee

Member
Would a 1060 be worth the upgrade over a 970? I'd be aiming for 1080p and I don't mind lowering shadows down to medium. I play a lot of AAA console ports but went back to consoles after I always seemed to fall short of 1080/60 on my 970. I recently got fucked out of my Sony digital account so PC is calling again.

Don't really want to go the extra £150 for a 1070. Cheers!


Wait a bit longer for nvidias next reveal. Shouldn't be too long.
In general going from a 970 to a 1060 is like going from a 970 to a 980 (more or less, often enough overclocked 980s are faster then overclocked 1060s). It is a nice upgrade but not really mind blowing. It will of course be better for 1080p/60... BUT if a 1070 is also an option and you don't want to wait, there might be an even better solution. Certain low level costum 1080s have been dropping in price like crazy in Europe lately and nearly reached 1070 levels of cost. I recently bought a gigabyte 1080 windforce (not the g1!) for just under 495€, that's just slightly over the cost of a 1070 strix from asus (480€). And yes those cards have worse cooling and don't oc as good as other 1080s but even with a moderate oc they bring a lot more power to the table then a regular 1070. And going from a 970 to a 1080 will be enough for a long time at 1080p, and is a significant upgrade imo. Other cards i've seen for just under 500€ are the kfa2 1080 exoc and the msi 1080 armor.
 
Would a 1060 be worth the upgrade over a 970? I'd be aiming for 1080p and I don't mind lowering shadows down to medium. I play a lot of AAA console ports but went back to consoles after I always seemed to fall short of 1080/60 on my 970. I recently got fucked out of my Sony digital account so PC is calling again.

Don't really want to go the extra £150 for a 1070. Cheers!

Really? What games were you playing and what CPU do you have? I've been running my GTX 970 since late 2014 and it's been able to do 1080p 60 with pretty much every game.

About the 1060, not at all if you have a GTX 970 that can clock to around 1400-1500MHz.
With both GPUs near their max overclocks, which would be 1.9-2GHz on the GTX 1060 and 1450-1500MHz+ on the GTX 970 it's around 10-15% faster, it's not a worthwhile upgrade at all unless you're looking for more ram.

Digital Foundry included a stock and overclocked GTX 970 in their GTX 1060 review.


You're best bet is to upgrade to a GPU that is 50% or more faster if you want a worthwhile upgrade, like the GTX 1070 or perhaps even one of AMD's upcoming VEGA GPUs. I myself prefer upgrading to GPUs that are twice or more faster like the GTX 1080 or future GPUs.

It's best to wait to see what Vega offers in a few weeks, if AMD brings solid competition then you could potentially have a decent upgrade path from either AMD or NVIDIA as price adjustments often come when there's solid competition.

That's actually exactly what I told myself when I bought the 1080 ("it's expensive but it's twice as fastas your 970!").

Haha! That's how I prefer to upgrade, to a GPU that is 2x or more faster than my previous.
 

napata

Member
About the 1060, not at all if you have a GTX 970 that can clock to around 1400-1500MHz.
With both GPUs near their max overclocks, which would be 1.9-2GHz on the GTX 1060 and 1450-1500MHz+ on the GTX 970 it's around 10-15% faster, it's not a worthwhile upgrade at all unless you're looking for more ram.

Almost every Pascal card can easily push past 2 ghz. The 1.9 ghz is almost the standard no oc boost. Between 2.1-2.2 is where I'd place max overclock. There aren't many cards that get to 2.2 though. Also from my experience memory overclocking is more important for the 1060.

I agree that going from a 970 to a 1060 is really not worth it.
 

Momentary

Banned
I don't see the big deal with PC gaming, 2400$ for a decent rig so you can get a few extra pixels and get a slight resolution bump?

What a shitpost. Not even discussing the topic at hand. This is more than likely a throw away account for someone here.
 
Almost every Pascal card can easily push past 2 ghz. The 1.9 ghz is almost the standard no oc boost. Between 2.1-2.2 is where I'd place max overclock. There aren't many cards that get to 2.2 though. Also from my experience memory overclocking is more important for the 1060.

I agree that going from a 970 to a 1060 is really not worth it.

Really? Hmm, I have seen some at 2.1GHz and above but I thought those were somewhat above average OCs? Interesting.

For the GTX 970s 1400-1500MHz were pretty average OCs, with max OCs typically topping out at around 1500-1550MHz.
 
Would a 1060 be worth the upgrade over a 970? I'd be aiming for 1080p and I don't mind lowering shadows down to medium. I play a lot of AAA console ports but went back to consoles after I always seemed to fall short of 1080/60 on my 970. I recently got fucked out of my Sony digital account so PC is calling again.

Don't really want to go the extra £150 for a 1070. Cheers!

1060 is slightly faster than a 970, so I would say it's not a worthwhile upgrade. How about selling the 970 and getting the 1070?

Really? Hmm, I have seen some at 2.1GHz and above but I thought those were somewhat above average OCs? Interesting.

For the GTX 970s 1400-1500MHz were pretty average OCs, with max OCs typically topping out at around 1500-1550MHz.

2200 oc is not common at all. Holding 2100+ across all games is pretty above average for Pascal. Honestly you get more gains from memory over clocking in Pascal because gpu boost 3.0 does a good job of finding speeds near the edge of stability. Plus the capped voltages and locked bios really suck the oc potential out.
 

VE3TRO

Formerly Gizmowned
Was going to pick up a 1060 in the next few weeks. Any timeframe when these will release?
 
Really? What games were you playing and what CPU do you have? I've been running my GTX 970 since late 2014 and it's been able to do 1080p 60 with pretty much every game.

About the 1060, not at all if you have a GTX 970 that can clock to around 1400-1500MHz.
With both GPUs near their max overclocks, which would be 1.9-2GHz on the GTX 1060 and 1450-1500MHz+ on the GTX 970 it's around 10-15% faster, it's not a worthwhile upgrade at all unless you're looking for more ram.

Digital Foundry included a stock and overclocked GTX 970 in their GTX 1060 review.



You're best bet is to upgrade to a GPU that is 50% or more faster if you want a worthwhile upgrade, like the GTX 1070 or perhaps even one of AMD's upcoming VEGA GPUs. I myself prefer upgrading to GPUs that are twice or more faster like the GTX 1080 or future GPUs.

It's best to wait to see what Vega offers in a few weeks, if AMD brings solid competition then you could potentially have a decent upgrade path from either AMD or NVIDIA as price adjustments often come when there's solid competition.

Thanks a lot for those that replied.

I've always been puzzled why my system couldn't get the framerates others got. I did have 1080/60 most of the time with certain settings lowered but there were drops in a lot of games like AC Unity, FC Primal, TW3 and RotTR which dropped it down to 50-58. That kind of drop and the judder it caused really annoyed me to the point where I started playing ok console again with the PS4 Pro.

I think my CPU is an i5 4690K OC to 4.5GHz. I did try OC'ing my 970 but I was getting little red dots all over the screen which scared me into abandoning it.

I wonder if it could be my RAM causing framedrops as I only have 8GB's (I actually thought aI had 16). Would a lack of RAM cause framerate drops at 1080p?

Looking at that DF chart a 1060 looks a decent upgrade for me tbh. It looks like a 5-10fps upgrade across the board which would eliminate the drops to sub 60fps so I think I may go for it. I don't really want to spend £350-£400 so a 1070 is not an option for now. Are there new GPU's due to be revealed I take it? I've not had AMD for years, are they supported decently now with new game optimisation patches? As that was the reason I went with Nvidia and have never went back.

Thanks for any info.
 

napata

Member
Really? Hmm, I have seen some at 2.1GHz and above but I thought those were somewhat above average OCs? Interesting.

For the GTX 970s 1400-1500MHz were pretty average OCs, with max OCs typically topping out at around 1500-1550MHz.

Here's a bunch of 1060 overclocks: https://www.techpowerup.com/reviews/ASUS/GTX_1060_STRIX_OC/29.html

I think these numbers are initial boost clocks though which means they'll probably drop a bit when stress testing and actual prolonged gaming.
 

ISee

Member
Really? Hmm, I have seen some at 2.1GHz and above but I thought those were somewhat above average OCs? Interesting.

For the GTX 970s 1400-1500MHz were pretty average OCs, with max OCs typically topping out at around 1500-1550MHz.

I overclocked several maxwell cards (970 and 980) and 1500mhz was rather hard to reach on most of them and I'm speaking about stable 1500mhz and not "lets do one benchmark run and tell everybody how stable it is". 1550mhz was undoable without moding the voltage levels directly in bios for all of them. The best results I got were 1600mhz on a gigabyte gtx 980 G1, but temperatures were getting a bit too high for my liking and voltage was at maximums at all boost clock steps, not recommendable because the card had the potential to die rather quick because of it, but performance was great (6.5tflops).

On Pascal 2ghz seems to be doable on all 1070 and 1080 cards so far. But above that it varies a lot. So far I wasn't able to push one single card to 2.1ghz, The maximum was 2063 on a msi gtx 1070, all the rest began to run unstable at ~ 2050mhz (3x1070, 2x1080). 1060s could of course reach higher clock speeds in general, I wasn't able to get my hand on one of them and never bought one myself.
 
Nvidia upgrades its top of the line cards every year, right? Why not the mid-range ones too? Where's the 1060Ti? These releases are completely meaningless.
 
Thanks a lot for those that replied.

I've always been puzzled why my system couldn't get the framerates others got. I did have 1080/60 most of the time with certain settings lowered but there were drops in a lot of games like AC Unity, FC Primal, TW3 and RotTR which dropped it down to 50-58. That kind of drop and the judder it caused really annoyed me to the point where I started playing ok console again with the PS4 Pro.

I think my CPU is an i5 4690K OC to 4.5GHz. I did try OC'ing my 970 but I was getting little red dots all over the screen which scared me into abandoning it.

I wonder if it could be my RAM causing framedrops as I only have 8GB's (I actually thought aI had 16). Would a lack of RAM cause framerate drops at 1080p?

Looking at that DF chart a 1060 looks a decent upgrade for me tbh. It looks like a 5-10fps upgrade across the board which would eliminate the drops to sub 60fps so I think I may go for it. I don't really want to spend £350-£400 so a 1070 is not an option for now. Are there new GPU's due to be revealed I take it? I've not had AMD for years, are they supported decently now with new game optimisation patches? As that was the reason I went with Nvidia and have never went back.

Thanks for any info.

Rise of the Tomb Raider is pretty tough on the CPU, I remember holding a solid 60 fps with no drops was quite challenging in some areas such as the Soviet Installation, even on my i7 4790K at 4.7GHz. However when I upgraded from my 1333MHz to 2400MHz memory I was able to run the game at a solid 60 fps, it was huge improvement as faster memory can help in CPU bound scenarios.

Assassin's Creed Unity should run really well, what settings were you using if you can recall?
I'm not too sure about Far Cry Primal as I haven't played it, and I also haven't played much of The Witcher 3 yet to properly know how it performs.

About overclocking, did you check out any guides and slowly raise your clocks? For something like a GTX 970 it's best to raise them by around 10-50MHz each step and test it's stability. Maybe even 100MHz if it's boosting to 1280MHz out of the box and slowly raise it from there.

What's your GTX 970's boost clocks? I strongly advise against getting a GTX 1060 unless you're selling your GTX 970 and spending a small amount on it as it's honestly not worth it but it's totally up to you. The minuscule 10 fps improvement is nothing if you want to sit on the card for 2+ years as games will become more demanding in the future depending on the settings used.

You're much better off waiting for a newer GPU that will compete or outperform the GTX 1070 at a lower price, GPU power like this will vastly outpace console performance. It's honestly not worth spend £/$200+ for a minuscule 10 fps improvement.

Here's a bunch of 1060 overclocks: https://www.techpowerup.com/reviews/ASUS/GTX_1060_STRIX_OC/29.html

I think these numbers are initial boost clocks though which means they'll probably drop a bit when stress testing and actual prolonged gaming.

Interesting clocks, these are the kind of OCs I've seen which appear to be somewhat above average.

1060 is slightly faster than a 970, so I would say it's not a worthwhile upgrade. How about selling the 970 and getting the 1070?

2200 oc is not common at all. Holding 2100+ across all games is pretty above average for Pascal. Honestly you get more gains from memory over clocking in Pascal because gpu boost 3.0 does a good job of finding speeds near the edge of stability. Plus the capped voltages and locked bios really suck the oc potential out.

Yeah it's a real shame, I wonder if they could reach clocks of 2.3-2.4GHz and be stable if they didn't have these restrictions? Many of the cards seem to have comparable overclocking capabilities which offer performance gains of around 10-15% from stock performance.

I overclocked several maxwell cards (970 and 980) and 1500mhz was rather hard to reach on most of them and I'm speaking about stable 1500mhz and not "lets do one benchmark run and tell everybody how stable it is". 1550mhz was undoable without moding the voltage levels directly in bios for all of them. The best results I got were 1600mhz on a gigabyte gtx 980 G1, but temperatures were getting a bit too high for my liking and voltage was at maximums at all boost clock steps, not recommendable because the card had the potential to die rather quick because of it, but performance was great (6.5tflops).

On Pascal 2ghz seems to be doable on all 1070 and 1080 cards so far. But above that it varies a lot. So far I wasn't able to push one single card to 2.1ghz, The maximum was 2063 on a msi gtx 1070, all the rest began to run unstable at ~ 2050mhz (3x1070, 2x1080). 1060s could of course reach higher clock speeds in general, I wasn't able to get my hand on one of them and never bought one myself.

2GHz on Pascal lines up with what I've seen, it seems to be a quite common for them to reach those clocks, or perhaps be just under 2GHz.

It's interesting and unfortunate to hear that your Maxwell GPUs had trouble hitting 1500MHz, those cards often had really great overclocking capabilities but I guess the silicon lottery is always in effect.
 

dr_rus

Member
What? You are saying that the 1060 beats the 480 on ultra vs medium settings with Mass Effect. Then say that on two other games it loses at ultra, but would win at medium? Even if it did why is the setting that the 1060 might win at the most relevant? Countering cherry picked benchmarks with cherry picked benchmarks says a lot about your preference.

Most games change the bottleneck when you change their settings and thus some card which is faster at Ultra settings may begin to be slower on average. In your MEA example the gap on Ultra is twice smaller than it was in Coulomb_Barrier's graph. This is also pretty much the only benchmark I've seen where 480 is even faster than 1060 there so I think that it's safe to say that it's irrelevant in the end results.

I agree that the performance difference between 1060, 480 and 580 is negligible to the point where saying that one card is faster than the other makes zero sense. Which is what this has started from.

And my preference is to facts against cherry picking.

is there a reason AIB's can just add faster memory without "official" nvidia support?

No, they should be able to do this. This is just NV providing "official" configuration meaning that IHVs can be 100% sure that these GPUs will work with 9 GDDR5 / 11 GDDR5X GHz memory.

So just to be 100% sure this faster memory is already in the current crop of 1080ti cards?

Yes. 1080Ti is using 11Gbps GDDR5X, that's how it's able to reach higher bandwidth than the Titan X (but not Xp) while using narrower memory bus.

Was going to pick up a 1060 in the next few weeks. Any timeframe when these will release?

They are out but prices for the first batch are higher than normal so you may want to wait for a couple of weeks.
 
Most games change the bottleneck when you change their settings and thus some card which is faster at Ultra settings may begin to be slower on average. In your MEA example the gap on Ultra is twice smaller than it was in Coulomb_Barrier's graph. This is also pretty much the only benchmark I've seen where 480 is even faster than 1060 there so I think that it's safe to say that it's irrelevant in the end results.

I agree that the performance difference between 1060, 480 and 580 is negligible to the point where saying that one card is faster than the other makes zero sense. Which is what this has started from.

And my preference is to facts against cherry picking.

Not sure why you ignored the rest of my post or the context of your chosen "facts"

You produced a graph that shows the 1060 beating the 480 at ultra settings in Mass Effect as an answer to another poster showing a graph with the 480 outperforming the 1060 at medium settings. You then implied that where the 480 was outperforming the 1060 at ultra in two other games that the results would some how swing to the 1060's favor at medium settings. What you implied flies in the face of logic to such a degree that the only possible explanation is that you wish to see the 1060 outperform the 480 regardless of facts or logic.

The part of my post that you chose not to respond to showed that a chart from the same article that contained the Mass Effect medium setting graph previously posted, showed the 480 beating the 1060 at ultra settings. This graph is at odds with the one you posted. So we have two different websites, benchmarking the same game at ultra settings. One says the 1060 is faster, the other says the 480 is faster. What is the likely cause? Maybe they benchmarked different areas, or they had different card models with faster or slower clocks.

Like I said, and you seem to agree with, the cards really are to close to call. I don't agree that the graph you posted had more to do with facts than trying to put the 1060 in a better light, considering what you implied concerning setting changes and how they would swing the benchmarks results.

When the performance are this close the "facts", are malleable and I think it's important to remember that, no matter your preference.
 

theultimo

Member
Not sure why you ignored the rest of my post or the context of your chosen "facts"

You produced a graph that shows the 1060 beating the 480 at ultra settings in Mass Effect as an answer to another poster showing a graph with the 480 outperforming the 1060 at medium settings. You then implied that where the 480 was outperforming the 1060 at ultra in two other games that the results would some how swing to the 1060's favor at medium settings. What you implied flies in the face of logic to such a degree that the only possible explanation is that you wish to see the 1060 outperform the 480 regardless of facts or logic.

The part of my post that you chose not to respond to showed that a chart from the same article that contained the Mass Effect medium setting graph previously posted, showed the 480 beating the 1060 at ultra settings. This graph is at odds with the one you posted. So we have two different websites, benchmarking the same game at ultra settings. One says the 1060 is faster, the other says the 480 is faster. What is the likely cause? Maybe they benchmarked different areas, or they had different card models with faster or slower clocks.

Like I said, and you seem to agree with, the cards really are to close to call. I don't agree that the graph you posted had more to do with facts than trying to put the 1060 in a better light, considering what you implied concerning setting changes and how they would swing the benchmarks results.

When the performance are this close the "facts", are malleable and I think it's important to remember that, no matter your preference.
The biggest difference is power and heat. Otherwise its very close.
 

dr_rus

Member
Not sure why you ignored the rest of my post or the context of your chosen "facts"

You produced a graph that shows

Let me stop you right there.

I did not "produce" anything, I replied to a Coulomb_Barrier's graph which is clearly cherry picked (as I've said, this is pretty much the only MEA benchmark I've seen where 480 is faster than 1060, and I've seen at least a dozen of them). The graph I posted is more in line with what you would be able to see in any other MEA benchmark of your choice. What I said about the original graph using medium settings is a theory which could explain this difference, nothing more.

So you should really think on what you want to say next because right now you are barking at a wrong tree. And this is precisely why I ignored the rest of your previous post and will ignore the rest of this one.

Can't believe that the Xbox Project Scorpio has a faster memory speed than the Nvida 1080 cards.

What? It has the same bandwidth and much slower memory speeds.
 
Top Bottom