Definitely man since I have SLI 670s as well.My system can still hang670sli. I hope the GTX 870 that comes out next year is at least more powerful than two 670s.
Pls understand.Wait a minute, why is the R9 290X running in Quite Mode? That 8-12% loss is important. Also, I have a feeling that the R9 290X with custom coolers will perform better than how they are now (correct me on this if I am wrong).These benchmarks make no sense =/
We are looking at 500 watts here
Wait a minute, why is the R9 290X running in Quite Mode? That 8-12% loss is important. Also, I have a feeling that the R9 290X with custom coolers will perform better than how they are now (correct me on this if I am wrong).These benchmarks make no sense =/
Wait a minute, why is the R9 290X running in Quite Mode? That 8-12% loss is important. Also, I have a feeling that the R9 290X with custom coolers will perform better than how they are now (correct me on this if I am wrong).These benchmarks make no sense =/
sk3tch said:R9 290X ("Uber" switch on, no OC), i5 2500K @ 4.6 GHz.
![]()
Can you provide an argument as to why cards shouldn't be benchmarked at comparable noise levels?Haha, wow. What's next? Justifications for the $999 prices, wait we got those already!![]()
"Quiet Mode" is still louder than a stock 780, FWIW.Is there anything between 'quiet mode' and 'uber mode' for a 290X? If not, it seems to me that 'quiet mode' is just a clever misnomer, labelled so in order to make it look better if somebody chose to test against it.
"Quiet Mode" is still louder than a stock 780, FWIW.
Did you champion this logic during the 280/480 era?Can you provide an argument as to why cards shouldn't be benchmarked at comparable noise levels?
Can you provide an argument as to why cards shouldn't be benchmarked at comparable noise levels?
"Quiet Mode" is still louder than a stock 780, FWIW.
Well aren't they comparing the powers here? If they wanted to compare the noise levels than yeah that's fine.
No, I just laughed at NVidia. But you should have chosen the FX5800 era -- the original leaf blower. It was doubly hilarious set up against some of the best cards ATI ever made. I have no idea how NV ever expected to sell those.Did you champion this logic during the 280/480 era?
I think for people considering a product it's useful to see how it performs within a noise envelope they actually want to use. Ideally, you'd benchmark all cards at all profiles, and provide noise data for each profile (like computerbase does). Everyone can then decide how to weigh the multi-objective optimization problem between noise and performance.Well aren't they comparing the performance here? If they wanted to compare the noise levels than yeah that's fine but this isn't really a good comparison.
<- G-SyncI DON'T KNOW WHAT TO BLOW MY MONEY ON ANYMORE
But thats more than 10 years ago, the examples I gave are relatively recent.No, I just laughed at NVidia. But you should have chosen the FX5800 era -- the original leaf blower. It was doubly hilarious set up against some of the best cards ATI ever made. I have no idea how NV ever expected to sell those.
699 for 3GB.How much is this beast supposed to cost?
Can you provide an argument as to why cards shouldn't be benchmarked at comparable noise levels?
And seeing as my 780 benchmarked higher than some of the 290X owners running their card in Uber Mode, I doubt the benchmarks would look much different for it here.
And note that my 780 was running at 1245MHz but only 60% fan and 70C. Still quieter and cooler than the "Uber Mode" benches from sk3tch.
There really is little difference in performance between Quiet en Uber mode, and by default it is on Quiet mode since it otherwise gets quite loud. It will probably indeed get a lot better with custom coolers though.
Comparing top GPUs which were created for best performance in noise level is like comparing fastest cars by their fuel consumption.
It is subject for comparison but most of people don't give a flying fuck about it same as Power consumption.
Are you using watercooling? Most reviews have the 290X reaching 90°C+, both in Uber and Quiet mode.That particular benchmark favors NVIDIA so it is no surprise that your 780 performed better. Your 780 was heavily OC'd, as well. My 290X was not OC'd. Just "Uber" switch on...which just eliminates a lot of the throttle. You may have been quieter but you were not cooler.My cards have never gone over 70 C even in Crossfire.
I'm not sure why AMD cheerleaders keep pointing at the GTX 480 to prove that "power-hungry, loud and hot is totally fine and well accepted when Nvidia does it".Did you champion this logic during the 280/480 era?
Are you using watercooling? Most reviews have the 290X reaching 90°C+, both in Uber and Quiet mode.
I'm not sure why AMD cheerleaders keep pointing at the GTX 480 to prove that "loud and hot is totally fine and well accepted when Nvidia does it".
Everyone *loathed* the GTX 480 for it, myself included.
It was the time when the common opinion was "Nvidia really dropped the ball with this".
So why does the 690 beat it in multiple benchmarks?
So why does the 690 beat it in multiple benchmarks?
The 690 is a dual GPU card? A not especially old one either.
So why does the 690 beat it in multiple benchmarks?
$700 for 3gb though?
pass, I can't cosign that
Power-hungry for a top enthusiast card is pointless. Loud and hot are valid complaints, no matter who does it.I'm not sure why AMD cheerleaders keep pointing at the GTX 480 to prove that "power-hungry, loud and hot is totally fine and well accepted when Nvidia does it".
Everyone *loathed* the GTX 480 for it, myself included.
It was the time when the common opinion was "Nvidia really dropped the ball with this".
699 for 3GB.
Power-hungry for a top enthusiast card is pointless. Loud and hot are valid complaints, no matter who does it.
The point of discussion here was that no one complained that the 280/480 need to be benchmarked at similar noise levels back then.
Power-hungry for a top enthusiast card is pointless. Loud and hot are valid complaints, no matter who does it.
The point of discussion here was that no one complained that the 280/480 need to be benchmarked at similar noise levels back then.
It's not a bad thing at all. You can do something similar in software on NV (and I think earlier AMD?) cards by simply setting the power target and fan profiles way up -- that's what e.g. computerbase do, and they compare both performance and noise at normal and "max" levels. (I know I keep mentioning computerbase, I'm not affiliated in any way with them, it's just that I currently consider their GPU reviews some of the most thorough and professional around -- it's too bad they don't offer official English translations)Well, times have changed. The push for more efficient AND powerful GPUs has been on - so AMD merely made their top-end part a bit more configurable by having the two BIOS on it. Previously (with the 7970) there was a switch, as well - but just two identical BIOS. All this shows is AMD's acceptance that enthusiasts want to tinker, whether it be quiet or loud or custom BIOS - they've set their cards up to accept the needs of enthusiasts. I'm not sure why that's a bad thing.
I'll drop a couple grand on 2 of these cards if the 6gb versions are $999. The 3gb versions would be nice at $650, but I'd feel like that was a step back from my Titans. The most I can get on them is 1202mhz core.
I'm hoping $800 for 6GB and $900 or $1000 for 12GB. No way it would be a $300 uptick to go from 3GB to 6GB.
I guess no one likes these benches.![]()
Since you didnt notice the point of the discussion, let me repost it again;What forums were you frequenting?
on Guru3d there was a huge uproar about the 400 series heat and noise levels (its heat more than its noise though). 200 series was not as bad from what I remember.
I am not sure anyone would argue otherwise... the 290x is hot and loud and that is surely a detraction against it in comparison to the Nvidia offerings.
The point of discussion here was that no one complained that the 280/480 need to be benchmarked at similar noise levels back then.
I dont think so.People are worn out of $500+ GPUs
Can you provide an argument as to why cards shouldn't be benchmarked at comparable noise levels?
.
Since you didnt notice the point of the discussion, let me repost it again;
I dont think so.
I'll drop a couple grand on 2 of these cards if the 6gb versions are $999. The 3gb versions would be nice at $650, but I'd feel like that was a step back from my Titans. The most I can get on them is 1202mhz core.
Why do that. The 880 is set of 6800 cuda cores and a Q2 Launch, That's 2.5x the speed of a Titan.
Why do that. The 880 is set of 6800 cuda cores and a Q2 Launch, That's 2.5x the speed of a Titan.
Any source for this? We've gone from 280 -> 480 -> 580 -> 680 -> 780 with around 15-25% (each generation) improvements in speed. Why would it all of a sudden be 250% faster?