zephervack
Member
Holy crap, so kepler can handle 4 screens in 1 card? Game changer for me, first time I will be able to go to green camp in a while , eagerly awaiting benchies @ 5760 * 1200, dont let me down nVidia.
From what was said in that hkepc review I posted it seems like the periphery screens are refresh capped, from the sound of it looks retardish but I'll wait judgement before I hear more feedback from reviews or experience it personally.Holy crap, so kepler can handle 4 screens in 1 card? Game changer for me, first time I will be able to go to green camp in a while , eagerly awaiting benchies @ 5760 * 1200, dont let me down nVidia.
Whatever gave you that idea .. dont forget single GPU cards have launched in the past for $749 (thank you Nvidia!)So... this means we're going to see 7 or 800 dollar single gpu cards now?
1D_FTW, you currently have an AMD card, right? I'm curious why you're looking to make the switch to nvidia.
Nvidia is the way it's meant to be played.
Also they moneyhat a shitload more than ATI of recent.
Actually, what that review said is that, when playing in surround, if the framerate drops the card will prioritize the main screen (and preferably drop frames in the peripheral screens). That sounds pretty smart to me.From what was said in that hkepc review I posted it seems like the periphery screens are refresh capped, from the sound of it looks retardish but I'll wait judgement before I hear more feedback from reviews or experience it personally.
So... this means we're going to see 7 or 800 dollar single gpu cards now?
1D_FTW, you currently have an AMD card, right? I'm curious why you're looking to make the switch to nvidia.
It seems, performance-wise, a single GTX680 = two GTX560Ti in SLI.
And price wise much cheaper buying two GTX560TI than a GTX680...
Looks like GTX 680 might be an OC'ing beast...
And that's on STOCK voltage (0.987v). I cannot wait to OC one of these! A 1.05v overvolt should allow 1.2-1.3 Ghz core clocks. And dat RAM OC...
I think you underestimate the pull of the Green Goblin Nvidia has almost always managed to make more money than ATI/AMD, even when they were slower / late to the party, and they're not slower in this case.
Wait; have we solved whatever needs to be solved so that you can get audio from your PC ported over an HDMI cable connected to your graphics card?I've had tons of problems with HDMI audio
Hows these new cards performing compared to a gtx590?
That dual gpu card is what I have and Im just wondering if I should bother upgrading.
Maybe if we're lucky the prices seen are only a bit of early gouging and not the actual MSRP. Seems to be a few GTX680s out and around now according to some forums (just reading HardOCP). One guy tested it with an i3 for godsakes..
That guy is awesome.Maybe if we're lucky the prices seen are only a bit of early gouging and not the actual MSRP. Seems to be a few GTX680s out and around now according to some forums (just reading HardOCP). One guy tested it with an i3 for godsakes..
That would be a big win for AMD as the 7950 is the best OC card ever and costs $100 less. (That is if the 680 costs $549 and doesnt have as much OC room)This is best metric so far that I've seen: (IF true)
*GTX680 @ 1110/1800 = 7950 @ 1260/1800*
Average oc result is a shade below 40%. This is mind boggling because the previous best overclocking champ was the GTX460 and it was below 30% ..
I don't understand X3390 as a score. Someone explain?
That guy is awesome.
Not only is it a good test for i3 systems being CPU limited it also will provide a good bench for scaling from 2C/4T to 4C/4T and even 4C/8T.
This is best metric so far that I've seen: (IF true)
*GTX680 @ 1110/1800 = 7950 @ 1260/1800*
*Stock 680 seems to be about a stock 7970 as well
http://www.overclock.net/t/1231254/retail-gtx-680-vs-7970-benchmarks
I'm not so sure about that. It looks like your min framerates take a huge hit in quad core friendly games like BF3. Dual core becomes especially limiting in certain RTS titles. I wouldn't compare i3 680 runs to a 7970 on a high end quad.An i3 2120 will beat any AMD processor. At stock speeds, it pretty much runs toe to toe with the 2500k and i7s in gaming benchmarks. It's only through overclocking and the rare exception that there's any separation. So he could have definitely done a lot worse (anything AMD).
Threads are more important than cores with today's games. And the i3 is 4 threads.
I'm not so sure about that. It looks like your min framerates take a huge hit in quad core friendly games like BF3. Dual core becomes especially limiting in certain RTS titles. I wouldn't compare i3 680 runs to a 7970 on a high end quad.
I'm not so sure about that. It looks like your min framerates take a huge hit in quad core friendly games like BF3. Dual core becomes especially limiting in certain RTS titles. I wouldn't compare i3 680 runs to a 7970 on a high end quad.
True, but 2100 HT helps out a tad here so it isn't as bad as that (tmk).I'm not so sure about that. It looks like your min framerates take a huge hit in quad core friendly games like BF3. Dual core becomes especially limiting in certain RTS titles. I wouldn't compare i3 680 runs to a 7970 on a high end quad.
Actually, people are reporting smoother performance on BF3 with HT off .Am I missing something? Isn't he just disabling 2 cores on the i2500? No doubt it's gonna suck. 2 thread gaming has been having issues for years now. The i3 is 4 threads. It's 2 core/4 thread. The 2500k is 4 core/4 thread. So there's no difference in the amount of threads. A 2500k with 2 cores disabled is <<<<<<<< than an i3 2120.
Yeah, especially with the resolution of the Heaven benchmark (1600x900 in his case). Of course it has a lot of people up in arms. "Wait until we see 2500k @ 4.4GHz!". Well, you are right with the i3. I didn't realize that it is a very appropriate cpu to test in this scenario scaling well with gaming performance. Looks like the 680 will be similar in price and performance to the 7970, which I know will be a disappointment to many.
I said this months ago, that Kepler wouldn't blow away the 7970/7950. Some people thought I was crazy. Perhaps GK110 would've blown away Tahiti, but not GK104.
Either way, Nvidia should price this lower than the 7970 and force competition. If they price it at $450 then AMD will have to drop both the 7970 and the 7950 at least $50.
Problem is people like Nvidia better and will pay the premium. They have like what, 66 percent of the discrete market? And some would argue AMD was giving better value/performance during this period. So they don't have to under cut. If they get close, it's good enough for most people.
Yeah, the last thing either Nvidia or AMD want is a price war.
I think it's 59% or something like that.
Yeah, of course they don't want that... so they can continue making higher profit margins.
Thats the reason why you probably wont get any price wars that you're hoping for. With such a small die, smaller VRAM frame buffer, cheaper PCB, Nvidia is going to laugh all the way to the bank. They had record profit margins, now all I can say for their next quarter is DAT MARGIN!Yeah, of course they don't want that... so they can continue making higher profit margins.
http://www.techspot.com/news/47593-jpr-discrete-gpu-shipments-down-65-nvidia-gains-market-share.html
Market shrank overall, but their market share went up to 63 percent during the all important 4th quarter last year.
Nvidia wasn't thrashing them to justify it on performance alone. It's other issues at work. Nividia isn't going to ever undercut when they know people will pay for the intangibles. It's up to AMD to be the aggressor and to put out cards that force Nvidia's hand. There's zero incentive with the current situation. It's close enough and they're not going to move the price.
That's for Q4 alone, not total market share unless I've misread something in that article. The last article I saw for overall market share was something closer to the 59%-60% mentioned before. Anyways, AMD has re-built back up a good enough reputation since the HD 4000 series where they don't feel compelled to undercut Nvidia significantly at the high end range, especially now that initial benchmarks indicate the 680 trades blows with the 7970.
It's just the 4th quarter. Although I would assume that's the largest and most profitable quarter (and the one you want to do your best in).
Just saying, Nvidia has a healthy lead that is in no way equal to their performance difference. So if people choose Nvidia for other reasons (and they clearly do), there's no incentive for Nvidia to drop it to 450 to "compete" with the 7970. Unfortunately, it's close enough for them to get away with what they're doing.
Now that the 600 series is out, will there be a discount coming soon for the 400 and 500 series? I don't have the kind of money required for the new hotness, but I also don't care about playing games beyond 1080p and a billion FPS either.
Buy used as folks like me dump their perfectly good cards for the new hotness...you'll get great deals.
BTW - the GTX 680 is not yet officially out. I believe it's March 22.
Oooh I thought it was already out, my bad.
Is that generally safe to buy people's used cards? I don't really know too much about this kind of stuff. I just know that most enthusiasts like the kind on GAF OC their cards to hell and back. Would that decrease the life of the card?