irfan said:
I have a 4870x2, works a charm very happy with it. The dual GPUs scale well in most games. I haven't noticed the microstuttering or lag that people complain about. Maybe I am just very sensitive to that kind of thing. I all for improving crossfire if the can, but personally I think it works pretty well already.Dr. Light said:Don't you think you're blowing this out of proportion?
Although, if they do change the way crossfire works, would that affect existing cards?
Zaptruder said:Will buy if it has 5 or more outputs.
TheNiX said:I'd settle for 3, but I'm wondering if they'll be selling different models with the support for 6 monitors as seen in the pictures on page 1.
Charlie over at SemiAccurate says that, according to what he has heard, yields of NVIDIAs GT300 are currently at 1.7%. You read that right, under two percent. So, if what Charlie is reporting is even semi-accurate, yields are extremely bad.
How many worked out of the (4 x 104) 416 candidates? Try 7. Yes, Northwood was hopelessly optimistic - Nvidia got only 7 chips back. Let me repeat that, out of 416 tries, it got 7 'good' chips back from the fab. Oh how it must yearn for the low estimate of 20%, talk about botched execution. To save you from having to find a calculator, that is (7 / 416 = .01682), rounded up, 1.7% yield.
Tenacious-V said:More fuel.
http://hardocp.com/news/2009/09/15/nvidia_gt300_yields_are_under_263/
The full article.
http://www.semiaccurate.com/2009/09/15/nvidia-gt300-yeilds-under-2/
Things don't look to hot for the green camp right now.
Dr. Light said:Don't you think you're blowing this out of proportion?
Although, if they do change the way crossfire works, would that affect existing cards?
As someone who has used 4 sets of dual cars in the past 3 or so years, I have to say that I do not agree with the gameplay comment, then again some people are more sensitive to micro stuttering than others, and it must have been that long that maybe I do not remember anymore how games without micro stutter are supposed to feel. However, I will give you this. I play very little on my 360, but the 30fps that it displays feel heck of a lot better than the 30fps I get on Crysis with all settings maxed. So what you are saying is not wrong at all.brain_stew said:Not at all, the current solution sacrifices playability for high average framerates that look good in benchmarks. That's just something I can't knowingly support.
If this is true (I trust [H] but I don't trust CJ) then we as customers will see the negative effects. ATI had planned to launch these cards at very affordable prices, but now that they realized Nvidia can't compete, the launch prices apparently increased. I understad, that a 5870 at $400 dollars is still a great deal, and that if we are going to talk about raping the customers with high prices Nvidia takes the crown, but still, I like the humble ATI I fell in love with.Tenacious-V said:More fuel.
http://hardocp.com/news/2009/09/15/nvidia_gt300_yields_are_under_263/
The full article.
http://www.semiaccurate.com/2009/09/15/nvidia-gt300-yeilds-under-2/
Things don't look to hot for the green camp right now.
cyberheater said:Hitler, as Nvidia's CEO, gets informed about ATI's Evergreen series
http://www.youtube.com/watch?v=FR45ja_fNzU
:lol :lol :lol
brain_stew said:5850 and a i5 750 + a P55 board and 4GB of DDR3. Use the ~$200 you save and buy a 1080p monitor, job done.
irfan said:
cyberheater said:Hitler, as Nvidia's CEO, gets informed about ATI's Evergreen series
http://www.youtube.com/watch?v=FR45ja_fNzU
:lol :lol :lol
Zaptruder said:Even worse for Nvidia is that all this negativity in the green camp is going to cause fence sitters to jump on the red wagon before they even get a chance to release... that's losing potential sales thanks to opportunity cost!
cyberheater said:Hitler, as Nvidia's CEO, gets informed about ATI's Evergreen series
http://www.youtube.com/watch?v=FR45ja_fNzU
:lol :lol :lol
Some quick Photoshop comparing with a 10.5" GTX295 puts that monster at just under twelve inches. I don't know if it'll actually fit in my case.irfan said:5870x2 pic
People trust anything SA says about Nvidia. :lolTenacious-V said:More fuel.
http://hardocp.com/news/2009/09/15/nvidia_gt300_yields_are_under_263/
The full article.
http://www.semiaccurate.com/2009/09/15/nvidia-gt300-yeilds-under-2/
Things don't look to hot for the green camp right now.
He was THE whistle blower on bumpgate, so yeah.K.Jack said:People trust anything SA says about Nvidia. :lol
Its already out but is some what limited: http://www.amd.com/us/products/technologies/ati-xgp/Pages/ati-xgp.aspxSapientWolf said:Ugh, when are external video cards coming?
and as an owner of an SLi card, i've got your back not that you need it.brain_stew said:Not at all, the current solution sacrifices playability for high average framerates that look good in benchmarks. That's just something I can't knowingly support.
K.Jack said:People trust anything SA says about Nvidia. :lol
Davidion said:That's the most phallic card I've ever seen. :lol
Sleeker said:i love when they do these vids.
the gpu wars are more interesting than the console wars
thats what you recommended me too
but is it better to get the 4gb of DDR3 or 6gb?
im not well familiar with the dual channel stuff
More than 400 of them.Binabik15 said:How many Gamecubes have to be taped together for one Hemlock?
![]()
brain_stew said:Not at all, the current solution sacrifices playability for high average framerates that look good in benchmarks. That's just something I can't knowingly support.
I was in tears from 00:56-1:14cyberheater said:Hitler, as Nvidia's CEO, gets informed about ATI's Evergreen series
http://www.youtube.com/watch?v=FR45ja_fNzU
:lol :lol :lol
Xavien said:Can you please stop spouting this BS in every single PC graphics thread, SLI and Crossfire do not have microstutter anymore, newer drivers do not have this problem (180 series atleast). Triple Buffering is partially used automatically when you use SLI/Crossfire because of the implementation of Buffers, i know this because i have SLI myself, when Vsync is On, the FPS count never goes 60-30-etc, sure its fixed at a maximum of 60 but the fps is pretty much identical as to having triple buffering on.
Please tell me do you have SLI/Crossfire? or are you getting this from websites?
Now don't get me wrong there are some drawbacks (certain areas in games that are not SLI/Crossfire optimized can cause some sharp slow downs), but microstutter and triple buffering are not them.
Microstuttering for 2-card SLI is mostly fixed. It's still a problem for 3x/4x SLI and any kind of crossfire. Also, each additional GPU will increase your input lag by one frame -- this will most likely remain the case as long as AFR is used.Xavien said:Can you please stop spouting this BS in every single PC graphics thread, SLI and Crossfire do not have microstutter anymore, newer drivers do not have this problem (180 series atleast).
i don't think Xavien truly understands the issue. i don't mean that as an insult to him either. it's a pretty hard issue to understand for a lot of people that 60 fps can look more like 30 with microstuttering when it gets especially bad, but the whole time their fps counter will report 60 fps.brain_stew said:Using a FRAPs counter to check whether or not triple buffering is working is not the way to do it, as it measures fps, not frametimes. Go look up the actual frametimes, and they won't correspond to a triple buffered game.
Its inherant to the way the technology currently works. Each GPU renders alternate frames. Once a frame is rendered it is displayed. If the two GPUs render a frame one just after another then you'll have two frames and a long gap, then another two frames. Go record your frametimes, and I'll guarantee they're not steady increments as you'd see with a single GPU.
If SLI was optimised to remove micro stutter with AFR then you'd see a significant decrease in your average fps, yet we haven't had a single driver release that proudly boasts o be reducing SLI performance, quite the opposite.
Then there's the memory issue, what a horrible waste of resources that is. I don't have to personally run an SLI rig to understand the way the technology works, and why its not something I desire.
Its not at all comparable to a single GPU setup, and I don't really care if it offends people that have spent a lot of money.
Every scientific look at that has confimred there is still microstutter, show me a readout of frametimes that proves otherwise.
i am not seeing any data other than the usual graphs showing microstuttering as seen here.Durante said:Microstuttering for 2-card SLI is mostly fixed. It's still a problem for 3x/4x SLI and any kind of crossfire. Also, each additional GPU will increase your input lag by one frame -- this will most likely remain the case as long as AFR is used.
What I've written above has been confirmed by multiple independent test, one of the most in-depth ones is here (though it's in German).
Zaptruder said:Well, it's the x2 version, so I'd definetly want 2 extra outputs... otherwise I'd just go with 2x5870s.
Don't cry I'll buy you an ATI!Omnicent said:I was in tears from 00:56-1:14
This one will NEVER topped. It's the best fit I've seen yet. :lolcyberheater said:Hitler, as Nvidia's CEO, gets informed about ATI's Evergreen series
http://www.youtube.com/watch?v=FR45ja_fNzU
:lol :lol :lol
plagiarize said:i am not seeing any data other than the usual graphs showing microstuttering as seen here.
http://www.computerbase.de/artikel/...95/21/#abschnitt_mikroruckler_auf_der_gtx_295
but my german is very rusty.
i didn't know about microstutter when i got my GPU and it started bothering me this summer before i even knew what it was. ATI's used to be really really bad, but it's not gone away, it's just been reduced compared to what it was.
it's still present and it's still artificially inflating performance figures. until benchmarks take it into account, it isn't likely to go anywhere.
Septimus said:Jesus christ that thing will not fit in my case haha. :lol :lol @ Nvidia hitler video. Although the bluray vs. hddvd one always made me laugh too.
Sebulon3k said:Have the dimensions for the card been released yet?
If they have can anyone tell me if it'll fit in this Case![]()
fu3lfr3nzy said:Dude, there is no way it'll fit in that case
Its a mid-tower first and foremost, which makes it difficult for long cards to squeeze through.
Then the HDD slots in that case pretty much seals the deal as far as not being able to fit this card(not enough space)
If this card really is longer than a 4870X2, oh my :lol