• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

ATI 58XX Preview - Media Stuff goes in here.

35ldopk.jpg
 
Dr. Light said:
Don't you think you're blowing this out of proportion?

Although, if they do change the way crossfire works, would that affect existing cards?
I have a 4870x2, works a charm very happy with it. The dual GPUs scale well in most games. I haven't noticed the microstuttering or lag that people complain about. Maybe I am just very sensitive to that kind of thing. I all for improving crossfire if the can, but personally I think it works pretty well already.

Edit: might buy that 5870x2 badboy posted above. Damn. 2x2 GB vram a must though.
 
TheNiX said:
I'd settle for 3, but I'm wondering if they'll be selling different models with the support for 6 monitors as seen in the pictures on page 1.

Well, it's the x2 version, so I'd definetly want 2 extra outputs... otherwise I'd just go with 2x5870s.

Triple monitor setup, +1080p projector, +1080p TV, +next Gen Wacom Cintiq whenever that arrives...

I've got a lot of outputs to display to!
 
More fuel.

http://hardocp.com/news/2009/09/15/nvidia_gt300_yields_are_under_263/

Charlie over at SemiAccurate says that, according to what he has heard, yields of NVIDIA’s GT300 are currently at 1.7%. You read that right, under two percent. So, if what Charlie is reporting is even semi-accurate, yields are extremely bad.

How many worked out of the (4 x 104) 416 candidates? Try 7. Yes, Northwood was hopelessly optimistic - Nvidia got only 7 chips back. Let me repeat that, out of 416 tries, it got 7 'good' chips back from the fab. Oh how it must yearn for the low estimate of 20%, talk about botched execution. To save you from having to find a calculator, that is (7 / 416 = .01682), rounded up, 1.7% yield.

The full article.
http://www.semiaccurate.com/2009/09/15/nvidia-gt300-yeilds-under-2/

Things don't look to hot for the green camp right now.
 
Dr. Light said:
Don't you think you're blowing this out of proportion?

Although, if they do change the way crossfire works, would that affect existing cards?

Not at all, the current solution sacrifices playability for high average framerates that look good in benchmarks. That's just something I can't knowingly support.
 
brain_stew said:
Not at all, the current solution sacrifices playability for high average framerates that look good in benchmarks. That's just something I can't knowingly support.
As someone who has used 4 sets of dual cars in the past 3 or so years, I have to say that I do not agree with the gameplay comment, then again some people are more sensitive to micro stuttering than others, and it must have been that long that maybe I do not remember anymore how games without micro stutter are supposed to feel. However, I will give you this. I play very little on my 360, but the 30fps that it displays feel heck of a lot better than the 30fps I get on Crysis with all settings maxed. So what you are saying is not wrong at all.
Tenacious-V said:
If this is true (I trust [H] but I don't trust CJ) then we as customers will see the negative effects. ATI had planned to launch these cards at very affordable prices, but now that they realized Nvidia can't compete, the launch prices apparently increased. I understad, that a 5870 at $400 dollars is still a great deal, and that if we are going to talk about raping the customers with high prices Nvidia takes the crown, but still, I like the humble ATI I fell in love with.
 
cyberheater said:
Hitler, as Nvidia's CEO, gets informed about ATI's Evergreen series

http://www.youtube.com/watch?v=FR45ja_fNzU

:lol :lol :lol

i love when they do these vids.

the gpu wars are more interesting than the console wars

brain_stew said:
5850 and a i5 750 + a P55 board and 4GB of DDR3. Use the ~$200 you save and buy a 1080p monitor, job done.

thats what you recommended me too
but is it better to get the 4gb of DDR3 or 6gb?
im not well familiar with the dual channel stuff
 
The 5870X2 could very well make me walk away from GTX295.
Although it would be unfortunate as I have just picked up Batman AA with PHYSX.
This does look like a great card though.
 
Zaptruder said:
Even worse for Nvidia is that all this negativity in the green camp is going to cause fence sitters to jump on the red wagon before they even get a chance to release... that's losing potential sales thanks to opportunity cost!

Well I also think though, that NV got complacent. A lot of the negativity is from their own doing. Tons of rebadging, mobility problems, gimping some games for ati, etc.

But regardless of that, I think a lot of fence sitters are going to jump to red anyway. What we can already say for certain is, crap yields or not, NV is now still getting back engineering samples, and ATi launches full DX11 next week.
 
I'm a little worried about lack of market competition over the coming months. Hopefully they stick to that pricing chart. Juniper 5850 will be a nice -199 dollar upgrade from a GTX260 c216 (what, about 50% faster?), and maybe the same price (depending on how Nvidia prices post ATI launch).

Can't wait to see some benchies at lower (normal) resolutions. I would imagine most PC gamers are still in the 900p-1050p range. Whereas these new ATI's seem to be explicitly designed for the 2XXXp + range. Notice the significant bumps only seem to come in at the insanely high resolutions...
 
irfan said:
5870x2 pic
Some quick Photoshop comparing with a 10.5" GTX295 puts that monster at just under twelve inches. I don't know if it'll actually fit in my case.

Glad it has a red stripe, though; it'll go much faster now.
 
I'm interested in a new card ... I currently have 2 8800gt's in SLI and I just WANT to get away from SLI. I've had not real great luck with it.

Are there any pre-release comparisons with these new cards coming out vs. todays cards? Should I just keep the 8800gts for now and wait a little longer? I am so utterly confused with there being so many damn cards on the market and all the information on the net just straight confuses me.
 
Fuck sake, I eye-measured the MASSIVE card. Look like I need whole new shelf-built system again along new tower case & PSU.

Time to sell my one or two of my consoles for MASSIVE upgrade.
 
brain_stew said:
Not at all, the current solution sacrifices playability for high average framerates that look good in benchmarks. That's just something I can't knowingly support.
and as an owner of an SLi card, i've got your back not that you need it.

until they resolve microstuttering all my future graphics card purchases will be a card with the most powerful single chipset money can buy.

my 9800 GX2 is a good card, and microstuttering isn't a big deal in some games compared to others, but 60 FPS burnout paradise on PC with SLi just isn't the same as 60 FPS paradise on 360. every game where i can hit 60 in single card mode, you can bet your ass i do.
 
K.Jack said:
People trust anything SA says about Nvidia. :lol

Well the lack of any real news on progress from nVidia doesn't help to refute these claims.
 
Sleeker said:
i love when they do these vids.

the gpu wars are more interesting than the console wars



thats what you recommended me too
but is it better to get the 4gb of DDR3 or 6gb?
im not well familiar with the dual channel stuff

You want to buy pairs of RAM for dual channel so 6GB isn't an option. 4GB or 8GB.

8GB is a waste for anyone that's spends a lot of their time in seriously RAM hungry applications, and that does not include games.
 
brain_stew said:
Not at all, the current solution sacrifices playability for high average framerates that look good in benchmarks. That's just something I can't knowingly support.

Can you please stop spouting this BS in every single PC graphics thread, SLI and Crossfire do not have microstutter anymore, newer drivers do not have this problem (180 series atleast). Triple Buffering is partially used automatically when you use SLI/Crossfire because of the implementation of Buffers, i know this because i have SLI myself, when Vsync is On, the FPS count never goes 60-30-etc, sure its fixed at a maximum of 60 but the fps is pretty much identical as to having triple buffering on.

Please tell me do you have SLI/Crossfire? or are you getting this from websites?

Now don't get me wrong there are some drawbacks (certain areas in games that are not SLI/Crossfire optimized can cause some sharp slow downs), but microstutter and triple buffering are not them.
 
Xavien said:
Can you please stop spouting this BS in every single PC graphics thread, SLI and Crossfire do not have microstutter anymore, newer drivers do not have this problem (180 series atleast). Triple Buffering is partially used automatically when you use SLI/Crossfire because of the implementation of Buffers, i know this because i have SLI myself, when Vsync is On, the FPS count never goes 60-30-etc, sure its fixed at a maximum of 60 but the fps is pretty much identical as to having triple buffering on.

Please tell me do you have SLI/Crossfire? or are you getting this from websites?

Now don't get me wrong there are some drawbacks (certain areas in games that are not SLI/Crossfire optimized can cause some sharp slow downs), but microstutter and triple buffering are not them.

Using a FRAPs counter to check whether or not triple buffering is working is not the way to do it, as it measures fps, not frametimes. Go look up the actual frametimes, and they won't correspond to a triple buffered game.

Its inherant to the way the technology currently works. Each GPU renders alternate frames. Once a frame is rendered it is displayed. If the two GPUs render a frame one just after another then you'll have two frames and a long gap, then another two frames. Go record your frametimes, and I'll guarantee they're not steady increments as you'd see with a single GPU.

If SLI was optimised to remove micro stutter with AFR then you'd see a significant decrease in your average fps, yet we haven't had a single driver release that proudly boasts o be reducing SLI performance, quite the opposite.

Then there's the memory issue, what a horrible waste of resources that is. I don't have to personally run an SLI rig to understand the way the technology works, and why its not something I desire.

Its not at all comparable to a single GPU setup, and I don't really care if it offends people that have spent a lot of money.

Every scientific look at that has confimred there is still microstutter, show me a readout of frametimes that proves otherwise.
 
Xavien said:
Can you please stop spouting this BS in every single PC graphics thread, SLI and Crossfire do not have microstutter anymore, newer drivers do not have this problem (180 series atleast).
Microstuttering for 2-card SLI is mostly fixed. It's still a problem for 3x/4x SLI and any kind of crossfire. Also, each additional GPU will increase your input lag by one frame -- this will most likely remain the case as long as AFR is used.

What I've written above has been confirmed by multiple independent test, one of the most in-depth ones is here (though it's in German).
 
brain_stew said:
Using a FRAPs counter to check whether or not triple buffering is working is not the way to do it, as it measures fps, not frametimes. Go look up the actual frametimes, and they won't correspond to a triple buffered game.

Its inherant to the way the technology currently works. Each GPU renders alternate frames. Once a frame is rendered it is displayed. If the two GPUs render a frame one just after another then you'll have two frames and a long gap, then another two frames. Go record your frametimes, and I'll guarantee they're not steady increments as you'd see with a single GPU.

If SLI was optimised to remove micro stutter with AFR then you'd see a significant decrease in your average fps, yet we haven't had a single driver release that proudly boasts o be reducing SLI performance, quite the opposite.

Then there's the memory issue, what a horrible waste of resources that is. I don't have to personally run an SLI rig to understand the way the technology works, and why its not something I desire.

Its not at all comparable to a single GPU setup, and I don't really care if it offends people that have spent a lot of money.

Every scientific look at that has confimred there is still microstutter, show me a readout of frametimes that proves otherwise.
i don't think Xavien truly understands the issue. i don't mean that as an insult to him either. it's a pretty hard issue to understand for a lot of people that 60 fps can look more like 30 with microstuttering when it gets especially bad, but the whole time their fps counter will report 60 fps.

it isn't just a motion thing, it isn't like what you see with 3:2 pull down. the problem isn't just that one frame displays for less time than the other:

the closer together the frames are rendered, the more indistinguishable they become. the perceived frame rate is closer to a second divided by the time lag between the two frames further apart divided, than it is to the actual number of frames per second.
 
Durante said:
Microstuttering for 2-card SLI is mostly fixed. It's still a problem for 3x/4x SLI and any kind of crossfire. Also, each additional GPU will increase your input lag by one frame -- this will most likely remain the case as long as AFR is used.

What I've written above has been confirmed by multiple independent test, one of the most in-depth ones is here (though it's in German).
i am not seeing any data other than the usual graphs showing microstuttering as seen here.

http://www.computerbase.de/artikel/...95/21/#abschnitt_mikroruckler_auf_der_gtx_295

but my german is very rusty.

i didn't know about microstutter when i got my GPU and it started bothering me this summer before i even knew what it was. ATI's used to be really really bad, but it's not gone away, it's just been reduced compared to what it was.

it's still present and it's still artificially inflating performance figures. until benchmarks take it into account, it isn't likely to go anywhere.
 
Zaptruder said:
Well, it's the x2 version, so I'd definetly want 2 extra outputs... otherwise I'd just go with 2x5870s.

Ah, I thought people were saying they were getting 5870 x2, didn't know it was a model. Thanks for clearing that up.
 
plagiarize said:
i am not seeing any data other than the usual graphs showing microstuttering as seen here.

http://www.computerbase.de/artikel/...95/21/#abschnitt_mikroruckler_auf_der_gtx_295

but my german is very rusty.

i didn't know about microstutter when i got my GPU and it started bothering me this summer before i even knew what it was. ATI's used to be really really bad, but it's not gone away, it's just been reduced compared to what it was.

it's still present and it's still artificially inflating performance figures. until benchmarks take it into account, it isn't likely to go anywhere.

That certainly doesn't seem to suggest microstuttering has gone away at all, and they look to be based upon a GTX 295, which in theory should have the least amount of microstuttering. I've seen much worse graphs in the past, sure, but the fact of the matter is that "60fps" from a SLI/Crossfire card is not the same as "60fps" from a single GPU setup.

Add in lower minimum framerates, huge power draws, losing half of your memory and a huge reliance upon driver updates and its just not something I'm going to be a fan of in its current implementation. If a single GPU is within 20-30% of a dual GPU's average framerate then I'd take that single card solution every time.
 
Sebulon3k said:
Have the dimensions for the card been released yet?

If they have can anyone tell me if it'll fit in this Case :D

Dude, there is no way it'll fit in that case :P

Its a mid-tower first and foremost, which makes it difficult for long cards to squeeze through.

Then the HDD slots in that case pretty much seals the deal as far as not being able to fit this card(not enough space)

If this card really is longer than a 4870X2, oh my :lol
 
fu3lfr3nzy said:
Dude, there is no way it'll fit in that case :P

Its a mid-tower first and foremost, which makes it difficult for long cards to squeeze through.

Then the HDD slots in that case pretty much seals the deal as far as not being able to fit this card(not enough space)

If this card really is longer than a 4870X2, oh my :lol

Good thing, I was about to click complete before I remembered the 5870 is supposed to be huge :lol
 
Short question, not sure how much I'd have to wade through in this thread to see if it's been answers.

Have current video cards had a significant drop in price due to this, or is that still coming?
 
Top Bottom