• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: AMD DirectX 11 Card Performances, Prices Revealed, Surprisingly Affordable

as long as this heated conversation is going on i figured it might be interesting to drop this weekend sale from fry's for the EVGA GTX 275 card

$149 after rebate, not too shabby. then again ati's new cards will just cause the price of the current ones to drop anyway so that's really inapplicable to the fact that this is a pretty swell price at the time of sale

http://www.frys.com/product/5892363?site=frysecampaign
 
Mr.Potato Head said:
I know piracy has a bit to do with it but its NOT the only reason behind the lack of pc games taking advantage of high end gear and its also NOT the biggest reason behind it, i wish people would stop saying piracy is why..its NOT! Its part Intels fault with the crap integrated graphics bullshit and the fact that Microsoft has yet to address the "user friendliness" missing from the pc platform as a gaming platform for that "average not to pc savvy gamer"...pc gaming now days is not like how it once was back in the C64 days and stuff like that where even i could pop in a floppy disc and play a game (i was around 12 years back in the C64 days)...so...if anything its more complex to the average user now days with computers..


i16553_C64easy.PNG


It wasn't hard but people would've been put off by the command line :p
 

ghst

thanks for the laugh
i'm just thankful for console gamers neutered standards when it comes to performance/resolution/iq. the whole situation could be a damn sight worse.

i don't have the same cloud yelling passion, but i think there's a little bit of what ut66 is saying in most high end pc gamers.

but i also think this bottom line/forced homogeneity bullshit is just that. pc isn't a hardware spec. it's a way for people with whatever hardware spec they feel like putting together to play whatever games they feel like playing on it. anything else is pure compromise, and do that enough times and you end up with a console.

and who wants that.
 
ghst said:
i don't have the same cloud yelling passion, but i think there's a little bit of what ut66 is saying in most high end pc gamers.
I think we all would love to see the full capabilities of whatever hardware we have, but what he is suggesting is pure nonsense, and a few seconds of critical thinking makes it clear that the system we have now is not only infinitely more sustainable, but probably more consumer-friendly as well.
 

tokkun

Member
brain_stew said:
Well if SLI/Crossfire worked as advertised you might have a point, but they don't and will forever be a horrible upgrade path.

Can you expand on that? I don't run SLI myself, but the benchmarks I have seen on enthusiast sites showed (for example) an Crossfire combination 4850s destroying the 4890 in the vast majority of benchmarks.

Edit: Here is an example. Even the extremely cheap 4770 in Crossfire is able to beat the 4890 in two thirds of the benchmarked games.
http://www.hardocp.com/article/2009/08/18/amd_radeon_hd_4770_crossfire_evaluation/1
 
tokkun said:
Can you expand on that? I don't run SLI myself, but the benchmarks I have seen on enthusiast sites showed (for example) an Crossfire combination 4850s destroying the 4890 in the vast majority of benchmarks.

Edit: Here is an example. Even the extremely cheap 4770 in Crossfire is able to beat the 4890 in two thirds of the benchmarked games.
http://www.hardocp.com/article/2009/08/18/amd_radeon_hd_4770_crossfire_evaluation/1

Oh they produce very nice looking benchmarks, no doubt, the problem is that those benchmarks don't translate into how smooth the game will actually play. I've talked about it at length before and there's plenty discussion on the internet, just look up "AFR", "frametimes" and "micro stutter" if you want to find out why its such a messy situation. Forgetting all that of course, you are always at the mercy of drivers, performance is nothing near consistant, "low" framerates (you know the thing that arguably has the biggest impact on how smooth a game plays?) are universally lower and power/heat output compared to what you get in game is just all out of whack. Oh, and you'll very often be limited by your framebuffer, which is only the size of one card's memory pool, not two.

Its a terrible upgrade path, always has been always will be, until there's a radical change in how the technology is employed, its not something I'll recommend.
 

SapientWolf

Trucker Sexologist
tokkun said:
Can you expand on that? I don't run SLI myself, but the benchmarks I have seen on enthusiast sites showed (for example) an Crossfire combination 4850s destroying the 4890 in the vast majority of benchmarks.

Edit: Here is an example. Even the extremely cheap 4770 in Crossfire is able to beat the 4890 in two thirds of the benchmarked games.
http://www.hardocp.com/article/2009/08/18/amd_radeon_hd_4770_crossfire_evaluation/1
Crossfire shows strongest at ridiculously high resolutions and settings, but the 4890 does comparable numbers up to 1080p and consumes far less power. You're generally better off trading up than going crossfire/SLI.
 

pestul

Member
Damnit guys, with this thread being 7 pages long I was tricked into believing benchmarks would actually be included. Another 20min gone to GAF.
 

artist

Banned
pestul said:
Damnit guys, with this thread being 7 pages long I was tricked into believing benchmarks would actually be included. Another 20min gone to GAF.
There are some performance numbers, see my post on pg 2 or 3. UT66 has derailed yet another thread. :/
 
I might be upgrading to the latest of ATI's offering swapping out my 4870 depending if I get any more performance increase from 3.2Ghz dual core E6750.

Hopefully ATI does something in regards to some sort of Physx. Hopefully devs use Direct X11's API instead of Nvidias.
 
Not completely on topic but related.

Any word on refreshes of current cards using the new fab process. A part like a 4870 or 4890 with a considerable drop in power consumption would feel my needs. :D
 

pestul

Member
The timing couldn't be better for me. I'm getting ready to scrap an ancient AthlonXP machine for a powerhouse Windows 7 system this fall. A $300 video card is the perfect price point too.
 

mrgone

Member
UT66 said:
My $85 4770 laughs at this news. Hell my 9800GT and even the 8600GTS sit here unused and laugh at this fuking news. BASICALLY...

FUCK YOU ATI. FUCK YOU NVIDIA. FUCK YOU AMD. AND FUCK YOUR ASS INTEL. OH AND MICROSOFT. FUCK YOU TOO, YOU STUPID FUCKS. LET ME ASK YOU THIS...

Where the hell is my KZ2 caliber game, exclusive for the PC? MY GT5 CALIBER GAME? AH?

Crysis? Fuck that. YES, It was funny game for like a few days, but basically that's just throwing shit at my PC, so you FUCKS can justify selling your ridiculous hardware. That doesn't strike me as a good, intelligent, and honest effort. That's not efficient. That doesn't wow me. KZ2 does. GT5 does ( For the record, im no ps3 fan) And those games are running on a super piece of shit notebook gpu from 2005!!

So enougth of this bullshit. ENOUGH! YOU WANT ME TO BUY YOUR STUPID HARDWARE? WOW ME. USE WHAT I HAVE FOR A FUKING CHANGE. PUT SOME FUCKING EFFORT ON IT. HIGHER ANTI ALIASING AND HIGER RESOLUTION IS NOT GOING TO CUT IT ANYMORE. IM NOT ASKING FOR MUCH. 720P AND 30FPS IS GOOD ENOUGH FOR ME.
JUST TAKE WHATS LEFT AND SQUEEZE REALLY HARD. YOU KNOW? LIKE YOU FUCKS DO WITH THE CONSOLES. UNTIL THEN, FUCK YOU.

smuh46.gif
 

mjolnirsbane

Neo Member
Lets just face the facts here people... consoles wouldn't exist without the PC. The ability to UPGRADE on the fly outlives the consoles limited lifespan.

Consoles are meant for casual gamers and PC's are meant for serious gamers. I own both so don't tell me otherwise.

Inflamatory? Maybe... right? Definitely.

And of course PC's are also good for infinite other applications. When is the last time you surfed the web with a console? Dreamcast. Yep.
 

theultimo

Member
mjolnirsbane said:
Lets just face the facts here people... consoles wouldn't exist without the PC. The ability to UPGRADE on the fly outlives the consoles limited lifespan.

Consoles are meant for casual gamers and PC's are meant for serious gamers. I own both so don't tell me otherwise.

Inflamatory? Maybe... right? Definitely.

And of course PC's are also good for infinite other applications. When is the last time you surfed the web with a console? Dreamcast. Yep.

Well, I sometimes surf with the Wii and PS3...


However, the easiest and best method will always be PC/Mac Browsers.
 

Xavien

Member
mjolnirsbane said:
Lets just face the facts here people... consoles wouldn't exist without the PC. The ability to UPGRADE on the fly outlives the consoles limited lifespan.

Consoles are meant for casual gamers and PC's are meant for serious gamers. I own both so don't tell me otherwise.

Inflamatory? Maybe... right? Definitely.

And of course PC's are also good for infinite other applications. When is the last time you surfed the web with a console? Dreamcast. Yep.

Surfing with the Wii can be fairly easy with the pointer, as long as you don't need to use a keyboard. but the other two consoles? i agree 100%.
 
Xavien said:
Surfing with the Wii can be fairly easy with the pointer, as long as you don't need to use a keyboard. but the other two consoles? i agree 100%.

input device isn't the issue, the lack of ram is. (though this wouldn't be a big issue with the ps3 if sony actually spent time making the ps3 browser decent). Overall, browsing the web on consoles is pretty useless.
 
U K Narayan said:
What do you guys usually do with a graphics card when you replace it? Sell it off?

If I can get some decent scrillas for it, I sell it off. If not, I have 4 computers in my house, there's a neverending hand-me-down chain going on here.
 
U K Narayan said:
What do you guys usually do with a graphics card when you replace it? Sell it off?
I don't replace mine often enough to have a regular pattern for it (I buy one once every few years, much less frequently than some PC detractors claim you need to to stay up on all the games) but last time I just ended up passing it down to my friend who is pretty new to PC gaming. I did a pretty significant upgrade a few years back and passed on most of my major components (CPU/mobo/GPU) to him as a base to build his own PC so we could play Left 4 Dead. It's still doing fine for him.

Next time around I'm thinking about building an HTPC starting with leftovers.
 

artist

Banned
GT300 in 2010. He was the first one to have called GT212 cancelled, so take it for what its worth.

Nvidia roadmaps turn up

September 1, 2009

THE BATTLE FOR GPU SUPREMACY this coming winter solstice holiday season is looking like Nvidia (Nasdaq: NVDA) is bringing a dull butter knife and a blindfold to a howitzer fight. At least that is the impression its latest round of roadmaps is giving.

The roadmaps recently shown to SemiAccurate confirm the bleak outlook we have been talking about recently. There are two of them each dated mid-August, or about two weeks ago. One covers Q3/2009 through Q2/2010 and the other has Holiday 2009 and Spring 2010 selling season coverage.

2s9y7n8.jpg

One years worth of code names

The information covered is pretty easy to interpret. The D10 generation is a 55nm part, but Nvidia is up to its usual sleazy naming tricks, even to its NDA'd partners. It is claiming that renamed G9x parts are the D10 generation even though they are not. If it can't be honest with itself, what chance does it have to be honest to customers? Don't answer that. There is literally nothing new here.

D11 is the code name for the 40nm shrinks of, well, we aren't really sure anymore. They could be G9x derivatives, or they could be G200 based, but since they are legitimate shrinks, we'll give them the new generational name.

Take note that the D11 parts are only in the bottom three categories, Mainstream A and B, along with Performance A. As we said a few days ago, this is nothing more than a cynical renaming borne from a gross inability to make new parts. It also is quite emblematic of Nvidia's inability to deliver even a G92 class part on 40nm. To bang that drum once again, if Nvidia has huge problems yielding ~100mm^2 and smaller dies on TSMC 40nm, what chance does it have of producing a >500mm^2 part? For the math averse, the answer is less than 1/25th the chance.

The most interesting item is D12U or GT300. The U stands for Ultra, the highest end variant. Nvidia is claiming that will be out in Q1, but that forecast is really dependent on what it gets back from the fab later this week or early next. If its hot lots need a respin, this Q1/10 date is suddenly going to seem hopelessly optimistic.

On the original roadmaps shown to SemiAccurate, all of the lower 2 classes had the D10 to D11 transition happening on the Q4 to Q1 boundary, that is, January 1. The Enthusiast segment has the D10U to D12U transition happening after that - the bar of the graph clearly goes well into Q1. This is either fudging or a CES paper launch, followed by availability sometime later.

10ne26d.jpg

Product names by season

More interesting bits can be found on the second page, the seasonal roadmap. ATI will have DX11 parts on sale in Q3/2009, which would be this month. For Q3 and Q4, all that Nvidia has to offer is the GTX280 with the RAM doubled to 1792MB. It will be pricing a 480mm^2 part with a hugely complex PCB against a 181mm^2 part with far fewer layers. GTX280 won't be able to compete with Cypress at all, and Juniper can obliterate it on price/performance. Dull butter knife indeed.

The rest of the stack is known, with the only changes to note being the GT230 in Performance A getting double the memory. Until the new year, it is simply more of the same.

Remember when Jen-Hsun said that Nvidia was producing more 40nm wafers than anyone else? Remember when he said that it would be mostly 40nm by the end of the year? Anyone believe him? Where is the SEC when we need it? Where are the non-gullible analysts?

Pain really starts for Nvidia in 2010. Q1 brings D12U/GT300, if it can make them, likely in very short supply unless it does something radical to fix the yield toilet it is swimming in. Assuming it gets that out, and it performs better than we hear, then Nvidia is going to be shipping a 530mm^2+ part in very short supply against a low 3xxmm^2 part in plentiful supply.

To give you an idea of how confident it is of this plan working out, the roadmaps for the Spring 2010 season still list the GTX280 with 1792MB as the lead card. If GT300 doesn't work perfectly and it needs to do a second spin, Nvidia is literally out of the high end graphics game. If that does work out, it gets to promote a card that it can't make at the expense of the products it can produce. Great choices there.

The meat of the market, Performance B, is still served by the venerable G92. Ouch. No, really. Nvidia's inability to make a part is getting painful to watch at this point. Juniper will kill this part at a lower cost.

The bottom three, the GT330/320/310 chips, are simply shrunk versions of their 230/220/210 predecessors. Nvidia is listing the performance gains from the 230 to the 330 as +20%, 220 to 320 as +60%, and 210 to 310 as 30%. For Performance B and Enthusiast, there are no gains listed because the parts don't change at all.

Some of those might look like decent numbers, but ATI has a new generation top to bottom, and should be at about +100%. Game over, top to bottom.


In the end, the Q4 and Q1 roadmaps are what we have been saying all along, devoid of any hope. What we were not expecting was the Q2/2010 roadmaps to be as bleak as they are now. If Nvidia isn't sandbagging on Q2, it is in deep deep trouble.

At this point, it is going to have to compete on price, and only in the low end at that. A single Cypress should be about on par with a GTX295, and Cypress will cost less than a GTX285 to produce. How long can Nvidia sell at a loss to make up volume? We will know by the end of Q1/2010. Until then, all it seems to have is spin, bluster, and the usual demos full of promise but lacking purchasable silicon.S|A

http://www.semiaccurate.com/2009/09/01/nvidia-roadmaps-turn/

And for the folks in the SF area:

AMD's been cooking up some very cool stuff ... if you were at QuakeCon you might have already seen a sneak peek.

Not sure if this is the best place to post this, but here goes ...

On the evening of September 10th in San Francisco, AMD would like to invite fans to join us for the unveiling of a new visual PC experience. Basically, I can't say exactly what you're going to see there, but I can tell you that we’ll be holding a big party with food, drink and tunes, very cool demos of unreleased games, incredible hardware set-ups you have to see to believe and a lot more.

Interested in joining us?? Send me an e-mail at MonicaAMD@gmail.com with the following details:
Name:
Forum name:
Age:
City:
Blog/website (if you have one):
 

artist

Banned
Gwanatu T said:
In other news, 8 days!
GTX295 performance for $300 and DX11 is pretty sweet, hopefully they deliver on this. I dont want a multiGPU solution, single chip is damn fine.
 

artist

Banned
AstroLad said:
I thought it was just supposed to compete with the 280. That's new news to me.
Cypress XT (5870) is the new single chip high end for AMD. If its only faster than the 280 then it would be an embarassment of sorts. The 4890 is already pretty close and on some occasions faster than the 280.
 
TheExodu5 said:
I'd be highly surprised if it competes with the GTX 295. We'll see.

Performance will absolutely be in that ballpark, and since its a single GPU solution, anything remotely close is going to be such a vastly superior experience.

We already know their $200 part is going to offer 2+ teraflops, the $300 5870 should be an absolute monster. Expect the dual GPU Hemlock (which will launch later) to be a ~5 teraflop part.

Make no mistakes about it, Nvidia are utterly fucked no matter what they do, the next few months aren't going to be pretty. This isn't just a launch of a high end part to marvel over either, its a complete range of GPUs from three separate dies, with at least 5 different SKUs from $350 below.
 

Firestorm

Member
Damn. My cousins wanted me to build them computers later this year I think and I was hoping to use nVidia GPUs since one of them plays WoW a lot and I'd love him to be 3D ready. Will be really hard to do this unless the GTX 275 drops to <$150 in Canada. Something tells me that the 5850 will be close to $300 when it launches here =/
 

artist

Banned
AstroLad said:
dx11 though
DX11 alone by itself is not enough to justify the price, they'll have to bring performance to the table as well. If Cypress XT is on par with 280, I pretty much guarantee you that it will be one of the biggest duds.
 

1-D_FTW

Member
brain_stew said:
Performance will absolutely be in that ballpark, and since its a single GPU solution, anything remotely close is going to be such a vastly superior experience.

We already know their $200 part is going to offer 2+ teraflops, the $300 5870 should be an absolute monster. Expect the dual GPU Hemlock (which will launch later) to be a ~5 teraflop part.

Make no mistakes about it, Nvidia are utterly fucked no matter what they do, the next few months aren't going to be pretty. This isn't just a launch of a high end part to marvel over either, its a complete range of GPUs from three separate dies, with at least 5 different SKUs from $350 below.

Which for PC gamers is a very good development. If AMD is going to continue offering competition to Intel and Nvidia, they desperately need a MEGA hit to rake in the $$$. Even Nvidia lovers should be pleased with this development. It's good for us consumers.
 

Minsc

Gold Member
ATI needs to get on the 3D gaming ball. Power's nice and all, but 3D Vision's a really good selling point.

Probably not enough to let the 275 hold on against this new card, especially since it seems it will be a pretty huge gap, but no 3D vision w/ ATI does damper their offering a bit to those interested.
 

Ceebs

Member
Xdrive05 said:
Do you guys think a Phenom ii 940 BE quad core would bottleneck these new cards?
This is a good question...How soon until the new GPUs are no longer the bottleneck on a modern CPU dual or quad core?
 
Maybe it'll allow PCSX2 to run smoother? That's a big advantage...of course, that would only convince me to build a new PC when prices for these cards go down even more.
 

artist

Banned
Minsc said:
ATI needs to get on the 3D gaming ball. Power's nice and all, but 3D Vision's a really good selling point.

Probably not enough to let the 275 hold on against this new card, especially since it seems it will be a pretty huge gap, but no 3D vision w/ ATI does damper their offering a bit to those interested.
3D can be done via drivers, they'll probably jump in when there is market for it.

PhysX, well it'll be redundant come DX11. :lol
 
So with an E8400, should I wait for these cards to hit, or should I just upgrade to a 4890?Basically, will these new cards be bottlenecked by my processor?
 

artist

Banned
Gully State said:
So with an E8400, should I wait for these cards to hit, or should I just upgrade to a 4890?Basically, will these new cards be bottlenecked by my processor?
Low resolutions, may be. High resolutions, no.
 

Durante

Member
Regarding the bottleneck discussion it's always a question of your resolution and image quality requirements. If you have a 30" monitor you'll probably always have a GPU bottleneck, but if you are happy playing on a 22" display with low levels of AA you probably won't benefit much from upgrading your GPU.
 
Fyodor Dostoevsky said:
Maybe it'll allow PCSX2 to run smoother? That's a big advantage...of course, that would only convince me to build a new PC when prices for these cards go down even more.
I thought the CPU was usually the bottleneck on the emulators of recent consoles, unless you're running things at high resolution with large amounts of post-processing. I could be wrong, but I always got the impression that they benefit from a fast dual-core processor more than anything.
 
Durante said:
Regarding the bottleneck discussion it's always a question of your resolution and image quality requirements. If you have a 30" monitor you'll probably always have a GPU bottleneck, but if you are happy playing on a 22" display with low levels of AA you probably won't benefit much from upgrading your GPU.

So I guess in my case, my CPU wouldn't be a bottleneck since I'm thinking about upgrading my monitor from 1600x1200 to 1080p.
 
Minsc said:
ATI needs to get on the 3D gaming ball. Power's nice and all, but 3D Vision's a really good selling point.

Probably not enough to let the 275 hold on against this new card, especially since it seems it will be a pretty huge gap, but no 3D vision w/ ATI does damper their offering a bit to those interested.

This is becoming a bug bear for me as well. I've already decided that I make the leap to 3D gaming with my next HDTV purchase in a year or two and I really don't want to be restricted to one brand.

ATI's next set of cards should be capable of matching my GTX 260 even when in sterescopic 3D mode, so let me use that power dammit! 1080p is enough resolution for me and my GTX 260 handles that just great so the only thing on the horizon that will get me to upgrade is better performance in sterescopic 3D mode.

Adding native triple monitor support is damn nice, now go follow that up with a 3DVision competitor please, ATI!
 
irfan said:
3D can be done via drivers, they'll probably jump in when there is market for it.

PhysX, well it'll be redundant come DX11. :lol

ATI aren't launching just one card. The 5850 should hit in the same price bracket as a 4890, offer more performance, consume less power and have full DX11 compatability. There's at least two SKUs below that as well, so even if you don't want ultra high end performance, you might as well get DX11 compatability and lower power requirements for roughly the same price. Just wait.

"Bottlenecks" are so hard to define anyway, its totally dependant on the game, individual situations in a game and even then, there's plenty of ways to shift that bottleneck back to the GPU. Start using sterescopic 3D, crank up the resolution and add the utter highest level of IQ (with at least transparency supersampling or flat out supersampling if possible) and you'll be able to make pretty much any game GPU bound, even with the latest hardware.
 

TheExodu5

Banned
brain_stew said:
"Bottlenecks" are so hard to define anyway, its totally dependant on the game, individual situations in a game and even then, there's plenty of ways to shift that bottleneck back to the GPU. Start using sterescopic 3D, crank up the resolution and add the utter highest level of IQ (with at least transparency supersampling or flat out supersampling if possible) and you'll be able to make pretty much any game GPU bound, even with the latest hardware.

Well said. You can never have too much GPU power.
 

artist

Banned
Supposedly the 5670 (Juniper) is as fast as the current GTX280. Even if its slightly slower, man that'd be one fucking awesome card for an HTPC. :D
 
Top Bottom