• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: AMD DirectX 11 Card Performances, Prices Revealed, Surprisingly Affordable

Pimpbaa

Member
irfan said:
Supposedly the 5670 (Juniper) is as fast as the current GTX280. Even if its slightly slower, man that'd be one fucking awesome card for an HTPC. :D

That sounds too good to be true.
 
tokkun said:
Can you expand on that? I don't run SLI myself, but the benchmarks I have seen on enthusiast sites showed (for example) an Crossfire combination 4850s destroying the 4890 in the vast majority of benchmarks.

Edit: Here is an example. Even the extremely cheap 4770 in Crossfire is able to beat the 4890 in two thirds of the benchmarked games.
http://www.hardocp.com/article/2009/08/18/amd_radeon_hd_4770_crossfire_evaluation/1

Cost vs performance. The cost of a second card for dual card solutions doesn't give close to the performance per cost that a single card upgrade does.
 

DieH@rd

Banned
irfan said:
Supposedly the 5670 (Juniper) is as fast as the current GTX280. Even if its slightly slower, man that'd be one fucking awesome card for an HTPC. :D

smiley_oh_sticker-p217194901605792400qjcl_400.jpg


How fast then will 5850 and 5870 be?
 

artist

Banned
Gully State said:
Any benchmarks on these new cards yet? Crysis at 60FPS would make me buy one of these cards in a heartbeat.
HD5870 (Cypress XT): ~P15xxx / P17xxx
HD5850 (Cypress Pro): ~P1?xxx
HD5670 (Juniper XT) ~P95xx
HD53XX Redwood ~P46xx


DieH@rd said:
http://rlv.zcache.com/smiley_oh_sticker-p217194901605792400qjcl_400.jpg

How fast then will 5850 and 5870 be?
5870 is rumored to be as fast as the GTX295.

Pimpbaa said:
That sounds too good to be true.
Yes, which makes the wait even more excruciating. :D
 

tokkun

Member
elrechazao said:
Cost vs performance. The cost of a second card for dual card solutions doesn't give close to the performance per cost that a single card upgrade does.

That's not true, though. Did you look at the link in the post you replied to? Even the 4770, which cost $100, outperformed the single high-end card when in Crossfire.

I will grant that some people encounter this micro-stutter issue, but I think it's hard to argue that the single-card upgrade has a better price-performance ratio, provided you have a motherboard and PSU that can handle it.
 
tokkun said:
That's not true, though. Did you look at the link in the post you replied to? Even the 4770, which cost $100, outperformed the single high-end card when in Crossfire.

I will grant that some people encounter this micro-stutter issue, but I think it's hard to argue that the single-card upgrade has a better price-performance ratio, provided you have a motherboard and PSU that can handle it.

1. Canned benchmark testing is a horribly flawed methodology.
2. There is no easy answer to the question, especially not one using canned benchmarks, because it completely depends on the resolution at which you are running the game. A 30 inch top of the line dell vs a 19 inch widescreen. SLI and single GPU solutions have different benefits there, and comparing top end performance without specifying IQ and playability and framerate and resolution is just silly.
3. There are exceptions to every rule at every time, but as a general matter the statement is true.

And again, you didn't even address cost vs performance, which is what I specifically stated.

See hardocp for all of your real life gaming tests and comparisons, not the 50 sites that run 10 tests in 3dmark.
 

artist

Banned
Zaptruder said:
Triple monitor support from a single card?

Jesus jumping jackrabbits, I'm upgrading as soon as there's availability!
ATI Eyefinity technology is something pretty cool that will be available the upcoming Radeon HD 5800 series. It allows you to extend your game view across 3 displays like the Matrox TripleHead2Go except that it gonna be much more powerful. That's over 12 megapixels at 2560 x 1600 resolution for the ultimate gaming experience. It is ideal for flight sim, racing games, role paying games, real-time strategy, first-person shooter and even multimedia apps. So how powerful exactly this card is?

This card sports a 2nd gen TeraScale engine that delivers more than 2 teraFLOPS of processing power. With so much power pack inside, it supports a new anisotropic filtering method too. If one isn't enough, you can get up to 1.8X graphics performance boost by pairing them up. The launch is around the corner and reviews will start popping out by middle of September Stay tuned for more information.

http://vr-zone.com/articles/more-details-about-ati-eyefinity-technology/7518.html?doc=7518

Eyefinity - drive 3 displays
2+ TFlop - Compute Shader, here we come!
New AF method - supposed to be the best, REFERENCE material
CrossFire boost upto 1.8x - good to know
 
Anxiously waiting for the reviews.

Wondering I still have a PCI Express 1.0 motherboard, is that gonna cost me performance?

Read a bit back that the current gen cards are not seeing a increase with PCIe Express 1-2.
 

Azure J

Member
My God, the 5XXX series of ATi cards are fucking beastly sounding. 8 days is too fucking long to wait for this. :lol
 

artist

Banned
SapientWolf said:
Any word on power consumption? ATI has had the upper hand in that category for awhile now.
The X2 had a 6 pin & 8 pin connectors, so I would assume its still in the same ballpark as the 4870X2 but obviously almost twice as powerful so overall the perf/w should be twice the previous gen. Most probably the 5870 will have two 6 pins and the 5860 & 5670 have one 6 pin.

The only consumption figures leaked so far are from ATI's mobile DX11 gpus. So yeah, ATI is launching quite a few GPUs (5 desktop & 5 mobile), 10 total. o_O Will be the record/history if it happens.
 

Zyzyxxz

Member
Damn ATI is changing the landscape of the mobile GPU market too!

If I buy another laptop I will make it mandatory to have a MXM slot!

Also I can't wait until after a few months the 5870's start dropping in price and it will be the perfect video card for a long time, I don't see game requirements increasing at the same rate they have in the past.
 

zoku88

Member
irfan said:
The X2 had a 6 pin & 8 pin connectors, so I would assume its still in the same ballpark as the 4870X2 but obviously almost twice as powerful so overall the perf/w should be twice the previous gen. Most probably the 5870 will have two 6 pins and the 5860 & 5670 have one 6 pin.

The only consumption figures leaked so far are from ATI's mobile DX11 gpus. So yeah, ATI is launching quite a few GPUs (5 desktop & 5 mobile), 10 total. o_O Will be the record/history if it happens.
Watch how almost no one will use the mobile ones, though, again :-/
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.
TheExodu5 said:
I'd be highly surprised if it competes with the GTX 295. We'll see.
It is bound to surpass the 295. It has 2GB of faster memory and it is a single chip. I would be surprised if after refined drivers it still cant make the 295 seem outdated.
 

tokkun

Member
elrechazao said:
1. Canned benchmark testing is a horribly flawed methodology.
2. There is no easy answer to the question, especially not one using canned benchmarks, because it completely depends on the resolution at which you are running the game. A 30 inch top of the line dell vs a 19 inch widescreen. SLI and single GPU solutions have different benefits there, and comparing top end performance without specifying IQ and playability and framerate and resolution is just silly.
3. There are exceptions to every rule at every time, but as a general matter the statement is true.

And again, you didn't even address cost vs performance, which is what I specifically stated.

See hardocp for all of your real life gaming tests and comparisons, not the 50 sites that run 10 tests in 3dmark.

The link was from HardOCP genius.
 

Tzeentch

Member
I had really wanted to avoid getting a new video card (my 8800GT has proven to be quite a trooper), but if ATI provides the cost-performance that has been rumored I'll be switching over ASAP along with my transition to Windows 7.
 

Xavien

Member
Well looks like AMD/ATI has another Radeon 9800 on their hands here :D

Been looking to upgrade my 8800GT's in SLI for some time now, so I'll be getting one (probably 5870) of these if the rumours prove to be true.
 

Zaptruder

Banned
irfan said:
http://vr-zone.com/articles/more-details-about-ati-eyefinity-technology/7518.html?doc=7518

Eyefinity - drive 3 displays
2+ TFlop - Compute Shader, here we come!
New AF method - supposed to be the best, REFERENCE material
CrossFire boost upto 1.8x - good to know

So what kinda ports are we looking at for the triple monitor support?

DVI or HDMI?

Is the card that I've seen of the third DVI port taking up part of the airflow vent this card?

I'd honestly prefer 3 HDMI outputs with dongles to reconvert to DVI packed in... but I'm a bit biased because all my displays have HDMI.
 
I looooooooooooooooooove new GPU architecture season.

All the excitement, all the speculation, all the rage and all the damage control.
 
tokkun said:
That's not true, though. Did you look at the link in the post you replied to? Even the 4770, which cost $100, outperformed the single high-end card when in Crossfire.

I will grant that some people encounter this micro-stutter issue, but I think it's hard to argue that the single-card upgrade has a better price-performance ratio, provided you have a motherboard and PSU that can handle it.

No, everybody encounters the microstutter "issue", its inherently linked to the way the technology works.

No triple buffering either, so go slash off upto 50% of your framerate from that as well, and the "low" framerate will alway be lower. The end result is that the actual ingame experience is not anywhere close to what those benchmarks suggest, being restricted to a 512MB framebuffer is a huge bottleneck as well.

You're also forgetting that you can, you know, sell that old GPU.
 
Zaptruder said:
So what kinda ports are we looking at for the triple monitor support?

DVI or HDMI?

Is the card that I've seen of the third DVI port taking up part of the airflow vent this card?

I'd honestly prefer 3 HDMI outputs with dongles to reconvert to DVI packed in... but I'm a bit biased because all my displays have HDMI.

From what I gather for triple monitor you'll be using either:

DVI/DVI/Display port
DVI/HDMI/Display port

Of course, converting any of those outputs to HDMi just requires a $2 adapter that is usually included in the box anyway. Cards need to still pack two DVI outputs so that they support dual link DVI.
 

Dennis

Banned
brain_stew said:
No, everybody encounters the microstutter "issue", its inherently linked to the way the technology works.

No triple buffering either, so go slash off upto 50% of your framerate from that as well, and the "low" framerate will alway be lower. The end result is that the actual ingame experience is not anywhere close to what those benchmarks suggest, being restricted to a 512MB framebuffer is a huge bottleneck as well.

You're also forgetting that you can, you know, sell that old GPU.

You seem very negative about dual GPU cards and while the issues you raise are legitimate, the benefit of that extra power can be worth it.

I have a 4870x2 and am very pleased with its performance. I will upgrade to a 5870x2 if it represents a very significant increase in power.

I am a total graphics whore and want to play with as much AA, Transperancy AA, AF etc. as possible, something that makes dual GPUs worth it.
 

Acosta

Member
MickeyKnox said:
I looooooooooooooooooove new GPU architecture season.

All the excitement, all the speculation, all the rage and all the damage control.

It´s the PC version of console launches.

Looks very impressive, but I won't update until SP1 Windows 7 at least so I have time to check how everything evolves.
 

Xavien

Member
brain_stew said:
No, everybody encounters the microstutter "issue", its inherently linked to the way the technology works.

No triple buffering either, so go slash off upto 50% of your framerate from that as well, and the "low" framerate will alway be lower. The end result is that the actual ingame experience is not anywhere close to what those benchmarks suggest, being restricted to a 512MB framebuffer is a huge bottleneck as well.

You're also forgetting that you can, you know, sell that old GPU.

Actually down to the way SLI works you get triple buffering for free. I have SLI and with Vsync on, i never go from 60 to 30 etc, its usually 60-50-45 and so on.

Micro-stutter is based on drivers, on the driver set (186.39) I'm on right now there is no micro-stutter.

There is one disadvantage though, you can get some incredibly high fps, but you can also get some low fps too (if one area is optimized for SLI and another isn't for example, then you're essentially running on one card)

One more is that the Graphics Card RAM doesn't get shared, so 512MB means you can only have the texture quality + AA so high before it runs out and starts stuttering.

SLI/Crossfire can be an incredibly good upgrade path, if you find the right cards to do it with, 2 8800GT's back in the day cost less than the latest and greatest and usually delivered better performance.
 

Klyka

Banned
I am an ATI whore but I still have to say: They don't have PhysX, meaning Batman:AA will still be better on Nvidia :(
 

zoku88

Member
Gwanatu T said:
New Onimusha confirmed? Seriously, that looks like Jacques or whatever his name was from Onimusha 3.
Wasnt that guy a (really famous) French Actor? Reno or something?

And that doesn't really look like him to me...
 

artist

Banned
rohlfinator said:
To add a dose of pessimism to the thread, looks like the prices might be a bit higher than hoped:
http://www.brightsideofnews.com/news/2009/9/2/ati-radeon-58502c-58702c-5870x2-pricing-revealed.aspx

5850: $279-299
5870: $379-399
5870X2: $599
Disregard that, its utter and pure bull shit. They are Bull shit News for a reason ;) Apparently the guy who runs the site is a thief who plagiurizes, comes up with bull shit stories and what not. This is the guy who claimed GT300 taped out back in Q1 of this year. :lol

PowerColor basically confirmed the price of 5870 as $299 in their press release. :lol

rohlfinator said:
Wasn't the Frostbite Engine already on DX10? It's still impressive, but I would think the bigger and more common hurdle would be going from DX9 to DX11.
I know Frostbite was already DX10, just highlighting the fact that the port from DX10 to DX11 is quite simple. We'll just have to wait and see if Nvidia COCK BLOCKS devs from doing this in the near future.
 

SapientWolf

Trucker Sexologist
Klyka said:
I am an ATI whore but I still have to say: They don't have PhysX, meaning Batman:AA will still be better on Nvidia :(
You can get PhysX on ATI cards with custom drivers, so it's not a hardware issue. But I doubt that it's going to make much of a difference either way. Unreal 3 engine games are rarely GPU limited.
 

marsomega

Member
Minsc said:
ATI needs to get on the 3D gaming ball. Power's nice and all, but 3D Vision's a really good selling point.

Probably not enough to let the 275 hold on against this new card, especially since it seems it will be a pretty huge gap, but no 3D vision w/ ATI does damper their offering a bit to those interested.



Have you actually used the technology? Anyone here actually use the NVIDIA 3D Stereo feature?


I got a chance to play with this at Blizzcon 2009. They had many demo stations set up with World of Warcraft hooked up. It looked cool at first but the technology is heavy on the eyes. It really isn't for everyone, it seemed like everyone I heard from mentioned eye strain. The biggest complaint was the frame rate. It ran great on the main huge displayed (295 GTXs) but the others Dalaran was at single digits. Though I have my doubt about the demo station connected to the huge screen. The performance was great but the guy was in Shattrah. In fact, most of the demo stations where people got good or decent framerates was because they were in Shattrah running around.


Anyway just a FYI. The problem at Blizzcon 2009 was those glasses are 200 USD and really feel gimmicky since it can make your graphics card crawl and it does quite a number on the eyes. In no way shape or form could I be using those for more then 20 minutes without feeling my eyeballs are going to explode. Not much of a line to try them out since most couldn't be on them for 10 minutes or more when i waited.
 

marsomega

Member
SapientWolf said:
You can get PhysX on ATI cards with custom drivers, so it's not a hardware issue. But I doubt that it's going to make much of a difference either way. Unreal 3 engine games are rarely GPU limited.


In Windows 7 and Windows XP you can run ATI cards for graphics and 1 NVidia card for PhysX. The new drivers don't work with it but the ones just before are working.
 

Decado

Member
Um...what does DX11 do? DX10 was such a flop...

I loved PC gaming, but the games just aren't there anymore and DX10 was such a let down I'm not sure why I should care anymore.
 
So have about a 6 month newly purchased HP computer from circuit city. Quad core with 5 gigs of ram etc... are these next gen cards mainly more for built computers? Or could I put it in my HP? Do I have to take into account fitting it in the current type HP case and power supply? Or are these new cards pretty easy to install.
 

1-D_FTW

Member
marsomega said:
Have you actually used the technology? Anyone here actually use the NVIDIA 3D Stereo feature?


I got a chance to play with this at Blizzcon 2009. They had many demo stations set up with World of Warcraft hooked up. It looked cool at first but the technology is heavy on the eyes. It really isn't for everyone, it seemed like everyone I heard from mentioned eye strain. The biggest complaint was the frame rate. It ran great on the main huge displayed (295 GTXs) but the others Dalaran was at single digits. Though I have my doubt about the demo station connected to the huge screen. The performance was great but the guy was in Shattrah. In fact, most of the demo stations where people got good or decent framerates was because they were in Shattrah running around.


Anyway just a FYI. The problem at Blizzcon 2009 was those glasses are 200 USD and really feel gimmicky since it can make your graphics card crawl and it does quite a number on the eyes. In no way shape or form could I be using those for more then 20 minutes without feeling my eyeballs are going to explode. Not much of a line to try them out since most couldn't be on them for 10 minutes or more when i waited.

There have been a number of threads on gaf that have spoke very highly of them. Not sure I really get your complaint about the fps. People want them connected to high end video cards because 120fps is what you want the game being rendered at. This is the gold standard, because the more you compromise, the more eye fatigue is going to be an issue (since you're cutting that in half with the shutters.)
 

mr stroke

Member
rohlfinator said:
To add a dose of pessimism to the thread, looks like the prices might be a bit higher than hoped:
http://www.brightsideofnews.com/news/2009/9/2/ati-radeon-58502c-58702c-5870x2-pricing-revealed.aspx

5850: $279-299
5870: $379-399
5870X2: $599


Wasn't the Frostbite Engine already on DX10? It's still impressive, but I would think the bigger and more common hurdle would be going from DX9 to DX11.

Well SOB :(

was really hoping for a $299 5870
I wonder if there will be a big difference in performance(20-30 fps) between the 5850 vs 5870
 

mr stroke

Member
TrAcEr_x90 said:
So have about a 6 month newly purchased HP computer from circuit city. Quad core with 5 gigs of ram etc... are these next gen cards mainly more for built computers? Or could I put it in my HP? Do I have to take into account fitting it in the current type HP case and power supply? Or are these new cards pretty easy to install.


you should be able to fit the card right into the mobo(did this with a cheap Compaq last year)
only concern is the case space, with my experience you will probably only be able to fit a 5850 in those pre built HP/Gateway/Acer cases.
 

SapientWolf

Trucker Sexologist
TrAcEr_x90 said:
So have about a 6 month newly purchased HP computer from circuit city. Quad core with 5 gigs of ram etc... are these next gen cards mainly more for built computers? Or could I put it in my HP? Do I have to take into account fitting it in the current type HP case and power supply? Or are these new cards pretty easy to install.
I doubt that it will have the extra necessary power connections. HP is pretty stingy with those. You would be better off getting a 4770 in that case, or building your own.
 
mr stroke said:
Well SOB :(

was really hoping for a $299 5870
I wonder if there will be a big difference in performance(20-30 fps) between the 5850 vs 5870

As stated before that site is a bunch of crap. It makes sense that AMD will go with the same pricing scheme that worked so well last generation.
 
mr stroke said:
you should be able to fit the card right into the mobo(did this with a cheap Compaq last year)
only concern is the case space, with my experience you will probably only be able to fit a 5850 in those pre built HP/Gateway/Acer cases.

thanks for the reply. I've been out of the pc building game for a few years now. So wasnt sure if these use like new pci express slots or something too.
 
TOAO_Cyrus said:
As stated before that site is a bunch of crap. It makes sense that AMD will go with the same pricing scheme that worked so well last generation.
It's a rumor, just like the basis of this thread. I posted it to make people aware of conflicting information that's out there, not to claim it as fact. I can't speak to the reliability of the source, but the fact that Tech Report reposted it means I won't dismiss it outright.

It makes sense that they'd keep the same pricing scheme, but it would also make a lot of sense for them to go for a little more profit once they have the performance crown, especially since they've been having financial troubles for a while now.
 

DSN2K

Member
ive never been very excited about first Direct X etc cards due to the fact 99% of the time none of them even the high end releases can run games what take full advantage of the latest Direct X, by the time Crysis 2 or whatever lands these early cards wont be running these games on High.

regardless of that I will likely pick one up come the end of september now as im planning a full upgrade with a iCore 7 or 5 system so I might as well buy the best going. :lol
 

artist

Banned
DSN2K said:
ive never been very excited about first Direct X etc cards due to the fact 99% of the time none of them even the high end releases can run games what take full advantage of the latest Direct X, by the time Crysis 2 or whatever lands these early cards wont be running these games on High.

regardless of that I will likely pick one up come the end of september now as im planning a full upgrade with a iCore 7 or 5 system so I might as well buy the best going. :lol
Geforce 4200/4600/4800 series - DirectX8
Radeon 9800 - DirectX9
Geforce 8800GTX - DirectX10

All of the above are THE best in the history of gaming. Cards introduced at DirectX inflection point have turned out awesome, if past is anything to go by.

rohlfinator said:
It's a rumor, just like the basis of this thread. I posted it to make people aware of conflicting information that's out there, not to claim it as fact. I can't speak to the reliability of the source, but the fact that Tech Report reposted it means I won't dismiss it outright.

It makes sense that they'd keep the same pricing scheme, but it would also make a lot of sense for them to go for a little more profit once they have the performance crown, especially since they've been having financial troubles for a while now.
You got to look at the sources and weed them out. TR is nothing but a relay for that rumor, source is crap. I'd take PowerColor's word over any rumor mongering website any day of the week. ;)
 
Top Bottom