• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: AMD DirectX 11 Card Performances, Prices Revealed, Surprisingly Affordable

plagiarize said:
as awesome as this is, i'm not sure that i can live without my Phys-X and 3D vision support.

i really hope we can get a standard for those things same as we have with Direct X.

i remember when those first DX9 9700's hit and what an awesome time that was. the doors were blown off, and i happily went Red for a while.

i plan on being in the market for a single powerful GPU down the line... but yeah. i like gimmicks :(

DX11 should diminish PhysX's exclusivity to nvidia if I understand correctly. With DirectX Compute and OpenCL, all physics engines (i.e. Havok) will be able to run across all cards. Too bad nvidia no longer allows you to use a nvidia-gpu as a stand-alone physx card while using an ati card as your primary display, makes the whole thing look like a joke if you ask me.
 

artist

Banned
Sirolf said:
Where are the extra bells and whistles from DX 11 support ?
I'm not sure if you have seen tessellation in action but the performance drop compared to other techniques for the same IQ is mind blowing. :p

Once game devs get a taste of the tessellator, I'm pretty sure it'll be heavily used. It'll also give a nice boost to the 360 because Xenos (I think) has a tessellator.
 

sn00zer

Member
There has not been a serious jump in DX technology since 8.1 -> 9...seriously 10 does next to nothing when compared side by side with DX9 and DX11 doesnt look to much better
 

artist

Banned
sn00zer said:
There has not been a serious jump in DX technology since 8.1 -> 9...seriously 10 does next to nothing when compared side by side with DX9 and DX11 doesnt look to much better
Jump from DX10 -> DX11 looks much bigger than DX8 -> DX9.
 

artist

Banned
Smelly Tramp said:
So how much do you think the 5850 will cost, and when will it be out?

These two things are the only things i need to know.
$299 & Sept 23.

MickeyKnox said:
Eyefinity in action

DSCF0452-728-75.jpg


DSCF0450-420-90.jpg


http://www.techradar.com/news/gaming/hands-on-ati-eyefinity-review-634244
AMD is looking to expand your horizons with its Eyefinity technology, allowing you to run up to six monitors in HD from a single graphics card – and TechRadar has had the chance to play with the latest in graphics tech.

Although you'll have to wait and see how ATI is bringing the technology to our homes, Eyefinity is close to release and looking rather stunning. Essentially, the technology allows you to run multiple monitors in high definition from your graphics card – and TechRadar was at the top-secret launch event to test out whether it's merely a gimmick, or if it's really a game changer for the company.

Although it is working hard on getting partners to provide the kind of monitor hardware necessary for the practical necessities of sticking six monitors together, AMD seems aware that the majority of us will not be forking out for a half dozen panels just yet. But, as we flew an aeroplane through stunning vistas (not the OS) with our peripheral vision taken up with screen and not wall, we have to confess that the high definition multiple monitors were certainly helping us feel more involved in things. Even with the rather sexy specially supplied Samsung monitors that sported much thinner bezels, the black lines were, of course, noticeable, but it was amazing just how quickly your eyes start discounting the edges of the screens. The power of the technology that supplied Eyefinity was clear – this was a meaty rig indeed to provide stutter-free six monitor action (6 x 2,560x1,600 resolution, in fact) but it wasn't so pricey that it would be beyond the means of an enthusiast gamer. Stick six 23-inch monitors in the mix, however, and you'd be looking for seriously deep pockets. Still, as a concept, it was pretty damn cool, and we bore that in mind when we moved over to a more feasible three monitor setup – which will be available earlier than the six-screen behemoth.

Now we should point out that multiple monitors are nothing new, but in gaming terms getting a stable gaming experience while using the setup has been problematic. Eyefinity (and the surrounding tech) changes that, and in spectacular style.
On the three monitors, the gaming experience was, in all honesty, not significantly worse than the six monitor set-up – and less likely to be pie in the sky for Mr average income. Your peripheral vision extends far wider than it does vertically – and with the focus on the middle monitor, the side monitors gloriously plied our eyes with extra information without detracting from the gameplay.

The game being featured on this rig was Left4Dead, and it was certainly an advantage to be able to sense further around ourselves. It literally expanded our horizons in terms of gameplay – and, as a nice little added bonus – it made it much nicer to spectate. Apparently many games are perfectly capable of taking advantage of the ridiculously large field of vision, because they take their maximum resolutions from what the graphics card tells them. Because EyeFinity allows you to essentially treat the entire surface of your monitors as a single resolution, you simply choose what you are offered and the game adapts – giving you glorious action. AMD was at pains to point out that this isn't applicable to all games, but an extensive list was shown including major first person shooters like Half Life 2, Crysis, and Far Cry 2, real-time strategy games, flight sims and so on that could run well on EyeFinity setups.

Of course, multiple monitors have uses outside of gaming – and EyeFinity allows you to set up the monitors in multiple configurations – with some portrait and other landscape, in an inverted 'T' with four monitors or in an 'L' shape for instance.
This, of course, boosts productivity and, for people who need multiple programmes running at the same time (AMD's example was city traders), and have the computers that can cope, this will prove to be a major boon.
We also asked AMD if the EyeFinity tech could cope with monitors with different resolutions and sizes, and received an affirmative – which means that you could begin to add monitors as and when you like, including re-using old ones.
Plus you can clone and span monitors to your heart's content, or even group screens together. It's pretty damn cool, especially considering that it is close to a public release, and, although you might not be forking out for a six screen setup straight away, we can see the three monitor configuration gaining some traction.
:D
 

artist

Banned
Frencherman said:
No Jump was bigger than to Shadermodel 2.0!
You can argue the same for Unified shader model :p however the thing that stopped it to take off was its closed relationship with Vista.
 
ooooooh that 2gb 5870 is so tempting. i've never bought an amd gpu before this would almost feel dirty. does nvidia have anything to counter this coming out this year?

fuck sept 23rd is so soon. this is amazing.
 

delirium

Member
Damn, my graphics card recently died. Should I just buy a replacement now or wait until the 5xxx or gt3xx are out for price drops?
 

Kaako

Felium Defensor
Hmmm need more detailed performance charts with images for individual game tests. A bit pricey for me as well at the moment.
 

artist

Banned
Wanna see what 24.5 million pixels looks like?

33vyakk.jpg


That's six Dell 30" displays, each with an individual resolution of 2560 x 1600. The game is World of Warcraft and the man crouched in front of the setup is Carrell Killebrew, his name may sound familiar.

Driving all of this is AMD's next-generation GPU, which will be announced later this month. I didn't leave out any letters, there's a single GPU driving all of these panels. The actual resolution being rendered at is 7680 x 3200; WoW got over 80 fps with the details maxed. This is the successor to the RV770. We can't talk specs but at today's AMD press conference two details are public: over 2 billion transistors and over 2 TFLOPs of performance. As expected, but nice to know regardless.

The technology being demonstrated here is called Eyefinity and it actually all started in notebooks.

Not Multi-Monitor, but Single Large Surface

DisplayPort is gaining popularity. It's a very simple interface and you can expect to see mini-DisplayPort on notebooks and desktops alike in the very near future. Apple was the first to embrace it but others will follow.

The OEMs asked AMD for six possible outputs for DisplayPort from their notebook GPUs: up to two internally for notebook panels, up to two externally for conncetors on the side of the notebook and up to two for use via a docking station. In order to fulfill these needs AMD had to build in 6 lanes of DisplayPort outputs into its GPUs, driven by a single display engine. A single display engine could drive any two outputs, similar to how graphics cards work today.

Eventually someone looked at all of the outputs and realized that without too much effort you could drive six displays off of a single card - you just needed more display engines on the chip. AMD's DX11 GPU family does just that.



16glxmp.jpg


At the bare minimum, the lowest end AMD DX11 GPU can support up to 3 displays. At the high end? A single GPU will be able to drive up to 6 displays.

2u8v7d3.jpg


AMD's software makes the displays appear as one. This will work in Vista, Windows 7 as well as Linux.

The software layer makes it all seamless. The displays appear independent until you turn on SLS mode (Single Large Surface). When on, they'll appear to Windows and its applications as one large, high resolution display. There's no multimonitor mess to deal with, it just works. This is the way to do multi-monitor, both for work and games.

2jd5ck.jpg


Note the desktop resolution of the 3x2 display setup

I played Dirt 2, a DX11 title at 7680 x 3200 and saw definitely playable frame rates. I played Left 4 Dead and the experience was much better. Obviously this new GPU is powerful, although I wouldn't expect it to run everything at super high frame rates at 7680 x 3200.

f4lmdi.jpg


If a game pulls its resolution list from Windows, it'll work perfectly with Eyefinity.

With six 30" panels you're looking at several thousand dollars worth of displays. That was never the ultimate intention of Eyefinity, despite its overwhelming sweetness. Instead the idea was to provide gamers (and others in need of a single, high resolution display) the ability to piece together a display that offered more resolution and was more immersive than anything on the market today. The idea isn't to pick up six 30" displays but perhaps add a third 20" panel to your existing setup, or buy five $150 displays to build the ultimate gaming setup. Even using 1680 x 1050 displays in a 5x1 arrangement (ideal for first person shooters apparently, since you get a nice wrap around effect) still nets you a 8400 x 1050 display. If you want more vertical real estate, switch over to a 3x2 setup and then you're at 5040 x 2100. That's more resolution for less than most high end 30" panels.

Read more at Anandtech

foiq9w.jpg

homersimpson_drool

Does this deserve its own topic?
 

demolitio

Member
I can't see how people can stand playing with multiple monitors with the black frames covering up some important areas like part of your character in that screen. That would annoy me personally, but still pretty cool to see so many monitors hooked up.
 

Kaako

Felium Defensor
Shieeeeeeet. If it can handle that, it should be good for 1080P+2-4AA resolution for most games, correct?
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.
Last year Nvidia released their cards first right? However, there were still tons of rumors about ATI's. This time there is nothing to say about Nvidia's lineup but bad news. For the sake of competition, I hope that Nvidia releases their top end parts before the 5890 hits the market.
Kaako said:
Shieeeeeeet. If it can handle that, it should be good for 1080P+2-4AA resolution for most games, correct?
lol of course. I think the 5870 is so mighty that displaying what it could achieve on a single monitor is impossible due to the fact that even an OC'd Phenom X2 would bottleneck it (they wouldn't show it with an Intel processors).
 

Cheeto

Member
demolitio said:
I can't see how people can stand playing with multiple monitors with the black frames covering up some important areas like part of your character in that screen. That would annoy me personally, but still pretty cool to see so many monitors hooked up.
I would love to have 3 displays for racing games.
 
Draft said:
Alright man, you're running at 7680x3200, you can probably turn off the AA.
heh.

this would allow me to play split screen L4D on multiple monitors on Vista again (i had it working but Valve fixed the bug that let me do it :( ).

3 monitors is a good number for something like this, especially on an FPS or something, so your cross hair isn't split in half.
 

Kaako

Felium Defensor
godhandiscen said:
Last year Nvidia released their cards first right? However, there were still tons of rumors about ATI's. This time there is nothing to say about Nvidia's lineup but bad news. For the sake of competition, I hope that Nvidia releases their top end parts before the 5890 hits the market.

lol of course. I think the 5870 is so mighty that displaying what it could achieve on a single monitor is impossible due to the fact that even an OC'd Phenom X2 would bottleneck it (they wouldn't show it with an Intel processors).
Would it be ok with a stock icore7?
 
Draft said:
Cards are a little pricier than I was hoping (but not than I was expecting.)
Yeah, same here.

I think I'll be shooting for a 5850 once they drop to $250 or less. Hopefully Nvidia will force prices down when they launch their next gen, although I'm not too confident in that happening.

demolitio said:
I can't see how people can stand playing with multiple monitors with the black frames covering up some important areas like part of your character in that screen. That would annoy me personally, but still pretty cool to see so many monitors hooked up.
I could see it being a cool novelty for trade shows and other public events, but I wouldn't want to use it for my normal gaming setup. The standard triple-monitor setup seems like it would work a lot better, where the center of the viewing area is in the center of a monitor.

It's still pretty impressive that it works, and that games are actually playable at that resolution.
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.
Kaako said:
Would it be ok with a stock icore7?
Not even. With these new videocards the processor will be the bottleneck again. At least with the x2 version.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Draft said:
Alright man, you're running at 7680x3200, you can probably turn off the AA.

Ha ha. You can't have enough AA :)

I love to see this technology used with projectors 'cause you'd could get rid of the frames and have a wonderful high resolution huge display.
Obviously you'd need to calibrate a bit but by god would it look magnificent.
 

Cheeto

Member
I wonder if there would be a competitive advantage to running a FPS on 3 displays...considering the huge increase in FOV you could realistically achieve.
 

Blackface

Banned
Cheeto said:
I wonder if there would be a competitive advantage to running a FPS on 3 displays...considering the huge increase in FOV you could realistically achieve.

It can actually make it more difficult to keep track of everything.
 

Draft

Member
Cheeto said:
I wonder if there would be a competitive advantage to running a FPS on 3 displays...considering the huge increase in FOV you could realistically achieve.
Well, FOV is usually restricted within the game, so you could get the same FOV on one tiny ass 17" monitor as you could three massive 30" monitors. So I guess it would afford the same advantage a rig capable of delivering 300 FPS does over a rig struggling to draw 30.
 

SleazyC

Member
irfan, the card powering multiple monitors is not the 5870 and I am assuming it is not the 5890? That seriously looks ridiculous and is the kind of upgrade I am looking at but I wonder how far off it is from being released.
 

artist

Banned
SleazyC said:
irfan, the card powering multiple monitors is not the 5870 and I am assuming it is not the 5890? That seriously looks ridiculous and is the kind of upgrade I am looking at but I wonder how far off it is from being released.
Its most certainly a 5870, there is no 90 yet.
 

SleazyC

Member
irfan said:
Its most certainly a 5870, there is no 90 yet.
Err... I guess I misread your post than. That is pretty absurd if the 5870 is pumping out that power. The 5870X2 will certainly whet my apetite. Wondering what Nvidia will bring to the game but I am just about ready to put together a new machine either wya.
 

Durante

Member
It must be a 5870X2 or (more likely, since that probably doesn't exist yet) crossfire'd 5870s. One can "only" manage 3 outputs.

Anyway, 3 is exactly what I'm interested in, so that's fine. Now they just need to get to adding 3D gaming support.
 

marsomega

Member
Hardocp.com said:
We can't tell you a lot about the video card since we are still bound by our Non-Disclosure Agreement until the product's launch date in a few weeks. However, AMD is allowing us to show you what is likely the most impressive feature this video card has to offer, besides the monster performance increase over current top end AMD GPU hardware

Can't wait till Hardocp gets their hands on these cards.
 

artist

Banned
Durante said:
It must be a 5870X2 or (more likely, since that probably doesn't exist yet) crossfire'd 5870s. One can "only" manage 3 outputs.

Anyway, 3 is exactly what I'm interested in, so that's fine. Now they just need to get to adding 3D gaming support.
Wrong.

One high-end (Cypress) can drive 6 while the lowest end (Redwood) will drive 3.
 

Durante

Member
irfan said:
Wrong.

One high-end (Cypress) can drive 6 while the lowest end (Redwood) will drive 3.
Source? I have problems visualizing the physical side of that. Either most of those 6 are mini-displayport or they need a proprietary connector with a breakout cable.
 

artist

Banned
Durante said:
Source? I have problems visualizing the physical side of that. Either most of those 6 are mini-displayport or they need a proprietary connector with a breakout cable.
Anand Lal Shimpi said:
At the bare minimum, the lowest end AMD DX11 GPU can support up to 3 displays. At the high end? A single GPU will be able to drive up to 6 displays.
http://www.anandtech.com/video/showdoc.aspx?i=3635

On Hardocp, there is a photo of a setup with 4 cards driving 24 LCDs. :D
 

Durante

Member
Oh, so they will probably actually use mini Displayport. I wonder how many cards will be sold with that setup instead of the 2*dvi + hdmi + displayport we've seen so far, and if there will be a significant markup.
 

Kevin

Member
These cards are starting to catch my interest. I am interested in seeing what the dual card versions offer in terms of performance and hopefully Nvidia will announce something soon so we know what we can expect from their cards.
 

squinters

astigmatic
The 5850 sounds like the card for me (was hoping the 5870 woul be a bit cheaper, but beggers can't be choosers). i don't have a dual monitor setup and the 5850 is more than enough for what my upcoming games are on my list.

Still, I'd love to know I had all that extra power just incase a new graphical powerhouse gets announced.
 

evlcookie

but ever so delicious
So all we got was a little tease about it's performance and seeing shit most of us a) can't afford b) don't have the room for.

Boo hiss. As much as i would love a 2nd monitor, i barely have enough room for this 24" :lol

Time to get excited again in 2 weeks time.
 
Top Bottom