• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

ATI 58XX Preview - Media Stuff goes in here.

artist

Banned
brain_stew said:
Yup!

That's some ludicrous performance, I'm stunned. :D

Honestly, I hope this shuts up all those that are claiming the next generation of consoles won't be a big leap forward. This level of performance will be the low end of PC hardware when the new consoles launch in 2012, and will absolutely be in the sub $100 category, possibly much lower.

Even a cautious approach to next gen. should yield something on par with what we're seeing today, and I don't think anyone will be upset with a console capable of running Cryengine 3 with much nicer settings than its current console iteration at 12x the resolution and a better framerate.

GPU technology has just moved so far on since 2005.
I wouldnt be surprised if the next Wii had a RV740 level of GPU. :p

luka said:
*stares at his 4870x2 with disgust and comtempt*
Set it up for adoption to a good home (ebay). :D If any of you are planning to upgrade next month then now is the time to flip your current cards before this news becomes mainstream.
 

Thunderbear

Mawio Gawaxy iz da Wheeson hee pways games
I've been waiting for realtime displacement... it's going to be awesome. And it's not much different from a developers point of view of generating a displacement map instead of a normal map. VRAM is going to be the limiting factor of course.

I want this card, I already have three monitors at home (one is a cintiq and when I bought it I put my second monitor in a box). Looks awesome.
 

Thunderbear

Mawio Gawaxy iz da Wheeson hee pways games
Most PC games these days you can play without any noticable anti-aliasing (except maybe for Crysis, those damn palm tree leaves). I'm really looking forward to a future when games across all platforms have no anti-aliasing. Good image quality makes such a difference.
 
I wasn't planning on upgrading my GTX 260 for a while...but...HOLY SHIT.

LOOK AT THAT. Holy hell, that thing should eat up any game available at the market at my resolution (1920x1200) with full AA and AF.

Do want. Do want hard.
 

Ceebs

Member
Diablohead said:
I take it the mobo I have right now which uses DX10 GTX 260 would be incompatible with DX11 cards? it was a cheap board to start with with minor space for upgrades as this is a budget gaming rig.

Looks awesome though, few years time and I will own one :D
If you are using a GTX260 now you can just plug one of these in where that is now. Nothing else you would need to buy. They are still using PCI-Express.
 
irfan said:
I wouldnt be surprised if the next Wii had a RV740 level of GPU. :p

Well Fusion is set to integrate a RV730 based GPU into the processor package, so say a Athlon ii X2 2.6ghz Fusion CPU should be cheap as peanuts by 2012, yet could offer awesome performance at 720p, especially if they integrate some eDRAM (enough for 720p w/ 2xmsaa without tiling) and ship it with 1GB of shared GDDR5.

It'd be pretty noticable leap over the 360 yet be at the bottom of the tree, cost and power wise. Seems like a nice fit to me and more than sufficient to offer a full and comprehensive software emulation, even with rendering enhancements.

Downports from the PS4/Xbox 3 could just be a case of running at half framerate (from 60fps to 30fps), half resolution, with less AA, lower texture resolution and a couple of superfluous bells and whistles turned off. The games would still look great, and multiplatform development would be able target Nintendo's platform this time around. All the major multiplatform engines would be able to add Wii support in a couple of days, as it'd effectively just be a low end/midrange DX10 PC, so developers could start producing great results right from day one.
 

joesmokey

Member
Pretty impressive. It'll be more amazing once they come out with sleeker monitor setups that do away with the thick borders.
 

Hazaro

relies on auto-aim
brain_stew said:
The 24 monitor setups have been confirmed to be using 4 5870s in Crossfire. We don't know any final pricing but we do know at least the 1GB 5870 will be less than $400.

I've heard no word that there were Hemlock's (5870X2s) at the event, that'd temper peoples excitement if that was the case but I've seen no reports suggesting it.
I saw a pic of the case with the cards in it and there was no xfire connector on them.
http://www.pcper.com/images/news/demo06.jpg

Top 2 might be.
 

Slayer-33

Liverpool-2
joesmokey said:
Pretty impressive. It'll be more amazing once they come out with sleeker monitor setups that do away with the thick borders.


I'm sure they will get thin and borderless with OLED lol.
 
Hazaro said:
I saw a pic of the case with the cards in it and there was no xfire connector on them.

That's weird, how were they syncing the outputs then?

ATI are apparently ditching AFR with this new generation and using SFR so they may just have each GPU dedicated specifically to 6 monitors which may not require a Crossfire bridge.

There's definitely a lot left to be found out. Thinking about it though, ATI pretty much has to have come up with some new sort of GPU scaling. The amount of memory a single ~96 megapixel framebuffer would take up would just be way too much if Crossfire only had access to one memory pool like it does with the traditional AFR method. They must have access to all 8GB it seems.
 

wsippel

Banned
brain_stew said:
Well Fusion is set to integrate a RV730 based GPU into the processor package, so say a Athlon ii X2 2.6ghz Fusion CPU should be cheap as peanuts by 2012, yet could offer awesome performance at 720p, especially if they integrate some eDRAM (enough for 720p w/ 2xmsaa without tiling) and ship it with 1GB of shared GDDR5.
I don't see Nintendo abandoning the PPC architecture, but as far as I know, there's a joint AMD/ IBM research team working on a Fusion design using PowerPC cores instead of amd64 cores. At least that's what I got from a job description I've seen a while ago. If that's the case, I assume they do this for a future console design, so probably either for Microsoft or for Nintendo (or both).
 

Hazaro

relies on auto-aim
As I said before in the other thread chiphell usually is the first to get results in along with legitreviews and both past launches both of the sites have been spot on.
 
If that benchmark is accurate, I bet 60fps average in Crysis could be reached with overclocked cards. Its only 17fps. Don't need to wait for the X2 line.
 

bee

Member
SuperEnemyCrab said:
If that benchmark is accurate, I bet 60fps average in Crysis could be reached with overclocked cards. Its only 17fps. Don't need to wait for the X2 line.

overclock + custom cfg + dx9, then it would for sure run at 60fps 1920x1200 4xaa 16xaf
 

SapientWolf

Trucker Sexologist
Hazaro said:
That's ok though because 43 fps of Crysis feels like 60fps. :D
I plan on using v-sync though. 60fps locked is like gaming nirvana to me. It's the main reason why I'm still a PC gamer.
 

Aesius

Member
Man, there's no f'n way I could play on a dual monitor setup, let alone like 20 monitors. The little gaps in between the images/monitors would drive me insane.
 

dimb

Bjergsen is the greatest midlane in the world
Trax416 said:
I guess this bodes well for the single 1080p monitor I will be picking up when these cards launch.
Cards that are already out can run 99.9% of titles out there extraordinarily well at that resolution. Why do you need this specific card?

Is Crysis your game of choice and you really need that performance boost? I just don't understand the appeal of these cards at this juncture. Especially with no end for the current console generation in sight keeping a number of restrictions on most PC game graphics (which is why the resolution is what they're pushing I'd imagine).

Running games on multiple screens is not only less than ideal in terms of space and financial terms, but you also have to deal with your visual space being spread out too far. This is even worse when you always need to keep an eye on a piece of the user interface. As a gamer more often than not a single larger screen running at a resolution like 1080p or 1680x1050 is all you really need.

I just really don't see much need or use for these cards.
 
Aesius said:
Man, there's no f'n way I could play on a dual monitor setup, let alone like 20 monitors. The little gaps in between the images/monitors would drive me insane.
Some games support dual monitors in ways that don't simply split one big image--rather, you can isolate the full map on one screen, things like that.
 
So I will probably be ready to build a computer by this upcoming summer. Maybe March. To build a computer with this card and compatible parts, anybody take a guess at what that would cost??

I do 3D/Motion Design and this would be sweet to have a 3 monitor setup with all my tool panels just on one screen, with 2 screens of space for actual 3D modeling or After Effects work.
 

Minsc

Gold Member
Dance In My Blood said:
Cards that are already out can run 99.9% of titles out there extraordinarily well at that resolution. Why do you need this specific card?

Is Crysis your game of choice and you really need that performance boost? I just don't understand the appeal of these cards at this juncture. Especially with no end for the current console generation in sight keeping a number of restrictions on most PC game graphics (which is why the resolution is what they're pushing I'd imagine).

Running games on multiple screens is not only less than ideal in terms of space and financial terms, but you also have to deal with your visual space being spread out too far. This is even worse when you always need to keep an eye on a piece of the user interface. As a gamer more often than not a single larger screen running at a resolution like 1080p or 1680x1050 is all you really need.

I just really don't see much need or use for these cards.

Another big feature is they idle at like 30W though, opposed to around 100+. Just think, if you don't use them long enough, they pay for themselves!
 

Hazaro

relies on auto-aim
SapientWolf said:
I plan on using v-sync though. 60fps locked is like gaming nirvana to me. It's the main reason why I'm still a PC gamer.
I don't think tearing has ever bothered me much at so I'm lucky in that regard.
Minsc said:
Another big feature is they idle at like 30W though, opposed to around 100+. Just think, if you don't use them long enough, they pay for themselves!
And if you buy 2 you'll make back money twice as fast! :lol
 

Kaako

Felium Defensor
gamerecks said:
Ugh, why did these have to be so awesome. I have dual 4870's, and now will have the urge to replace them.
No problem man. I'll take your dual 4870's off your hands for $200.
 

Kintaro

Worships the porcelain goddess
Dance In My Blood said:
I just really don't see much need or use for these cards.

Well, that wraps it up. They should stop business right now and close down shop.

Card makers make cards. Faster, better, stronger, more efficent. It's what they do. You're looking at cards that you dream of in the next generation of consoles. Except we get them now and not in 2012.
 

mr_nothin

Banned
1900x1200 @ 4xaa....all @ a min of 30 fps and an avg of 43 fps is amazing. If you turn the AA down or just go with a lower resolution then you'll have your 60fps avg. 1900x1200 is not NEEDED.
 

Dennis

Banned
CryEngine 3 on three monitors in that YouTube video is simply astounding.

I never really felt tempted to get a multi-monitor setup before, but...

I already have one 30" 2560x1600 screen. If I get 2 more and a 5870x2...

My God...I actual feel real temptation now...
 

dimb

Bjergsen is the greatest midlane in the world
Kintaro said:
Well, that wraps it up. They should stop business right now and close down shop.

Card makers make cards. Faster, better, stronger, more efficent. It's what they do. You're looking at cards that you dream of in the next generation of consoles. Except we get them now and not in 2012.
Huh? I'm a PC gamer, but there are obvious restrictions to how good a PC game is going to look because most of them are also designed around being played on a console.
 

Kintaro

Worships the porcelain goddess
Dance In My Blood said:
Huh? I'm a PC gamer, but there are obvious restrictions to how good a PC game is going to look because most of them are also designed around being played on a console.

I'm just saying card makers do what they do. They make cards. What should they do exactly?
 

Blackface

Banned
Dance In My Blood said:
Cards that are already out can run 99.9% of titles out there extraordinarily well at that resolution. Why do you need this specific card?

Is Crysis your game of choice and you really need that performance boost? I just don't understand the appeal of these cards at this juncture. Especially with no end for the current console generation in sight keeping a number of restrictions on most PC game graphics (which is why the resolution is what they're pushing I'd imagine).

Running games on multiple screens is not only less than ideal in terms of space and financial terms, but you also have to deal with your visual space being spread out too far. This is even worse when you always need to keep an eye on a piece of the user interface. As a gamer more often than not a single larger screen running at a resolution like 1080p or 1680x1050 is all you really need.

I just really don't see much need or use for these cards.

You don't need to lecture me about this, I mean, I was the sole reason the PC thread even existed for four months a while back when nobody else on the forums was helping people with their PC questions except me.

So why am I getting one of these cards? Simple.

1. I normally skip a generation. I currently have a 9800GTX, which has been used every day since I bought it. For various games, from MMO's to shooters. It's served me well, but it is getting dated. I skipped the current generation of cards, and will be picking up one of ATI's new offerings.

2. The ATI card will simply last me longer then a 4890 will, and will perform better on future games, mainly future MMORPG's which may not be as graphically stunning as some of the single player ones out there, but are very demanding.

3. I need HDMI, and will probably need Display Port eventually. The only affordable card in Canada that offers both Display Port and HDMI is the Sapphire Vapor-X 4890. The same card that also has "The Cold Boot Bug" that has caused over a 60 percent failure rate. A problem I had to figure out with a friend, who after RMA's ended up with another card that was from the same DOA batch. Fortunatly, it worked, but it still cost him slightly over $300 before all was said and done. Which is about what I will pay for the new ATI card.
 

artist

Banned
While all eyes are on the Radeon HD 5800 "Cypress" series, there is another series quietly brewing beneath AMD which NVIDIA should be really afraid of. It is targeted at the mainstream market which is no doubt the most important segment for both camps.

Juniper XT and LE will be officially named Radeon HD 5770 and Radeon HD 5750 respectively when launched. HD 5770 card is codenamed Countach while HD 5750 is codenamed Corvette and they both come with 1GB GDDR5 memories on 128-bit memory interface. Juniper will possess all the features of its higher end counterpart like 40nm, DX11, Eyefinity technology, ATI Stream, UVD2, GDDR5 and best of all, it is going to be very affordable.

One of the reason why AMD is not mass producing Radeon HD 4700 (RV740) series now is because HD 5700 series will be replacing it soon and will come one month after HD 5800 series. It will meet head on against the NVIDIA's D10P1 (GT215) series in October so expect a full fledge war then. With a performance target of 1.6x over the HD 4770 and 1.2x over the HD 4750, they are surely packed with enough power to pit against the NVIDIA's lineup. Pair them up and you will get a boost of 1.8x which is roughly the performance of a Cypress card.
Source

AMD is also launching their 5000 series Mobility GPUs pretty soon. Motherload of all launches in the history.
 

Dennis

Banned
Thunderbear said:
Most PC games these days you can play without any noticable anti-aliasing (except maybe for Crysis, those damn palm tree leaves). I'm really looking forward to a future when games across all platforms have no anti-aliasing. Good image quality makes such a difference.
Yeees...if you apply like 16xAA maybe. I game at 2560x1600 and can still see jaggies. I turn on AA of course but some games (Crysis, ArmA 2 etc.) with AA brings my 4870x2 to its knees....roll on 5870x2 !!!
 
Hazaro said:
2lxjuhe.jpg

I said holy fuck irl when i saw this!
 

dimb

Bjergsen is the greatest midlane in the world
Kintaro said:
I'm just saying card makers do what they do. They make cards. What should they do exactly?
Work on hitting the mass market with cheaper cards that are equivalent to the current high end.

Which sounds like the Juniper stuff posted in here just now.

I'm not condemning them in any way for putting out product though, for the record. Just putting out there, "who really needs this?"
 

Minsc

Gold Member
Dance In My Blood said:
Work on hitting the mass market with cheaper cards that are equivalent to the current high end.

Which sounds like the Juniper stuff posted in here just now.

I'm not condemning them in any way for putting out product though, for the record. Just putting out there, "who really needs this?"

People who have lots of disposable money, and like to game at 60fps with AA and at 1080p. There's more than just Crysis that won't run at high resolutions with AA on cards out now.
 
Top Bottom