I wouldnt be surprised if the next Wii had a RV740 level of GPU.brain_stew said:Yup!
That's some ludicrous performance, I'm stunned.
Honestly, I hope this shuts up all those that are claiming the next generation of consoles won't be a big leap forward. This level of performance will be the low end of PC hardware when the new consoles launch in 2012, and will absolutely be in the sub $100 category, possibly much lower.
Even a cautious approach to next gen. should yield something on par with what we're seeing today, and I don't think anyone will be upset with a console capable of running Cryengine 3 with much nicer settings than its current console iteration at 12x the resolution and a better framerate.
GPU technology has just moved so far on since 2005.
Set it up for adoption to a good home (ebay). If any of you are planning to upgrade next month then now is the time to flip your current cards before this news becomes mainstream.luka said:*stares at his 4870x2 with disgust and comtempt*
If you are using a GTX260 now you can just plug one of these in where that is now. Nothing else you would need to buy. They are still using PCI-Express.Diablohead said:I take it the mobo I have right now which uses DX10 GTX 260 would be incompatible with DX11 cards? it was a cheap board to start with with minor space for upgrades as this is a budget gaming rig.
Looks awesome though, few years time and I will own one
irfan said:I wouldnt be surprised if the next Wii had a RV740 level of GPU.
I saw a pic of the case with the cards in it and there was no xfire connector on them.brain_stew said:The 24 monitor setups have been confirmed to be using 4 5870s in Crossfire. We don't know any final pricing but we do know at least the 1GB 5870 will be less than $400.
I've heard no word that there were Hemlock's (5870X2s) at the event, that'd temper peoples excitement if that was the case but I've seen no reports suggesting it.
joesmokey said:Pretty impressive. It'll be more amazing once they come out with sleeker monitor setups that do away with the thick borders.
Hazaro said:I saw a pic of the case with the cards in it and there was no xfire connector on them.
0_0ati called Eyefinity it supports up to 6 monitors running at 7680x3200 resolution on 1 card ....
I don't see Nintendo abandoning the PPC architecture, but as far as I know, there's a joint AMD/ IBM research team working on a Fusion design using PowerPC cores instead of amd64 cores. At least that's what I got from a job description I've seen a while ago. If that's the case, I assume they do this for a future console design, so probably either for Microsoft or for Nintendo (or both).brain_stew said:Well Fusion is set to integrate a RV730 based GPU into the processor package, so say a Athlon ii X2 2.6ghz Fusion CPU should be cheap as peanuts by 2012, yet could offer awesome performance at 720p, especially if they integrate some eDRAM (enough for 720p w/ 2xmsaa without tiling) and ship it with 1GB of shared GDDR5.
Yeah, but 43 ain't 60 though. Guess I'll wait for the 5870x2.Kaako said:I'd be somewhat content with Crysis @ 1080P(4AA+16AF) averaging 43fps.
If the 5870x2 averages it to 55-60+ then I'm with you.SapientWolf said:Yeah, but 43 ain't 60 though. Guess I'll wait for the 5870x2.
SuperEnemyCrab said:If that benchmark is accurate, I bet 60fps average in Crysis could be reached with overclocked cards. Its only 17fps. Don't need to wait for the X2 line.
That's ok though because 43 fps of Crysis feels like 60fps.SapientWolf said:Yeah, but 43 ain't 60 though. Guess I'll wait for the 5870x2.
I plan on using v-sync though. 60fps locked is like gaming nirvana to me. It's the main reason why I'm still a PC gamer.Hazaro said:That's ok though because 43 fps of Crysis feels like 60fps.
Cards that are already out can run 99.9% of titles out there extraordinarily well at that resolution. Why do you need this specific card?Trax416 said:I guess this bodes well for the single 1080p monitor I will be picking up when these cards launch.
Some games support dual monitors in ways that don't simply split one big image--rather, you can isolate the full map on one screen, things like that.Aesius said:Man, there's no f'n way I could play on a dual monitor setup, let alone like 20 monitors. The little gaps in between the images/monitors would drive me insane.
Dance In My Blood said:Cards that are already out can run 99.9% of titles out there extraordinarily well at that resolution. Why do you need this specific card?
Is Crysis your game of choice and you really need that performance boost? I just don't understand the appeal of these cards at this juncture. Especially with no end for the current console generation in sight keeping a number of restrictions on most PC game graphics (which is why the resolution is what they're pushing I'd imagine).
Running games on multiple screens is not only less than ideal in terms of space and financial terms, but you also have to deal with your visual space being spread out too far. This is even worse when you always need to keep an eye on a piece of the user interface. As a gamer more often than not a single larger screen running at a resolution like 1080p or 1680x1050 is all you really need.
I just really don't see much need or use for these cards.
I don't think tearing has ever bothered me much at so I'm lucky in that regard.SapientWolf said:I plan on using v-sync though. 60fps locked is like gaming nirvana to me. It's the main reason why I'm still a PC gamer.
And if you buy 2 you'll make back money twice as fast! :lolMinsc said:Another big feature is they idle at like 30W though, opposed to around 100+. Just think, if you don't use them long enough, they pay for themselves!
No problem man. I'll take your dual 4870's off your hands for $200.gamerecks said:Ugh, why did these have to be so awesome. I have dual 4870's, and now will have the urge to replace them.
Dance In My Blood said:I just really don't see much need or use for these cards.
Hazaro said:And if you buy 2 you'll make back money twice as fast! :lol
Huh? I'm a PC gamer, but there are obvious restrictions to how good a PC game is going to look because most of them are also designed around being played on a console.Kintaro said:Well, that wraps it up. They should stop business right now and close down shop.
Card makers make cards. Faster, better, stronger, more efficent. It's what they do. You're looking at cards that you dream of in the next generation of consoles. Except we get them now and not in 2012.
Dance In My Blood said:Huh? I'm a PC gamer, but there are obvious restrictions to how good a PC game is going to look because most of them are also designed around being played on a console.
Dance In My Blood said:Cards that are already out can run 99.9% of titles out there extraordinarily well at that resolution. Why do you need this specific card?
Is Crysis your game of choice and you really need that performance boost? I just don't understand the appeal of these cards at this juncture. Especially with no end for the current console generation in sight keeping a number of restrictions on most PC game graphics (which is why the resolution is what they're pushing I'd imagine).
Running games on multiple screens is not only less than ideal in terms of space and financial terms, but you also have to deal with your visual space being spread out too far. This is even worse when you always need to keep an eye on a piece of the user interface. As a gamer more often than not a single larger screen running at a resolution like 1080p or 1680x1050 is all you really need.
I just really don't see much need or use for these cards.
SourceWhile all eyes are on the Radeon HD 5800 "Cypress" series, there is another series quietly brewing beneath AMD which NVIDIA should be really afraid of. It is targeted at the mainstream market which is no doubt the most important segment for both camps.
Juniper XT and LE will be officially named Radeon HD 5770 and Radeon HD 5750 respectively when launched. HD 5770 card is codenamed Countach while HD 5750 is codenamed Corvette and they both come with 1GB GDDR5 memories on 128-bit memory interface. Juniper will possess all the features of its higher end counterpart like 40nm, DX11, Eyefinity technology, ATI Stream, UVD2, GDDR5 and best of all, it is going to be very affordable.
One of the reason why AMD is not mass producing Radeon HD 4700 (RV740) series now is because HD 5700 series will be replacing it soon and will come one month after HD 5800 series. It will meet head on against the NVIDIA's D10P1 (GT215) series in October so expect a full fledge war then. With a performance target of 1.6x over the HD 4770 and 1.2x over the HD 4750, they are surely packed with enough power to pit against the NVIDIA's lineup. Pair them up and you will get a boost of 1.8x which is roughly the performance of a Cypress card.
Yeees...if you apply like 16xAA maybe. I game at 2560x1600 and can still see jaggies. I turn on AA of course but some games (Crysis, ArmA 2 etc.) with AA brings my 4870x2 to its knees....roll on 5870x2 !!!Thunderbear said:Most PC games these days you can play without any noticable anti-aliasing (except maybe for Crysis, those damn palm tree leaves). I'm really looking forward to a future when games across all platforms have no anti-aliasing. Good image quality makes such a difference.
Hazaro said:
DennisK4 said:I game at 2560x1600 and can still see jaggies.
Work on hitting the mass market with cheaper cards that are equivalent to the current high end.Kintaro said:I'm just saying card makers do what they do. They make cards. What should they do exactly?
Dance In My Blood said:Work on hitting the mass market with cheaper cards that are equivalent to the current high end.
Which sounds like the Juniper stuff posted in here just now.
I'm not condemning them in any way for putting out product though, for the record. Just putting out there, "who really needs this?"
Xdrive05 said:Do borderless LCD's exist? Because that would be 100% sex with this tech.