~Kinggi~ said:Anyone know when the 4870x2 will be released? Im looking at that as my next card.
What resolution do you plan to game at?Cheeto said:I'm torn...do I get a 4850 now, or get a 4870 later.
Zzoram said:I think I'm going to wait for an HD4870 with a better stock cooler for $300. It might be two months, but I can wait until September at the latest (The Witcher Enchanced Edition), so that's ok by me. I want to play Crysis Warhead when it comes out at 1680x1050 with 2xAA and silky smooth frame rate, so the HD4850 is probably just a tad too slow for me.
Chiggs said:For an extra forty or fifty bucks, you can probably get a 1gb version. too.
bathala said:
I game at 1680x1050careful said:What resolution do you plan to game at?
For 1920x1200 or above, play it safe and go with a 4870.
This generation, the $200 front looks to be the most exciting. I wish I was looking for a card with that budget, I would be soo happy.Mindlog said:http://www.engadget.com/2008/06/26/amd-smells-a-comeback-with-ati-all-in-wonder-hd/
Oh GOD YES!
ATI bringing back the AIW.
but they didn't mention what core they are using :[ I will be looking to upgrade my current 9800 Pro AIW soon.
dLMN8R said:Meh, All In Wonder was never a good idea. You're basically buying a brand new TV card every time you buy a new video card - what a waste.
Far better to get a separate PCI TV card from Hauppauge, DVICO, or wherever. I've been using the same TV card for years and years now, but have upgraded my video card 2-3 times. Definitely would not want to have paid extra for an AIW version of the card each of those times.
Does it have a PCI slot?Mamesj said:Random question (don't laugh): would a 4870 work on my 965p board?
godhandiscen said:Does it have a PCI slot?
Yeah, my bad. Isnt PCI even slower than AGP?SRG01 said:I think you mean PCI-E. :lol
godhandiscen said:Yeah, my bad. Isnt PCI even slower than AGP?
Mindlog said:http://www.engadget.com/2008/06/26/amd-smells-a-comeback-with-ati-all-in-wonder-hd/
Oh GOD YES!
ATI bringing back the AIW.
but they didn't mention what core they are using :[ I will be looking to upgrade my current 9800 Pro AIW soon.
Go look at benchmarks done at 1280x1024 for the 4870, 4850 is probably included. I'm too lazy right now.Foster said:Quick question, if i play games at 1280 x 1024, would i notice the difference between the 4850 and the 4870?
and also will core 2 quad Q6600 2.4ghz (will overclock it to 3.0+ hopefully) with 4gb of ram and one of the above cards be ok to play crysis on high at that resolution?
Why? 90% of games are done in Direct3D. I am a big supporter of OpenGL (Its my first 3D api, I programmed on it a lot), but when it comes to gaming you either like Id games or else OpenGL is unimportant. also, I doubt Blizzard is employing 3.0 for its games right now.antiloop said:I will wait for OpenGL 3.0 ready cards. It will be a long wait but I don't game that much on my PC nowadays.
It sure does. Lets just apply a couple "The way its mean to be played" incentives here and there....Chiggs said:The 260GTX is finally starting to show up at Newegg...at $399.99. They need to drop that to $299.99 asap. I'll bet that it's done within the next two weeks, unless Nvidia has some sort of secret performance driver, or they think 200 series has legs for newer games, whereas the 4800 series won't.
things just gone seriousOutdoor Miner said:EA, Sega, and Blizzard have signed on for Direct X 10.1.
Oh shit, suddenly 10.1 could be really relevant. Anyways, by the time Diablo 3 releases, Nvidia will have 10.1 hardware.Outdoor Miner said:EA, Sega, and Blizzard have signed on for Direct X 10.1.
Diablo III in 10.1? *fap*
godhandiscen said:It sure does. Lets just apply a couple "The way its mean to be played" incentives here and there....pay developers to cripple performance under ATI hardware(Age of Conan, Lost Planet, Assasins Creed, etc)
Outdoor Miner said:EA, Sega, and Blizzard have signed on for Direct X 10.1.
Diablo III in 10.1? *fap*
WrikaWrek said:My plan is to equip my new pc i'm building in August with the Hd4870 and then by early next year, or this christmas, another hd4870 for the Crossfire.
GHG said:Or have drivers that cheat on some games by reducing the IQ for higher frames.
godhandiscen said:It sure does. Lets just apply a couple "The way its mean to be played" incentives here and there....pay developers to cripple performance under ATI hardware(Age of Conan, Lost Planet, Assasins Creed, etc)
godhandiscen said:It sure does. Lets just apply a couple "The way its mean to be played" incentives here and there....pay developers to cripple performance under ATI hardware(Age of Conan, Lost Planet, Assasins Creed, etc)
VaLiancY said:So what's special about those All-In-Wonder cards?
The irony about the whole situation is that, ATI cards perform very well (except Lost Planet)godhandiscen said:It sure does. Lets just apply a couple "The way its mean to be played" incentives here and there....pay developers to cripple performance under ATI hardware(Age of Conan, Lost Planet, Assasins Creed, etc)
Those are pre-patch benchmarks. Ubisoft already removed all the DirectX10.1 code by order of Nvidia, so now I doubt those benchmarks would hold true.Chiggs said:^^^^^
:lol at the 4850 beating the GTX 260 in Assassin's Creed.
I always laugh at conspiracy theories, but this is one I believe in.Cheeto said:
Even after the patch, the 4870 is within 2% of the GTX280.godhandiscen said:Those are pre-patch benchmarks. Ubisoft already removed all the DirectX10.1 code by order of Nvidia, so now I doubt those benchmarks would hold true.
I knew those were pre-patch figures. Also, I don't doubt that 2% figure seeing how powerful the 4870 is. I just wish we all could move to Dx10.1 and have free AA in all games as it was meant to be with DirectX10irfan said:Even after the patch, the 4870 is within 2% of the GTX280.![]()
http://www.tgdaily.com/html_tmp/content-view-38117-140.htmlShuichi Takagi, CyberLink's vice president of of business development ran a demonstration on a just launched ATI Radeon 4850 512 MB, proving that the hardware and software is capable of converting four HD MPEG-2 movies into MPEG-4 simultaneously - in real time. According to Shuichi, it will take about 30 minutes to process four full-length movies and compress them into handheld-friendly 200+ MB files.
irfan said:
This is why you have an open API, so that anybody can use it.irfan said:http://www.tgdaily.com/html_tmp/content-view-38117-140.html
edit: Note that this is different from the free AVIVO transcoding software from AMD.