• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

June 2008: Battleground for PC Graphics - Geforce GTX 200 v Radeon HD 4800

I think I'm going to wait for an HD4870 with a better stock cooler for $300. It might be two months, but I can wait until September at the latest (The Witcher Enchanced Edition), so that's ok by me. I want to play Crysis Warhead when it comes out at 1680x1050 with 2xAA and silky smooth frame rate, so the HD4850 is probably just a tad too slow for me.
 
Zzoram said:
I think I'm going to wait for an HD4870 with a better stock cooler for $300. It might be two months, but I can wait until September at the latest (The Witcher Enchanced Edition), so that's ok by me. I want to play Crysis Warhead when it comes out at 1680x1050 with 2xAA and silky smooth frame rate, so the HD4850 is probably just a tad too slow for me.

For an extra forty or fifty bucks, you can probably get a 1gb version. too.
 
Meh, All In Wonder was never a good idea. You're basically buying a brand new TV card every time you buy a new video card - what a waste.

Far better to get a separate PCI TV card from Hauppauge, DVICO, or wherever. I've been using the same TV card for years and years now, but have upgraded my video card 2-3 times. Definitely would not want to have paid extra for an AIW version of the card each of those times.
 
dLMN8R said:
Meh, All In Wonder was never a good idea. You're basically buying a brand new TV card every time you buy a new video card - what a waste.

Far better to get a separate PCI TV card from Hauppauge, DVICO, or wherever. I've been using the same TV card for years and years now, but have upgraded my video card 2-3 times. Definitely would not want to have paid extra for an AIW version of the card each of those times.

Doesn't fit very well inside a SFF box.

Then again most current gen vCard don't either :D
 
godhandiscen said:
Yeah, my bad. Isnt PCI even slower than AGP?

Yep. Amazingly, there are some really weird graphics solutions that still use the old PCI. I think Sparkle announced a 8500GT PCI, but it never materialized onto the market.
 
Mindlog said:
http://www.engadget.com/2008/06/26/amd-smells-a-comeback-with-ati-all-in-wonder-hd/

Oh GOD YES!

ATI bringing back the AIW.

but they didn't mention what core they are using :[ I will be looking to upgrade my current 9800 Pro AIW soon.

FYI, its a 3650. http://ati.amd.com/products/aiwhd/requirements.html. However the tuner HW its based on (ATI 650) is one of the best tuner cards you can buy right now.

But if you're into HTPC like me, this is a great card by default because it provides audio and video via HDMI in one card (although you still need a sound card). I just wish it supported the newer audio codecs like this sound card does.
 
Quick question, if i play games at 1280 x 1024, would i notice the difference between the 4850 and the 4870?

and also will core 2 quad Q6600 2.4ghz (will overclock it to 3.0+ hopefully) with 4gb of ram and one of the above cards be ok to play crysis on high at that resolution?
 
Foster said:
Quick question, if i play games at 1280 x 1024, would i notice the difference between the 4850 and the 4870?

and also will core 2 quad Q6600 2.4ghz (will overclock it to 3.0+ hopefully) with 4gb of ram and one of the above cards be ok to play crysis on high at that resolution?
Go look at benchmarks done at 1280x1024 for the 4870, 4850 is probably included. I'm too lazy right now.
 
i can have a brand new Sapphire 4850 for 125€, i'm a bit tempted right now :lol

don't know if it's worth it tho, i've had a 8800GT for 3 months now and used it only to play Crysis

:)
 
So I finally got my projector hooked up to the PC via HDMI (I was using VGA before - needed a really long HDMI cable to go to the front of the room where my receiver as well as a switchbox since I am out of HDMI inputs on the receiver :p)...

I was pleasantly surprised that on my projector basically every resolution you'd care about is supported over HDMI (it's gimped over VGA - max was 1280x1024), not just the DTV resolutions. And the 7.1 HDMI audio works fine too :)

For shits and giggles I ran the Devil May Cry 4 demo with my Crossfire'd 4850s, 1080p resolution, Super High everything, 8xAA...I dunno how to get a screen grab of the result, but I got an S rank with the line pinned to the top of the chart with the exception of a few spots in the second and fourth demo, for the most part. FPS range looked like 90 minimum, 200 max.

I bought Crysis recently and I just did the "very high on DX9" hack - CF doesn't help so much, but I do get 30fps at 720p/NoAA, which is playable.

Also, the 4850s run HOT. I was getting lockups in the DMC4 benchmark until I pulled the side off my case. Not sure what I am going to do for cooling yet.
 
antiloop said:
I will wait for OpenGL 3.0 ready cards. It will be a long wait but I don't game that much on my PC nowadays.
Why? 90% of games are done in Direct3D. I am a big supporter of OpenGL (Its my first 3D api, I programmed on it a lot), but when it comes to gaming you either like Id games or else OpenGL is unimportant. also, I doubt Blizzard is employing 3.0 for its games right now.
 
So I'm looking in the ~$200 dollar range, and it looks like the 4850 vs the 9800GTX. From all the reviews, it looks like the 9800GTX has the slight edge in Crysis and low temperatures under load, but it's loud, whereas the 4850 is quiet, but hot, and wins in almost every other game. I'm not worried about going to a multi-card solution, so it will probably come down to where I can get either one for the best price.
 
The 260GTX is finally starting to show up at Newegg...at $399.99. They need to drop that to $299.99 asap. I'll bet that it's done within the next two weeks, unless Nvidia has some sort of secret performance driver, or they think 200 series has legs for newer games, whereas the 4800 series won't.
 
Chiggs said:
The 260GTX is finally starting to show up at Newegg...at $399.99. They need to drop that to $299.99 asap. I'll bet that it's done within the next two weeks, unless Nvidia has some sort of secret performance driver, or they think 200 series has legs for newer games, whereas the 4800 series won't.
It sure does. Lets just apply a couple "The way its mean to be played" incentives here and there....
pay developers to cripple performance under ATI hardware(Age of Conan, Lost Planet, Assasins Creed, etc)
 
Outdoor Miner said:
EA, Sega, and Blizzard have signed on for Direct X 10.1.

Diablo III in 10.1? *fap*
Oh shit, suddenly 10.1 could be really relevant. Anyways, by the time Diablo 3 releases, Nvidia will have 10.1 hardware.
 
godhandiscen said:
It sure does. Lets just apply a couple "The way its mean to be played" incentives here and there....
pay developers to cripple performance under ATI hardware(Age of Conan, Lost Planet, Assasins Creed, etc)

Or have drivers that cheat on some games by reducing the IQ for higher frames.
 
Outdoor Miner said:
EA, Sega, and Blizzard have signed on for Direct X 10.1.

Diablo III in 10.1? *fap*

Let's not forget Dawn of War II either now. Unless there is some huge difference between Direct X 10 and 10.1 I'm missing.
 
My plan is to equip my new pc i'm building in August with the Hd4870 and then by early next year, or this christmas, another hd4870 for the Crossfire.
 
WrikaWrek said:
My plan is to equip my new pc i'm building in August with the Hd4870 and then by early next year, or this christmas, another hd4870 for the Crossfire.

Make sure you get an X38 motherboard, there's been some reports of performance penalties running Crossfire on P45 boards.
 
GHG said:
Or have drivers that cheat on some games by reducing the IQ for higher frames.

Aren't Nvidia and ATI both guilty of doing this?

godhandiscen said:
It sure does. Lets just apply a couple "The way its mean to be played" incentives here and there....
pay developers to cripple performance under ATI hardware(Age of Conan, Lost Planet, Assasins Creed, etc)

Scumbag tactic; that's for sure.
 
godhandiscen said:
It sure does. Lets just apply a couple "The way its mean to be played" incentives here and there....
pay developers to cripple performance under ATI hardware(Age of Conan, Lost Planet, Assasins Creed, etc)
lolcatsdotcomiy0hwr7nqv87ch3g.jpg
 
godhandiscen said:
It sure does. Lets just apply a couple "The way its mean to be played" incentives here and there....
pay developers to cripple performance under ATI hardware(Age of Conan, Lost Planet, Assasins Creed, etc)
The irony about the whole situation is that, ATI cards perform very well (except Lost Planet)

assassinscreed.gif
f0not5.png
 
Chiggs said:
^^^^^

:lol at the 4850 beating the GTX 260 in Assassin's Creed.
Those are pre-patch benchmarks. Ubisoft already removed all the DirectX10.1 code by order of Nvidia, so now I doubt those benchmarks would hold true.
Cheeto said:
I always laugh at conspiracy theories, but this is one I believe in.
 
godhandiscen said:
Those are pre-patch benchmarks. Ubisoft already removed all the DirectX10.1 code by order of Nvidia, so now I doubt those benchmarks would hold true.
Even after the patch, the 4870 is within 2% of the GTX280. :D $299 says hello to $649. :p
 
Waiting for the 1GB version of the 4870, run my pc through my tele (1080p) then and play some games on there, GRID, DMC4, VP and other pad friendly games :)
 
irfan said:
Even after the patch, the 4870 is within 2% of the GTX280. :D
I knew those were pre-patch figures. Also, I don't doubt that 2% figure seeing how powerful the 4870 is. I just wish we all could move to Dx10.1 and have free AA in all games as it was meant to be with DirectX10
 
Shuichi Takagi, CyberLink's vice president of of business development ran a demonstration on a just launched ATI Radeon 4850 512 MB, proving that the hardware and software is capable of converting four HD MPEG-2 movies into MPEG-4 simultaneously - in real time. According to Shuichi, it will take about 30 minutes to process four full-length movies and compress them into handheld-friendly 200+ MB files.
http://www.tgdaily.com/html_tmp/content-view-38117-140.html

edit: Note that this is different from the free AVIVO transcoding software from AMD.
 
Top Bottom