• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon HD 6870 and HD 6850 benchmarks

DonasaurusRex

Online Ho Champ
If these cards are the new 5770 - 5830 segment i cannot wait to see what we are gonna get in the 5870 - 5850 slot , if you look at the difference between the 5830 and the 6800 its truly compelling.
 
God damnit god damnit god damnit. I told myself I was gonna hold off, but I'm starting to feel my 4870's age, and these prices are too good to pass up. This weekend, I'm so buying one.
 

otake

Doesn't know that "You" is used in both the singular and plural
Solstice said:
God damnit god damnit god damnit. I told myself I was gonna hold off, but I'm starting to feel my 4870's age, and these prices are too good to pass up. This weekend, I'm so buying one.


But is there a compelling game worth upgrading for?
 
Shambles said:
"no fucker else" huh?

Those effects already exist and have for a long time. That's like saying no one is going to bother to make graphical eye candy in games because there is no point. And most games are not cheap console ports... I'm not sure what else could be wrong with this post. PhysX is just a label on a box. It itentially uses shitty old X87 code to destroy CPU performance to artifically make the GPU look like it accelerates it. Often the CPU is better suited for physics simulations when systems aren't parallel (but you can also fake a situation to make the collisions parallel when they really aren't). In addition to all these it's silly to run physics calculations on the GPU that is already under full load trying to render frames when the CPU usually is sitting there mostly idle.

Not this fucking argument again.

good luck getting software physics on par with nvidia's hardware accelerated offering any time soon. And good luck getting developers to add vastly more complex physics simulation to their console ports without money hats from Nvidia or ATi.

I don't like proprietry bullshit but it is what it is.

New pics of MLAA.

maa4-l.jpg
maa3-l.jpg
 

otake

Doesn't know that "You" is used in both the singular and plural
DaBuddaDa said:
Crysis 2 is only ~5 months away.

Isn't that game going to be consoled-down? are the system requirements out yet?
 

DaBuddaDa

Member
otake said:
Isn't that game going to be consoled-down? are the system requirements out yet?
The official reqs aren't announced, but I guess the developer spin is that it will look better than Crysis but require lower system specs. We'll see about that.

You could always try to go balls-to-the-wall maxed out Metro 2033 DX11...
 

otake

Doesn't know that "You" is used in both the singular and plural
nubbe said:
In 5 months we will see 7870 do 120fps in crysis


I remember people posting comments like that in the 4870 thread.

It bothers me that no one benchmarks at my resolution. If I knew how much of a performance improvement I would see it would make this decision much easier to make.
 
flyinpiranha said:
So for somebody who does not OC, and has a 4850 ... would you aim for a 6850 or a 6870. This is going to last me for about 2 years+ and will be moved into a new build and most likely put into a Xfire to squeeze another couple years out of it.

6870.
 
Salacious Crumb said:
The 6950 and 6970 are both single GPU designs.

The 6990 is AMDs dual GPU card

If Antilles is based on Barts GPU's it will outperform HD5970. HD6870 CF already surpass HD5870 CF (also better than a 5970), HD6870 for a dual gpu will stay way below the 300w at stock speeds.

HD5970 is toe to toe with a 6850 CF (50w less power consumption than a 5970).


Now if Antilles is based in the Cayman gpu's, hell will be upon us.
 

mr stroke

Member
brain_stew said:
You''ll purge the disease that is Crossfire from your machine, that's not something to be taken lightly.

In the end its idle power consumption that really matters and a single card solution is always going to have the lower idle consumption, so if you care about reducing your power bills, then the 6970 is going to be the way to go, not dual 6850/6870s. You'll also probably be able to get rid of the 1GB VRAM bottleneck that has been crippling your 5970 as well.


whats wrong with crossfire/sli?

So I assume you would chose a single GTX 480 vs Crossfire 6870's?
 

ghst

thanks for the laugh
Krauser Kat said:
i just noticed Newegg no long carries the 4890. Any reason why it phased out so early when the 4870 etc. are still around
like a candle in the wind. only now are we seeing similar performance cards go for the same price as the 4890 a few months after launch (£130), didn't make sense to keep it hanging around making the 5xxx series look overpriced. can't imagine per-unit profit was exactly stellar, either.
 
This has probably been asked a million times already but is the 6870 really worth the extra $50?

All I want is a GPU that will last me 2.5 years (much like how my 8800GT has so far).

I don't care too much about maxing out every single game to every come out, but more so running it in quality settings. I mean the 6850 looks to be marginally better than the GTX 460 1GB and that was the hot shit up until two days ago.
 

camineet

Banned
Ryoma-Echizen said:
If Antilles is based on Barts GPU's it will outperform HD5970. HD6870 CF already surpass HD5870 CF (also better than a 5970), HD6870 for a dual gpu will stay way below the 300w at stock speeds.

HD5970 is toe to toe with a 6850 CF (50w less power consumption than a 5970).



Now if Antilles is based in the Cayman gpu's, hell will be upon us.


Antilles is based on the Cayman GPU, not Bart. So Antilles = 2 Cayman GPUs on a single card.
 

Akuun

Looking for meaning in GAF
Still quite happy with my 4870. I can wait until the newer cards come out. Maybe even until the next generation.
 
Flying_Phoenix said:
This has probably been asked a million times already but is the 6870 really worth the extra $50?

All I want is a GPU that will last me 2.5 years (much like how my 8800GT has so far).
.

If you're willing to OC (i.e. move a little slider for an instant performance boost), then no. A $180 will perform the same as a stock 6870 once overclocked and the 6870 is not capable of ever reaching much higher performance than what it delivers at stock.
 

mr_nothin

Banned
Nidain said:
k2k%K@:^JK@$^J$@^
:D :D :D

Bring on the MLAA hax!!
 

Avtomat

Member
Flying_Phoenix said:
This has probably been asked a million times already but is the 6870 really worth the extra $50?

All I want is a GPU that will last me 2.5 years (much like how my 8800GT has so far).

I don't care too much about maxing out every single game to every come out, but more so running it in quality settings. I mean the 6850 looks to be marginally better than the GTX 460 1GB and that was the hot shit up until two days ago.

Think of it this way $50 is not a hell of a lot to invest over 2 years. I mean if the 6850 is not gonna cut it and you end up regretting not getting the 6870 in 18 months well you got another 6 months to put up with below par performance.

Akuun said:
Still quite happy with my 4870. I can wait until the newer cards come out. Maybe even until the next generation.

Next gen is prolly gonna be 2011, so in truth not that long then :p
 

otake

Doesn't know that "You" is used in both the singular and plural
bad company 2 appears to run badly on card. I guess it is time to upgrade.
 
Check this out, MLAA comparison with Starcraft 2.
http://www.tomshardware.com/reviews/radeon-hd-6870-radeon-hd-6850-barts,2776-4.html

4xAA causes a 50% drop in performance while with MLAA there is virtually none. I wonder if it will work as well with other games.

Also a word of warning, the default AF setting in catalyst 10.10 is actually worse then with the default catalyst AI setting in previous drivers. You have to set it to high quality to get the same IQ on HD5XXXand to get the improvements with HD6XXX.
 
brain_stew said:
If you're willing to OC (i.e. move a little slider for an instant performance boost), then no. A $180 will perform the same as a stock 6870 once overclocked and the 6870 is not capable of ever reaching much higher performance than what it delivers at stock.

6850 it is then ... thanks for saving me $50!
 
evil solrac v3.0 said:
is ATI going to have a PhysX equivalent for games?

They probably don't have the expertise to do that and it would suck anyway because it would be proprietary and devs would have to chose between AMD and Nvidia. Nvidia bought PhysX and had them port their physics engine from their stand alone PPU's to CUDA. Anyone can write a platform agnostic GPU accelerated physics engine using DirectCompute which will run on all DX10 and above GPU's. Devs never seem to do their own physics engines these days Havoc seriously needs to hurry up and do this.
 

Seth C

Member
So, I'm currently running the onboard graphics (wheee) and my monitor (20") only runs at 1600x900 anyway. Any of you upgraders want to help a gaming brother out on the cheap?
 

subversus

I've done nothing with my life except eat and fap
otake said:
bad company 2 appears to run badly on card. I guess it is time to upgrade.

yeah, it's time to upgrade your CPU.

But if you're on quad-core and play on insanely high resolutions, then yeah, you should upgrade your videocard.
 

otake

Doesn't know that "You" is used in both the singular and plural
subversus said:
yeah, it's time to upgrade your CPU.

But if you're on quad-core and play on insanely high resolutions, then yeah, you should upgrade your videocard.

I'm core 2 duo. I don't think I'll be upgrading any time soon. So far, only one compelling game has required going quad.
 

subversus

I've done nothing with my life except eat and fap
otake said:
I'm core 2 duo. I don't think I'll be upgrading any time soon. So far, only one compelling game has required going quad.

Yes, and this one game is BFBC2, right? (also GTA4, Dragon Age and a dozen more games) My framerate in it doubled when I had upgraded from 2 to 4 cores. Don't waste your money on GPU, if you want to increase fps in BFBC2.
 

kinggroin

Banned
brain_stew said:
You''ll purge the disease that is Crossfire from your machine, that's not something to be taken lightly.

In the end its idle power consumption that really matters and a single card solution is always going to have the lower idle consumption, so if you care about reducing your power bills, then the 6970 is going to be the way to go, not dual 6850/6870s. You'll also probably be able to get rid of the 1GB VRAM bottleneck that has been crippling your 5970 as well.

I appreciate your help and persistence. We've done this dance before but I have no intention of getting rid of my crossfire setup. My "crippled" 5970 does just fine at spanking pretty much every other card in existence; So being able to use absolute max settings and image quality Is something I Am not willing to give up : )

Micro stuttering is also not a concern of mine; The combination of vsync and triple buffering have pretty much elimnated that problem in most games that I play. Don't fret however, as soon as a single gpu can match the performance of my cf setup, ill bite.


And finally get you off my back, lol.
 

otake

Doesn't know that "You" is used in both the singular and plural
subversus said:
Yes, and this one game is BFBC2, right? (also GTA4, Dragon Age and a dozen more games) My framerate in it doubled when I had upgraded from 2 to 4 cores. Don't waste your money on GPU, if you want to increase fps in BFBC2.


I played Dragon Age on my c2d with no issues. Upgrading my cpu would mean upgradin my motherboard as well, not looking forward to that. The core i5 stuff is still too expensive.

I thought gaming was becoming much more GPU bound than CPU bound.
 

subversus

I've done nothing with my life except eat and fap
otake said:
I played Dragon Age on my c2d with no issues. Upgrading my cpu would mean upgradin my motherboard as well, not looking forward to that. The core i5 stuff is still too expensive.

I thought gaming was becoming much more GPU bound than CPU bound.

No, since this console generation is light on GPU side the opposite is happening.

Cryengine 3, Frostbite (DICE games from now on), Rage (GTA series) - they all demand 4 cores to run decent on high settings. And again your fps in DA will almost double if you're on quad.

http://www.pcgameshardware.com/aid,...rks-75-percent-boost-for-quad-cores/Practice/

But yeah, Intel CPUs and motherboards are expensive.
 

Jin34

Member
otake said:
I played Dragon Age on my c2d with no issues. Upgrading my cpu would mean upgradin my motherboard as well, not looking forward to that. The core i5 stuff is still too expensive.

I thought gaming was becoming much more GPU bound than CPU bound.

What card do you have? I have a 4850 but have a Phenom II X4 and can play BC2 @ 1680 x 1050 very well as long as I leave AO off and shadows in medium. So I do think it's the CPU. Most games are gpu bound but thats as long as you have a quad. But at this point you might as well wait for Sandy Bridge and upgrade then.
 

otake

Doesn't know that "You" is used in both the singular and plural
subversus said:
No, since this console generation is light on GPU side the opposite is happening.

Cryengine 3, Frostbite (DICE games from now on), Rage (GTA series) - they all demand 4 cores to run decent on high settings. And again your fps in DA will almost double if you're on quad.

http://www.pcgameshardware.com/aid,...rks-75-percent-boost-for-quad-cores/Practice/

But yeah, Intel CPUs and motherboards are expensive.

Aren't the console limited to three cores? I know the 360 is three cores and the ps3 is somce crazy 6 or seven spe bullshit.
 

subversus

I've done nothing with my life except eat and fap
otake said:
Aren't the console limited to three cores? I know the 360 is three cores and the ps3 is somce crazy 6 or seven spe bullshit.

PS3 made devs offload a lot of graphics work on CPU (since its GPU is shit) and parallelize tasks. There was a DICE presentation where they stressed that FROSTBITE will parallelize tasks heavily and use 6 or more cores, so PS3 will benefit from it greatly. Cryengine 3 will use up to 6 cores, all physics tasks will be offloaded to CPU.

360 CPU is also pretty powerful for a console and the lead engine architect at DICE has said numerous times on his twitter (due to complaints on BFBC2 perfomance) that console CPUs rape most 2 core CPUs used in PCs.
 

Shambles

Member
otake said:
I'm core 2 duo. I don't think I'll be upgrading any time soon. So far, only one compelling game has required going quad.

The easiest way to solve this is to run the game and monitor GPU and CPU usage. Whichever is being maxxed out is the one that needs the next upgrade.
 
otake said:
I played Dragon Age on my c2d with no issues. Upgrading my cpu would mean upgradin my motherboard as well, not looking forward to that. The core i5 stuff is still too expensive.

I thought gaming was becoming much more GPU bound than CPU bound.

Not on dual core rigs.

You could pick up a used Q6600 for <$100 and clock it to 3ghz+ and it would get rid of your CPU woes for many years to come. I can't rightfully recommend a GPU upgrade until you do at least this.
 
brain_stew said:
Not on dual core rigs.

You could pick up a used Q6600 for <$100 and clock it to 3ghz+ and it would get rid of your CPU woes for many years to come. I can't rightfully recommend a GPU upgrade until you do at least this.

Where do you shop for a used one?
 
Being honest, I suspect that I'll start getting more and more life on my hardware as time goes on. Gaming is reaching its limit in terms of production costs and we haven't seen a game that truly brought PC's to their knees since Crysis.

flyinpiranha said:
Where do you shop for a used one?
Amazon.

And if you want one on the cheap go AMD AM3.
 
Flying_Phoenix said:
Being honest, I suspect that I'll start getting more and more life on my hardware as time goes on. Gaming is reaching its limit in terms of production costs and we haven't seen a game that truly brought PC's to their knees since Crysis.

.

Metro 2033 would like a word.
 
brain_stew said:
Metro 2033 would like a word.

Exactly one game, that can be easily scaled down. Yes I know you can probably find a few more if you really dig but there's no denying that games today are in the need of more power as oppose to games ten years ago.
 
Top Bottom