Well, to be fair, I have the microstutter problem on my GTX 460 for Deus Ex: HR. This is the first time I've ever experienced it. It really does make a 60fps game seem like it's running at a junk framerate and I'd quit PC gaming if this was the norm. That's some messed up incompetence whatever causes it.
Let me count the ways:
-prototype shadows still borked completely after all these years, never fixed...
-skyrim doesn't scale at all
-saint's row 3 super low fps
-homefront unplayable performance degrading even on lowest setting
-rage (well rage had stutter problems on both platforms, nevertheless)
-deus ex HR stutter
-witcher 2 drunkmode slowdown, was never fixed on hd 48xx series! only fixed on the newer cards (the usual fuck you from amd for 'legacy' support if you can call 3 years legacy)
+microstutter in a miriad of games even on single gpu.
There's more but these are the ones that come to mind.
Amd has always had a bit more problems than nvidia but lately it completely takes the piss.
Then there's the fact that nvidia has way more awesome supersampling AA options and way better SLI support.
My old radeon 9800 pro was released like 2 years before vista was and it NEVER got vista drivers, ever.
You always had to run some broken xp driver and the 9200 cars don't work at all.
Amd is terrible with support, they forget their hardware exists within 2-3 years after release, you can count on that from my experience.
People expecting 60 fps minimum in every game with all settings max are just setting themselves up for disappointment, I don't care if you got dual flagship gpus there will be scenarios where they will buckle and go under 60 , and I certainly don't just mean due to a game's technical/visual prowess...
Agreed. Unless there is actually any real information this thread should be locked. This is just a desperate grab at trying to land the OT before anyone else can, or a pathetic attempt to try to take attention away from the 7970.
This. Buying the latest and greatest has never given a constant 60 FPS, doesn't matter what GPU generation. In many cases CPUs are still the limitation too when trying to hit 60 FPS minimum.
The next GPU from Nvidia isn't going to be an endgame card, even in SLI I have no doubt it'll dip below 60 FPS here and there in today's titles. Moreover, buying dual GPU has never been a good value. You always get more bang from your buck going single GPU and it comes with none of the problems.
Umm...BP101 has been touting and praising the 7970 more than anybody.
Whats the deal with the series name. Are they breaking up the mobile and desktop lines like they did for the GTX3xx/4xx series?
Their triple buffering implementation is also poor since it's evidently queuing frames, causing a lot of mouse lag. I don't know why so few engines implement triple buffering properly...drives me nuts.
BF3, Batman Arkham Asylum DX11 (PhysX OFF), Saints Row The Third DX11, Crysis 2 with DX11 patch and High Res textures. Those are just off the top of my head. 2 way SLI GTX580s are a joke for how much money they cost me. I got the first one in December of 2010 and the second on March 2011. I paid $580 and $550 respectively for two 1.5GB EVGA cards which I keep OC'd to 820. It has barely been one year and the setup is struggling with the latest games worse than any other multi-GPU setup I ever had. I am upgrading as soon as Nvidia releases their successor.
It means I cannot get 60fps constant. Also, I am very nitpicky when it comes to my PC gaming. On the consoles I am willing to put up with graphical issues, but when I dump over 1k on videocards, I expect to make no compromises and the sad truth is that I have to. Also, I find it unbelievable that you are experiencing 60fps constant in those games. Maybe you are just oblivious to fps drops. In Saints Row The Third, which is the least graphical intensive game of the ones I mentioned, I can easily experience fps drops when you hit a gang and cars and explosions start piling up. BF3 stays at 40-50fps with drops even down to 30fps when I play on a 32v32 server. I had to dial back to high for this game.
Yeah, there is no way that BF3 is GPU limited with two 580s at 1080p.Again, what resolution are you running and settings in BF3?? Need to know the CPU too, it sounds like somthing isnt right here :/
Yeah, there is no way that BF3 is GPU limited with two 580s at 1080p.
Yup, true that. And Kepler NEEDS to be a beastly generation of cards so we can have ourselves a 28nm price war, because it's best when competition is so fierce that consumers have to make a hard choice. between great products. 7970/7950 might be beast cards, but if Kepler comes in 2-3 months and is superior, I might just have to sell whatever I've got and grab one.
I personally hope (but don't fully expect) Kepler really is 20-40% faster than 79xx series and also OC's like a beast so AMD goes 'oh shit' and has to do huge price drops.
The only thing I can think of is if he is running it on Ultra, that even at 1080p BF3 can use more then 1.5gb of vram, so if he doesnt have the 3gb variant of the the 580 he would have the game grabbing memory off the HDD when exceeding 1.5gb causing stutter, frame rate issues etc.
I can't fathom at 1080p bf3 bringing a sli 580 setup to its knees :/
Realistically I shouldn't have to upgrade my setup for years, but I'm trying so hard not to fall into this trap with the 7970s. Been doing good so far
It doesn't. That's why I'm wondering if he's playing above 1080p. Only map on BF3 that gives me "trouble" is Oman. And even then my FPS is in the 70ish range (everything Ultra).
You do realize that triple buffering is extremely common on PS3, right? It's been used in games for years while it never (or rarely) shows up on XBOX360.The consoles have neither the VRAM nor the grunt to do triple-buffering, and most game engines are made with consoles in mind now. It should surprise nobody that triple-buffering is essentially a lost cause at this point.
Fortunately most console ports run locked at 60fps anyways on my machine, so I just turn Vsync on.
hough we are unable to confirm the concrete specifications and how effective the performance of first-launched "Kepler" will be, well-informed source indicated that "Kepler" will feature 256-bit memory controller, its corresponding graphics will pack 2GB memory, have TDP of 225W. We infer the first product may be the GK104 "Kepler" GPU. Judged from the memory controller and memory capacity, it is supposed to be difficult to defeat AMD Radeon HD 7970 or even Radeon HD 7950, in addition, TDP of 225W doesnt have much going for it compared with the competitor.
Way faster, I dont know why people are thinking these 7xxxx are such beasts, when they compare it to a 580 3Gb, the 7970 has a huge clock advantage, it should be compared against the 580 classy ultra with the 900 on the core, which also recently had a price drop to $550,
I wish someone would bench the 2 and compare
7970 vs 3GB 580 Classified Ultra
@Stock
@Max OC
and Clock for Clock @ 925mhz.
I dont think the 580 classy would be all that slower while on an older architecture
Maybe. Wouldn't be a bad idea if they did.
The 300M series was completely stupid though, nearly disgraceful really. It hung Nvidia's mobile fans out to dry for nearly an entire year, stuck dicking around with 128 core G92 200M cards for the fastest things available, as the highest 300M card was a pitiful 96 shader 128-bit GDDR5 pos which no one cared about.
Are Nvidia cards generally noisier? I got a Vapor-X 5870 specifically to keep things quieter and I've been quite happy with it. Wondering what options on the Nvidia side there are for keeping the dB down.
I'm not that desperate to upgrade now, perhaps later in the year. Is DX11.1 something that cards will have to specifically support? Might it be best to wait for that too?
Because 2GB is enough in most cases and it cuts costs. They don't feel pressured.I don't even understand what Nvidia is thinking if those rumors are true. We've seen them do the VRAM "numbers game" before (3GB GT 555Ms on Alienware M14x, anyone? Pointless...esp. with it's max 900p screen lol...) - so why not toss out 3GB to match AMD, if nothing else? Seems like that'd be easy to change whereas all of their other specs seem due to yield/architectural problems...
anyone care to speculate on price? Yeah I would want a 3 gb version though, it is kind of strange they aren't 3gb.
anyone care to speculate on price?
Does any game use more than 2GB of data yet?
Would be funny if Nvidia release their card, matchs or excceds the 7970, and is cheaper.
This. I'm going to have about $199 to spend at the beginning of March. Should that be efficient funds for a version of this card?
Slot in the correct price bracket, receive money.What's so fucking stellar about the 7970 at 480 ($610)? I don't get it, I paid 280 for my 4870 back when it released. Does AMD see a world where people are still making money hand over fist?
What's so fucking stellar about the 7970 at 480 ($610)? I don't get it, I paid 280 for my 4870 back when it released. Does AMD see a world where people are still making money hand over fist?
Neither are the 580's dropping in price over here. I need to build a new computer, but I really don't know what GPU I should go for.
Do we know if the 6xx/7xx series is going to be PCI Express 3.0? Or is it still 2.0?
It should get a lot better since there has been a switch to prioritize performance per watt.I really really hope that the next couple of generations of ultrabooks can get better gaming performance. Whether it be through a dedicated card or an onboard solution-- PC gaming has been hurting quite a bit because it used to be that when you bought a new computer it was guaranteed to run all of the latest games. Now there are computers being sold that can still barely run a 2007 game.
I know this thread is for the desktop stuff, but its not nearly as interesting these days and it won't be until the next generation of consoles roll out so that games are being developed for bleeding edge hardware again.
I really really hope that the next couple of generations of ultrabooks can get better gaming performance. Whether it be through a dedicated card or an onboard solution-- PC gaming has been hurting quite a bit because it used to be that when you bought a new computer it was guaranteed to run all of the latest games. Now there are computers being sold that can still barely run a 2007 game.
I know this thread is for the desktop stuff, but its not nearly as interesting these days and it won't be until the next generation of consoles roll out so that games are being developed for bleeding edge hardware again.
What's so fucking stellar about the 7970 at 480 ($610)? I don't get it, I paid 280 for my 4870 back when it released. Does AMD see a world where people are still making money hand over fist?
If these cards are specced on 3.0, don't you need ivybridge to unlock them?