• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

GTX880 Rumors [New stuff: Post #231]

SuperÑ;117578936 said:
Im on the verge of buying a R9 290 to substitute my old 6950... should i bite the bullet now?

Yeah, the 290 is a great card and going from 6950 to 290 will be an awesome leap for you. Just buy new if it's 290/290X - too much risk of a used card being an ex-miner to buy used AMD, IMO.
 
That's a long time to wait, if true. My old 670 still does a great job at 1080p, but I'm soon planning to go for one of the 1440p monitors, so I'm still trying to decide if I should get a 780ti now, or wait for the 800-series.
 
Been considering buying a new PC for the last couple of months but now this... it seems like ages away, don't know if I can hang on for another 4+ months with this PC, I bought it at the wrong time so it's pretty badly oudated now (4GB RAM, GTX 460). Confirmed by Dead Rising 3 where even the minimum spec is too high for my card!

Do the indications and rumours suggest these will be worth waiting for over getting a 770 or higher 700 series card now?
 
On same chip size, jump from 28nm to 20nm would bring larger amount of transistors, lower power consumption, lower heat.

Better product.

For a reference today - look at the 750 Ti. Performance-wise, they have neutered it so it falls in line to the NVIDIA numbering scheme - but power/energy use and heat is super insanely low compared to all else.
 
On same chip size, jump from 28nm to 20nm would bring larger amount of transistors, lower power consumption, lower heat.

Better product.
Not guaranteed it seems. Signs point to 20nm being possibly more expensive and problematic

20nm has higher cost per wafer than 28nm

Nvidia thinks 20nm is crap

Issues with 20nm and beyond

For a reference today - look at the 750 Ti. Performance-wise, they have neutered it so it falls in line to the NVIDIA numbering scheme - but power/energy use and heat is super insanely low compared to all else.
We'll need to see if it scales. As of now 750Ti is not great price/perf value compared to R7 265.
 
All this talk about 20nm, but has anyone heard if they're gonna go GDDR6 with the 8xx or hold off till the next card? I'm really curious about that. It's supposed to have bigger bandwidth and higher clocks.
 
All this talk about 20nm, but has anyone heard if they're gonna go GDDR6 with the 8xx or hold off till the next card? I'm really curious about that. It's supposed to have bigger bandwidth and higher clocks.

Would be with the release of the 9xx. If they want major boost for 4K gaming it'll be 20nm + GDDR6, first released as part of Titan serious for 1K.
 
So this should raise hopes of the 20nm Maxwell cards being full-fledged cards then right? Isn't 16nm coming pretty soon afterwards? Can they really pad out their releases for that long and risk AMD putting out proper high end cards and blowing them away?
 
bleh. the sooner the 6gb 780ti comes out the better. maxwell isn't looking like it'll be worth it until the 900 series.
I would skip the 6GB 780 ti, because the ports that need that much memory have garbage optimization anyway. You would need more grunt than that.

Honestly, gsync has done more for my experience than my GPU upgrade did. BF4 and Trials Fusion went from a stuttery mess to buttery smooth.
 
That's too bad. I was hoping to see a 20nm GTX 880 late this year that would put my Titan to shame. Still, I might not be able to resist upgrading if I decide to stick with PC gaming.
 
Hmm, is the lack of 20nm sufficient reason to wait yet another generation before upgrading my HD7870? I'm still only aiming to hit the "low" target of 1080p60, no 2560x1440 or refresh rates over 60Hz in my immediate future.
 
If I can buy a g-sync monitor off the shelf, and if there is a single chipset enthusiast level part more powerful than the Titan (the Z isn't really a gaming part) in the new line up I'm still going to be building a PC later this year. my current CPU isn't quite cutting it anymore and I haven't done a completely new build in something like five years.

If not... I guess I keep waiting, but I really don't want to have to. My DK2 arrives next month and my PC needs a shot in the arm to let me play StarCitizen on it.


I am close to pullingt the trigger on a dk2, my wee boy is dying to try it, me too. :-)


I would skip the 6GB 780 ti, because the ports that need that much memory have garbage optimization anyway. You would need more grunt than that.

Honestly, gsync has done more for my experience than my GPU upgrade did. BF4 and Trials Fusion went from a stuttery mess to buttery smooth.


There is way too much stuff i want, I need gsync as well, judder drives me nuts.
 
There is way too much stuff i want, I need gsync as well, judder drives me nuts.

I hope it's just a case of early adopter tax but the first gsync monitor for sale over at OCUK is literally twice the price of the regular model (£180). If that does happen to be the regular mark-up, I'd probably choose to put the money towards a better gpu instead.
 
The graphics cards keep getting more powerful but there aren't a lot of CPUs who can even keep up without bottlenecking them.
Still depends a lot on the game and what sort of resolution you're running at.

And this will also be somewhat addressed by things like Mantle and DX12.

Wouldn't mind seeing 6 or 8 core mainstream CPU's become a thing, though.
 
So if I'm lookin to upgrade from a 660ti for Witcher 3 and beyond, what's my best bet for ~$300 range card? Wait until 20nm?
 
And I hope a Maxwell Titan-killer from AMD gets announced immediately after at half the price to knock some sense into Nvidia.

I would love to see AMD steeping up too, but I hope they can do it without eating a mini nuclear plant and F1 engine in the process.

Keep it quiet and cool pls.
 
Getting Oculus DK2 within the next month and my computer is 4 years old, so i'm looking to upgrade.

I was thinking of going with a Geforce GTX 780 Ti. Should I wait for this? Will the price go down on this old one? I'm guessing the wait is going to be more than 2 months which I don't know if I can handle.

Also for the CPU is Intel Core i5-4670K Haswell Quad-Core 3.4GHz LGA 1150 good enough? I read that i5 is what you want for gaming? Were these people crazy?
 
I think memory its gonna be a big deal breaker for me, with most developer leaving behind the old consoles and the tight memory managment, the next batch of games are gonna eat some serious VRAM. I wouldn'd be surprised if The Witrcher 3 on ultra eats close to 5GB.
 
Still depends a lot on the game and what sort of resolution you're running at.

And this will also be somewhat addressed by things like Mantle and DX12.

Wouldn't mind seeing 6 or 8 core mainstream CPU's become a thing, though.

Though of course it's scenario-dependent, for things like driving 120 Hz screens and more advanced emulation consistently, we could really use some big single-threaded CPU innovations.
 
Getting Oculus DK2 within the next month and my computer is 4 years old, so i'm looking to upgrade.

I was thinking of going with a Geforce GTX 780 Ti. Should I wait for this? Will the price go down on this old one? I'm guessing the wait is going to be more than 2 months which I don't know if I can handle.

Also for the CPU is Intel Core i5-4670K Haswell Quad-Core 3.4GHz LGA 1150 good enough? I read that i5 is what you want for gaming? Were these people crazy?

Both i7/i5 are identical in 99% of games, so it's more of a price/performance thing.

Some games do take advantage of the I7's hyperthreading I think it's called though, such as Crysis 3 and you'll get better framerates with an I7.
 
I would skip the 6GB 780 ti, because the ports that need that much memory have garbage optimization anyway. You would need more grunt than that.
future proofing/protecting myself against watch dogs tier shit ports, can't be much worse than the 2gb i have on my 760 now anyway, i'm sure the 780ti 6gb will be out s...

... oon.

uh. okay. roll on the 880 then i guess?
 
So if I'm lookin to upgrade from a 660ti for Witcher 3 and beyond, what's my best bet for ~$300 range card? Wait until 20nm?

Early returns on Maxwell lite (750Ti) were pretty encouraging. The 800 series is supposed to be cheaper than the 600/700 releases as well. Who knows when 20mm will actually show up...
 
Probably end up picking up 2-4 of the 880 when a specific version releases. My 680's are crying more than me about the lack of 20nm.

Really hope they do have 8GB versions(4k, 8thgen). Would make smooth 4k feasible. Think VRAM will be a major factor again now that the new console are out.
 
Wouldn't mind seeing 6 or 8 core mainstream CPU's become a thing, though.

As long as it doesn't come at a tradeoff of single-core/threaded performance as we see with the current Xeons.

I might step up from my 770 4GB for a 880, just so I can play The Witcher 3 at max settings, hopefully over 60FPS. I would probably have to wait for the actual high-end Maxwell part though to achieve that.
 
I'm really hoping the 880s won't be more than 650. 600 would be the sweet spot though. If they try to sell them above the price of a 780ti, it's going to be a slap in the face to their consumers.
 
Top Bottom