SuperÑ;117578936 said:Im on the verge of buying a R9 290 to substitute my old 6950... should i bite the bullet now?
Yeah, the 290 is a great card and going from 6950 to 290 will be an awesome leap for you. Just buy new if it's 290/290X - too much risk of a used card being an ex-miner to buy used AMD, IMO.
14nm is just a label, like Intel's 22nm. Effectively Intel's 22nm might not have been significantly better than 28nm (as their labelling would suggest)Doesn't make it better though, how did they let Intel get this huge advantage? I know it's mostly TSMCs fault, but it's still disappointing if you consider Moore's Law states it should be at 14nm this year.
No 20nm this year isnt new. I dont know why people still expect miracles. :/
What is so great about 20nm?
Sorry not very technical about PC stuff.
On same chip size, jump from 28nm to 20nm would bring larger amount of transistors, lower power consumption, lower heat.
Better product.
Not guaranteed it seems. Signs point to 20nm being possibly more expensive and problematicOn same chip size, jump from 28nm to 20nm would bring larger amount of transistors, lower power consumption, lower heat.
Better product.
We'll need to see if it scales. As of now 750Ti is not great price/perf value compared to R7 265.For a reference today - look at the 750 Ti. Performance-wise, they have neutered it so it falls in line to the NVIDIA numbering scheme - but power/energy use and heat is super insanely low compared to all else.
14nm is just a label, like Intel's 22nm. Effectively Intel's 22nm might not have been significantly better than 28nm (as their labelling would suggest)
Intel 22nm is in label only
Intel 22nm not really 22nm
]
Not guaranteed it seems. Signs point to 20nm being possibly more expensive and problematic
20nm has higher cost per wafer than 28nm
Nvidia thinks 20nm is crap
Issues with 20nm and beyond
We'll need to see if it scales. As of now 750Ti is not great price/perf value compared to R7 265.
All this talk about 20nm, but has anyone heard if they're gonna go GDDR6 with the 8xx or hold off till the next card? I'm really curious about that. It's supposed to have bigger bandwidth and higher clocks.
I would skip the 6GB 780 ti, because the ports that need that much memory have garbage optimization anyway. You would need more grunt than that.bleh. the sooner the 6gb 780ti comes out the better. maxwell isn't looking like it'll be worth it until the 900 series.
bleh. the sooner the 6gb 780ti comes out the better. maxwell isn't looking like it'll be worth it until the 900 series.
If I can buy a g-sync monitor off the shelf, and if there is a single chipset enthusiast level part more powerful than the Titan (the Z isn't really a gaming part) in the new line up I'm still going to be building a PC later this year. my current CPU isn't quite cutting it anymore and I haven't done a completely new build in something like five years.
If not... I guess I keep waiting, but I really don't want to have to. My DK2 arrives next month and my PC needs a shot in the arm to let me play StarCitizen on it.
I would skip the 6GB 780 ti, because the ports that need that much memory have garbage optimization anyway. You would need more grunt than that.
Honestly, gsync has done more for my experience than my GPU upgrade did. BF4 and Trials Fusion went from a stuttery mess to buttery smooth.
bleh. the sooner the 6gb 780ti comes out the better. maxwell isn't looking like it'll be worth it until the 900 series.
bleh. the sooner the 6gb 780ti comes out the better. maxwell isn't looking like it'll be worth it until the 900 series.
The graphics cards keep getting more powerful but there aren't a lot of CPUs who can even keep up without bottlenecking them.
And I hope a Maxwell Titan-killer from AMD gets announced immediately after at half the price to knock some sense into Nvidia.I think a Maxwell Titan in Q1/Q2 2015 is granted, probably the first thing to be 20nm.
There is way too much stuff i want, I need gsync as well, judder drives me nuts.
Still depends a lot on the game and what sort of resolution you're running at.The graphics cards keep getting more powerful but there aren't a lot of CPUs who can even keep up without bottlenecking them.
And I hope a Maxwell Titan-killer from AMD gets announced immediately after at half the price to knock some sense into Nvidia.
So if I'm lookin to upgrade from a 660ti for Witcher 3 and beyond, what's my best bet for ~$300 range card?
Yeah, as long as you stay away from those versions with the reference cooler design. That one's fucking awful. Of course, if you can get a really good deal on one you can swap the cooler.A second-hand 290x would be your best bet imo. A lot of them were for the mining craze and didn't get too much use.
http://www.ebay.com/sch/i.html?_sacat=0&_from=R40&_nkw=radeon+r9+290x
Still depends a lot on the game and what sort of resolution you're running at.
And this will also be somewhat addressed by things like Mantle and DX12.
Wouldn't mind seeing 6 or 8 core mainstream CPU's become a thing, though.
Getting Oculus DK2 within the next month and my computer is 4 years old, so i'm looking to upgrade.
I was thinking of going with a Geforce GTX 780 Ti. Should I wait for this? Will the price go down on this old one? I'm guessing the wait is going to be more than 2 months which I don't know if I can handle.
Also for the CPU is Intel Core i5-4670K Haswell Quad-Core 3.4GHz LGA 1150 good enough? I read that i5 is what you want for gaming? Were these people crazy?
future proofing/protecting myself against watch dogs tier shit ports, can't be much worse than the 2gb i have on my 760 now anyway, i'm sure the 780ti 6gb will be out s...I would skip the 6GB 780 ti, because the ports that need that much memory have garbage optimization anyway. You would need more grunt than that.
... oon.JacobF confrimed the 780ti 6GB has been canned.
http://www.overclock.net/t/1475993/...o-6gb-update-6gb-gtx-780-ti-cancelled/1000_50
So if I'm lookin to upgrade from a 660ti for Witcher 3 and beyond, what's my best bet for ~$300 range card? Wait until 20nm?
Wouldn't mind seeing 6 or 8 core mainstream CPU's become a thing, though.