Hopefully, this will bring more interesting chips for the mid-range sector as well. It's been what... 3 years and the 7850/7870 are still can't be beat.
The R9 280 or even R9 280X can be found for 150-180. Basically mid-range.
Hopefully, this will bring more interesting chips for the mid-range sector as well. It's been what... 3 years and the 7850/7870 are still can't be beat.
I'm not sure about games. I think game engines are designed with the bandwidth limit in mind. But when it comes to compute stuff, memory management is a big issue, and bandwidth is always limiting.
For most current applications, and especially gaming, it doesn't make much of a difference, but there are scenarios obviously where 8gb will be an improvement over 4.
But point is, the average consumer may not know this and even if they do, 12gb is likely to be subconciously more alluring.
What about driver issues though?
When I built my pc I used a ati card and it had a fault that it couldn't run 3d games. I have no idea why but it just shut down on games like borderlands.
Ended up getting the same level of power in a visual and it ran crysis and borderlands at max with no problems.
Would be weary to go back to amd for the graphics cards but their prices are always so tempting
And every year I hear it isn't as bad as people make it seem.LOL. I literally hear this every year, like it's a new thing.
I'm sorry. What? Drivers are always pertinent (for PC gaming).
I'm sorry. What? Drivers are always pertinent (for PC gaming).
Such a big jump. I wonder how much the reduction in wire diameter will improve signal speeds due to reduced parasitic capacitance or if that's negligible.Still 28nm. There won't be 20nm this year, but 16nm is expected for next year.
Good single GPU 4K performance.. but it's AMD...
nvidia drivers still prevail
HahaGood single GPU 4K performance.. but it's AMD...
from the link in the op:I own a GTX 970 but it's great to see AMD light a fire under Nvidia's ass.
Now hopefully the heat/noise/power consumption aren't as atrocious as the 290X.
edit: multiple sources are telling me that nVidia doesn't give a fuck and will probably price the Titan X at over $1000 anyways. I know my next card will be an AMD with HBM memory. nVidia can stick their ludicrous premiums where the sun don't shine
Amen. there's a reason nobody's doing buisness with Nvidia on the console side.
I thought 980 Ti wasn't possible?
Interesting.Funny thing is, my experience as a developer has been the opposite. Current project the bug count is 2 to 1 in favor of AMD. Although to be fair, the AMD 'bug' was a mistake I made that the driver didn't handle properly - a compute kernel infinite loop that took down my PC The NVIDIA bugs were both shader compiler issues that were a pain to work around. Things like very specific type of texture gathers returning just one value instead of four, etc.
There are things AMD cards just handle better too, things like large texture UV precision is dramatically better on AMD.
Same experience with past projects. I think my all time favorite was from *way* back in the day on a geforce 2; I once had msn messenger appear in water reflections
AMD with the elbow drop.
So, to clarify, the 980 SLi 4*4GB is 980s in 4-way SLi?
So, to clarify, the 980 SLi 4*4GB is 980s in 4-way SLi?
No it's supposed to be single SLI, otherwise the perf would make 0 sense (4 way sli only has 1.6x performance of a single card?!). The R9 295 is a dual GPU card and also says 4 * 4, it's just a stupid way of noting it. The chart is probably bogus anyway.
It's impressive that possibly the 4gb Fiji card bests the 12gb Titan X.
But I think its important that AMD gets an 8gb 390X out there asap to really put the heat on Nvidia. Even though an 8gb version of the 390X will only likely yield a 5-10% improvement over a 4gb version, it's important for marketing purposes so that Nvidia can't pull the wool over the eyes of the uninformed consumer who sees 12gb memory as superior to something lower from AMD.
Or maybe people pay attention to the driver situation and be willing to see that AMD has improved their drivers recently. In effect statements like this completely validate the pricing strategy Nvidia is using on their captive market.
Well, we'll see ifthis turns out to be true. In real world benchmarks, for the most part, 3 and 4-way SLi shows very poor scaling past 2-way (of course, it varies with game/application). 390X is looking like a winner according to these numbers, but then again, AMD.
Actually, in many games 3 and 4-way SLi doesn't really give THAT much of an improvement over 2-way. Also depends on resolution you're testing at. See: http://www.pcper.com/reviews/Graphi...ay-and-4-Way-SLI-Performance/Metro-Last-Light
So it all depends on what they were using to test. Like you said, this is probably bogus anyways. We'll have to sit and wait for official benchmarks.
No it's supposed to be single SLI, otherwise the perf would make 0 sense (4 way sli only has 1.6x performance of a single card?!). The R9 295 is a dual GPU card and also says 4 * 4, it's just a stupid way of noting it. The chart is probably bogus anyway.
And every year I hear it isn't as bad as people make it seem.
Better, but they still suck on OpenGL front, which is critical for things like Dolphin Emulator.
I'll wait for confirmation on powet consumption and noise levels, things AMD had trouble with in recent times.
AMD may not release drivers as often, but they're for damn sure more stable and reliable than what Nvidia is putting out. Fuck what anyone else tells you.
LOL. I literally hear this every year, like it's a new thing.
I'm sorry. What? Drivers are always pertinent (for PC gaming).
You trippin'
You must know nothing about NVIDIA.
AMD, going all in. Just a shame HBM isn't available in higher sizes yet. 8GB of that could have really put Titan X in a bad position, now people can just make the 'well more ram" argument
Have rumoured pre-release benchmarks been accurate in the past?
It's only been 1 month since I had to switch to nvidia due to the problem. Fool me once, shame on you etc. Etc.
If DX12 moves prior driver code into the game, isn't that a large burden for developers? I would think it means whoever has the strongest developer support wins vs best hardware. Either that or there is a slow transition.
There is still hope!
There is still hope!