I hope the used 980 Ti market tanks and the cards become worthless (so I can buy them for pennies).
When is the Polaris 10 rumored to be releasing?
Me too. Why didn't you show all the results?
Maybe because it shows a 20% advantage for the 1080 while the others show only a 12-16%?
Compared to the 40% plus advantage in other games on the same site this is significant. Don't you think?
I hope the used 980 Ti market tanks and the cards become worthless (so I can buy them for pennies).
Nvidia will make sure of this once they start writing the new drivers for the new cards, making people buy Pascal or bust.
So what are you saying is, Nvidia will lower 980ti performance after new pascal drivers arrives?
I was replying to a Benchmark for Hitman running in DX11 at 1440p, so I quoted the same test from a site that is very trustworthy. You can see a lot of the cards match up except the Fury X which on the one I quoted was at 61fps vs. TPU's result of 51.7fps. That's a 10 fps disparity only on the side of AMD that I felt was unusual.
That's fine, but you can't ignore the Fury coming in at 40% of the 1080 in other benches, but only 15% in Hitman. This points towards a Fury advantage in Async.
Unless you are a competitive gamer and need insane good input lag, there are plenty of high quality 4k monitors or 4:4:4 4k 40" tvs that are wider than any ultrawide, for less money.
As I browse neogaf on my 65 inch curved 4k tv. Witcher 3 4k60 is bonkers.
Question about Vega, I could have sworn AMD got rights to early batches of HBM2 since they used HBM1 already, or am I wrong? That's the only way I can see them getting Vega out in Oct, unless they changed to 5X.
All I know is if those rumors are true, they better announce soon before people like me buy up 1080s.
Hmm to be fair, we can then discount that Project Cars favors Nvidia.I did point out a page back that Hitman favored AMD heavily, another reason why I think it's a poor benchmark for new hardware.
Last page has a few posts on this, like I said in one of them, I'm not one to believe in planned obsolesence, but in this case, Nvidia's case, I found damning evidence of fucking up Keppler for the benefit of Maxwell cards, just a few google searches and you can find benchmarks and numbers to support the fact that Nvidia fucked up their old architecture in favor of the new one.
I did point out a page back that Hitman favored AMD heavily, another reason why I think it's a poor benchmark for new hardware.
So a benchmark that uses async and favors AMD is a poor benchmark? Why would you think that? AMD developed their hardware towards that particular feature. A feature that will be utilized heavily in DX12, but is somehow a poor benchmark? Come on now.
Do you disregard Doom benchmarks that favor Nvidia?
Unfortunately for AMD, sheer brute force easily overcomes whatever difference there is in implementation of async compute. The 1080 easily crushes the Fury X in Ashes of the Singularity and even hands the Fury X it's ass in Hitman, a game which AMD actually wrote the async compute code for themselves.
I thought the Metro: Last Light benchmarks were more interesting (odd) in that they have a 980ti (OC) and Titan X outperforming the new 1080 at all resolutions tested using DX11 obviously. Some weird results among all the tests for sure.
Last page has a few posts on this, like I said in one of them, I'm not one to believe in planned obsolesence, but in this case, Nvidia's case, I found damning evidence of fucking up Keppler for the benefit of Maxwell cards, just a few google searches and you can find benchmarks and numbers to support the fact that Nvidia fucked up their old architecture in favor of the new one.
To be fair, last year he was leaning team red for his 4K setup then AMD just HDMI1.4ed him.Maybe you should wait for AMD to release their cards before thumbing your chest, geesh..you make it sound like these benchmarks are tit for tat when reality is..only Nvida has shown their cards. Some times its impossible to hide inner fanboysim. No offense but some of your posts reek of it.
This is a big fat no for me. Card still doesn't handle 4K correctly. Although the gains are not too bad on the 980Ti the price is definitely shitty.
G-sync + 980 Ti @1440p till 1080Ti hits!
I'm quite disappointed at this: after halving the process node, improving the architecture, adding many improvements along the way, this card comes at a higher TDP than the previous one and performs a bit better.
Chris, please do a review with GTX 980, 980 Ti and 1080 with the last one clocked exactly like a 980. I'd really like to see how much of this 15-FPS-avg gain is thanks to that 40%-ish increase in clock speed.
Lower power consumption...well...the 980 stood around the 300W mark in load and the 1080 appears to spend most of its time around the 250W mark. It's probably even better if we negate that clock speed advantage. But what about its performance in that case?
What are the actual architectural gains? Of course a 3.7GHz Pentium 4 is gonna be faster than a 2.6GHz one. We all know that... Combining clock speed improvements with architectural gains masks the actual gains, but I believe a clock-for-clock comparison would be very useful.
i see 1080 as a disappointment in performance. Clock for clock it's not much faster than the 980 and a part of the performance increase actually comes from the new image compression technic (according to Nvidia's slides image compression alone increases performance by like 30%). Good things are like better frame times and much less power consumption
Last page has a few posts on this, like I said in one of them, I'm not one to believe in planned obsolesence, but in this case, Nvidia's case, I found damning evidence of fucking up Keppler for the benefit of Maxwell cards, just a few google searches and you can find benchmarks and numbers to support the fact that Nvidia fucked up their old architecture in favor of the new one.
Last page has a few posts on this, like I said in one of them, I'm not one to believe in planned obsolesence, but in this case, Nvidia's case, I found damning evidence of fucking up Keppler for the benefit of Maxwell cards, just a few google searches and you can find benchmarks and numbers to support the fact that Nvidia fucked up their old architecture in favor of the new one.
That's so evil, holy shit. They need to be called out on this more.
My twin Cities store had multiple open box evga and msi under 400 open box. But yeah now the new ones are back up to reg price.
So a benchmark that uses async and favors AMD is a poor benchmark? Why would you think that? AMD developed their hardware towards that particular feature. A feature that will be utilized heavily in DX12, but is somehow a poor benchmark? Come on now.
Do you disregard Doom benchmarks that favor Nvidia?
That's what everyone said in the thread we had about that: "OMG it's true, now nothing will happen and everything will stay the same"
After reading articles, benchmarks, etc. I'll just wait and get the 1070, even if it is not happening on purpose ($$$LOL$$$), I'll not risk it again.
In this day and age, things like that can easily backfire. But I guess it just wasn't that big a deal unlike the 3.5 GB 970 debacle.
No single GPU 4K means I sit this one out.
GTX 780 is killing me.
It's not a big deal because, unlike the 970 VRAM issue, there's no substance behind the claim.
http://www.neogaf.com/forum/showpost.php?p=193814937&postcount=221
http://www.hardwarecanucks.com/foru...iews/70125-gtx-780-ti-vs-r9-290x-rematch.html
http://www.bytemedev.com/the-gtx-780-ti-sli-end-of-life-driver-performance-analysis/
Won't stop the FUD being spread, the meme is out there and rebuttals never spread as far or fast as the original scandal.
https://youtu.be/xtely2GDxhU?t=7610
Tom Petersen from Nvidia discussed the Founder's Edition a bit in this video (around 2:09:00), he says they intended it to be in the middle of the product stack price/quality wise.
https://youtu.be/xtely2GDxhU?t=7610
Tom Petersen from Nvidia discussed the Founder's Edition a bit in this video (around 2:09:00), he says they intended it to be in the middle of the product stack price/quality wise.
Lol you guys complaining about updating your 980 cards. I have a 560ti.
Well, that settles it.It's not a big deal because, unlike the 970 VRAM issue, there's no substance behind the claim.
http://www.neogaf.com/forum/showpost.php?p=193814937&postcount=221
http://www.hardwarecanucks.com/foru...iews/70125-gtx-780-ti-vs-r9-290x-rematch.html
http://www.bytemedev.com/the-gtx-780-ti-sli-end-of-life-driver-performance-analysis/
Debating whether I should get a 1440p or 4k monitor with a 1080. It seems a single GPU still can't get 4k/60 on most modern games.
So I'm wondering how does dropping the resolution of a game to 1440p look in a 4k monitor?
When is Kaby releasing?Wait until Kaby Lake, then buy the i7-7xxxk.
That depends on your current GPU.
When is Kaby releasing?
At work atm, can't watch the video but does this mean that overclocked 1080 3d Party modells with good cooling (like the gigabyte g1 gaming series) will be priced above 800? I already read some rumors that 3d party prices could go up to 900.
When is Kaby releasing?
Hmmmm, after looking at quite a few benchmarks a 980Ti OC'd is pretty close to the 1080, at least as far as I'm concerned and my definition of "close". Will wait another card generation or two me thinks.