• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

GTX880 Rumors [New stuff: Post #231]

You mean that graphics API that has as good adoption rate as DX 11? And that has only one GPU manufacturer behind it?

api_adoption.PNG.png

I only use Nvidia GPUs, so I might be a little bit "out of the loop," but that graph seems highly speculative. let me guess, the graph comes straight from AMD? with such concrete information as "EXPECTED Mantle adoption" and "conservative (estimated?) Mantle adoption." I stand by my statement that it's a giant load of shit. I should ask (because I don't know) how many PC games support Mantle? all I could find on the internets doesn't seem promising:

AMD Mantle - Concerns Surrounding Adoption

It's Official: AMD's Mantle Is Doomed
AMD's Mantle may have pushed Microsoft toward improving DirectX, but it gives the company absolutely no advantage going forward. OpenGL can already deliver similar performance increases, and DirectX 12 will do the same when it's released. Developers have no reason to adopt Mantle, especially since it only works on AMD GPUs. The API will slowly drift into obscurity. Mantle is a buzzword, nothing more.
 
forgive me for my stupid question i am new to the pc world, how often do gpu series last ? ? is it a yearly thing ? for example the nvidia 7 series when did it release and how long will it last before the 8 series will come ?

edit : typo
GPU manufacturers generally come out with new graphics card lines every year.

But lately, many 'new' cards are just refreshed versions of an old card. The GTX770 is basically just a GTX680, for example. Or a 280X is essentially just a 7970. Gotta stay informed, cuz its not always 100% clear.
 
forgive me for my stupid question i am new to the pc world, how often do gpu series last ? ? is it a yearly thing ? for example the nvidia 7 series when did it release and how long will it last before the 8 series will come ?

edit : typo
New cards come out every year or so.
 
forgive me for my stupid question i am new to the pc world, how often do gpu series last ? ? is it a yearly thing ? for example the nvidia 7 series when did it release and how long will it last before the 8 series will come ?

edit : typo

So technically anything 770 and below is a renamed 6xx series card. The 680(enthusiast tier) becomes the 770(Hi end tier). Nvidia puts the new architecture in the enthusiast and the "why do you have this much money" slots.

Titan came out 2/13 and the 780 6/13 with the 780Ti 11/13.

Considering that we are now 18 months from when the last architecture was released I would not expect the endgame cards(like the Ti series) to happen until 2015 honestly.
 
except, you know, 2 out of 3 current gen consoles run on amd cpu/gpu.

Sony uses the PlayStation Shader Language API
Microsoft uses DirectX

Mantle wouldn't be remotely used on either one, and as such only serves purpose on PC.

With Microsoft making improvements to DirectX, which supports a wide range of different GPU manufacturers it makes the most logical sense just to stick with that.

Mantle was a good push in the right direction sure, But it ultimately doesn't have much of a future.
 
For the CPU should I go for the Core i7-5960X Extreme Edition (8 cores, 16 threads) or a more overclock'able 4 or 6 core Haswell / Haswell-E ?

Jesus man, just buy a 4790K and buy me a PS4 with the change instead if you have that much money lying around.
 
except, you know, 2 out of 3 current gen consoles run on amd cpu/gpu.

one of those two, Xbox1, is made by Microsoft who has their own software library for graphics in the form of DirectX 12, which is coming to X1 and will render Mantle superfluous. I can predict with a very high degree of certainty that future Xbox1 games will use DirectX 12 rather than Mantle.

as for Sony and PS4, they already have a low-level graphics API in the form of "GNM" which already renders Mantle redundant.

Mantle is a dead end. but you don't have to take my word for it:

Xbox One will not support AMD’s Mantle, and PS4 is also unlikely. Is Mantle DOA?
 
I guess a 4 GB GM104 / GTX 880 would be fine then. Just don't like the idea of buying a GPU right before the shift to 20nm happens.

For the CPU should I go for the Core i7-5960X Extreme Edition (8 cores, 16 threads) or a more overclock'able 4 or 6 core Haswell / Haswell-E ?

4/6 then spend change on memory
 
Is there any idea how many shaders should be expected in the 880? A few articles popped up saying 1920 - 2560, but that seemed a bit low to me. Will the drop to 20nm (apparently maybe 16-18nm) increase the number of shaders that could fit on the next generation of Maxwell by much?
 
A new rumour surfaced today that the GM204 will be used in 4 SKUs - 880Ti, 880, 870, 860. This isn't much of a stretch, but it comes with another rumour that NVIDIA will skip 20nm and the refresh of Maxwell will be on a 16nm process.

http://videocardz.com/51009/nvidia-preparing-four-maxwell-gm204-skus

I don't know what a shader is physically like, but would that node shrink mean that the 980 could have ~double the shaders of the 880 / double the TFLOPs?
 
I would guess you are currently running something like AMD 5870 or NVIDIA 480 so I would say upgrading to something like 870 would bring quite a leap in performance. The CPU will certainly start to be lagging behind, but depending of the game it might be a smaller or larger bottleneck.

What I have overclocked my 920 to 3,9Ghz I would say it's easy, but it needs good cooling. I would say that getting a overclocking of 3,7Ghz should be really easy if you fallow a guide. Overclock of around 4.0 or over should be possible if you have a really good tower/watercooler.

Do you have a detailed guide on overclocking the 920 specifically?

I have mine slightly overclocked, but haven't felt the need to really push it for games yet. Mine is liquid-cooled.
 
Do you have a detailed guide on overclocking the 920 specifically?

I have mine slightly overclocked, but haven't felt the need to really push it for games yet. Mine is liquid-cooled.

http://www.overclock.net/t/538439/guide-to-overclocking-the-core-i7-920-or-930-to-4-0ghz

Top result on Google and it's still quite thorough. I've had my i7 920 DO running at 4.2GHz for over a year now. It ran at 4GHz for the previous 4yrs. It becomes a power hog, but it still holds its own at those speeds.
 
Do you have a detailed guide on overclocking the 920 specifically?

I have mine slightly overclocked, but haven't felt the need to really push it for games yet. Mine is liquid-cooled.
http://www.overclock.net/t/538439/guide-to-overclocking-the-core-i7-920-or-930-to-4-0ghz

Top result on Google and it's still quite thorough. I've had my i7 920 DO running at 4.2GHz for over a year now. It ran at 4GHz for the previous 4yrs. It becomes a power hog, but it still holds its own at those speeds.
Yes that is good overclocking guide. Also you should try to look if there is specific guide for your motherboard, as there are settings that should be locked to certain voltage/frequency instead let them be in auto settings. My settings were for 21×190 with P6T deluxe V2, before I added more ram and laxed the OC:
Vcore: 1,275
PLL: 1,86
QPI/DRAM: 1,325
IOH: 1,18
ICH: 1,2
There were also some settings like spread spectrum that I had disabled.
Oh god how I miss the ability to have 50% overclocking potential in a normal 250€ CPU.
 
Jesus man, just buy a 4790K and buy me a PS4 with the change instead if you have that much money lying around.

I don't so I have to start saving :P

4/6 then spend change on memory

Kinda thought something like that would be a better way to go.

Back to the topic of Maxwell, some really interesting speculation:

http://wccftech.com/nvidia-geforce-...pu-skus-rumors-point-16nm-revision-late-2015/

NVIDIA GeForce Maxwell Lineup To Feature Four GM204 GPU SKUs – Rumors Point To 16nm Revision in Late 2015

An interesting article has been written by Videocardz which points out that NVIDIA might be preparing four GM204 GPU SKUs for their next generation GeForce Maxwell graphics cards. NVIDIA’s GeForce Maxwell line graphics cards which will feature the GM204 GPU such as the one which was recently leaked is reported to launch in the next few months but the GM204 GPU featured on the graphics card might not be the only one of the entire GeForce Maxwell stack.

If we take a look back at things NVIDIA did with Kepler, the report might actually be valid considering NVIDIA’s GeForce 600 series has three GK104 (GeForce Kepler) based GPUs for the GeForce GTX 680, GeForce GTX 670, GeForce GTX 660 Ti and two GK114 (GeForce Kepler Revision) GPUs for the GeForce 700 series cards including the GeForce GTX 770 and GeForce GTX 760. Based on the 28nm Kepler architecture, the GK104 has spanned quite some time as NVIDIA’s top-end to mainstream chip in the market before being replaced by the faster GK110 graphics processor which was unleashed with the GeForce GTX Titan graphics card back in February 2013.


According to the source and as we previously know, the GM204 is part of the second generation Maxwell core architecture which will offer some architectural and efficiency improvements over the first generation Maxwell core which is currently available on the GeForce GTX 750 Ti and GeForce GTX 750 graphics cards. The GM107 which is obviously an entry level part should not be compared to the more enhanced GM204 core but the core architecture changes will be added to the design while keeping the same DNA offering an extremely power efficient design and more performance per watt than the current Kepler generation cards.

The GM204 will power the GeForce GTX 880 which has been leaked in engineering state earlier featuring 8 GB GDDR5 memory and some other improvements which we can’t really decrypt at the moment but in addition to that, there would be other GM204 GPUs which will be integrated in cards aside the GeForce GTX 880 just like the cut down GK104 GPUs on GeForce GTX 670, GeForce GTX 760 and GeForce GTX 660 Ti. The four models will include the following:

* GeForce GTX 880 Ti
* GeForce GTX 880
* GeForce GTX 870
* GeForce GTX 860

If true, then this would mean that the GeForce GTX 880 won’t be the fastest graphics card in the lineup which will pave in room for the GeForce GTX 880 Ti. Its hard to confirm whether this part will actually launch and even more harder to tell whether the differences will exist in the core configuration or clock speeds. Judging from the parity in core configuration of GeForce GTX 780 Ti and GeForce GTX 780, we can say that the core count will be lesser on the non-Ti variant and even further toned down in the lower models (e.g. GeForce GTX 870/GeForce GTX 860) which will be done to offer more competitive priced parts.

All GPUs will feature the second generation of GeForce Maxwell core architecture and will be launched in between Q4 2014 and Q1 2015 while the GeForce GTX 880 will hit shelves during Summer (Q3 2014). More details on the GeForce GTX 880 can be found in the previous analysis of the PCB itself.


So all this concludes the the era of the GK104 will finally come to an end just like AMD’s Tahiti is going to be replaced by Tonga in August 2014. Both GPUs have served a span of over two years in the graphics industry and we can hope their successors will be a true replacement for these glorified cores.


NVIDIA’s GeForce Maxwell Revision Rumored To Be Based on 16nm Process – Ditching 20nm?

The second and first generation Maxwell details are starting to get uncovered but there’s a revision of the NVIDIA GeForce Maxwell cards which is already under plans. Now before beginning, all of this is just plain old rumor from Semiaccurate who have been inaccurate regarding Denver and Sea Islands in the past but they do get some pieces right so for your own best, take this with a grain of salt.

NVIDIA’s GeForce Maxwell was known to be a 20nm part since its announcement a couple of years back however due to process delays from TSMC, this led to a change in NVIDIA’s schedule hence the first generation Maxwell core which launched as the GM107 GPU didn’t arrive as a 20nm unit but rather based on the existing 28nm die process which has been running since 2012. While TSMC has 20nm production geared up now since their Apple 20nm production quota has been finished, the company is going to stat manufacturing 20nm units for the rest of the industry which includes AMD and NVIDIA in the graphics industry. Now there have been reports that the first generation and second generation cores from Maxwell may not be built on 20nm and use the existing technology which is 28nm process.

According to semiaccurate, after second generation of Maxwell ships to the market, NVIDIA will have a third generation of Maxwell GPUs on the verge of launch. It will take around 6-10 months time frame for NVIDIA to have these new cards ready since they will include the flagship GM200 GPU which replaces the GK110 but the new cards will be built on a 16nm process technology, entirely ditching the 20nm process. This also points that another revision of the GM204 would launch when NVIDIA introduces the far-from-launch “GeForce 900 series” (Not Official Name) which will include the revised Maxwell GPUs. The GM204 is currently available in revision A so it could either make a transition to “B” revision or a name change “GM214″ could be possible similar to GK114 from GK104 or both?

There’s alot going on over at NVIDIA which we don’t know of at the moment. The only thing we currently know regarding second generation Maxwell is the PCB shot which was leaked a last week and even details regarding that are still not fully revealed. Its best to wait for a bit more time until details on the GeForce GTX 880 start to pop out since only then we could make a more comprehensive analysis of the GeForce Maxwell architecture and the GPUs beyond it.

* NVIDIA GM200 (Maxwell Architecture, High-Performance for Telsa/Quadro Arrives later for Cosnumers, Successor of GK110)

* NVIDIA GM204 (Maxwell Architecture, High-End Consumer, Successor of GK104, First GeForce 800 Series Products likely to feature)

* NVIDIA GM206 (Maxwell Architecture, Performance Minded, Successor of GK206, Mid-Range GeForce 800 Series products to feature)

* NVIDIA GM107/207 (Maxwell Architecture, Entry Level, Successor of GK107, Entry Level GeForce 800/700 Series To feature, Already introduced on GTX 750 Ti / GTX 750)

source
source 2
original source, SemiAccurate

These are same kind of speculations / questions I have myself.

We know that to truly fully support the full DX12 feature set (whatever that might be), will most likely require new hardware, not currently in any Kepler-based GPU nor the entry level, first gen Maxwell GPUs that are now on the market. That is despite the fact that all Kepler and Maxwell (and Fermi as well?) cards will still support DX12.

I say that based on this article : http://techreport.com/news/26210/directx-12-will-also-add-new-features-for-next-gen-gpus

Will full DX12 support be included in the upcoming 28nm GM204 / 880 / 870 / 880 Ti, or will that only come with the speculated "3rd gen" Maxwell which should, at some point, include a Flagship GeForce GTX consumer GPU, the replacement for the GK100/GK110 used in the Titan line, meaning a Big Maxwell / GM210.

ZcsBUfZ.png

(a young James Clerk Maxwell)
 
Sure because skipping a process which is already two years late for a more advanced process which is already two years late and is likely to add another two years on top of this is a smart thing to do.

Hey, NV should skip straight to 8nm! GM210 in 2030!

They will use 20nm for Maxwell chips. There is no way around this - unless you want to completely skip a generation and just sit idly for 3-4 years. 16nm being the first TSMC's FinFET process is more than likely to be late and bugged and it's just plain stupid to expect any high performance GPUs using it in 2015. 2017 at best is my guess.

The only question is how long will they ride 28nm for top end of the line up. It'll be 20nm after that. You can pretty much count on this.
 
Sure because skipping a process which is already two years late for a more advanced process which is already two years late and is likely to add another two years on top of this is a smart thing to do.

Hey, NV should skip straight to 8nm! GM210 in 2030!

They will use 20nm for Maxwell chips. There is no way around this - unless you want to completely skip a generation and just sit idly for 3-4 years. 16nm being the first TSMC's FinFET process is more than likely to be late and bugged and it's just plain stupid to expect any high performance GPUs using it in 2015. 2017 at best is my guess.

The only question is how long will they ride 28nm for top end of the line up. It'll be 20nm after that. You can pretty much count on this.
Well if Nvidia and AMD "collude" to eek out chips like they kinda sorta did this time around (2.5 years since the 7970 released....)... they could totally play the waiting game for 16nm.

They could care less about actually advancing the tech and providing reasonably priced products with good performance...

Man, this last generation of GPUs has really turned me sour.
 
Well if Nvidia and AMD "collude" to eek out chips like they kinda sorta did this time around (2.5 years since the 7970 released....)... they could totally play the waiting game for 16nm.

They could care less about actually advancing the tech and providing reasonably priced products with good performance...

Man, this last generation of GPUs has really turned me sour.

What do you mean collude. TSMC make the chips for both of them
 
What do you mean collude. TSMC make the chips for both of them

I put it in brackets on purpose, of course I know that TSMC makes chips for the btohof them.

By collude, I mean creating market conditions and releasing chips at "strange" intervals and with overly purposed market price points. From my understanding, the fact that TSMC produces both of their chips has nothing to do with why Nvidia launched with a GTX 680 or why the 780TI came out months after the 780.

Nor does TSMC developing both their chips explain the 7970 to 280 series rebranding.

Both companies created a stagnant market in which bullshit like that occured and was 100% tolerated by the consumers. People actually buying the titan en masse, ensured the price gouging will continue. Furthermore, the next generation of video cards will probably not launch with flagships because of the precedent this gen made.

This recent gen of GPUs lead to horribly inflated GPU prices by both sides and screwing over initial buyers by releasing better cards 3 months later. A trend I would like to see disappear.
 
Sure because skipping a process which is already two years late for a more advanced process which is already two years late and is likely to add another two years on top of this is a smart thing to do.

Hey, NV should skip straight to 8nm! GM210 in 2030!

They will use 20nm for Maxwell chips. There is no way around this - unless you want to completely skip a generation and just sit idly for 3-4 years. 16nm being the first TSMC's FinFET process is more than likely to be late and bugged and it's just plain stupid to expect any high performance GPUs using it in 2015. 2017 at best is my guess.

The only question is how long will they ride 28nm for top end of the line up. It'll be 20nm after that. You can pretty much count on this.


I think you misunderstand the situation. 16nm is actually 20nm with 3d transistors (like what intel does on their cpu), it's called finfet. They started producing 20nm socs for apple not so long ago and at the end of the year they will be able to make 20nm gpus. The thing is that if they release 28nm maxwell sooner they can hold up 7-9 months and instead jumping right away to 16nm finfet on a proven architecture. Seems plausible to me.

http://www.cadence.com/Community/bl...-ahead-for-16nm-finfet-plus-10nm-and-7nm.aspx

On top of that, the yields (number of dead die vs the one you make) are already on par with 20nm.

I'd take any rumors with a giant lake of salt at this point. We don't even know what is the next card releasing in under 6 months is going to be based on so speculating around news is fun but pointless to a degree. Quite exciting release.
 
I think you misunderstand the situation. 16nm is actually 20nm with 3d transistors (like what intel does on their cpu), it's called finfet. .

Here's what I think - Since Maxwell GPUs of all sorts are going to be produced for years & years to come (at least one Fermi GPU was recently released IIRC) it's probably likely we will see various Maxwell GPUs on 28nm, 20nm and later in Maxwell's life 16nm (16/20 nm ?) / FinFET / FinFET+.

Because Nvidia constantly change their roadmaps, I would not count on the first graphics cards based on the Pascal GPU architecture until 2017, no matter matter what Jen-Jsun says about things that are so far away. I'd be shocked if we see Pascal anytime in 2016.
stacked DRAM / TSV in consumer products doesn't seem trivial.
 
Here's what I think - Since Maxwell GPUs of all sorts are going to be produced for years & years to come (at least one Fermi GPU was recently released IIRC) it's probably likely we will see various Maxwell GPUs on 28nm, 20nm and later in Maxwell's life 16nm (16/20 nm ?) / FinFET / FinFET+.

Because Nvidia constantly change their roadmaps, I would not count on the first graphics cards based on the Pascal GPU architecture until 2017, no matter matter what Jen-Jsun says about things that are so far away. I'd be shocked if we see Pascal anytime in 2016.
stacked DRAM / TSV in consumer products doesn't seem trivial.

Exactly what I'm counting on. Anyone waiting for Pascal is nuts. I just want a solid GPU with good power usage, solid performance stats, and a fair price.

If I'm lucky, I'll get 2 of those 3.
 
Exactly what I'm counting on. Anyone waiting for Pascal is nuts. I just want a solid GPU with good power usage, solid performance stats, and a fair price.

If I'm lucky, I'll get 2 of those 3.

I'd like to have a PC built around an Intel CPU /w base clock at 4.0 GHz and GTX 880 around the end of the year. I won't be running games beyond 1080p for some time to come. I'll just need to make sure the power supply is much more than I'll need so if I wanna upgrade to a Big Maxwell based card a year or more later, I can.
 
mK3tfj6.png


Kind of a guess as to the performance increase to come with the 880. At this point I want to rebuild my rig really bad, but I'm not dropping a dime until the 880 releases. I'm hearing recent info saying as early as Q3 2014 (which is anytime now) all the way until early 2015. The most popular time frame seems to be in the span of Q4 2014. This waiting game Nvidia is playing is driving me nuts. I can remember back in 2013 when everyone was saying Q1 2014...
 
I think you misunderstand the situation. 16nm is actually 20nm with 3d transistors (like what intel does on their cpu), it's called finfet. They started producing 20nm socs for apple not so long ago and at the end of the year they will be able to make 20nm gpus. The thing is that if they release 28nm maxwell sooner they can hold up 7-9 months and instead jumping right away to 16nm finfet on a proven architecture. Seems plausible to me.

http://www.cadence.com/Community/bl...-ahead-for-16nm-finfet-plus-10nm-and-7nm.aspx

On top of that, the yields (number of dead die vs the one you make) are already on par with 20nm.

I'd take any rumors with a giant lake of salt at this point. We don't even know what is the next card releasing in under 6 months is going to be based on so speculating around news is fun but pointless to a degree. Quite exciting release.

I know that. I also tend to expect a lot of problems from any new production process - be it "just 20nm with FinFETs" or not. TSMC already moved to 16FF+ - with no 16FF products in sight - which only means to me that 16nm situation is far from ideal and skipping a production ready 20nm in it's favour is plainly stupid.

It's a no brainer to expect better prices and yields from 20nm first and 16nm later. Considering that any GPU maker is yet to move any chip on 20nm waiting for 16nm sounds like a recipe to ride GK104/Tahiti/GK110/Hawaii for another 2-3 years. And I'm 100% sure that they'll one up each other in that waiting - if NV will decide to wait AMD won't and vice versa. Thus no waiting is possible at all. They will use 20nm. The question is - will they use it for top chips or for their mobile/laptop parts only?
 
It's pretty clear most Maxwell GPUs will be on 28nm and on a conventional 20nm process.

As for very late in Maxwell's commercial life when the first Pascal GPUs are coming out, who knows,

As for Nvidia Pascal, I'd expect graphics cards based on that architecture should be in wide spread use during the time PS5 and Xbox Next are launching late this decade using next gen AMD APUs. So yeah, holding out for Pascal in 2014 would be sheer insanity.
 
God, seems like my 7870 was the best purchase of all time. It provided a significant jump, was decently priced and now it seems that the whole industry has grinded to a near halt, allowing it to get me half way PS4 before it's worth upgrading.
 
Top Bottom