• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Nvidia adopting AMD's GPU strategy next year, massive die to return in '13?

artist

Banned
This was already posted but got lost in the PC thread. This is quite a shocking turn of events if true, besides VR-Zone is credible when they post Nvidia related rumors.

Report: NVIDIA 28nm Desktop GPU Roadmap

Japanese website 4Gamer has revealed a detailed roadmap for NVIDIA's upcoming 28nm Kepler GPU line-up for desktops. As previously rumoured, the Kepler roll out will indeed be bottom-to-top, starting with the mainstream GK107 chip in Q2 2012 to the high-end GK110/2 parts later in the year. All Kepler GPUs are manufacted at TSMC's 28nm, use GDDR5 memory and feature DirectX 11.1 support, ready for Windows 8.

gawor.jpg


First to release will be GK107, a budget mainstream GPU. There's unlikely to be a Kepler based entry-level GPU. GK107 features 128-bit memory, and is first expected in notebooks, the desktop release being sometime in Q2 2012. Unlike the other GK chips (which support PCI Express gen 3), GK107 only features PCI-Express 2.

GK106 will be the mainstream performance part, replacing the GeForce GTX 560, featuring 256-bit memory. According to the roadmap, it is expected to release in late Q2 2012, not long after the GK107 release.

In 2012, NVIDIA is expected to embrace AMD's sweet-spot strategy, with no massive die. Instead, the top single GPU part will be GK104, featuring 384-bit 1.5 GB GDDR5 memory. GK104 is said to push out 2 TFLOPS, 30% higher than GTX 580. However, despite the smaller die, 4Gamer claims it consumes over 250W power. GK104 will release bang in the middle of 2012, perhaps during Computex time. Following right after GK104 will be GK110 - a dual GK104 flagship, thus completing NVIDIA's line-up for most of 2012 - remarkably similar to AMD's sweet spot strategy.

However, not for long, as late in 2012 / early 2013, NVIDIA plans to return back to a massive die with GK112, featuring 512-bit memory.

Details about Kepler remain sketchy, and such reports must be considered as preliminary estimates, at best. Meanwhile, AMD is rumoured to be preparing Southern Islands / HD 7000 for a Q1 2012 release, well before high performance Kepler GPUs reportedly hit the desktop market.

Read more: http://vr-zone.com/articles/report-nvidia-28nm-desktop-gpu-roadmap/14067.html

Cliffnotes of this rumored roadmap;

GTX690
Massive die with 512-bit memory interface
Dec '12 - Jan '13 release

GTX680
2 x GTX670 chips ala GTX590
Launch after GTX670

GTX670
Juneish release
30% faster than GTX580
Consumes more than 250W
384-bit memory interface

GTX660
May release
256-bit memory interface
 

artist

Banned
Basically isn't this the trade off of yield vs. power consumption?
It could also be a trade off of not much competition from AMD? With AMD moving to a new architecture next generation, Nvidia might feel comfortable with a 30% increase over the 580. AMD will need over 50% improvement on the 6970 and that would be a big feat.
 

Reallink

Member
1.5GB isn't really going to cut it going forward is it? I thought BF3 and Rage were already starting to experience issues on 1-1.5GB cards. I don't see that holding up at all once new console come out and assets quality increases exponentially.
 

Smokey

Member
Don't forget to buy one of these:
corsair_psu.jpg
I run 580 SLI on 850w so I doubt thats needed :p.

Sounds like I won't be upgrading anytime soon. Not going with a dual GPU in a single card. Meh. Who knows maybe ill cave but looks like its 2013 at earliest for me.

I've the 3gb 580s and the higher cards needs to at least come with 2.5 if not 3gb. Certainly cases where it helps.
 

Dennis

Banned
1.5GB isn't really going to cut it going forward is it? I thought BF3 and Rage were already starting to experience issues one 1-1.5GB cards. I don't see that holding up at all once new console come out and assets quality increases exponentially.

1.5 GB was too low the day the GTX580s came out

With the flagship card you should never even have to consider that you don't have enough VRAM.
 

BurntPork

Banned
I'd say "shut up and take my money," but NVidia doesn't have the self control for me to trust them with that. :p That said, only a 30% increase over the GTX 580 at 28nm and 250W? That's not bad, but it's not particularly impressive either.
 

StevieP

Banned
I'd say "shut up and take my money," but NVidia doesn't have the self control for me to trust them with that. :p That said, only a 30% increase over the GTX 580 at 28nm and 250W? That's not bad, but it's not particularly impressive either.

You're not one of the folks who thought the shrink to 28nm would net a 60% performance increase with half the power consumption, were you?
 

BurntPork

Banned
You're not one of the folks who thought the shrink to 28nm would net a 60% performance increase with half the power consumption, were you?

Of course not. I guess I took the claims of a 50% increase at the same level of power consumption too seriously, though.

But anyway, I'm not disappointed, but I'm definitely not blown away either. It's too bad that we have to wait an extra year to see the true beast. :(
 

1-D_FTW

Member
Who the hell is 4Gamer? And why should I believe anything they say when claiming a mainstream part is going to use 250 watts?

I'll wait till next year before making any judgements. Silly season only brings out sites that have advertising agendas, are just plain ignorant, or over reliant on sourcing foreign sites that could be 100 percent non-sense for all any non-native speaker knows.

EDIT: I guess it's not the mainstream part, but these timelines still don't make sense. Mainstream's at the end of the year. Ramping up production by power. But the flagship is coming mid-2012. Shit doesn't make any sense.
 

Izayoi

Banned
What the fuck? Only 1.5GB? We've long-since hit that barrier. Guess I'll be waiting for non-reference designs with higher RAM. What are they thinking?
 

Sinatar

Official GAF Bottom Feeder
What the fuck? Only 1.5GB? We've long-since hit that barrier. Guess I'll be waiting for non-reference designs with higher RAM. What are they thinking?

It's a rumor about a card that isn't due for another 8 months. Relax.
 

-COOLIO-

The Everyman
i also feel like i cant get excited about future cards unless there's something to tax them besides 3 screens of battlfield 3 ultra at 2500x1600
 
You're going to turn your computer on and it's going to sound like a Proton Pack charging.

That would actually be a great selling point.

At LAN parties you could each start up your computers...

"Doooo..."
"Reeee..."
"Egonnnn!"

And then the neighborhood has a black-out.
 

Izayoi

Banned
You ain't happy with your 580 sli?
I would be if I had two 3GB cards. As it is all of that extra horsepower is completely useless because of the VRAM barrier.

Been strongly considering selling my 1.5GB cards and moving up, and this news only makes the proposition more appealing.
 
Me too. What an unnecessary buy that was.

so? it will last you for years and years.

I would be if I had two 3GB cards. As it is all of that extra horsepower is completely useless because of the VRAM barrier.

Been strongly considering selling my 1.5GB cards and moving up, and this news only makes the proposition more appealing.

been thinking about that too and buying two MSI lightings when I buy the 30 inch monitor.

512b means it will be 2 or 4GB.

oh, ok. good thing then.
 

derFeef

Member
so? it will last you for years and years.

Fair enough, but hopefully connectors are staying the same. Just saying that a 650W would have been sufficient enough and seeing that power consumptions goes down and I am not a fan of multi-GPU (anymore) it is a bit overblown.
 
Fair enough, but hopefully connectors are staying the same. Just saying that a 650W would have been sufficient enough and seeing that power consumptions goes down and I am not a fan of multi-GPU (anymore) it is a bit overblown.

oh I don't think connectors will ever change their physical form(hopefully)
but a good 650W PSU is still good for 90% of systems today.
 

Izayoi

Banned
You can never have too solid of a power supply. If anything, it will just insure that your system gets strong, steady, clean power - which is exactly what's best for it.
 
Top Bottom