• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia GTX 6xx/7xx Launch Thread of Swirling Kepler Rumors

dr_rus

Member
$299 for a card that can beat a 7970 is the part that was in question.
7970 is a ~300mm^2 chip. It can and should cost $299 (because right now it costs way more than AMD's own Cayman which was a lot bigger). Can a chip of the same complexity beat it at the same price point or is that somehow something too good to be true?

Also, how do I know that all the posts posts "were coming from newly signed up members and from the same IP subnet". Why should I trust this allegation when it may well come from AMD's shills themselves? PR is everywhere. Use your own brains.
 

artist

Banned
7970 is a ~300mm^2 chip. It can and should cost $299 (because right now it costs way more than AMD's own Cayman which was a lot bigger). Can a chip of the same complexity beat it at the same price point or is that somehow something too good to be true?
Only if we see things in vaccum, which we cant. Wafer costs for new nodes are always higher than older ones - comparison with Cayman is moot.

Also, how do I know that all the posts posts "were coming from newly signed up members and from the same IP subnet". Why should I trust this allegation when it may well come from AMD's shills themselves? PR is everywhere. Use your own brains.
Because its the forum owners saying so? I mean why would they suddenly make such allegations against Nvidia when clearly their revenue come from information/leaks on both AMD/Nvidia products.

Suppose tomorrow Anand from Anandtech makes a similar claim, why wouldnt you trust him?

CORE CLOCK
Comparing architectures by artificially clocking selective GPU domain? Sounds pretty loaded.
 

Hellish

Member
Comparing architectures by artificially clocking selective GPU domain? Sounds pretty loaded.

You are a nut case, it is a simple comparison of the ultra being bumped up to 925mhz from 900mhz just to compare how well they stack up, both are $550, yes they are a different architecture, it was just 1 of 3 ways to get the greatest scope of how well these cards truly compare to one and other.
 

artist

Banned
You are a nut case, it is a simple comparison of the ultra being bumped up to 925mhz from 900mhz just to compare how well they stack up, both are $550, yes they are a different architecture, it was just 1 of 3 ways to get the greatest scope of how well these cards truly compare to one and other.
This might just blow your mind but architectures are designed based on different constraints, clocks being one of them. You cant design a Goliath chip like the GF110 to run at clocks of 900+ MHz. But you can design a GF114 to run at those frequencies.

Yeah, being reasonable makes one a nut case these days.

will these cards use dx 11.1
No official confirmation but yes.
 

dr_rus

Member
Only if we see things in vaccum, which we cant. Wafer costs for new nodes are always higher than older ones - comparison with Cayman is moot.
Even in that case what's too good to be true in a chip with the same complexity on the same process being faster?

Because its the forum owners saying so? I mean why would they suddenly make such allegations against Nvidia when clearly their revenue come from information/leaks on both AMD/Nvidia products.

Suppose tomorrow Anand from Anandtech makes a similar claim, why wouldnt you trust him?
Why would I blindly trust anybody? Everyone in the industry has their own agenda. Anand is much more believeable than some Chinese guys btw. And I don't see him running doing such allegations.
And in any case, I haven't heard anything that's too good to be true, were it coming from shills or not.
 

Elios83

Member
When are these cards supposed to be unveiled?
Late Feb/early March? I think it will be really interesting to look at these new architectures also because of their possible integration in next gen consoles.
 

artist

Banned
Even in that case what's too good to be true in a chip with the same complexity on the same process being faster?
I havent seen GK104 to make that jump, neither have you.

Why would I blindly trust anybody? Everyone in the industry has their own agenda. Anand is much more believeable than some Chinese guys btw. And I don't see him running doing such allegations.
And in any case, I haven't heard anything that's too good to be true, were it coming from shills or not.
Anand was only an example, he would only run around with allegation if such case were to happen on Anandtech forums.

Like you said, everybody has an agenda. Selective picking shows yours.
 

dr_rus

Member
I havent seen GK104 to make that jump, neither have you.
You assume too much there.

Anand was only an example, he would only run around with allegation if such case were to happen on Anandtech forums.

Like you said, everybody has an agenda. Selective picking shows yours.
What selective pickings? I've made some research on this and here's what I've found:
http://forums.guru3d.com/showpost.php?p=4230286&postcount=11
http://hardforum.com/showpost.php?p=1038308445&postcount=36
http://www.semiaccurate.com/forums/showpost.php?p=150020&postcount=369

Add this to the fact that I still haven't seen any proof of that allegations from Chiphell. So who're the shills here and who're they working for?
 

artist

Banned
You assume too much there.
I didnt jump, so the one assuming (too much) here is not me.

What selective pickings? I've made some research on this and here's what I've found:
http://forums.guru3d.com/showpost.php?p=4230286&postcount=11
http://hardforum.com/showpost.php?p=1038308445&postcount=36
http://www.semiaccurate.com/forums/showpost.php?p=150020&postcount=369

Add this to the fact that I still haven't seen any proof of that allegations from Chiphell. So who're the shills here and who're they working for?
Oh links to random posts, here I can do a few myself:

http://hardforum.com/showpost.php?p=1038311102&postcount=110
http://hardforum.com/showpost.php?p=1038310917&postcount=106

And my trump card ..

http://www.nvnews.net/vbulletin/showpost.php?p=2522974&postcount=151

Rollo said:
Those Chiphell bastards banned all my accounts and .....

There's probably no kepler, you should convince all your friends to buy 7970s Roadie!
"NVIDIA Focus Group Member
NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members."
 

Doc Holliday

SPOILER: Columbus finds America
Don't really care whether the rumors are true or false but the whole thing is pretty hilarious.

More fuel to the fire...

Charlie:

Since people are asking, let me say this for the record. What Chiphell wrote (or appears to have written with the best translation available to me) is blatantly false, period.

My sources are industry sources, not message boards, random web forums, or other dubious sources. When I use information from web based sources, I cite and credit them because it is both the correct thing to do, and lets readers know where to go to judge the validity of that information. If it is not cited, it is from a direct industry source(s). If it is credited, it is just that. If not, it is because the source requested anonymity.

What other sites do is what they do, but I don't follow the pack because I actually value accuracy and ethics.
 
You are a nut case, it is a simple comparison of the ultra being bumped up to 925mhz from 900mhz just to compare how well they stack up, both are $550, yes they are a different architecture, it was just 1 of 3 ways to get the greatest scope of how well these cards truly compare to one and other.

Why are you STILL talking about clocking DIFFERENT ARCHITECTURES to the same core clock!? A GTX 580 oc'd to 925 mhz is a huge OC while a 7970 @ 925 is STOCK. How do you not understand this still?

In the car world, you are the same kind of guy who ignorantly tries to compare horsepower/torque between different dynos without realizing the complexity of the variables at play that render direct comparisons redundant.

The only realistic comparisons are stock 580 and 7970 and max average OC comparisons between the two.
 

Hellish

Member
Why are you STILL talking about clocking DIFFERENT ARCHITECTURES to the same core clock!? A GTX 580 oc'd to 925 mhz is a huge OC while a 7970 @ 925 is STOCK. How do you not understand this still?

In the car world, you are the same kind of guy who ignorantly tries to compare horsepower/torque between different dynos without realizing the complexity of the variables at play that render direct comparisons redundant.

The only realistic comparisons are stock 580 and 7970 and max average OC comparisons between the two.

The Ultra is already at 900mhz less then a %3 OC.

and is $550.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814130751&Tpk=gtx 580 ultra
 

Hellish

Member
The Ultra is not an official GTX 580 designation, it is simply a manufacturer's factory overclocked GTX 580. And it still gets beat by a stock clocked 7970, despite being overclocked 128 Mhz on the core and 200 Mhz on the RAM.

I am not responding after this post because this is pointless, but as I stated both are $550, not being an "Official GTX 580 designation", clearly it is not a reference card but the price is the key factor here.
 

kitch9

Banned
[H]ard doesn't tend to make shit up. I say that as a long time member.

Yes they do?

I've read an Asrock mb review where he panned the board for being buggy as it kept writing another number from the one he wanted to type in bios.

Turns out he had numlock on, but never retracted his review as it wasn't Asus, who are clearly figuring his ring piece with 20 dollar notes.
 

Sethos

Banned
You cant compare it to one that doesnt exist, can we?

You could wait and do a comparison to its real competitor, the Kepler. Comparisons to see the performance gain? Great. Doing a head-2-head comparison pretending a ( over ) year old GPU is still its direct competitor; stupid.



Wasn't that confirmed fake, few of the comments suggest so as well. Was posted multiple places and is probably done for quick page hits.
 
You could wait and do a comparison to its real competitor, the Kepler. Comparisons to see the performance gain? Great. Doing a head-2-head comparison pretending a ( over ) year old GPU is still its direct competitor; stupid.




Wasn't that confirmed fake, few of the comments suggest so as well. Was posted multiple places and is probably done for quick page hits.

Not confirmed fake. Just people doubting if it's true. I see these being close to official specifications, even if the hotclocked cores are off.
 
Why is it assuming the graph is real that the jump from Kepler to Maxwell cards so much greater then the jump from Fermi to Kepler? I know very little about what actually goes on inside a GPU or what differentiates each generation of card so if my question is dumb I'm sorry.
 
Why is it assuming the graph is real that the jump from Kepler to Maxwell cards so much greater then the jump from Fermi to Kepler? I know very little about what actually goes on inside a GPU or what differentiates each generation of card so if my question is dumb I'm sorry.

The graph is kind of Nvidia saying 'this is what we hope to achieve', it is not exactly a good indicator of what they WILL achieve. It's a very old graph as well.
 

McHuj

Member
Why is it assuming the graph is real that the jump from Kepler to Maxwell cards so much greater then the jump from Fermi to Kepler? I know very little about what actually goes on inside a GPU or what differentiates each generation of card so if my question is dumb I'm sorry.

That graph is specifically for 64-bit operations per watt and aimed at super computers; not necessarily (or just) graphics.

About 2-3 years ago, DARPA put out a research program called Ubiquitous High Performance Computing, Nvidia and Intel won some of the money for the research. I'm pretty sure that money is helping fund R&D for Maxwell as some of the goals of that program were high performance parallel processing at very low power (high flops/watt).
 
The graph is kind of Nvidia saying 'this is what we hope to achieve', it is not exactly a good indicator of what they WILL achieve. It's a very old graph as well.

That graph is specifically for 64-bit operations per watt and aimed at super computers; not necessarily (or just) graphics.

About 2-3 years ago, DARPA put out a research program called Ubiquitous High Performance Computing, Nvidia and Intel won some of the money for the research. I'm pretty sure that money is helping fund R&D for Maxwell as some of the goals of that program were high performance parallel processing at very low power (high flops/watt).

So if I understand correctly the graph means very little, because of its age and what it represents is efficiency and not actual horsepower?
 
So if I understand correctly the graph means very little, because of its age and what it represents is efficiency and not actual horsepower?

Yup. It is forecasted and Nvidia will either under/over-deliver on their prediction. Hell, Maxwell may not even launch in the window they forecasted, as I believe Kepler has been delayed a few times.
 

Celcius

°Temp. member
^thanks, fixed. Since the GTX 680 is the only kepler I'm interested in, I automatically just typed kepler lol.
 

iNvid02

Member
seeing how unoptimized some of last year's major releases were bothers me, i also just went back to gta iv for a bit and that was horrible too.

there are very few games i can think of which are in desperate need of more power, most just perform poorly

i think i'll be keeping my 580s for another year or two
 

Jtrizzy

Member
Damn I was hoping for June on the 680. Yes, GTA is annoying as hell on my single 580. I think I'm gonna give the enb mods a try. Although I still get stuttering to 28 in the game even when I limit it to 30 fps.
 

pestul

Member
Bit of trollin' in that last comment..

dave Feb 7, 2012 at 10:15 am #

Look at the coin AMD is currently making selling 7970/7950 boards with chips the class size of Nvidia’s 560!!! Could Nvidia ever sell 560/660 class GPUs at such high prices/margins? obviously no!

Nvidia has to build incredibly massive, incredibly expensive, incredibly hard-to-manufacture (=low yield for the full function die) in order to stand any chance of gaining the fastest-single-GPU-board crown.

And yet, each time, AMD is usually snapping at their heels with a much smaller chip. Indeed, this generation, unless Nvidia is at a 512-bit bus (forget the article above) there is a real chance an unleashed 7970 (clocked to its true potential) will equal or surpass Nvidia’s new high end chip. With Nvidia disappointing bigtime in the ARM SoC market, this would be a disaster.

If it is true that Nvidia intends another massive round of ‘dirty-pool’ via PhysX games designed to fail on AMD hardware, a large number of people are going to be looking forward to dancing on Nvidia’s grave.

I just can't see Nvidia fucking up that badly. I think they've learned enough from a past failure or two.
 
Why are you STILL talking about clocking DIFFERENT ARCHITECTURES to the same core clock!? A GTX 580 oc'd to 925 mhz is a huge OC while a 7970 @ 925 is STOCK. How do you not understand this still?

In the car world, you are the same kind of guy who ignorantly tries to compare horsepower/torque between different dynos without realizing the complexity of the variables at play that render direct comparisons redundant.

The only realistic comparisons are stock 580 and 7970 and max average OC comparisons between the two.


Speaking of cars, there are engines desinged to rev up to 9400 rpm while others redline as low as 4800, that would be equivalent to someone asking to limit the rpm of an RX8 to only 4800rps to try to make a fair comparison against an Oldsmobile overhead valve engine... what would be the point of that? to watch an Oldsmobile station wagon from the 60's beat an RX-8 under ridiculous conditions? what will that prove? LOL
 
Top Bottom