• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

NVIDIA CEO GTC 2014 keynote Over - broken dreams; broken wallets

I hope you're right, but considering the 780 To and Titan Black are both pretty damn new, I'm thinking probably not.

120Hz 120Hz 120Hz 120Hz!

Am I a broken record yet?

Yeah, yeah, yeah. I hear you. lol. My Wife already Green Lighted the ASUS, but reading a few forums it still up in the air when it is coming out. The good news is that now that we moved into our new place I have a Computer/Gaming room now and the way it is set up, I can easily connect the new monitor and still have my rig near my 65" Screen to game on that when I want as well. That ASUS sure looks nice and has great specs, they just need to release it already.
 
Cyriades, that 780 Ti is calling my name!!! I'm just worried it's not gonna be "good enough" for Oculus CV1.

I have two 780ti's, I actually may sell one of them since I've decided on VR gaming instead of springing for a 4k TV. I just bought it last month, it's practically brand new. What's a fair price? $625-650 shipped? Dunno pm me
 
I thought it had the same power as a Titan.

ibqYGUNt4kYpWo.png


The Ti has higher clock speed, and is a full board. Nothing disabled, unlike the Titan and the 780.

However, the Ti's compute is artificially neutered compared to the Titan/Titan Black because Nvidia wants all your money, forever and ever.
 
lmao. When I saw what time it was on and then realized I have a meeting during that time, my wife even said "isn't it that one guy that just goes on and on and on...". I mean I like his enthusiasm at times, but damn the man cannot get to the point without 20min of yapping.

Yep. I mean, he's selling his company's products, but the dude is enthusiastic, and that I like. But I don't know if I can stomach sitting through drawn out graphics, analogies, and wishy washy marketing speak.
 
How about Witcher 2 at 1080p with uber mode? Or Crysis 3 at max settings?

Witcher 2 and über is definitely a system killer - I get 20-30fps at 1080p on a 4.6Ghz 3820 and the Titan. However, I can't imagine the next round will get that to 60fps, locked, quite yet. I'd like to stick with a single card solution. ... You are right, though.

Crysis 3, however, runs fine at max settings. The occasional dip below 50fps - definitely not locked at 60, but it's more than playable.
 
What kind of gains are we looking at for 20nm Maxwell?

I mean the usual rule of thumb is that every year is a modest improvement over the last. And I think the 700 series showed that NVidia is happy to gimp their cards to make sure it's only a modest improvement which they can then unlock later at a higher price.

Well the only Maxwell cards out in the wild are 28nm, but they paint a pretty good picture of what the architecture can offer. It's most about efficiency. Comparing a GTX 650 Ti Boost and GTX 750 Ti both on the 28nm process node, Nvidia managed to lower power consumption by 50%, heat generation considerably, and use ~30% less transistors, to give equivalent performance. Yes, the 750 Ti lags behind a bit, but considering it has a smaller memory bus, 30% less transistors, 30% fewer ROPs, and ~14% fewer shader units. That's pretty damn impressive.

Now, Maxwell wasn't really designed for 28nm, so it probably isn't the most optimized variation of it, but imagine any card on the market now crammed into those specs. High-end Nvidia cards that only require a 6-pin connector and use 125w instead of 200w. Consider that you in 2 years will be able to get 780 Ti level performance out of an entry-level card, but it runs cooler and uses less power, and the high end equivalent in that line would probably be edging towards 780 Ti SLi levels of performance if it scales as well as I suspect.

Maxwell will be bigger than Kepler was, which is a pretty big deal.
 
Yep. I mean, he's selling his company's products, but the dude is enthusiastic, and that I like. But I don't know if I can stomach sitting through drawn out graphics, analogies, and wishy washy marketing speak.
He doesn't even seem to have a script or just doesn't follow it. He just keeps talking.
 
This should be an awesome way to start my morning off tomorrow. Just as long as we don't end up talking store front cloud processing I'll be happy.
 
I wonder if there will be any high-end CG presentations again this year from any of the major visual effects or animation houses.
 
What we'll see:

- Charts
- Tegra
- G-SYNC
- Shadow Play
- GPU Servers
- Cars
- Cars
- Cars
These are my expectations.

- Charts with vague release window for maxwell gpus

- Tegra in Cars

- GPU Servers will power some new supercomputer

-Announcements for new Professional Grade GPUs with firm release windows unlike their consumer parts.

-Project Denver GPGPU to challenge Intel and AMD's CPUs is discussed again (no release date)

-This will segue into hyping up Nvidia GPUs pushing forward the robotics industry.
 
These are my expectations.

- Charts with vague release window for maxwell gpus


- Tegra in Cars

- GPU Servers will power some new supercomputer

-Announcements for new Professional Grade GPUs with firm release windows unlike their consumer parts.

-Project Denver GPGPU to challenge Intel and AMD's CPUs is discussed again (no release date)

-This will segue into hyping up Nvidia GPUs pushing forward the robotics industry.
Great, so i can turn it off pretty quickly!
 
I'm pleased to read that Nvidia is choosing to focus on DirectX rather than going head-to-head against Mantle with its own API.
 
Witcher 2 and über is definitely a system killer - I get 20-30fps at 1080p on a 4.6Ghz 3820 and the Titan. However, I can't imagine the next round will get that to 60fps, locked, quite yet. I'd like to stick with a single card solution. ... You are right, though.

Crysis 3, however, runs fine at max settings. The occasional dip below 50fps - definitely not locked at 60, but it's more than playable.

I don't know what kind of SSAA ubersampling is, but using downsampling plus SMAA produce a better IQ than ubersampling at higher framerate.
 
I'm pleased to read that Nvidia is choosing to focus on DirectX rather than going head-to-head against Mantle with its own API.

Probably thought that with all the different platforms gaining popularity, accessories like Oculus making waves, do developers really have the time to support another API that only appeals to a sub-set of people on one platform?
 
lmao. When I saw what time it was on and then realized I have a meeting during that time, my wife even said "isn't it that one guy that just goes on and on and on...". I mean I like his enthusiasm at times, but damn the man cannot get to the point without 20min of yapping.

But this is a keynote! This is a keynote by Huang. Right? So a keynote by Huang. What if they announce something really cool? So Huang doing a keynote and announcing something really cool. Wouldn't you like to be one of the first to see the reveal of something really cool and something really new? A keynote by Huang announcing something new - and you can be one of the first to witness this reveal. Imagine that.

Yes, the bloke takes fucking ages to get from one point to another.
 
Obviously I am pulling this from my ass but.....

I don't really see Intel being in a rush to push processors with more than 4 cores to the mainstream for a while yet. No one outside of professionals needs them and that segment can afford to pay top dollar, why not carry on as we are now with essentially everyone subsidising their iGP and add low power SoC type solutions like always on listening and gesture cameras etc

So IMO not in broadwell and probably not till the skylake shrink

You can say pretty much the same thing about any modern graphic card!

Maybe Intel wants all other people with Quad Cores to spend some cash because not a lot of people are pulling the trigger on those 9% performance increases from architecture change.

They have a product, they want to sell it, does it really matter if a lot of programs are not multithreaded properly?

Besides, Intel is on board with Dx12, so maybe they saw the need for more cores in order to future proof their products, kinda like slapping 128 mb of eDRAM on those Iris GPUs.
Witcher 2 and über is definitely a system killer - I get 20-30fps at 1080p on a 4.6Ghz 3820 and the Titan. However, I can't imagine the next round will get that to 60fps, locked, quite yet. I'd like to stick with a single card solution. ... You are right, though.

Crysis 3, however, runs fine at max settings. The occasional dip below 50fps - definitely not locked at 60, but it's more than playable.

Witcher 3 better fucking TXAA support!

Because Witcher 2 needed it BAD!
 
You can say pretty much the same thing about any modern graphic card!

Maybe Intel wants all other people with Quad Cores to spend some cash because not a lot of people are pulling the trigger on those 9% performance increases from architecture change.

They have a product, they want to sell it, does it really matter if a lot of programs are not multithreaded properly?

Besides, Intel is on board with Dx12, so maybe they saw the need for more cores in order to future proof their products, kinda like slapping 128 mb of eDRAM on those Iris GPUs.


Witcher 3 better fucking TXAA support!

Because Witcher 2 needed it BAD!

How well does TXAA perform?
Just got my first NVidia card so I'm still learning. I read that they are bringing it to Titanfall. How does it compare to MSAA?
 
Purchased a 770 in the middle of November so I doubt i'll be getting too excited about the hardware showcased today. Perhaps new mobile tech or new GeForce software?
 
Nvidia wont announce a product that's 10 months away. When you hear about the 800 series, it will be weeks or days before they go on sale.
Any other approach just sends sales to your competition or encourages your customers to spend their disposable income elsewhere.

Part of the reason I think these new GPUs are still a long way off is because we've seen none of the usual leaks running up to a GPU launch and nvidia are now allowing a 6GB 780 Ti variant. The 780 Ti and the Titan Black are designed to see us through 2014 at the top end in my opinion. It's possible we could get the mid-range parts in the summer, but who knows.
 
I'm trying to keep my heart steady for this one, boys. I always get way to excited for this, and then I just end up with a slight feeling of disappointment.
 
Indeed, deliver unto me the 790. Or EVGA is getting my money the moment the 780ti 6GB hits the market.
That's currently my plan of action.

I figure, when high-end Maxwell eventually launches, the resale value of the 780ti 6GB will still be decent enough make a dent on that card.
 
Nvidia wont announce a product that's 10 months away. When you hear about the 800 series, it will be weeks or days before they go on sale.
Any other approach just sends sales to your competition or encourages your customers to spend their disposable income elsewhere.

Part of the reason I think these new GPUs are still a long way off is because we've seen none of the usual leaks running up to a GPU launch and nvidia are now allowing a 6GB 780 Ti variant. The 780 Ti and the Titan Black are designed to see us through 2014 at the top end in my opinion. It's possible we could get the mid-range parts in the summer, but who knows.


My thinking as well... But I want to believe
 
Nvidia wont announce a product that's 10 months away. When you hear about the 800 series, it will be weeks or days before they go on sale.
Any other approach just sends sales to your competition or encourages your customers to spend their disposable income elsewhere.

Part of the reason I think these new GPUs are still a long way off is because we've seen none of the usual leaks running up to a GPU launch and nvidia are now allowing a 6GB 780 Ti variant. The 780 Ti and the Titan Black are designed to see us through 2014 at the top end in my opinion. It's possible we could get the mid-range parts in the summer, but who knows.

You're killing me!
 
Top Bottom