• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia Mobile Kepler - 2W, 192 cores more powerful than 8800GTX

KKRT00

Member
Screen-Shot-2013-07-24-at-2.41.18-AM_678x452.jpg


Screen%20Shot%202013-07-24%20at%203.18.58%20AM_575px.png


Island tech demo:
http://www.youtube.com/watch?v=miEglxHz1iA&feature=player_embedded

Ira tech demo [facework tech]:
http://www.youtube.com/watch?v=Vx0t-WJFXzo

---
By AnandTech article it seems that they need 1ghz core to achieve this performance, but Nvidia hasnt talked about core frequency yet.


Very impressive stuff. Maybe now people will believe how fast technology advances and how outdated current gen is :)

More here:
http://www.anandtech.com/show/7169/nvidia-demonstrates-logan-soc-mobile-kepler
 
So those people claiming (I think it was Epic or Crytek) that next generation of tablets beating the PS360... are seeing those claims becoming reality! WHOA!
 

see5harp

Member
Geez. Battery tech is going to have to step up. I'm not sure I can justify a laptop that runs for maybe an 1 hour. Awesome though.
 

Lasdrub

Member
This tech is 200x more powerful than the iPad 4? Is it a separate graphics chip? That seems unrealistic.

Edit: OK I did math wrong. If the iPad 4 is at 50x and the Nvidia chip is at 250x, then this new chip is 5x more powerful than the iPad 4.
 

blastprocessor

The Amiga Brotherhood
So at least 2x performance PS3/XBOX360 in iPad form factor.

Incredible but will need to see to believe. Also what of battery life?
 

No Love

Banned
Geez. Battery tech is going to have to step up. I'm not sure I can justify a laptop that runs for maybe an 1 hour. Awesome though.

This uses far less power than a laptop GPU of equivalent power. 90%+ less.

For example, a Radeon 6770M (which is close to the power level of this, in theory) draws 30W on average. That's 28w more than this, or 15x as much!
 

Dambrosi

Banned
Seriously, what kind of battery tech are you going to need to power this monster? Twin batteries? Triple? Quad?

What about heat? Not just from the monster GPU, but from the monster battery powering it? A potential fire risk, maybe?

Wait, it's supposed to use only 2W?! Okay, I'm impressed. How much, though?
 

Keyouta

Junior Member
I think I should get a new GPU, the one in my computer is a 9800GTX+, which isn't much better than the 8800GTX. Impressive tech.

Seriously, what kind of battery tech are you going to need to power this monster? Twin batteries? Triple? Quad?

What about heat? Not just from the monster GPU, but from the monster battery powering it? A potential fire risk, maybe?

I dunno, this seems a bit pie-in-the-sky, but we'll see.

It's 2W, much lower than a normal mobile GPU from NVidia or AMD.
 

McHuj

Member
I really don't trust their 2W numbers on that slide. Wouldn't surprise me if it was something like NVIDIA GPU core power vs full SOC power of the "competitors" processor.
 
Seriously, what kind of battery tech are you going to need to power this monster? Twin batteries? Triple? Quad?

What about heat? Not just from the monster GPU, but from the monster battery powering it? A potential fire risk, maybe?

I dunno, this seems a bit pie-in-the-sky, but we'll see.

It's 2 W.

Tegra 3 and 4 are in the same power consumption ballpark (actually it's a higher tdp).
 

aeolist

Banned
it's entirely possible that power consumption for final retail implementations will go down since by the time they launch it might be on TSMC's 20nm process node rather than the current 28nm
 
By AnandTech article it seems that they need 1ghz core to achieve this performance, but Nvidia hasnt talked about core frequency yet.

Yeah this probably just not going to happen. Apparently the Ipad 4 GPU is clocked at 550mhz, Snapdragon Adreno 330 is clocked at 450mhz and Tegra 4 is clocked at 650mhz.

Assuming that the GPU in this Soc will be clocked at 650mhz:

192 * 0.650 * 2 = 249.6 Gflops.

But Flops is only part of the full picture, memory bandwidth is the biggest problem for mobile Socs. This Soc (Logan) will probably have just 25-40 GB/s memory bandwidth at best, that still less than half the 8800 GTX memory bandwidth.
 

aeolist

Banned
Yeah this probably just not going to happen. Apparently the Ipad 4 GPU is clocked at 550mhz, Snapdragon Adreno 330 is clocked at 450mhz and Tegra 4 is clocked at 650mhz.

Assuming that the GPU in this Soc will be clocked at 650mhz:

192 * 0.650 * 2 = 249.6 Gflops.

But Flops is only part of the full picture, memory bandwidth is the biggest problem for mobile Socs. This Soc (Logan) will probably have just 25-40 GB/s memory bandwidth at best, that still less than half the 8800 GTX memory bandwidth.

they also ran this GPU at performance parity with the ipad 4 and measured 900mW coming off the GPU rail, while the ipad drew 2.6W on the GPU rail

and yes, you're right about memory bandwidth being a constraint. they're just measuring peak flops on the GPU which doesn't translate directly to real world graphics performance.
 

see5harp

Member
This uses far less power than a laptop GPU of equivalent power. 90%+ less.

For example, a Radeon 6770M (which is close to the power level of this, in theory) draws 30W on average. That's 28w more than this, or 15x as much!

Yea, I missed the whole 2W thing. If this is truly real and they are getting performance like without fudging numbers, then this is awesome.
 

KKRT00

Member
People really dont get how energy efficient is this chip.

They've manage to get same level performance as iPad 4 with 0.9 Watt, where iPad GPU took 2.6W. Thats in 1080p T-Rex HD test in GLB2.7 demo

IMAG1589_575px.jpg
 

Dambrosi

Banned
2W. Two Watts. If true, and not simply fuzzy math, that's quite stunning.

But this is Nvidia, and they don't have the best track record for honesty in these things, do they?
 

Arkam

Member
Looks very impressive.Well done Nvidia.

I cant imagine what the mobile landscape will look like in 5 years.
I could see Nintendo unifying their console/handheld platforms into one device.
 

KKRT00

Member
Isn't a gtx 8800 a lot stronger than the Wii u gpu?

8800GT was able to run Crysis 2 in average 30+fps in 1900x1200 on lowest settings [so higher than consoles]. So yeah.

Gamer-1900x1200.png


---
Yeah this probably just not going to happen. Apparently the Ipad 4 GPU is clocked at 550mhz, Snapdragon Adreno 330 is clocked at 450mhz and Tegra 4 is clocked at 650mhz.

Assuming that the GPU in this Soc will be clocked at 650mhz:

192 * 0.650 * 2 = 249.6 Gflops.

But Flops is only part of the full picture, memory bandwidth is the biggest problem for mobile Socs. This Soc (Logan) will probably have just 25-40 GB/s memory bandwidth at best, that still less than half the 8800 GTX memory bandwidth.
Anything above 30gb/s is gold for such a chip. New Haswell GPUs like HD 4600 has around 30gb/s of bandwidth and is still able to pull Crysis 3 in higher settings than current gen consoles.
 

No Love

Banned
2W. Two Watts. If true, that's quite stunning.

But this is Nvidia, and they don't have the best track record for honesty in these things, do they?

They like to make ridiculous charts, fudge numbers, and rebrand stuff like a motherfucker.

But damn their tech kicks ass sometimes. I'm sure AMD could outdo them on the mobile front, though, their designs are almost always more power efficient.
 

aeolist

Banned
this isn't going to be faster that ps360/wii u

it's going to be more constrained clock speeds with a weaker CPU and much slower memory interface. the gpu architecture is great and efficient and looks right now like it blows qualcomm and imagination technologies out of the water but it's not yet reached the point where it does better than dedicated high-wattage home consoles.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
I would laugh so hard if someone brought out a handheld device in the next 6 months that out performed the WiiU for around the same price.
 
People really dont get how energy efficient is this chip.

They've manage to get same level performance as iPad 4 with 0.9 Watt, where iPad GPU took 2.6W. Thats in 1080p T-Rex HD test in GLB2.7 demo

IMAG1589_575px.jpg

We'll see how the next iPad does when it show up with a 28nm SoC and probably a Rogue 6430 or 6630 graphics core. It's neat to compare, but by the time this chip launches from Nvidia, the next iPad will already be on the market.

The 6630 can do 525 MP/s, and that's @ 600 MHz, not 1 GHz. 315 GF/s @ 600 MHz too. Kepler is 240 GF/s @ 600 MHz scaling off of Anandtech's number of 400 GF/s @ 1 GHz. Rogue would only need a 450 MHz clock to match that. Apple has run previous cores at 400 MHz in the iPad, so mobile Kepler is right in the same neighborhood as Rogue. We'll see how many cores Apple chooses to use if they use Rogue.
 

TheCloser

Banned
Yeah this probably just not going to happen. Apparently the Ipad 4 GPU is clocked at 550mhz, Snapdragon Adreno 330 is clocked at 450mhz and Tegra 4 is clocked at 650mhz.

Assuming that the GPU in this Soc will be clocked at 650mhz:

192 * 0.650 * 2 = 249.6 Gflops.

But Flops is only part of the full picture, memory bandwidth is the biggest problem for mobile Socs. This Soc (Logan) will probably have just 25-40 GB/s memory bandwidth at best, that still less than half the 8800 GTX memory bandwidth.

What is the power consumption when running it at 1tf because I'm almost willing to bet that it will not be running at that speed. It will also be a while before this becomes a baseline for tablets. The most important thing though is that tablet games are usually <$10 so we won't see companies spend AAA money to bring these experiences to tablets for that price. Games on tablets are successful because they are cheap or free but if you start asking for >$25 for tablet games, no one is going to buy it.

Overall the tech is great but no one is going to use the power. Tablets are like pc, there is no single target hardware to work with. Varying gpus and poor apis(ios & android) just ensure that the power will always be potential power.
 

quest

Not Banned from OT
Since this is shipping in first half 2014 NVIDIA they say if it was 28nm or 20nm? The article suggested 28nm but shipping in first half 2014 should not be possible it is 20nm?
 

syko de4d

Member
i hope this is true, sounds stunning.

Imagine what will be possible in some years if you link your Tablet or Smartphone with a Oculus Rift :O
 
Mobile tech advances so damn fast.

The wall is coming. Mobile SoC TDPs have ballooned because they're at peak usage for less and less time, but the well is running dry there with 8W peak that we see in the latest Exynos chips and the like.

Also, Nvidia now has their latest core in their mobile SoC line. They're now tied to their desktop line for architectural improvements.
 

KKRT00

Member
We'll see how the next iPad does when it show up with a 28nm SoC and probably a Rogue 6430 or 6630 graphics core. It's neat to compare, but by the time this chip launches from Nvidia, the next iPad will already be on the market.

The 6630 can do 525 MP/s, and that's @ 600 MHz, not 1 GHz. 315 GF/s @ 600 MHz too. Kepler is 240 GF/s @ 600 MHz scaling off of Anandtech's number of 400 GF/s @ 1 GHz. Rogue would only need a 450 MHz clock to match that. Apple has run previous cores at 400 MHz in the iPad, so mobile Kepler is right in the same neighborhood as Rogue. We'll see how many cores Apple chooses to use if they use Rogue.

Do we even have any good info about Rogue or some presentations? I'm not really up to date.

We also have to consider that Kepler is very efficient in tessellation, aniso etc, its very proven hardware.

---
The wall is coming. Mobile SoC TDPs have ballooned because they're at peak usage for less and less time, but the well is running dry there with 8W peak that we see in the latest Exynos chips and the like.

Also, Nvidia now has their latest core in their mobile SoC line. They're now tied to their desktop line for architectural improvements.

Dont think wall is coming. We have advanced in AVX on intel side that will translate to everyone else and will bring 4x time more performance to current gen CPUs when used efficiently and we will have stacked RAM and Volta tech in next two years from Nvidia to boost GPU performance significantly.
 
Top Bottom