• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.

ggx2ac

Member
Posting this again. Regarding the clock speeds of the GPU. Since it is clocked so low and that the Switch has a fan, we'd like to think there are more SMs in the Switch but is that likely?

First, since the Switch is clocked at 307.2 MHz giving 157 GFlops for 2 SM or 236 GFlops for 3 SM. The low clock speed suggests that this can't be manufactured on a 16nmFF node otherwise it could get clocked higher to give more performance when getting a 60% reduction in power consumption.

Since adding more SMs would make the GPU larger and increase the cost as well, by how much I don't know but it seems it'd cost more than manufacturing the SoC at 16nmFF with just 2 SMs.

It seems from leaks that the new Shield TV being shown at CES also has 2 SM if there is any possible correlation.

There is still the matter of the old dev-kits having loud fans suggesting they were overclocked and it may be possible that this hints at more than 2 SM for the Switch but still? I don't know.
 

antyk

Member
Stupid question - can Switch be connected to a TV without the dock? Even if the picture was to be stretched from 720p to TV's 1080p / 4K, it would still make sense to have this option with this being a portable device, right? Also, can it be charged without the dock?
 
3rwmos.jpg

LOL, that's basically me at this point.

Lowering my expectations as far as I can just to not get disappointed.

It's easy to be a Nintendo fan.
 

ggx2ac

Member
Stupid question - can Switch be connected to a TV without the dock? Even if the picture was to be stretched from 720p to TV's 1080p / 4K, it would still make sense to have this option with this being a portable device, right? Also, can it be charged without the dock?

Yes, it can be charged without the dock, otherwise you wouldn't be able to take it with you outside.

The Switch is most likely not programmed to output video with USB-C, I could be wrong. They want you to use the dock and the dock uses HDMI out to connect to the TV.
 
Stupid question - can Switch be connected to a TV without the dock? Even if the picture was to be stretched from 720p to TV's 1080p / 4K, it would still make sense to have this option with this being a portable device, right? Also, can it be charged without the dock?

No, you need the dock to pass-through the video signal.
 

z0m3le

Banned
Posting this again. Regarding the clock speeds of the GPU. Since it is clocked so low and that the Switch has a fan, we'd like to think there are more SMs in the Switch but is that likely?

First, since the Switch is clocked at 307.2 MHz giving 157 GFlops for 2 SM or 236 GFlops for 3 SM. The low clock speed suggests that this can't be manufactured on a 16nmFF node otherwise it could get clocked higher to give more performance when getting a 60% reduction in power consumption.

Since adding more SMs would make the GPU larger and increase the cost as well, by how much I don't know but it seems it'd cost more than manufacturing the SoC at 16nmFF with just 2 SMs.

It seems from leaks that the new Shield TV being shown at CES also has 2 SM if there is any possible correlation.

There is still the matter of the old dev-kits having loud fans suggesting they were overclocked and it may be possible that this hints at more than 2 SM for the Switch but still? I don't know.

The fan isn't justified with the configuration we were thinking and these clocks, you can put a fan on a rock, so it isn't impossible that it is just there. It could be that they were planning on going with higher clocks with pascal but it just didn't happen in time and now they are stuck with a fan on a device that doesn't need one, and for battery reasons, it will keep its low clocks, however we don't know what happened, what the configuration is, we assume it is 2sm, and if pascal and higher clocks were the original target, the fan might be there for that...

The fan being there also means they can push for higher clocks in the future, increasing the device by 25% can happen pretty easily given those outlines. The cpu can also receive a clock boost later as we know A57 is very stable at up to double that clock. When Nintendo stops caring about battery life of this model, it could be what they change, we will have to wait and see if we've even got it right. Personally all of this has been gravy, my original Nintendo's next handheld specs dream was 154gflops gcn, so this is a nice surprise in that light being maxwell and fp16 being there as well.

I still don't like the idea of a fan on this device.
 

McHuj

Member
Posting this again. Regarding the clock speeds of the GPU. Since it is clocked so low and that the Switch has a fan, we'd like to think there are more SMs in the Switch but is that likely?

First, since the Switch is clocked at 307.2 MHz giving 157 GFlops for 2 SM or 236 GFlops for 3 SM. The low clock speed suggests that this can't be manufactured on a 16nmFF node otherwise it could get clocked higher to give more performance when getting a 60% reduction in power consumption.

Since adding more SMs would make the GPU larger and increase the cost as well, by how much I don't know but it seems it'd cost more than manufacturing the SoC at 16nmFF with just 2 SMs.

It seems from leaks that the new Shield TV being shown at CES also has 2 SM if there is any possible correlation.

There is still the matter of the old dev-kits having loud fans suggesting they were overclocked and it may be possible that this hints at more than 2 SM for the Switch but still? I don't know.

I'm setting my expectations for a 28nm part. A 16nm part would probably allow for a much higher clock rate. 20nm is hardly used and expensive. Just because th X1 was on 20nm, doesn't mean this has to be as both ARM cores and Maxwell on 28nm as well.
 

ggx2ac

Member
I'm setting my expectations for a 28nm part. A 16nm part would probably allow for a much higher clock rate. 20nm is hardly used and expensive. Just because th X1 was on 20nm, doesn't mean this has to be as both ARM cores and Maxwell on 28nm as well.

It'd be hilarious if they decided to go for a larger node and a fan because they couldn't get 16nmFF.

Especially if they charge for $250 and not a cheaper price.
 
Why not?

Are people just totally cool with Nintendo being lazy? "You KNOW they'd never do something to make their game better, why even expect it?"
If they're going to remake every damn thing, it's going to be a new game. Stars is explicitly another version of a game that already exists. GTAV and a hundred other cross-gen games didn't create all-new models for PS4One releases.
 

z0m3le

Banned
I'm setting my expectations for a 28nm part. A 16nm part would probably allow for a much higher clock rate. 20nm is hardly used and expensive. Just because th X1 was on 20nm, doesn't mean this has to be as both ARM cores and Maxwell on 28nm as well.
Expectations? But the clock is already known, performance doesn't increase with different nodes, just battery life, and since we don't know the battery capacity, all we know is that it would be less battery life than 20nm, 28nm means very little to anyone any more, and might mean the difference between 4 or 5 hours of battery life, but with fast charging it becomes less important imo.
 
Posting this again. Regarding the clock speeds of the GPU. Since it is clocked so low and that the Switch has a fan, we'd like to think there are more SMs in the Switch but is that likely?

First, since the Switch is clocked at 307.2 MHz giving 157 GFlops for 2 SM or 236 GFlops for 3 SM. The low clock speed suggests that this can't be manufactured on a 16nmFF node otherwise it could get clocked higher to give more performance when getting a 60% reduction in power consumption.

Since adding more SMs would make the GPU larger and increase the cost as well, by how much I don't know but it seems it'd cost more than manufacturing the SoC at 16nmFF with just 2 SMs.

It seems from leaks that the new Shield TV being shown at CES also has 2 SM if there is any possible correlation.

There is still the matter of the old dev-kits having loud fans suggesting they were overclocked and it may be possible that this hints at more than 2 SM for the Switch but still? I don't know.

Adding an additional SM would indeed bump up the manufacturing cost, but the tradeoff here is the cost of the SM vs the cost/weight of a larger battery needed for higher clocks. So it could be possible that they find it cheaper to manufacture a 3SM die than it is to order a larger or heavier battery needed for the clock speeds which give them the performance they want.

16nmFF could have solved all of these problems by requiring less power overall, but it seems they missed that boat for some reason. Although DF did not actually confirm that.
 

LordRaptor

Member
Why not?

Are people just totally cool with Nintendo being lazy? "You KNOW they'd never do something to make their game better, why even expect it?"

How many games series have you played that has more than 800 fully animated and bespoke "main characters" (as in that are potentially the main camera focus for extended amounts of playtime)?

I don't think you appreciate how much work that is. Its not like something like Assassins Creed crowds where you can procedurally add male_A head to male_Z torso and use generic_man_walk animation and it just works
 

BDGAME

Member
About that "more SM possibility", isn't a 4SM Tegra, or 512 Cuda cores, basically a Xavier?

The Tegra Xavier will launch in 2017 too. Maybe Nintendo utilize some of that technology in their chip too.
 

Rodin

Member
Expectations? But the clock is already known, performance doesn't increase with different nodes, just battery life, and since we don't know the battery capacity, all we know is that it would be less battery life than 20nm, 28nm means very little to anyone any more, and might mean the difference between 4 or 5 hours of battery life, but with fast charging it becomes less important imo.
28nm actually perfectly explains the clockspeeds and why the fan is in there with 2SM.
 
The IPC difference between A15 and A57 is 20%-25%. There's no way 4x A57 @ 1GHz to roughly equal 8x Jaguar @ 1.6GHz.
Thank you. I'll redact the comment. I was looking at a comparison that showed the A15 swapping blows with the Jaguar, then out of curiosity I searched for A15 vs A57 and found that chart that made the A57 look ~40% better than the A15.

i was under the impression that a57 and jaguar at the same clockspeeds are close to even per core. ~10% diff
That's the A15 that was in the first Shield portable. It swaps blows with Jaguar cores. That's why the A15 to A57 comparison threw me off. I was still wrong, though.
 

z0m3le

Banned
About that "more SM possibility", isn't a 4SM Tegra, or 512 Cuda cores, basically a Xavier?

The Tegra Xavier will launch in 2017 too. Maybe Nintendo utilize some of that technology in their chip too.

Core count had nothing to do with architecture. If it's 4SM, it is at the upper limits of what maxwell should be able to offer in the size and cooling solution. You'd be looking at 780gflops docked, and 314gflops as a portable. Maybe if it did use 16nm.

I think what happened was originally they were planning on higher clocks with pascal, maybe 400mhz mobile and 1ghz docked but pascal fell through for whatever reason and they had to stick with 20nm and lower clocks to 307mhz and 768mhz docked. This would fit the idea that they were looking at higher clocks for the cpu and gpu but 20nm held them back. I don't see why they wouldn't just delay switch until may instead though and shoot for 16nm.
 

z0m3le

Banned
Yeah that would definitely do it. That would be a very odd decision but maybe Nvidia did give them an incredible deal on 28nm. That gives me hope for $199.

28nm would be weird because maxwell Tegra wasn't designed on 28nm, but maybe this is the case, what a poor choice to not go with 20nm and no fan though. I think the fan alone is the biggest disappointment, having a vent on top and a fan that shouldn't be there just makes it expensive. Turns a $199 device into a $249 device, but heck if they did get it to retail at $199, that would be pretty cool. I've said it before, I'm fine with the performance, it's really the fan that bothers me
 

ggx2ac

Member
28nm actually perfectly explains the clockspeeds and why the fan is in there with 2SM.

It's feeling more likely this is the case as opposed to more SMs.

1) The Switch better be $199, which would mean I only have to pay $250 for the premium model.

2) We still don't even know the CPU. It'd be stupid if they decided to just have 4 A57 cores and not do something like 4 A57 and 4 A53 with heterogeneous computing to at least make things better regarding the low clock speed.

3) It's likely then the SoC uses a 64-bit bus and somebody showed that the RAM at 25.6 GB/s wouldn't be a bottleneck for the current FLOPS speculated and even when downclocked would still provide plenty of bandwidth to CPU and GPU when portable.
 

Rodin

Member
Yeah that would definitely do it. That would be a very odd decision but maybe Nvidia did give them an incredible deal on 28nm. That gives me hope for $199.

Yeah, that would do. But it has to be incredibly cheap to be picked by Nintendo.

Thinking back to the WUSTs, i think it has a lot more to do with yields. Nintendo works on well consolidated nodes to increase yields and lower cost as much as possible, so that's probably what they did here as well... except i don't think 16nm would've been that exotic now, as it's been around for over a year now (almost 2 when the Switch releases). Even at the same performance level it would've clearly been the best choice, if anything to avoid putting a fan in there.

The problem for us (if this pans out) is that they went again for older components at lower clockspeeds/core count at the expense of performance to fit a certain power/thermal budget, instead of using more modern tech with better performance within those same limits. With Wii U there was also the problem that they had a billion dollars contract with IBM, but now that they could finally change the entire architecture many of us thought that they would use more cutting edge tech, so i can see why people would be bothered by the fact that nothing actually changed.

We were also discussing what cores they could've used, and the speculation was that A73 might have been too new to make it in time for a device with a release set for march 2017, but Huawei just released the Mate 9 with the new Kirin 960 (4 A73+4 A53) so that clearly wasn't a problem, as final kits for the NS were shipped in november. From the looks of it, though, they didn't even bother with A72 (which also hints at 28nm imho), so it's useless to even think about how much the better the CPU could've actually been. They just went dirt cheap with this, and hopefully it will be priced accordingly.

It's feeling more likely this is the case as opposed to more SMs.

1) The Switch better be $199, which would mean I only have to pay $250 for the premium model.

2) We still don't even know the CPU. It'd be stupid if they decided to just have 4 A57 cores and not do something like 4 A57 and 4 A53 with heterogeneous computing to at least make things better regarding the low clock speed.

3) It's likely then the SoC uses a 64-bit bus and somebody showed that the RAM at 25.6 GB/s wouldn't be a bottleneck for the current FLOPS speculated and even when downclocked would still provide plenty of bandwidth to CPU and GPU when portable.
1) Personally i think it's going to be 249-299, which is why i'm kind of pissed. I won't complain too much if it's 199-249 with a game.

2) I think that A57 is a safe bet considering the clockspeed and the (likely?) node, but yeah, hopefully they used at least a couple of A53-A35 to run the OS. The last thing we need is one of the 4 cores being dedicated to this task, leaving us with only 3 for games.

3) Maybe it's enough, but it's likely that there'll be a bit of SRAM for cache (2-4MB) so we shouldn't worry about bandwidth anyway.
 

Mpl90

Two copies sold? That's not a bomb guys, stop trolling!!!
As a reference for the future, since I've mentioned his posts in the recent past, here's all the posts Matt made on the possibility of getting PS4/One ports on Switch

http://neogaf.com/forum/showpost.php?p=224727906&postcount=180

I've said it before and I'll say it again:

Business concerns, not technical ones, will be what decides the games that show up on the Switch in the majority of cases.

http://neogaf.com/forum/showpost.php?p=224985588&postcount=2798

A lot of the stuff devs could do to reduce space requirements would be basically imperceptible to most or all gamers.

Again, this conversation is largely academic. If a publisher wants to put a game on the Switch, the vast majority of the time a dev will be able to do it.

http://neogaf.com/forum/showpost.php?p=224986107&postcount=2805

By and large the experience can be transferred over intact.

After all, is the experience of playing GTAV on a 360 really very different from playing it on a PS4?

http://neogaf.com/forum/showpost.php?p=224986236&postcount=2810

GTA 5 remaster is an up port of a last gen game, not a downport of a current gen game. Not the same thing.

That distinction has really nothing to do with what I said.

http://neogaf.com/forum/showpost.php?p=224986575&postcount=2815

All its features and gameplay was created with the limitations of PS3 and 360 in mind.

Right...which again doesn't really affect my point.

It's was just a comparison to the kind of differences you might see.

http://neogaf.com/forum/showpost.php?p=224986755&postcount=2819

I didn't understand your point then. I thought it was that features and gameplay will remain equal when ported to Switch.

It is. Which is why bringing up that GTV was an up port was irrelevant.

It doesn't matter, I think everyone is on the same page.

Edit: No worries.

http://neogaf.com/forum/showpost.php?p=225413656&postcount=494

Would you be willing to share for those of us who dont. The gap in system RAM seems like a large hurdle, especially for larger open world games.

Eh.

http://neogaf.com/forum/showpost.php?p=225424406&postcount=514

Intriguing response.

It was my polite way of saying that poster was wrong. RAM is great, more is always better. Is 8 GB completely necessary to make the kind of games we are seeing today? Absolutely not.

http://neogaf.com/forum/showpost.php?p=225430134&postcount=551

hard to say for sure without actual specs, but based on assumptions and limitations of the hardware due to form factor, look at anything where demands on the hardware are large because of things outside of graphical effects. Shadows of Mordor and its Nemesis system, Assassin's Creed Unity and its civilian density I very much doubt could run on switch hardware.

Of course they both could. Maybe not as dense for Unity, I don't know, but it certainly wouldn't be a night and day difference.

http://neogaf.com/forum/showpost.php?p=225447116&postcount=598

Yes. And making a pared-down version of a game for Switch COULD be an expensive endeavor, and if it's too costly, the dev cost-to-profit ratio will mean it won't be worth it. I don't get why people can't grasp that devs only make cross-gen games during a transitional period when it's still worth it to do so. We aren't in a transitional period, at least not one as big as gen 7 to gen 8. If Switch isn't as powerful as Sony and MS' systems, and given how badly Wii U flopped, if Switch hardware doesn't sell, then there's no point in porting games to it if they won't sell.

And what I am telling you is that the sales potential, not the tech factors, will be what largely determine which games show up. So all this worry going around about the Switch's specs preventing third party support is looking at the issue from the completely wrong angle.

http://neogaf.com/forum/showpost.php?p=225843204&postcount=3544

I think much like the Wii U the Switch is going to be CPU limited again. Lets just hope they at least go with 1 pool of faster memory.

The Switch will be less "CPU limited" than the PS4 and XBO are.

It should be quite comprehensive. Hopefully he can comment on the recent clock rumours soon.
 

z0m3le

Banned
Thinking back to the WUSTs, i think it has a lot more to do with yields. Nintendo works on well consolidated nodes to increase yields and lower cost as much as possible, so that's probably what they did here as well... except i don't think 16nm would've been that exotic now, as it's been around for over a year now (almost 2 when the Switch releases).

The problem for us (if this pans out) is that they went again for older components at lower clockspeeds/core count at the expense of performance to fit a certain power/heat budget, instead of using more modern tech with better performance within the same thermal limits. With Wii U there was also the problem that they had a billion dollars contract with IBM, but now that they could finally change the entire architecture many of us thought that they would use more cutting edge tech, so i can see why people would be bothered by the fact that nothing actually changed. We were also discussing what cores they could've used and the speculation was that A73 might have been too new to make it in time for a device with a release set for march 2017, but Huawei just released the Mate 9 with the new Kirin 960 (4 A73+4 A53) so that clearly wasn't a problem, as final kits were shipped in november. But apparently they didn't even bother with A72 (which also hints at 28nm imho), so it's useless to even think about how much the better the CPU could've actually been. They just went dirt cheap with this.



1) Personally i think it's going to be 249-299, which is why i'm kind of pissed. I won't complain too much if it's 199-249 with a game.

2) I think that A57 is a safe bet considering the clockspeed and the (likely?) node. Hopefully they used a couple of A53-A35 to run the OS, the last thing we need is one of those being dedicated to that.

3) Maybe, but with a bit of SRAM for cache (2-4MB) bandwidth shouldn't be an issue.

Maxwell and Pascal are the same architecture, it's just a smaller node. Volta is the successor to Maxwell. A57 and A72 aren't drastically different, A73 just offers higher clock rates. There is nothing old about the switch's architectures, they are the current architecture.

The device is still very capable even with these lower clocks, we saw that even portable Zelda ran much better than on wii u, that's because even just at fp32 it is 50% faster than Wii U.

Seriously 400gflops and 500gflops everyone was expecting isn't the sky falling, the thing that is odd is the cpu clock speed but if it is 28nm, I can see why it would be clocked that low. It should be able to handle ports but not as easily as we were hoping which is against older rumors, so maybe there is more than 4 cores or A53 cores, it's hard to say.
 
They have a chance to attract some "gamer" crowd with the possibility of portable dark souls or gta5.

If the hardware is as weak as it seems to be, no one will be excited by a 15fps dark souls or a stuttering GTA.

Well...

I really don't understand. How can they take an existing chip and purposedly make it weaker. It's as if they were frightened by the possibility of a descent hardware...

Man, it reminds me of the WiiU pre launch period. These bad news threads full of "lol Nintendo", "dirt cheap", "krilin tier" and all these bullshit. I was surprised that each and every Switch thread were positive, it seems this is over.

Now, why did no one this time leak the weakness of the thing?
 

ozfunghi

Member
Well; after Wii and WiiU, nothing surprises me from Nintendo. I had hoped for a better GPU sollution, but hey, a portable WiiU can be fun. But i won't pay more than €199 for this. Or this just could be the first Nintendo console in nearly 20 years, that i'll pass on.
 

Kimawolf

Member
Has he commented on the recent news?
Thats why i am not worried. Its obvious we missing some major pieces of the puzzle. Not like Matt is a Nintendo fanboy. He was the one who burst the Wii U bubble after all.

Its just amazing folks take admittedly incomplete info to the worst possible conclusion.

To think, Nintendo almost made it to their event without potentially negative news.
 

Avtomat

Member
It's feeling more likely this is the case as opposed to more SMs.

1) The Switch better be $199, which would mean I only have to pay $250 for the premium model.

2) We still don't even know the CPU. It'd be stupid if they decided to just have 4 A57 cores and not do something like 4 A57 and 4 A53 with heterogeneous computing to at least make things better regarding the low clock speed.

3) It's likely then the SoC uses a 64-bit bus and somebody showed that the RAM at 25.6 GB/s wouldn't be a bottleneck for the current FLOPS speculated and even when downclocked would still provide plenty of bandwidth to CPU and GPU when portable.

I think they will utilise 4xA57 and 4xA53, however I suspect that the A53 cores will not be directly accessible by games probably just used for background tasks and the OS. It just seems that it would introduce a lot of headaches trying to code for slower cores and faster cores at the same time making sure that they are in sync.
 
As a reference for the future, since I've mentioned his posts in the recent past, here's all the posts Matt made on the possibility of getting PS4/One ports on Switch

It should be quite comprehensive. Hopefully he can comment on the recent clock rumours soon.

Most of that was in reference to the storage medium (16GB game cards being "standard") but yeah I hope it's all still true regardless of clock speeds. The Wii U was in a similar situation, although the CPU was cited several times for actually 100% preventing ports, so hopefully we don't have all of the Switch CPU info yet. GPU functions are always incredibly scaleable.
 
Well; after Wii and WiiU, nothing surprises me from Nintendo. I had hoped for a better GPU sollution, but hey, a portable WiiU can be fun. But i won't pay more than €199 for this. Or this just could be the first Nintendo console in nearly 20 years, that i'll pass on.

My sentiments exactly.
 
They have a chance to attract some "gamer" crowd with the possibility of portable dark souls or gta5.

If the hardware is as weak as it seems to be, no one will be excited by a 15fps dark souls or a stuttering GTA.

Well...

I really don't understand. How can they take an existing chip and purposedly make it weaker. It's as if they were frightened by the possibility of a descent hardware...

Man, it reminds me of the WiiU pre launch period. These bad news threads full of "lol Nintendo", "dirt cheap", "krilin tier" and all these bullshit. I was surprised that each and every Switch thread were positive, it seems this is over.

Now, why did no one this time leak the weakness of the thing?

Did GTA5 run at 15fps or stutter on XB360?
 
Thats why i am not worried. Its obvious we missing some major pieces of the puzzle. Not like Matt is a Nintendo fanboy. He was the one who burst the Wii U bubble after all.

Its just amazing folks take admittedly incomplete info to the worst possible conclusion.

To think, Nintendo almost made it to their event without potentially negative news.

lol, it was too good to be true.
 

sfried

Member
Now, why did no one this time leak the weakness of the thing?
Clock speeds were disclosed to devs in the early stages of development? Expectations were in check?

If sales figures is what we have going for, and the draw here is that the unit itself is "incredibly cheap", well...that could explain why some higher profile 3rd parties are intruiged.
 

Kimawolf

Member
Clock speeds were disclosed to devs in the early stages of development? Expectations were in check?

If sales figures is what we have going for, and the draw here is that the unit itself is "incredibly cheap", well...that could explain why some higher profile 3rd parties are intruiged.
Or maybe devs just see the full picture and have access to everything. We will know more soon enough.
 

TunaLover

Member
Well; after Wii and WiiU, nothing surprises me from Nintendo. I had hoped for a better GPU sollution, but hey, a portable WiiU can be fun. But i won't pay more than €199 for this. Or this just could be the first Nintendo console in nearly 20 years, that i'll pass on.
Yeah, now I'm catious optimistic, will need to see Nintendo stance on VC, game pricing, and japanese 3rd party line-up, it looks like Switch it's going to be what Wii was to the Gamecube (power wise) with the small form factor advantage and portability.
 

AlStrong

Member
except i don't think 16nm would've been that exotic now, as it's been around for over a year now (almost 2 when the Switch releases).

It's about demand.

Apple paid a hefty premium to get it first into their high margin A9/A9X products (Sept 2015), and even then they had to split production with Samsung just to get volume. nVidia finally launched 16nmFF products half-way through this year, also with fairly high margins, so it's certainly worth it to them. There was no way nV would push performance without the double density & significant power savings on desktop when they were already pushing die size, and Apple has loads of cash to burn to push the latest at the earliest convenience while sucking up all the fab production volume.

There's no doubt that TSMC is having no trouble finding customers for the node, so they can demand a premium to recoup the R&D costs of the node in the first place.

Meanwhile, TSMC can offer discounts on older nodes to attract customers who aren't in terrible need of bleeding edge so that they have fabs busy instead of just sitting there collecting dust. ;)
 

ggx2ac

Member
Maxwell and Pascal are the same architecture, it's just a smaller node. Volta is the successor to Maxwell. A57 and A72 aren't drastically different, A73 just offers higher clock rates. There is nothing old about the switch's architectures, they are the current architecture.

I disagree there, considering how Nintendo had to downclock everything drastically an A73 would have been better for reducing power consumption because of how it was designed to have a lower power draw compared to an A72 while still being good at performance.

Instead of choosing to maintain A72’s 3-wide, or increase the microarchitecture’s decoder width, ARM opted to instead go back to a 2-wide decoder such as found on the current Sophia family. Yet the A73 positions itself a higher-performance and lower-power design compared to the larger A72.

http://www.anandtech.com/show/10347/arm-cortex-a73-artemis-unveiled
 

z0m3le

Banned
I disagree there, considering how Nintendo had to downclock everything drastically an A73 would have been better for reducing power consumption because of how it was designed to have a lower power draw compared to an A72 while still being good at performance.



http://www.anandtech.com/show/10347/arm-cortex-a73-artemis-unveiled
My point was just that it is technically the same architecture. The entire 64bit arm family is fully hardware compatible with each other.
 
Question: the specs DF is working from cites the GPU architecture as "Nvidia second generation Maxwell"

This would suggest 20nm, no? They are also working under the assumption of 20nm for the entirety of the article. Would they/developers know if the final hardware is supposed to be on a 28nm process? And how would this explain the devkits being stock Tegra X1s with audible cooling?
 

Vena

Member
28nm actually perfectly explains the clockspeeds and why the fan is in there with 2SM.

So they back-engineered a 20nm fab to 28nm? That doesn't exactly make all that much sense to me.

The DF article posits that they believe/have heard that the custom chip has absorbed other parts/improvements of Pascal, would be a very weird chimera of a chip to be sporting advanced featuresets on Maxwell Gen2 + Pascal improvements, and then be riding on 28nm. The former never being designed on said node to begin with.
 

z0m3le

Banned
Question: the specs DF is working from cites the GPU architecture as "Nvidia second generation Maxwell"

This would suggest 20nm, no? They are also working under the assumption of 20nm for the entirety of the article. Would they/developers know if the final hardware is supposed to be on a 28nm process? And how would this explain the devkits being stock Tegra X1s with audible cooling?
It wouldn't, none of it makes sense with former rumors and leaks, we are basically throwing all that stuff away and trusting eurogamer, and they could be right. Here is the thing though, it has a fan. That is a big problem for these specs, it just doesn't really make sense to have a fan, no matter what the savings, a smaller battery and no fan on a smaller node, would absolutely be the cheaper and better designed option than this.

But I'm going with it for now and trying to make sense of the fan, 28nm could lead to $199 price point I guess, and that could be worth the small downgrade in performance. X1 performance is only 512gflops, vs this 400gflops, it's not like the sky is falling.
 

sfried

Member
So they back-engineered a 20nm fab to 28nm? That doesn't exactly make all that much sense to me.

More like there were a shit-ton of 28nm spares lying around Nvidia from all of those unsold Shield devices, and Nvidia was selling them for very cheap.

Which begs to question: What happens after they sell the lot out of those 28nm chips? Another thing that wouldn't stack up. But it does line up with price point.
 
Status
Not open for further replies.
Top Bottom