• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

A Nintendo Switch has been taken apart

LordOfChaos

Member
My final monkey wrench question before we get (hopefully) confirming info today.

We know the Switch chip literally has a different code on it, which includes "NX" indicating it is specific to the Switch and not a generic gen 2 X1. But does that 100% prove there is "something" different about it versus the gen 2 X1? Or is it possible or likely that it could be a set of gen 2 X1s that they just put a different stamp on for the Switch? I don't know enough about how these are produced to feel confident that it is actually a modified chip in some way.


The A2 masking revision points to it not just being old TX1 dies, that much we can say with 99% confidence. But at the same time, the 121mm2 die size is too coincidental to the TX1 to assume a lot was changed. 250 people working 2 years including the NVN API, OS integration, etc, also isn't a whole lot of time in the chipmaking world.

I'm expecting the same CPU and GPU config as TX1, maybe with the A53s removed (But at 0.7mm2, they may not bother removing them), and maybe a redesigned memory controller and caches, but not anything drastically different to a TX1.
 

valouris

Member
I am expecting only very minor changes, with most of the customization pertaining to mking the chip work in the form factor and power consumption/profiles Nintendo was envisioning for the Switch.
 

Hermii

Member
The A2 masking revision points to it not just being old TX1 dies, that much we can say with 99% confidence. But at the same time, the 121mm2 die size is too coincidental to the TX1 to assume a lot was changed. 250 people working 2 years including the NVN API, OS integration, etc, also isn't a whole lot of time in the chipmaking world.

I'm expecting the same CPU and GPU config as TX1, maybe with the A53s removed (But at 0.7mm2, they may not bother removing them), and maybe a redesigned memory controller and caches, but not anything drastically different to a TX1.

I don't think anyone actually measured it yet. It looks the same at a glance, but we can't know for sure its 121mm until we get the report from chip works.
 

TAS

Member
Chipworks is the best. I don't want to set myself up for disappointment so I'm expecting a very lightly modified X1.
 
When is the shot supposed to be released? Also with the already existing cores they are using would it be possible to swap out A57s for A72s and the like? Do Maxwell 2 and Pascal look any different from a top die shot perspective?
 
Did we already know what CPU that thing have?

We know it's ARMv8, some devkits in the past had X1 SoCs with A57 cores and Eurogamer claims this has not changed for retail, while another leaker claims different frequencies that point to newer CPU cores. We cannot really know until the die shot is released.
 

LordOfChaos

Member
I don't think anyone actually measured it yet. It looks the same at a glance, but we can't know for sure its 121mm until we get the report from chip works.

You don't really need chipworks to take calipers to the die, it's out in the wild now. There's also no IHS on the die so you can measure directly. GamerNexus had a teardown as well. What Chipworks can do is confirm the node.

Edit: Btw, Anandtech took a look at the Switch today

http://www.anandtech.com/show/11181/a-look-at-nintendo-switch-power-consumption
 

Donnie

Member

Hmm interesting, they've got the undocked Switch fully charged (so not currently charging the battery) consuming 8.9w at max brightness in Zelda. But surely that would mean Zelda would run the battery down completely in less than two hours?

Even at minimum brightness they've got it consuming 7.1w, more than we thought it was and more than is possible if its battery does last two and a half hours in Zelda.
 
What surprises me the most is, how does it only consume 12W when Shield TV consumes 21 at the (allegedly) same frequencies while gaming?
16nm confirmed?
Answer to below: I thought it was confirmed that the Shield TV keeps the 2GHZ CPU, 1GHZ GPU in synthetic benchmarks, but during gaming loads (concurrent CPU and GPU) it throttles to 1GHZ CPU, 768MHZ GPU, and it consumes 21W during said gaming loads.
 

Astral Dog

Member
Expect news from chipworks very soon, I emailed them. Here is the correspondence:

Hi Neogaf,

Thank you for reaching out to us about the Nintendo Switch. I'm happy to report that we are tearing down the device as I type. We hope to have a blog post written about our findings no later than Monday. When the blog post goes live we expect to be able to release some top metal die photos that you can examine them yourself.

I'll be in touch soon.

Hi Chipworks 😊
Oh my God.
😮
Was not expecting this totp
 
Hmm interesting, they've got the undocked Switch fully charged (so not currently charging the battery) consuming 8.9w at max brightness in Zelda. But surely that would mean Zelda would run the battery down completely in less than two hours?

Even at minimum brightness they've got it consuming 7.1w, more than we thought it was and more than is possible if its battery does last two and a half hours in Zelda.
I think it's because of how they're measuring draw. They're using an inline USB-C meter, so this is not battery draw numbers. For one, they point out that the meter itself draws power, and that need is part of the measurement. (We also don't know how accurate or precise the meter is.)

Second, even if the Switch is fully charged, when hooked to external power--as it always is in these tests--it may draw slightly more than strictly needed. That ensures keeping the batteries topped up, and an extra watt or two should be easily dissapatable by the fan.
 

LordOfChaos

Member
What surprises me the most is, how does it only consume 12W when Shield TV consumes 21 at the (allegedly) same frequencies while gaming?
16nm confirmed?
Answer to below: I thought it was confirmed that the Shield TV keeps the 2GHZ CPU, 1GHZ GPU in synthetic benchmarks, but during gaming loads (concurrent CPU and GPU) it throttles to 1GHZ CPU, 768MHZ GPU, and it consumes 21W during said gaming loads.

I thought MDave showed the CPU stuck to more like 1700-1800MHz while throttling, while the GPU came closer to where the docked Switch is (while still averaging above the Switch, but did have dips a touch below it) . Certainly that could indicate the Shield is the one well past the TX1s sweet spot for efficiency.

If you look at say the RX480, a near imperceptible underclock can drastically reduce wattage, making the efficiency equation far more competitive, indicating they pushed it past where it should have been to be competitive.
 
Anybody have a die-shot of a retail X1 from a Shield TV for comparison when the Switch die-shot comes out?

(sidenote, I still giggle every time I boot up Zelda in handheld mode because it amuses me that I'm getting this level of graphical fidelity out of a handheld system.)
 

LordOfChaos

Member
Anybody have a die-shot of a retail X1 from a Shield TV for comparison when the Switch die-shot comes out?

(sidenote, I still giggle every time I boot up Zelda in handheld mode because it amuses me that I'm getting this level of graphical fidelity out of a handheld system.)

Courtesy of Fritzchens Fritz

igmUn4W.jpg


https://flic.kr/p/SjDAVS


I don't think there are shots around, there are shots of Maxwell and Pascal CPUs, and of dies of different phones.

This is the retail Shield TVs TX1
 
What surprises me the most is, how does it only consume 12W when Shield TV consumes 21 at the (allegedly) same frequencies while gaming?
16nm confirmed?
Answer to below: I thought it was confirmed that the Shield TV keeps the 2GHZ CPU, 1GHZ GPU in synthetic benchmarks, but during gaming loads (concurrent CPU and GPU) it throttles to 1GHZ CPU, 768MHZ GPU, and it consumes 21W during said gaming loads.

Hint: It's not a Shield. Never was a Shield. Never was comparable to a Shield. Nintendo, Nvidia, and third party developers confirmed this.

But the reason why the power consumption is a rubbish and horribly flawed argument, and not a reflection of a machine's power/capabilities is that a product releasing in 2017 isn't the same as one that released in 2013, 2014, 2015 and even last year. Advancements in performance and efficiency are a thing, so one has to account for those gains. Same reason why A++-rated electrical goods see their ratings decrease when more advanced products release.

If you saw what was achieved on the Wii U for its time, it wouldn't be a surprise at all. Nintendo even promoted its energy efficiency on the official site, and to drive my points home, the National Resources Defence Council confirmed that the Wii U consumed less power than the Wii in a report in 2014.

So, they have know-how and experience in this area. When Nintendo speaks of low-power consumption with high performance, it isn't "PR Fluff", it's real. Low power consumption and energy efficiency will be very important going forward - You can see it in other areas of technology, for example, with vacuum cleaners in the European Union.
 

LordOfChaos

Member
Oh, that's great. Are the different parts identified?

Anyone able to mark out the different blocks on this so I can tell what's what?

Here's how Nvidia (heavily) stylizes it, I think we need to rotate Fritz's one clockwise rotation to match.

On the stylized version it's much easier, big yellow blocks = big cores, smaller yellow blocks by them = little cores, multitude of smaller cores = GPU shaders (you can see 16x16=256)

X1-CPU.jpg


Rotated
g7C1bQq.jpg



Chipworks, where you at!
 

LordOfChaos

Member
*checks chipworks time zone*

Ooh, I didn't know they were Ottawa based. But yeah, they're all home playing BoTW and getting ready to sleep. I guess tomorrow, hopefully.
 
Jesus fucking christ, please release it already, I have a class to listen to :(
This Switch business is putting a strain in my life. I can't even afford one, but it's so fascinating.
Also, please be A72 CPU cores. I mean, they've been out for more than a year.
 
Top Bottom