• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Curious Case of the Switch Foxconn Leak (Now a hardware fanfiction thread)

Rodin

Member
Well the question then becomes why run so much faster for high res backwards compatibility but limit things so much for native gaming. I think a possible increase down the line is most likely, and I'd think that would be across the board.
I think it was LKD or Tom Phillips from EG who said that GC emulation was a thing, but at native resolution (SD). Does Nintendo really need 1.78GHZ to emulate old SD games of their old console, of which they have all the documentation etc, and then reduce the clocks to run modern, intensive games? I mean, it doesn't even begin to make sense.


I guess I would just say that it seems unlikely given what we know, but I guess possible, that the 1.78GHz CPU speed is solely for emulation. It still likely wouldn't explain how you can get 4 A57s to 1.78GHz and a 20nm 2SM Maxwell GPU to 921MHz continuously for 8 days when a Shield TV can't do that without throttling and has a much larger volume, and thus much better heat dissipation.

Add to that the fact that no one from any of the Switch events have felt air coming out of the vents even when docked playing Zelda, and it looks quite possible that they have built this on a 16nm node.
Here's the deal: if that CPU clock is solely for emulation, and the speculation is that it's reached at the expense of the GPU clock (which can be lower for emulation), why did they test 1.78GHZ AND 921MHZ for 8 days straight at the SAME time?


From my point of view, either Nintendo realized that the cooling system allowed for those clocks at 16nm without throttling (confirmed by these 8 days straight tests), or the foxconn guy knows his tech and made this shit up for some reason, knowing that it would look believable after joycons/battery infos were proved correct. Sounds like a conspiracy theory, but hey, who knows.

Or they tested these clocks and there was throttling, so they had to reduce clockspeeds to those given by DF. Or DF is dead wrong and the document given to them was utter bullshit. I really don't know, could be anything.
 

z0m3le

Banned
I think it was LKD who said that GC emulation was a thing, but at native resolution (SD). Does Nintendo really need 1.78GHZ to emulate old SD games of their old console, of which they have all the documentation etc, and then reduce the clocks to run modern, intensive games? I mean, it doesn't even begin to make sense.



Here's the deal: if that CPU clock is solely for emulation, and the speculation is that it's reached at the expense of the GPU clock (which can be lower for emulation), why did they test 1.78GHZ AND 921MHZ for 8 days straight at the SAME time?


From my point of view, either Nintendo realized that the cooling system allowed for those clocks at 16nm without throttling (confirmed by these 8 days straight tests), or the foxconn guy knows his tech and made this shit up for a reason, knowing that it would look believable after joycons/battery infos were proved correct. Sounds like a conspiracy theory, but hey, who knows.

Or they tested these clocks and there was throttling, so they had to reduce clockspeeds to those given by DF. Or DF is dead wrong and the document given to them was utter bullshit. I really don't know, could be anything.

The clocks were 100% stable for 8 days without a frame drop as far as we know, so these clocks didn't throttle and it seems it was a vulkan fish demo, not the unity fish one I thought, which means that the gpu was likely used to full capacity as well.

DF clocks are real, they have way too much relation to the Foxconn leak clocks, I'd say that they strengthen each other. We don't have a real explanation for those clocks being tested in the way they were, except for them ultimately being the clocks developers will eventually target on switch, but since it is a positive change, we have to poke holes in it that don't exist. To me the clocks are more likely than not, but there is a chance that these clocks won't be used, I'd say it is pretty low but we don't know the details of how this product is rolled out and what Sister products that need 100% compatability with it.
 
The clocks were 100% stable for 8 days without a frame drop as far as we know, so these clocks didn't throttle and it seems it was a vulkan fish demo, not the unity fish one I thought, which means that the gpu was likely used to full capacity as well.

DF clocks are real, they have way too much relation to the Foxconn leak clocks, I'd say that they strengthen each other. We don't have a real explanation for those clocks being tested in the way they were, except for them ultimately being the clocks developers will eventually target on switch, but since it is a positive change, we have to poke holes in it that don't exist. To me the clocks are more likely than not, but there is a chance that these clocks won't be used, I'd say it is pretty low but we don't know the details of how this product is rolled out and what Sister products that need 100% compatability with it.

How powerful would Switch's CPU be in relation to the Wii U, if DF's clockspeeds ended up being accurate..? I know you guys talked about it earlier on the other speculation thread before this one, that it would be close half as powerful as PS4/xbone with eurogamer, and actually equivalent with foxcon clockspeedsA72 and 1.78ghz?).. But what's being taken into account besides clockspeeds?
 
So someone on Reddit managed to bring a heat vision camera in while trying out the Switch and got a bunch of photos of the device playing different games:

www.imgur.com/gallery/egT5E

Note at least at one point the person believes the fan was indeed running actively in portable mode.
 

10k

Banned
So someone on Reddit managed to bring a heat vision camera in while trying out the Switch and got a bunch of photos of the device playing different games:

www.imgur.com/gallery/egT5E

Note at least at one point the person believes the fan was indeed running actively in portable mode.
The lengths people go too lol. Looks like the Tegra is one the right side and the battery is on the left.

The fan must be on the right then.
 

ggx2ac

Member
So someone on Reddit managed to bring a heat vision camera in while trying out the Switch and got a bunch of photos of the device playing different games:

www.imgur.com/gallery/egT5E

Note at least at one point the person believes the fan was indeed running actively in portable mode.

The heat output is similar to a Tegra Shield TV.

SATV.Thermal_575px.jpg


Although confined to one side of the Switch.
 

z0m3le

Banned
The heat output is similar to a Tegra Shield TV.

SATV.Thermal_575px.jpg


Although confined to one side of the Switch.

Like I said, it is silly to use heat, they won't tell you anything as ~35c will be the comfortable operating temp for a device you hold in your hand anyways. I'm sure you could thermal camera a phone running a game for a hour and get a similar heat picture, about the most useful thing from this picture is we can see how much space the device has minus the battery.
 

Buggy Loop

Member
Like I said, it is silly to use heat, they won't tell you anything as ~35c will be the comfortable operating temp for a device you hold in your hand anyways. I'm sure you could thermal camera a phone running a game for a hour and get a similar heat picture, about the most useful thing from this picture is we can see how much space the device has minus the battery.

Oh, even less than an hour for phones and higher celcius.

https://youtu.be/4Vs4Yf3ld9M
 

Cuburt

Member
So someone on Reddit managed to bring a heat vision camera in while trying out the Switch and got a bunch of photos of the device playing different games:

www.imgur.com/gallery/egT5E

Note at least at one point the person believes the fan was indeed running actively in portable mode.
Clever way to get more info that we might not otherwise get before launch.

I'm assuming they didn't even attempt to photograph the dock?
 
I think its impossible to tell from the heat vision alone.

I think the most interesting info from the heat vision pictures is:

1) The location of the SoC

2) The location/configuration of the heat sink

3) The fact that the fan does actually run in portable mode


Now, this brings up a few interesting questions. First, the Switch was always connected to a charger in these photos. I wonder if it gets hotter on battery power, as the battery likely heats up when discharging.

Second, if the fan runs in portable mode then it seems highly unlikely to be a 16nm process with the Eurogamer clocks. I don't know about the math but that seems to be low powered enough to be run without a fan.

Finally, this is tangentially related, but this and the IGN article where everyone commented on Zelda: BotW being choppy in docked mode got me thinking... Are we absolutely sure the SoC already increases its clock rate when docked? At least with these demo units? It could be possible that -at least these demo units- are running at the same clock speed when docked and undocked, and therefore BotW can output a very smooth 720p when portable but a choppy 900p when docked, both with the same GPU speed.

Taking this into account, and also the fact that all of the demo units appear to be connected to chargers, this could mean that either the clocks at the demo are the full, docked clocks (unlikely as that would have the fans likely running more often), they are the undocked clocks running games in TV mode at the undocked rate (would explain Splatoon 2 at 720p), or we are fundamentally misinformed about the nature of the Switch SoC raising its clock speed when docked- in other words, maybe that's not actually a feature of the final retail unit.
 

Donnie

Member
Yeah, that's a possibility, but it would seem very strange of Nintendo, given that it's not something they've ever done before. They'd also have to tank the battery life by quite a bit to make these kinds of changes to portable clocks, which I'm not sure they'd be willing to do.

Depends on what CPU and what process is being used. If we assume 3 hours is the battery life at the original 1ghz CPU speed and its 16nm A72 then upping to 1.7Ghz should drop that by about half an hour or even less.

How powerful would Switch's CPU be in relation to the Wii U, if DF's clockspeeds ended up being accurate..? I know you guys talked about it earlier on the other speculation thread before this one, that it would be close half as powerful as PS4/xbone with eurogamer, and actually equivalent with foxcon clockspeedsA72 and 1.78ghz?).. But what's being taken into account besides clockspeeds?

If we assume four CPU cores with one for the OS and three for games. Then 3x A57 at 1Ghz would be about half the performance of 6x Jaguar at 1.6Ghz (PS4 CPU). While 3x A72 at 1.7Ghz would be about the same as 6x Jaguar at 1.6Ghz.

There is the possibility that the Switch CPU has extra cores just for the OS (A53 most likely). That would change things a bit, obviously giving a extra 33% CPU performance for games (4 cores instead of 3 for games).

As far as WiiU goes, that's much harder to say for certain as we have no direct comparison. Though its safe to say A57 at 1Ghz would certainly be faster core for core.
 

Rodin

Member
Like I said, it is silly to use heat, they won't tell you anything as ~35c will be the comfortable operating temp for a device you hold in your hand anyways. I'm sure you could thermal camera a phone running a game for a hour and get a similar heat picture, about the most useful thing from this picture is we can see how much space the device has minus the battery.

35° was also the "max" temperature, not the average (which is usually around 22-25° in the various pics).

I think the most interesting info from the heat vision pictures is:

1) The location of the SoC

2) The location/configuration of the heat sink

3) The fact that the fan does actually run in portable mode


Now, this brings up a few interesting questions. First, the Switch was always connected to a charger in these photos. I wonder if it gets hotter on battery power, as the battery likely heats up when discharging.

Second, if the fan runs in portable mode then it seems highly unlikely to be a 16nm process with the Eurogamer clocks. I don't know about the math but that seems to be low powered enough to be run without a fan.

Didn't the patent show that the fan spins at low rpm when undocked?

Finally, this is tangentially related, but this and the IGN article where everyone commented on Zelda: BotW being choppy in docked mode got me thinking... Are we absolutely sure the SoC already increases its clock rate when docked? At least with these demo units? It could be possible that -at least these demo units- are running at the same clock speed when docked and undocked, and therefore BotW can output a very smooth 720p when portable but a choppy 900p when docked, both with the same GPU speed.

Taking this into account, and also the fact that all of the demo units appear to be connected to chargers, this could mean that either the clocks at the demo are the full, docked clocks (unlikely as that would have the fans likely running more often), they are the undocked clocks running games in TV mode at the undocked rate (would explain Splatoon 2 at 720p), or we are fundamentally misinformed about the nature of the Switch SoC raising its clock speed when docked- in other words, maybe that's not actually a feature of the final retail unit.
Good points, but the Zelda drops can be related to streaming issues and other unoptimized stuff, as the build from the showfloor wasn't the final one.

About Splatoon, i strongly believe that it was running at undocked speed. I've been saying this for a while now, there are no other explanations of why it would run at the same res as the portable version when docked.
 

Rodin

Member
Someone found the text for the Foxconn on reddit here. Next, some folks on translator reddit is working a translation here.

Thanks, it already clarified the "1080p screen" thing once and for all.

It also seems he himself stated that he saw 1080p on the benchmark so he assumed the screen was 1080p, hopefully someone can go in further detail and see if we missed anything.

About the GPU clock seems like he said "912MHZ" first, and then he corrected himself with 921 (at least this is what google translate chinese-->italian tells me).


EDIT:

The patent did show that but patents should never be taken as a full feature list of the final product. For instance, I'm pretty sure there was an embodiment described in the patent where the fan didn't run in portable mode, so they could have opted to go that way and it still would've been covered fully by the patent.

As for Zelda you could definitely be right about it just being unoptimized... The non-docked speed thing is just a theory at this point.

There's a lot of information about how the console performs that we won't be seeing until we get our hands on it, especially anything about the batteries, as it seems that all of the demo units are indeed connected to chargers. It could also be possible that the demo units have different configurations than the final unit will.

Either way I think it's an interesting question to keep in mind, as there has been no official confirmation of increased performance in docked mode.
Official no, but Matt said that Switch is "stronger than Wii U when undocked" and "a good deal stronger when docked", which indeed suggests different clocks to me.
 
Didn't the patent show that the fan spins at low rpm when undocked?


Good points, but the Zelda drops can be related to streaming issues and other unoptimized stuff, as the build from the showfloor wasn't the final one.

About Splatoon, i strongly believe that it was running at undocked speed. I've been saying this for a while now, there are no other explanations of why it would run at the same res as the portable version when docked.

The patent did show that but patents should never be taken as a full feature list of the final product. For instance, I'm pretty sure there was an embodiment described in the patent where the fan didn't run in portable mode, so they could have opted to go that way and it still would've been covered fully by the patent.

As for Zelda you could definitely be right about it just being unoptimized... The non-docked speed thing is just a theory at this point.

There's a lot of information about how the console performs that we won't be seeing until we get our hands on it, especially anything about the batteries, as it seems that all of the demo units are indeed connected to chargers. It could also be possible that the demo units have different configurations than the final unit will.

Either way I think it's an interesting question to keep in mind, as there has been no official confirmation of increased performance in docked mode.

Someone found the text for the Foxconn on reddit here. Next, some folks on translator reddit is working a translation here.

Ooooh very interesting, thanks!
 

Rodin

Member
Translation seems to be up.

Post 1
There are no additional functions in the base, it's just an output. The main unit is actually a 1080p capacitive touch screen, surprisingly. I haven't played with the controller but from observation the key travel of LR buttons is probably very short.
I saw heatsink fins on the main unit, probably because it was an engineering sample. I haven't seen a production unit yet.
The base is very light, feels very plasticky.
The side controllers are very small. They start charging when you plug them into the sides of the main unit. The split LR buttons are positioned under your hands. Handling feels weak due to the lack of buttons.
There's a 5cm vent at the top of the main unit and you can see heatsink fins inside.
The main unit is around 1.5kg.
Each small controller has 2 separate SL SR buttons. Key travel is still very short. Doesn't feel very nice.
The silicone buttons on the controller are very average.
The brightness of the main unit isn't very good. I'm guessing it'll be just visible enough outdoors if set to max brightness, but then you can forget about battery life.
I checked again today, the main unit definitely has a 1080p screen, which is great. There are no games on it so I couldn't tell much [about display quality]. Will try to check out the motherboard in the next few days.
I had a look at the heatsink today, there's a L shaped heat pipe 0.8cm wide and about 12cm long, placed upside down. The fins on it are 5cm long and 0.8cm tall, and there's a turbine fan with 4cm diameter. Looks pretty shoddy, but should be enough since the main unit doesn't generate that much heat anyway.
The demo screen is very boring, just a bunch of fish swimming around. The factory floor is very noisy so I couldn't tell if it generates much noise. I touched it and it wasn't too warm after running for 2 hours, so it's not too bad.
Brightness really isn't very good.
As a portable console the Nintendo Switch is not bad, but don't expect it to be on par with home consoles. If you want 1080p60 you can forget about it, even PS4 Pro only barely achieves that. If The Legend of Zelda: Breath of the Wild can run at 1080p30 then it's definitely worth buying, after all it only runs at 720p on the Wii U. But if it's interlaced/interpolated then that'd be pretty embarrassing.
It's a good choice as a portable console, and the 2 small controllers allow multi-player Mario Kart too. I'll need to check the battery capacity of the controllers, not sure how good its battery life is.
Had a look at the motherboard today. The CPU looked like about 10mm2 and is made in Taiwan. There are 2 memory chips but couldn't see the model or type. The fan is pretty powerful, I can feel waves of hot air. The unit itself doesn't feel very warm. There are heat pipes so we shouldn't need to worry about frequency throttling.
On the right side of the main unit there's a 8cm2 slot that's about 0.6cm deep. Pretty sure that's for the battery, a big one at that.
There's no way Nintendo would be generous enough to include a hard drive.
Even the Nvidia GP106 is only 120cm2 [Ed: did he mean mm2 ?]. I think we can look forward to good graphics performance on the Nintendo Switch even though some chip area is taken up by the CPU.
The CPU in Nintendo Switch is what's commonly known as an APU. Its area is about the same as Nvidia GP107, but that includes the CPU too, so graphics performance won't be as good as GeForce 1050.
Reply from OP: Not an engineering sample, a verification unit assembled in Japan, should be the same as production units. You think our factory security are useless? [Its performance] beats iPhone 8 hands down, but miles behind PS4.

Post 2
Can't take any photos. I've tried to describe all I could. Haven't checked the battery capacity of the side controllers yet, will take a look soon.
The controllers feel okay when attached to the main unit. It sucks that there's no + button, but I guess it's a compromise to allow two player games.
Our factory doesn't have the traditional controller so I can't say anything about that.
The main unit structure is pretty simple, should be easy to repair.
The side controllers are very complex, make sure you take good care of them. If you break the gold pins on the bottom then you can't charge them.

Post 3
The battery in the small controllers is about 5x2x0.5cm. You guys can guess the capacity. I don't want to go into the clean room, that'd look too suspicious.
The advantage of Nintendo Switch is no frequency throttling. There's no lag whatsoever after running for 2 hours straight. Not like mobile phones that can last only 1 minute on full performance.
I almost took a controller button home today, but then I thought it's pretty useless so I didn't do it. If I get caught the punishment will be severe [Ed: lit. "they'll cut off my hands"].
The most expensive component in the NS, besides CPU and motherboard, would be the side controllers. The quality is very high. Feels good to use when connected together.
I'm confident in the small controller's battery life after seeing the battery dimensions.
Quality control is very strict, you can buy with confidence.
I think the base is pretty useless besides charging. It's mostly a portable console.
The base is easily scratched because it uses the shittiest plastic.
If you want to upgrade your portable console then buy this. The display is high definition and not blocky.
Nintendo Switch is definitely the best portable console.
The chip looks like it uses the same packaging as the Nvidia GPUs. Looks very big and powerful. It's made in Taiwan, there's no TSMC logo but we all know who made it. The surface of the chip is smooth like GP106. Have a look at your video card and you'll know what the NS CPU looks like.

Post 4
Diagonal size of the screen is about 17. [Ed: they didn't say what unit.]
Our production line made 2,100 units today and we're still increasing output. The whole factory floor should be able to make 20,000 units per day.
I saw the blue and orange versions of the controller today.
Nintendo Switch has active cooling with heat pipes and fans, you can't compare it to mobile devices. It won't throttle frequency even after prolonged use. Its performance beats any mobile device hands down, including iPad Pro.
There's a very important thing I still need to find out, will let you guys know in a few days.
What's the maximum distance of the controllers? This is an important question if you want to play 2-player Mario Kart with split screen.
I haven't seen the OS. When they do testing I'm also working, when I'm off shift I'm too tired to go check.
About the 1080p screen, I'm not 100% sure about my visual estimation. The testing computer says output is 1080p. I guessed it's 1080p based on the fact that the motherboard in the base is too simple. [Ed: is this in response to a question about whether it supports 4K? Some context would be nice here.]
The base doesn't enhance performance in any way. There's only the power connector, a USB 3.0 and a HDMI port on the inside, and 2 USB 2.0 ports on the outside.
Nintendo Switch is a traditional portable console. It's major selling point isn't innovation. The only thing extra is a video-out.
One characteristic of the NS is that everyone has a different view on how it should be used. For me, it's a great portable console.
Sorry for not being able to check the details of various things. I can only say that I haven't found any motion sensing capability yet, and our factory doesn't have any testing for it either. Maybe I just haven't found it yet! I'm too tired today, will reply tomorrow.
The motherboard in the base is way too simple. Just think of it as a charger.
The main unit has 2 small holes on the sides, and a heat vent at the top. Dust getting inside could be a problem.

Post 5
Here's some standard specs: CPU 1750 MHz, GPU 912 MHz, EMC 1600 MHz. (I remembered wrong, the GPU is actually 921 MHz. It sure looks like it [Ed: likely in reply to someone saying it's similar to some other chip], but this one is 16nm so it's probably a new product.)
Battery capacity is 4310 mAh. (The battery is not replaceable.)
Today I saw a cube module near the bottom of the controller, could be a gyroscope. I'll ask around when I get a chance.
GPU frequency is half of a desktop Pascal.
The controllers are very light, around 20–30 g. Together with the main unit they're less than 1,000 g.
Correction, the CPU is 1785 MHz.
Battery is built in.
One unit has been under test for 11,750 minutes in our factory, still working well.
It can be charged using a standard USB Type-C cable.
There's no such thing as frequency throttling when not plugged in. The performance isn't very high to begin with, if it throttles then the games will lag like hell.
There is indeed an infrared-looking light at the bottom of the right controller.
I've confirmed today that there will be a 4G version.
We have an updated motherboard and the old motherboard. Probably different firmware, I can't really tell. Battery life is about the same, probably just a different version.
Today we had units shipped to Japan and Australia. I was confused.
Today I saw a colleague making some accessory with a network port. The main unit is now thicker, with 2 layers of back cover. The plan is to make 2,000 first. (This is the 4G version, we're making 2,000 as a trial, I didn't see the unit myself, heard it from some other guy in my dorm. I'll take another look the day after tomorrow when I go back to work. There's a new circuit board behind the back cover, I reckon it's an accessory. I'll check it out on Monday when I go back to the factory.)
[to be continued]

Post 6
Here's a spec for the standard version, the standard main unit without controllers is 302 g.
Each controller is 50 g.
Today I saw the enhancer. There are still many questions but I'll tell you what I know.
The processor in the enhancer is even bigger than the main unit CPU, at 200mm2.
Dimensions looked like 12x18, on par with GP106.
There are 2 extra memory chips. (Pretty sure it adds 4 GB memory, for a total of 8 GB.)
It connects to the back of the main unit motherboard via some sort of PCI bridge.
There are 2 Wi-Fi [antennae], 1 HDMI, 1 Mini-DP, 1 network port, 2 unknown circular ports, 3 network activity indicator lights.
The enhancer is much more complex than the main unit. It also has 6 or 7 unknown storage chips.
The enhanced main unit also looks different than the standard unit. It's got extra bits on the motherboard, including the bridging connector.
What I don't know is whether this is a development unit or if it'll be officially announced, and of course I have no idea if it'll go on sale at all.
The enhanced version doesn't have a base yet.
The enhanced version can connect to a TV/display without the base. It also has a power input.
I don't know if the processor on the enhanced version includes CPU.
If it doesn't include a CPU, then it can at most upscale existing games to 4K, it won't bring any dramatic improvements. The weak CPU might be the bottleneck.
If the main processor on the enhanced version only has GPU, then it'll be more powerful than PS4 Pro.
At the moment our factory floor only plans to make 2,000 enhanced units. (I also think it's probably a development unit. Let's wait for Mr. Ren to give us more info.)
The enhanced version is very powerful, but also weighs more and feels worse in the hands. It's purely to cater for 4K TVs. I'm also anxious to know if it'll go on sale.
The performance will be off the charts with the main processor plus enhancer processor. I've never seen such a huge 16nm chip on a mobile device, especially because the main processor is already 100mm2.
The main processor is dwarfed by the enhancer processor.
There's no additional battery on the enhancer. You probably have to plug it in when playing.
The power adapter is also built in.
I've already said, this might be a development unit. I want to tell the whole story, please go back and have a look at my previous posts. (The base only has a USB-C port, it won't handle a huge amount of data transfer.)
The main unit is unexpectedly light, at 300 g. I'm liking the standard version more and more.
I weighed it myself with an electronic scale.
High definition is finally a reality.
Each smaller controller is 50 g. When combined with the main unit the total is 400 g. Is that acceptable to you guys?
Yeah I think the enhancer is a development unit too, but the performance is really something.
There's no way to boost the performance of the standard version via the base. It only has a USB-C connector, at most USB 3.0, that's obviously not enough bandwidth.
Alright I've said a lot. Flame on.
!translated

Original thread: https://www.reddit.com/r/NintendoSw...he_foxconn_leak_original_text_maybe/?sort=top

Props to Cynix for the translation
 

z0m3le

Banned
Sounds like he confirmed it is 16nm for the Switch, well that makes sense considering the clocks he talks about and the lack of throttling, which is mostly a problem with 20nm.
 
1080p screen? 8 GB of RAM?
Sounds very fafictiony

Read through all of the posts, 1080p screen was a mistake he addressed- he said the output was 1080p which we know is the case. And the 8GB of RAM he surmises is the combined RAM of the normal unit and the mysterious devkit unit. Meaning, 4GB each.

Wait, what's going on?

Is the leaker claiming that there is a dock that enhances the Switches capability?


No, he specifically says that cannot be the case.
 

z0m3le

Banned
1080p screen? 8 GB of RAM?
Sounds very fafictiony

Try reading the whole thing.
No, he specifically says that cannot be the case.

No, he says there is no dock for it, but this configuration is exactly in line with what we have been saying about a dock with extra processing power, it has a separate gpu as well as the switch soc, it has a separate pool of ram (4GB) this is exactly the configuration you'd use for an enhanced dock and nothing else really, mainly because it is far cheaper to have just 1 larger SoC and keeping the memory together, this layout is exactly what thraktor thought it would be.
 

TAS

Member
No way in hell could he make up the exact mAh capacity of the battery and weight of the unit/joycons. This guy is for real.
 

TAS

Member
Indeed. He is too spot on about everything else to completely screw up on the clock speeds. I'm going with Occam's Razor on this one. ;)
 
No, he says there is no dock for it, but this configuration is exactly in line with what we have been saying about a dock with extra processing power, it has a separate gpu as well as the switch soc, it has a separate pool of ram (4GB) this is exactly the configuration you'd use for an enhanced dock and nothing else really, mainly because it is far cheaper to have just 1 larger SoC and keeping the memory together, this layout is exactly what thraktor thought it would be.

Doesn't this:

There's no way to boost the performance of the standard version via the base. It only has a USB-C connector, at most USB 3.0, that's obviously not enough bandwidth.

basically mean there is no way to connect the normal Switch to a base with this SoC due to the lack of bandwidth in USB 3.0?

EDIT: Ah, or this future "SCD" dock might have a different type of connector? I guess that could work then.
 

z0m3le

Banned
Doesn't this:



basically mean there is no way to connect the normal Switch to a base with this SoC due to the lack of bandwidth in USB 3.0?

EDIT: Ah, or this future "SCD" dock might have a different type of connector? I guess that could work then.

Not sure if this works like this, but you can clearly see that the usb-c connector on switch has pins on both sides, thus its 3.1 minimum (correct me if I'm wrong) that can use aux port and run pcie Lane just fine for a dock like he is talking about.
 
Considering its already in development, I wonder how soon Nintendo could show this scd off and have a release date.. Lets just consider for this hypothetical situation that Nintendo actually reveals this in e3/summer time, and has it out by holidays 2017 or q1 2018. Would this hurt Nintendo's image with consumers, or should they wait longer(holiday 2018)? I sure as hell wouldn't mind. Hmm maybe too early for holiday..

More powerful than pro but possibly less than Scorpio would make it gamecube all over. Lol I want that. ;_;
 

z0m3le

Banned
Considering its already in development, I wonder how soon Nintendo could show this scd off and have a release date.. Lets just consider for this hypothetical situation that Nintendo actually reveals this in e3/summer time, and has it out by holidays 2017 or q1 2018. Would this hurt Nintendo's image with consumers, or should they wait longer(holiday 2018)? I sure as hell wouldn't mind. Hmm maybe too early for holiday..

More powerful than pro but possibly less than Scorpio would make it gamecube all over. Lol I want that. ;_;

We don't know clocks but the chip involved is the GTX 1060 (based on size and shape) if it is only a GPU, meaning that it would be about equal to a 6tflops AMD GPU like the RX480
 

z0m3le

Banned
I haven't followed this thread. A short summary would be nice. thx!

A Foxconn employee leaked a bunch of information about the switch in November, stuff that was confirmed by Nintendo in January like the battery capacity of 4310mah and the weight of the tablet and Joycons, he also saw clocks that haven't been confirmed by Nintendo yet that being 1.78ghz cpu and 921mhz gpu as well as 4GB ram.

He also spotted an enhanced switch with a second processing chip the same size and shape as the GTX 1060, this enhanced switch was produced in 2000 units, and is likely a devkit for a switch with a dock that houses an external gpu.
 

EloquentM

aka Mannny
I haven't followed this thread. A short summary would be nice. thx!
Chinese Foxconn worker correctly guesses aspects of the switch before it is revealed fully and hints at there being a dev unit that is stronger than a ps4 pro which the posters in this thread are speculating to be the SCD
 

Rodin

Member
Who was the source that said that if devs want they can use the same clocks in both portable and docked mode?

If that was correct, i wonder if it's possible that 307MHZ is simply the standard/recommended/lowest clock available for handheld mode, not necessarily the "fixed" one or the only one available. I mean, a 2D indie and DQ XI don't need the same kind of resources, and it would also explain why the battery can go from 2.5 hours to 6.5 (not sure if brightness and wifi alone can make all that difference, but a mix of that and different clockspeeds would).
 

Vena

Member
I mean, given how thin the Switch turned out to be, isn't 16nm rather reasonable given it doesn't run hot and manages 3hr+ on the battery with a game like Zelda?
 
Wow, so they could release a special dock unit with a 1060 GPU and extra 4gb ram that would put the switch over a ps4 pro?? Holy shit!
 

Thraktor

Member
Yeah I figured it shouldn't be too CPU intensive, it's just something that does take a bit of the CPU away from the main game. And that's a very interesting thought about the revamped physics engine... I wonder how much input Nvidia had with the HD rumble. I would have thought Immersion's licensed software would have most of that work done. Maybe Nvidia just helped merge that with their own NVN API.

I would imagine that what Nvidia could contribute would be simple integration with the game's physics system (so what you feel is calculated based on the physical properties of the objects in game, for example if your sword hits a particular object PhysX could use the speed of the collision and the material properties of both objects to calculate the appropriate signal to be sent to the linear actuators).

I guess I would just say that it seems unlikely given what we know, but I guess possible, that the 1.78GHz CPU speed is solely for emulation. It still likely wouldn't explain how you can get 4 A57s to 1.78GHz and a 20nm 2SM Maxwell GPU to 921MHz continuously for 8 days when a Shield TV can't do that without throttling and has a much larger volume, and thus much better heat dissipation.

Add to that the fact that no one from any of the Switch events have felt air coming out of the vents even when docked playing Zelda, and it looks quite possible that they have built this on a 16nm node.

We don't strictly know how many CPU cores were being used during the tests. If they were using Nvidia's Vulkan fish demo, then that's not something designed to stress CPUs, and depending on how the test was configured, it's possible that they were only using a single core.

Even if it's docked-only, I find it pretty unlikely that they'd make clocks available for VC but not native games. The only thing where they've really done something like that before is relaxing the NX bit enforcement so VC stuff could use JIT compilers, and that is was likely more for security than anything else.

It's strange, but this is also a very different device than anything Nintendo's developed before, and balancing portable and docked modes may force them to make very different decisions on things like this than they would have made previously. For example, I have no doubt that they could clock the CPU higher in docked mode than portable without much issue, but they seem to have made the decision to limit them to the same CPU clock to simplify development across the two performance targets.

So someone on Reddit managed to bring a heat vision camera in while trying out the Switch and got a bunch of photos of the device playing different games:

www.imgur.com/gallery/egT5E

Note at least at one point the person believes the fan was indeed running actively in portable mode.

That's pretty interesting, particularly that you can see the heat being transferred through the joycon rail on the right hand side of the system (which shouldn't be that surprising in retrospect, as it's metal, so will conduct heat much better than the plastic & glass used for the rest of the casing).

Depends on what CPU and what process is being used. If we assume 3 hours is the battery life at the original 1ghz CPU speed and its 16nm A72 then upping to 1.7Ghz should drop that by about half an hour or even less.

Yeah, it should be about half an hour based on my calculations. It would seem strange to not just allow those clocks in the first place, though, if they were fine with that battery life. If they're concerned with, for example, getting western third-party ports, then increasing the CPU clocks a year or two down the line isn't going to do them much good on that front.

If we assume four CPU cores with one for the OS and three for games. Then 3x A57 at 1Ghz would be about half the performance of 6x Jaguar at 1.6Ghz (PS4 CPU). While 3x A72 at 1.7Ghz would be about the same as 6x Jaguar at 1.6Ghz.

There is the possibility that the Switch CPU has extra cores just for the OS (A53 most likely). That would change things a bit, obviously giving a extra 33% CPU performance for games (4 cores instead of 3 for games).

As far as WiiU goes, that's much harder to say for certain as we have no direct comparison. Though its safe to say A57 at 1Ghz would certainly be faster core for core.

I'd be quite surprised if there weren't a couple of A53 or A35 cores being used for the OS. They're small, they sip power, and they're surprisingly capable, particularly when it comes to crypto. Devoting an A57 or A72 to the OS would seem like a waste of die space and power draw in comparison. They would need to move to a HMP model (compared to TX1's clustered switching), but Nvidia have shipped Parker with (presumably) their own HMP interconnect, or Nintendo could have just used one of ARM's CCI interconnects to do the job.

Translation seems to be up.

[snip]

Original thread: https://www.reddit.com/r/NintendoSw...he_foxconn_leak_original_text_maybe/?sort=top

Props to Cynix for the translation

Thanks for posting this. I'm going to do a separate post digging into the details in a bit (this post is long enough as it is).

Doesn't this:



basically mean there is no way to connect the normal Switch to a base with this SoC due to the lack of bandwidth in USB 3.0?

EDIT: Ah, or this future "SCD" dock might have a different type of connector? I guess that could work then.

They don't need a different connector. USB-C is a physical interface that was specifically designed to be able to handle arbitrary communication protocols via the alternate mode specification. Obviously we have things like DisplayPort alt mode (which Switch uses while docked) and Thunderbolt, but manufacturers can also design their own proprietary alt modes in pretty much whatever manner they want, without having to publish them or be certified by USB-IF or anything like that.

There are basically two options for communicating with a dock containing a GPU or some other computational hardware. The first is to simply use the USB protocol, and despite the leakers claims, there's no reason to believe they would be limited to USB 3.0 speeds. We know that the system uses DisplayPort alt mode, which operates by interleaving a 5Gb/s USB signal with a 5Gb/s DisplayPort signal. This means that the system is limited to 5Gb/s while also outputting video, but it also means that it's capable of signalling at 10Gb/s, meaning full USB 3.1 speeds are entirely possible if the system isn't outputting a video signal as well (which it wouldn't be for a performance enhancing dock).

The other option is that they define their own alt mode. USB, Thunderbolt, DisplayPort, etc., are all protocols intended for use over cables, perhaps over a distance of 5 metres or more, and require far tighter tolerances on signalling at a given frequency to accommodate signal degradation and interference over those lengths. Short-range protocols (where wire length is measured in centimetres, rather than metres) can hit far higher speeds at much lower cost, and if they wanted to they could co-opt one of these over the USB-C connection as an alt mode. As a simple example, then could run 2 PCIe 3 lanes giving 2GB/s of bandwidth both ways, or an asymmetric M-PHY (4 lanes up, 1 lane down) giving up to 2.3GB/s of data from Switch to dock. If they really wanted to push it, they could even get 5GB/s out of a dual-lane NVLink 1 connection, and as ridiculous as that sounds, it would almost certainly be a lot cheaper and simpler to implement than Thunderbolt. Given that it's purely an interface for use between their own products, they could even adopt a variation of any of these, increasing clock speed or moving to an asymmetric or half-duplex transmission system (which would increase usable bandwidth at a given clock speed over a symmetric full-duplex protocol, given Nintendo's use case is likely to be far from symmetric).

I think we're also overestimating the amount of bandwidth required for a something like this. The GTX 980 achieves about 75% of its max performance at 1080p while on a PCIe v1 x4 link, which is only 1GB/s of bandwidth. Some games are an even lower performance differential than this, but in all cases we're talking about Windows games running on DX11. A game built in Vulkan for Switch and decently optimised around the bandwidth available could likely get by on quite a bit less.
 

Malus

Member
Considering its already in development, I wonder how soon Nintendo could show this scd off and have a release date.. Lets just consider for this hypothetical situation that Nintendo actually reveals this in e3/summer time, and has it out by holidays 2017 or q1 2018. Would this hurt Nintendo's image with consumers, or should they wait longer(holiday 2018)? I sure as hell wouldn't mind. Hmm maybe too early for holiday..

More powerful than pro but possibly less than Scorpio would make it gamecube all over. Lol I want that. ;_;

If it's real I'm not expecting this thing any time soon lol.
 

jmizzal

Member
Considering its already in development, I wonder how soon Nintendo could show this scd off and have a release date.. Lets just consider for this hypothetical situation that Nintendo actually reveals this in e3/summer time, and has it out by holidays 2017 or q1 2018. Would this hurt Nintendo's image with consumers, or should they wait longer(holiday 2018)? I sure as hell wouldn't mind. Hmm maybe too early for holiday..

More powerful than pro but possibly less than Scorpio would make it gamecube all over. Lol I want that. ;_;

Just like PS pro and Scorpio, its an enhancement that you dont have to buy, it would be for more core Gamer
 
That's... kind of interesting. I'm a bit hesitant to believe their social media folks have access to that kind of information, though. They're likely taking it from the article they linked to.
Fff I thought A72 was more likely with a 16nm node..

Just like PS pro and Scorpio, its an enhancement that you dont have to buy, it would be for more core Gamer
Right. I just hope nobody feels cheated. Nobody should. Just wondering if it was too soon to release it within a year.
I say this because, I think the sooner Nintendo releases that SCD, the faster they can get 3rd party support on board and get more ps4/xbone owners to pick up a switch.

So I guess Fall 17 is too soon, but maybe its better to have it out in quarter 1 '18 than holidays '18? It could give devs more time to port the switch if they have more months to work on it. Unless dev kits get passed down early like Q1 2018 for a holiday release. A smoother launch for devs come to mind.
 
They don't need a different connector. USB-C is a physical interface that was specifically designed to be able to handle arbitrary communication protocols via the alternate mode specification. Obviously we have things like DisplayPort alt mode (which Switch uses while docked) and Thunderbolt, but manufacturers can also design their own proprietary alt modes in pretty much whatever manner they want, without having to publish them or be certified by USB-IF or anything like that.

There are basically two options for communicating with a dock containing a GPU or some other computational hardware. The first is to simply use the USB protocol, and despite the leakers claims, there's no reason to believe they would be limited to USB 3.0 speeds. We know that the system uses DisplayPort alt mode, which operates by interleaving a 5Gb/s USB signal with a 5Gb/s DisplayPort signal. This means that the system is limited to 5Gb/s while also outputting video, but it also means that it's capable of signalling at 10Gb/s, meaning full USB 3.1 speeds are entirely possible if the system isn't outputting a video signal as well (which it wouldn't be for a performance enhancing dock).

The other option is that they define their own alt mode. USB, Thunderbolt, DisplayPort, etc., are all protocols intended for use over cables, perhaps over a distance of 5 metres or more, and require far tighter tolerances on signalling at a given frequency to accommodate signal degradation and interference over those lengths. Short-range protocols (where wire length is measured in centimetres, rather than metres) can hit far higher speeds at much lower cost, and if they wanted to they could co-opt one of these over the USB-C connection as an alt mode. As a simple example, then could run 2 PCIe 3 lanes giving 2GB/s of bandwidth both ways, or an asymmetric M-PHY (4 lanes up, 1 lane down) giving up to 2.3GB/s of data from Switch to dock. If they really wanted to push it, they could even get 5GB/s out of a dual-lane NVLink 1 connection, and as ridiculous as that sounds, it would almost certainly be a lot cheaper and simpler to implement than Thunderbolt. Given that it's purely an interface for use between their own products, they could even adopt a variation of any of these, increasing clock speed or moving to an asymmetric or half-duplex transmission system (which would increase usable bandwidth at a given clock speed over a symmetric full-duplex protocol, given Nintendo's use case is likely to be far from symmetric).

I think we're also overestimating the amount of bandwidth required for a something like this. The GTX 980 achieves about 75% of its max performance at 1080p while on a PCIe v1 x4 link, which is only 1GB/s of bandwidth. Some games are an even lower performance differential than this, but in all cases we're talking about Windows games running on DX11. A game built in Vulkan for Switch and decently optimised around the bandwidth available could likely get by on quite a bit less.

That's interesting about the USB C connection and different protocols, I wasn't really aware of that. It makes sense that they didn't really design themselves into a corner in that aspect.

I'm curious about your thoughts regarding the fan being run in handheld mode (reportedly). Also we still haven't had Switch units demoed when not charging if I'm not mistaken, so I wonder what the effects of being on battery power will be.

So has 4 x A57 been confirmed yet? If not, did ARM's Facebook page just confirm it?

I really doubt Nintendo would let them release info like that, so while it might be accurate, I very much doubt whoever posted that would actually know for sure.
 

Karanlos

Member
That's... kind of interesting. I'm a bit hesitant to believe their social media folks have access to that kind of information, though. They're likely taking it from the article they linked to.

They're linking to an Android Pit article so it doesn't really seem like they know from official channels.
 
That's... kind of interesting. I'm a bit hesitant to believe their social media folks have access to that kind of information, though. They're likely taking it from the article they linked to.

The info originates from the Digital Foundry article, actually. So it's really nothing new, and very likely that as you said, the social media people just pulled info from the linked article.
 

Thraktor

Member
Alright, time to start going over the improved translation. Thanks to cynix over at Reddit for the translation! I'm going to focus on Post 6 for now, just as that's the most interesting part to me:

Today I saw the enhancer. There are still many questions but I'll tell you what I know.
The processor in the enhancer is even bigger than the main unit CPU, at 200mm2.
Dimensions looked like 12x18, on par with GP106.

There are a few interesting things here. Firstly, it's described as an "enhancer", which definitely sounds like an add-on rather than an entirely new device. Secondly, it seems like the original post actually specifically referred to the GP106, which is quite interesting, as that was something I had only speculated on based on the dimensions.

There are 2 extra memory chips. (Pretty sure it adds 4 GB memory, for a total of 8 GB.)

This is actually a little curious. If the chip were in fact a GP106, then any memory attached would be GDDR5. However, GDDR5 isn't available in capacities over 1GB (not even sampling, as far as I'm aware), so it wouldn't be possible to get 4GB from 2 chips. I'm not sure I can explain this one. Perhaps they're actually duplicated on each side of the motherboard (like the first PS4 model) for 4 chips total, and they're 1GB a piece.

It connects to the back of the main unit motherboard via some sort of PCI bridge.

This here is very interesting, as it's specifically telling us that it attaches to the Switch, whereas the previous translation seemed to imply that it was a single standalone device. The PCI bridge also bodes well for my speculation that they may be using a PCIe-based alt mode over USB-C for the final model.

There are 2 Wi-Fi [antennae], 1 HDMI, 1 Mini-DP, 1 network port, 2 unknown circular ports, 3 network activity indicator lights.
The enhancer is much more complex than the main unit. It also has 6 or 7 unknown storage chips.

This seems somewhat standard for a dev kit. I'm not sure what the extra "unknown storage" chips are, but I wouldn't be surprised if there were some single-purpose ICs on there which would get consolidated together onto one chip for the final model.

The enhanced main unit also looks different than the standard unit. It's got extra bits on the motherboard, including the bridging connector.

This would make sense if it's a dev version of the Switch specifically built to interface with the "enhancer". Particularly if they weren't ready to use their PCIe-based alt mode over USB-C yet.

What I don't know is whether this is a development unit or if it'll be officially announced, and of course I have no idea if it'll go on sale at all.
The enhanced version doesn't have a base yet.
The enhanced version can connect to a TV/display without the base. It also has a power input.

This would all make sense if the final form factor of the enhancer is a stationary device (ie a dock or something along those lines).

I don't know if the processor on the enhanced version includes CPU.
If it doesn't include a CPU, then it can at most upscale existing games to 4K, it won't bring any dramatic improvements. The weak CPU might be the bottleneck.

This is somewhat speculative on the part of the leaker, although if the CPU were actually 1.78GHz I don't see it being much more CPU-limited than PS4 Pro (or possibly even Scorpio). That said, I would agree that it would seem like the main purpose would be to run existing Switch games at higher resolutions. Nintendo wouldn't make any exclusive games for this thing, although they may allow third parties to do so.

If the main processor on the enhanced version only has GPU, then it'll be more powerful than PS4 Pro.

This, combined with the mention of GP106 above, would suggest that the leaker is talking about the possibility that it's actually a GP106. In which case, at full clocks, it would be more powerful than PS4 Pro. Even if Nintendo's using a GP106 in dev kits at the moment, though, I wouldn't expect the final product to be as powerful. It would almost certainly be clocked quite a bit lower, and would also likely have fewer functional SMs.

At the moment our factory floor only plans to make 2,000 enhanced units. (I also think it's probably a development unit. Let's wait for Mr. Ren to give us more info.)
The enhanced version is very powerful, but also weighs more and feels worse in the hands. It's purely to cater for 4K TVs. I'm also anxious to know if it'll go on sale.
The performance will be off the charts with the main processor plus enhancer processor.
I've never seen such a huge 16nm chip on a mobile device, especially because the main processor is already 100mm2.
The main processor is dwarfed by the enhancer processor.

This seems to clarify a point of contention, which is that the new chip is definitely in addition to, rather than replacing the main Switch SoC.

There's no additional battery on the enhancer. You probably have to plug it in when playing.
The power adapter is also built in.

Again, this points towards the device having a stationary form factor when it's finished.

I've already said, this might be a development unit. I want to tell the whole story, please go back and have a look at my previous posts. (The base only has a USB-C port, it won't handle a huge amount of data transfer.)
[...]
Yeah I think the enhancer is a development unit too, but the performance is really something.
There's no way to boost the performance of the standard version via the base. It only has a USB-C connector, at most USB 3.0, that's obviously not enough bandwidth.

As mentioned above in my reply to Skittzo0413, I don't agree that this is a limitation. For this "enhancer" to work at all it has to do so over USB-C, and there are ample ways for Nintendo to get sufficient bandwidth from the port.

Based on the new translation, it looks like the "enhancer" is an add-on or dock of some sort with the following:


  • A chip that is the same size and shape as a GP106, and likely is just a GP106. I suspect it's probably a stand-in for a custom GPU that's somewhat less powerful.
  • An extra 4GB of dedicated RAM (although the type of RAM is a bit puzzling).
  • This chip and RAM would be used in addition to the Switch SoC and RAM, rather than replacing them.
  • A PCIe-based connection to the Switch. I believe this would likely be implemented as an alt mode over the USB-C connection.
  • No battery, and no screen (which the previous translation implied there was, which caused quite a bit of confusion).
This certainly fits what would be expected from a dev kit for an enhanced dock or SCD type device. It also eliminates a lot of points of confusion from the first translation (particularly the existence of a screen and clarifying that the new chip is being used in addition to the Switch Soc, rather than replacing it).

The reference to PCIe also reminds me of an interesting find a year ago from now, where Tovarisch found a LinkedIn profile of a Nintendo software engineer working on a "new project" which included "Driver development for PCI/e and DMA devices". It didn't make a whole lot of sense at the time, but a dock or SCD add-on which has its own RAM and communicates via PCIe would definitely need PCIe/DMA drivers.
 
Top Bottom