• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Curious Case of the Switch Foxconn Leak (Now a hardware fanfiction thread)

The Hermit

Member
One last thing..I don't know where to put this but when you think about Switch iterations, what Nintendo has done is ingenious!

Instead of all new systems like Xbox One S, PS4 Pro, Xbox Scorpio, Sony Neo, all Nintendo has to do is sell an upgraded tablet with the new chips/tech inside it.
Much like Nvidia is doing with their Shield.

Also, Nintendo could theoretically bump up the resolution of the tablet as well.

So 4K performance(to the tv) Switch-buy the tablet for $250 or something along those lines. You'll already have the dock, joyconns and other accessories ready to go!

I now believe that this is the way Nintendo will "push out" upgrades to Switch. They'll have different sku'd Switch tablets with performance enhancements i.e. New chips.

Thoughts anyone?

I think this will Nintendo's final console (not last mind you).

It's basically a modular console, always envolving. And it's great for printing, since it won't be stuck to launch the whole package every 4/5 years.

I hate imagining all the possibilities, because most often than not Nintendo won't accomplish them.

But, they are there that's for sure.

To be on topic, yeah I think this leak is as legit as the 4chan one. In two months someone will open that thing, but it's good to know it has some power under the hood.

The conference actually made me believe it was weaker than I expected, especially when EA announced the lastgen port of FIFA.
 
Yeah if the SCD happens, it would likely just be a GPU with its own vram and a hard drive for HD texture patches.

You'd expect 4x to 5x the power of the docked switch, so around 2 tflops, though switch could have an upgrade before that that is 1080p, and switch would run on full clock with new games.

Basically you'd have a 393gflops or 472gflops switch at all times and display 720p on the go and docked, then the new one would be 2.5x faster or 1180gflops and they would then make the SCD a 4k dock with 5tflops or so of performance.

This is all hypothetical
What about normal RAM though? VRAM can't make up for 4GB RAM. How likely is it that we will get regular RAM with the scd?
 

Hoo-doo

Banned
Yeah if the SCD happens, it would likely just be a GPU with its own vram and a hard drive for HD texture patches.

You'd expect 4x to 5x the power of the docked switch, so around 2 tflops, though switch could have an upgrade before that that is 1080p, and switch would run on full clock with new games.

Basically you'd have a 393gflops or 472gflops switch at all times and display 720p on the go and docked, then the new one would be 2.5x faster or 1180gflops and they would then make the SCD a 4k dock with 5tflops or so of performance.

This is all hypothetical

Thing is, you are known for doubling and tripling down on these hypotheticals of yours and people gobble this stuff up.
A month ago you were convinced the Switch would be rocking 600GFLOPS of undocked processing power because it had to be Pascal and your own calculations. Can you at least acknowledge that you often miss the mark on these hardware speculations?

I truly admire your effort to keep coming up with these theories and I can already hear people salivating over your 'hypothetical' 4K/5TFLOP Switch.
But man, at some point you need to realize that Nintendo's goals and roadmap lies so far apart from your personal desires for ultimate power, that it really stops making sense to keep coming up with this stuff.
 

Mutagenic

Permanent Junior Member
Thing is, you are known for doubling and tripling down on these hypotheticals of yours and people gobble this stuff up.
A month ago you were convinced the Switch would be rocking 600GFLOPS of undocked processing power because it had to be Pascal and your own calculations. Can you at least acknowledge that you often miss the mark on these hardware speculations?

I truly admire your effort to keep coming up with these theories and I can already hear people salivating over your 'hypothetical' 4K/5TFLOP Switch.
But man, at some point you need to realize that Nintendo's goals and roadmap lies so far apart from your personal desires for ultimate power, that it really stops making sense to keep coming up with this stuff.
Amen. Stop chasing the numbers. Nintendo isn't.
 

Donnie

Member
https://browser.primatelabs.com/v4/cpu/compare/1482641?baseline=1610738

A57 1.40 GHz
1 processor, 8 cores

AMD A6-6310 @ 1.80 GHz
1 processor, 4 cores

Single threaded comparison adjusted for clocks is in Jaguars favor. By a lot, and not just by AES or something skewing a result.

Those are A53 cores, the phone is called the Oppo A57, but it uses a Snapdragon 435 which is 8 core A53

This one is A57 cores and shows a very different picture:

https://browser.primatelabs.com/v4/cpu/compare/1398129?baseline=1396658
 

Instro

Member
Its still significantly weaker than a PS4 or Xbox ONE even Docked,the games will speak for themselves (or dont)

Well on the GPU side anyway, although a 20% improvement is still quite nice. The CPU improvement would be pretty significant though.
 

LordOfChaos

Member
Those are A53 cores, the phone is called the Oppo A57, but it uses a Snapdragon 435 which is 8 core A53

This one is A57 cores and shows a very different picture:

https://browser.primatelabs.com/v4/cpu/compare/1398129?baseline=1396658

Oh my god, my bad, haha. Just name a phone Snapdragon 820 next time.

Though, them being clocked nearly the same there, with about a 30% lead for the Cortex, I'd still say saying Cortex A57 at 1GHz beats Jaguar at 1.6 a touch off, with the Jaguar having a ~40% clock lead at 1.6 vs 1, the Cortex being at 62% of the Jaguars clock. It's much closer than I thought, though.
 

Donnie

Member
Well on the GPU side anyway, although a 20% improvement is still quite nice. The CPU improvement would be pretty significant though.

Even at the assumed 786Mhz its not as comparatively weak as some seem to think, given some time and effort by devs to optimise for Tegra's features (FP16 for one). Never going to be XBox One level at the speed but not as bad as some make out.

Oh my god, my bad, haha. Just name a phone Snapdragon 820 next time.

Yeah took me a minute looking at the results and just thinking "eh that doesn't seem to make sense..". From the other benchs I've seen at 1Ghz the A57 seems close to Jaguar at 1.6Ghz. Jaguar has the edge overall at that speed but they do trade blows.
 

MCN

Banned
SCD? I've see a few people throw that around and I'm not sure what it means haha.

EDIT: Nevermind, Switch Compute Dock I'm guessing

It means Supplementary Computing Device, a patent that Nintendo has had granted that allows expanding the power of a console with an external add-on. I think it can also contribute to cloud computing when not being used in a game.
 

z0m3le

Banned
I think it doesn't make much sense for the Switch to just be a x1, I'll say that much, 20nm is pretty much a dead end and unlike Wii U, they aren't forced into a process by the tech they want to use. The launch is something like 6 weeks away and while I've bought mine, you won't see me with a razor and a microscope trying to unlock this mystery, but I'll say that it should be fairly easy to tell once someone does do that.

The other thing I'd like to mention is that people need to cool it in terms of this being some huge improvement, the gpu boost is mostly trivial, the cpu clock wouldn't stop ports that developers wanted on the thing either way, but Eurogamer's clocks would make michael pachter's developer claim make much less sense than a CPU that is on par with current gen. The leak nailed things so correctly, even with these clocks that it is near impossible for these clocks to just be a guess from some random foxconn employee.

This leak came out before eurogamer's btw, and the clocks relate to eurogamer's in terms of multiplier and power consumption, to do that intentionally would require a time machine. I understand everyone wanting to jump down the switch's throat right now and the concerned trolls that always seem to hover, ready to attack. The reality is that this rumor doesn't change the performance in any drastic way, it's more academic than anything, but it does put into question "known" Switch clocks, because this device with these clocks were being produced at 20,000 units a day according to the leaker.

SCD? I've see a few people throw that around and I'm not sure what it means haha.

EDIT: Nevermind, Switch Compute Dock I'm guessing
basically... Here you go:
http://www.neogaf.com/forum/showthread.php?t=1246281

What about normal RAM though? VRAM can't make up for 4GB RAM. How likely is it that we will get regular RAM with the scd?

The SCD would use the 4 or 8 GB it had for VRAM, and the 3.2GB left over would be entirely used for normal RAM.

Thing is, you are known for doubling and tripling down on these hypotheticals of yours and people gobble this stuff up.
A month ago you were convinced the Switch would be rocking 600GFLOPS of undocked processing power because it had to be Pascal and your own calculations. Can you at least acknowledge that you often miss the mark on these hardware speculations?

I truly admire your effort to keep coming up with these theories and I can already hear people salivating over your 'hypothetical' 4K/5TFLOP Switch.
But man, at some point you need to realize that Nintendo's goals and roadmap lies so far apart from your personal desires for ultimate power, that it really stops making sense to keep coming up with this stuff.

It was 50gflops north or south of 500gflops for the portable, and was based on the fan. That would put it at 450gflops and it ended up at 393 with eurogamer's clocks, not actually a big difference, even if the Foxconn clocks are final clocks, you are looking at 472gflops, you are splitting hairs over 20% and looking for a witch hunt IMO.

The SCD speculation is simply based on the ratios Nintendo gave us with the current Switch. 302ghz (720p) to 768ghz is 2.5x clock increase, for 2.25x more pixels (1080p), a SCD would increase the resolution by 4x, Nintendo went 10% higher, so 4.4x would fit this configuration and give them a 4k dock that is SCD.

Eurogamer's specs: 157gflops 720p (portable), 393gflops 1080p (docked), 1729gflops (4k dock based on what they did to reach the docked specs)

Foxconn's specs: 189gflops 720p (portable), 472gflops 1080p (docked), 2076gflops (4k dock)

The difference here is very tiny. My original speculation was based on the crazier hypothetical of when "NEW" Switch comes out, Nintendo increases the original's clock when portable to the docked speed, since all the cooling is in the tablet and there is only a battery life issue preventing them from doing so now.

When this hypothetical product comes out, it would be a 1080p device with 2.5x the performance of the original Switch, with eurogamer's specs, this is 983gflops. With foxconn's it is 1180gflops, again not a huge jump here. The 4k dock would then need to be about 5tflops.

Edit: As for my own personal goals for power, I own a PC with a GTX 1060 and a 4.4GHz 8 core CPU. I'm fine, it's just fun to mess around with numbers and deal with hypothetical, feel free to ignore me as they seem to bother you so much.
 
The other thing I'd like to mention is that people need to cool it in terms of this being some huge improvement, the gpu boost is mostly trivial, the cpu clock wouldn't stop ports that developers wanted on the thing either way, but Eurogamer's clocks would make michael pachter's developer claim make much less sense than a CPU that is on par with current gen. The leak nailed things so correctly, even with these clocks that it is near impossible for these clocks to just be a guess from some random foxconn employee.

This leak came out before eurogamer's btw, and the clocks relate to eurogamer's in terms of multiplier and power consumption, to do that intentionally would require a time machine. I understand everyone wanting to jump down the switch's throat right now and the concerned trolls that always seem to hover, ready to attack. The reality is that this rumor doesn't change the performance in any drastic way, it's more academic than anything, but it does put into question "known" Switch clocks, because this device with these clocks were being produced at 20,000 units a day according to the leaker.

I agree that the clocks seem like they couldn't be random guesses- the leaker would likely have seen these clock speeds when testing the unit.

However, that doesn't mean these are the clock speeds available to developers. It could be the case that DF's info is outdated, but that info was presented as being "final" to developers, so it would be strange if that was to change.

I think the biggest improvement we could be getting from these specs is the CPU cores being A72/A73, and I'm curious if the Foxconn leaker actually determined that somehow for sure, or if he's just guessing. That would likely be the biggest nugget of info to come out of this if it's true.
 

Thraktor

Member
Given the accuracy of the battery capacity, among other things, it would certainly seem like the leaker did have some kind of access to Switch hardware. This doesn't necessarily mean that everything they said is right, but it's certainly interesting to run through the reddit post to look as the implications if the entire post is accurate (or as accurate as the leaker could be, given what he knew).


  • "10 x 10 core" - Assuming measurements in millimetres, this tracks pretty close to the TX1 (which is 11mm x 11mm). This would indicate that the SoC is either 20nm, 16nm or 14nm (all have similar density), as a 28nm SoC would be noticeably larger. It would also seem to rule out any additional SMs over the TX1 (the Switch SoC being smaller could be from a simplified video codec block, as Switch doesn't need 4K h.265 encoding/decoding).
  • CPU: 1785 MHz - As I previously mentioned, I don't see any reason for Eurogamer's clock speeds to be wrong for game, but it's entirely possible that the CPU can clock up higher while running the OS or for emulation.
  • GPU: 921 MHz - Again, I don't see the in-game GPU clocks being wrong, so they could just be stress-testing at a higher clock to be extra-cautious.
  • Speculated 2x RAM = 4GB - This is kind of interesting, as it implies that there are two memory modules. LPDDR4 modules are typically available at either 32 bit or 64 bit bus width per module, and although Shield TV uses two 32 bit modules for a total 64 bit bus, this almost never happens in a tablet, as space is much more limited, and a single 64 bit chip could be used instead. The only tablet I can think of which uses two LPDDR4 modules is the iPad Pro, which uses two 64 bit modules, as it's the only way to get a 128-bit bus. This doesn't really specify anything one way or the other, but it does leave the option open for a 128-bit memory bus.
The bit regarding the "dev kit" is perhaps the most interesting, though, if it's true (which is again a big if). Nintendo's dev kits are usually made by Intelligent Systems (it's actually what the company was founded to do, Fire Emblem was just a side project originally), but I suppose it's theoretically possible that they've outsourced that to Foxconn instead. If it is true, we're looking at the following:


  • Producing 2000x units for now - This sounds like the kind of numbers you'd expect for a dev-kit. I don't know whether you'd expect a batch of 2000 units for an early dev kit or a late dev kit, though.
  • The core is 1x times bigger than the one above,200m㎡, looking it looks like 12x18 - This is obviously quite a bit larger than above, and pretty much exactly the size of GP106 (the GPU die in the GTX1060). If it's an SoC, then there also has to be space in there for the CPU, etc, but it would still be quite a bit more powerful than the regular Switch (perhaps 4x as powerful for 4K games?). If it's not an SoC, but rather a separate GPU, then it should be a similar config to the GTX1060, but probably with a lower clock.
  • Extra ram, this version is 8GB - Makes sense for a more powerful device.
  • No dock for this version for now. Can be plugged into TV without docking, power is inside - Of course it doesn't need the dock, if it is the dock ;)
  • Speculated provided the core is only include GPU, it would be even more powerful than PS4 pro - If it actually was the GP106 (or similar), then yeah, it would potentially be more powerful than the PS4 Pro, although I'd imagine that they would clock it down a bit.
  • Screen is the same size as the normal one
  • It's much more powerful, but also much heavier, not feeling great in hand, speculated for 4K gaming - This could be because it's early hardware, or it's just a dev kit, but it could also be because it's not actually a portable device.
  • Haven't seen such a huge core, and it's 16nm + 100mm2 main core - This is the bit to focus on right here. What he's saying is that this "dev kit" has both the standard 100mm² SoC used in the regular Switch and a separate 200mm² chip. This means the 200mm² chip isn't a replacement for the Switch's SoC, it's an addition to it. That is, it's a GPU.
  • There's no battery inside this version - No need for one of these, if this is what I assume it is.
So, I'm about to use a phrase that I've tried to avoid using for the past year or so since it came into gaf's vernacular, because I've consistently believed that, even though it's technically feasible, I don't see the business case for it. I still don't see the business case for it, but frankly it fits the supposed leak above too well for me to ignore it, so here goes:

SCD

What he's describing above matches pretty much exactly what we would expect from a "supplementary computing device", or SCD, which Nintendo patented a few years ago. The SCD would be an add-on unit to Switch (let's say a special version of the dock) which has extra computational hardware inside (i.e. a GPU).

In this particular case what we'd be looking at is a dev kit that is designed to provide the functionality of both devices combined (as it's simpler to make than a dev kit with actual detachable parts). Hence why both the Switch SoC and the new GPU are in the same unit, and why there's a screen (same size as Switch's) and why it's so heavy and doesn't include a battery. For the actual SCD, though, it would likely be a dock with the GPU and RAM, but not the Switch's SoC or the screen.

Regarding the specifics of the leak, when I said the dev kit's chip was "pretty much exactly the size of GP106", I wasn't kidding. According to this (which is the best source I could find), the GP106 measures approximately 11.67mm x 17.13mm, which is extremely close both in size and shape to the leaker's claim that it "looks like" 12mm x 18mm. It's worth noting that the die size of GP106 would have been common knowledge when this was posted (so if the leaker is screwing with us they could have found this out), but if they're not screwing with us then it would seem too close to be a co-incidence.

As I see it, if the leak is true, then the dev kit is one of the following:


  • A dev kit for a Switch 2 portable/hybrid which has a GP106-class GPU (and presumably a 10-15 minute battery life)
  • An early "SCD" dock dev kit which is using GP106 as a stand-in for a custom GPU being developed (which could be less or more powerful than GP106)
  • A near-final "SCD" dock dev kit which is using a custom GPU which is very similar to GP106
  • An "SCD" dock dev kit, where they're just using the GP106 itself in the final product
I don't think there's even the remotest hope of Nintendo releasing a new portable Switch with a GTX1060 for a GPU in the next couple of years, so I think we can rule out the first option pretty safely.

Then it comes down to an SCD GPU add-on (most logically in the form of a dock) with a GP106 GPU or something similar. Even with an SM or two disabled and clocks pulled down a bit from GTX1060, that would still be an obscenely powerful GPU by Nintendo's standards, putting them roughly competitive with the PS4 Pro. It doesn't seem like the kind of thing Nintendo would do, but Nintendo is anything but predictable these days, and given their SCD patent and the fact that the leaker has got other things right, I couldn't really rule it out, either.

It does seem puzzling that they would go with something so powerful, though. If they just wanted to play Switch games in 4K, then a ~1.6Tflop GPU would do the job, whereas this would potentially be twice that. A 1080p/60fps Switch game like MK8 would in theory be able to hit properly native 8K (at 30fps) on that kind of hardware, let alone 4K. It's possible that it's their play to give people an option that's competitive with Sony and MS on power to get western third parties on board, but if the games have to run on the regular Switch as well then the potentially 20-fold power difference between portable Switch and docked super-GPU would be tough to manage without sub-sub-HD resolutions in portable mode. Alternatively, they could let AAA devs design games that only work with the SCD, but then you're making things more complicated for consumers.

Back to my initial issues with SCDs, though, which are that they don't make a whole lot of sense from a business perspective. Console add-ons have a notoriously bad track record, because by their very nature they're only going to be able to sell to a limited audience. If you build a new console, you can potentially sell that to anyone who likes video games, a total audience in the hundreds of millions. If, however, you build an add-on, you can only sell that to people who already own your existing system, which is likely cutting your potential audience down by a factor of ten or more. This is why PS4 Pro and Scorpio are new, standalone devices rather than add-ons to the PS4 and XBO, because if you have the choice of selling to an audience of hundreds of millions rather than tens of millions that's what you do.

The console add-on is also a route that is almost guaranteed to get pretty poor support from developers because, once again, the audience is smaller. If you can target your game to everyone who owns a Switch, or just to the people who own the SCD, the former group is guaranteed to be bigger. I suppose if most of Switch's big games will be made by Nintendo themselves this isn't as big of an issue, but it still means Nintendo devoting resources to accommodate a small subset of their total user base.

It would make more sense to me (although who knows what makes sense to Nintendo) for the SCD not to be a dock per se, but rather a standalone console which also operates as a dock. That is, the SCD includes both the Switch SoC and the new GPU, and comes bundled with a Pro Controller (or pair of joycons). When you turn on the SCD without a Switch docked, it boots up on its internal SoC and operates pretty much as a standard console. If you turn it on with a Switch docked, though, it boots up on the docked Switch's SoC, allowing the Switch to be undocked and the game to continue on the Switch. This wouldn't be that much more expensive than a dock, but would have a substantially larger audience. It would also make more sense, both to developers and to customers, for there to be games which are SCD-only than if it were just a Switch accessory.

There are a few other interesting things to think about if Nintendo went this route:


  • Devs could use the Switch's SoC GPU for compute or other tasks while docked with the SCD
  • Alternatitvely, Nintendo could disable the Switch GPU while docked with the SCD to allow the CPU to clock up higher (say to 1.78GHz)
  • Nintendo's SCD patent talks about the SCD and the gaming hardware (i.e. Switch) communicating over wired or wireless networks, with the SCD performing computational work for Switch. In a simple case, this could be implemented like Wii U, i.e. if you're using Switch as a portable the SCD could stream the game to you. Alternatively, there are potentially more sophisticated uses (even over longer range) where the SCD would perform background tasks while Switch still renders the final frame.
I should end by once again emphasising that this is purely speculation based on supposed leaks from a source which, although right about Switch, may be completely wrong about this device. That said, as they were right about Switch, it is worth discussing, if only for how damn crazy it would be.

Edit: TL:DR: If the leak is accurate (which is a big if), then it looks like Nintendo is making dev kits using Nvidia's GTX 1060 GPU. It seems likely that the final product would be a special dock for Switch allowing it to play games with this more powerful GPU. Although it would be clocked lower than the GTX 1060, it would still be very powerful, potentially competitive with PS4 Pro.

This is a stretch. At the same clocks, Jaguar and A72 will trade blows by micro-benchmark, that's a bit higher clocked (10%) than the base PS4, but that has 7 (one partially) available cores to a game. Four A72s aren't going to match that, except per-core.

Now, 8 A72s would be amazing and I would agree those would match and exceed the PS4, but that's my dreamland.

Gaming-ish benchmarks tend to favour the A72, but it would depend on a case-to-case basis. If it were a quad-A72 at 1.78GHz where all four cores were fully usable by games then I'd say it'd be in the ballpark of PS4 (perhaps ahead in some cases), but if one or more A72s were used up by the OS then it would fall behind.

In any case, as I said above, I don't expect anything higher than 1GHz in games.

I think it would be best to wait for clarification from Eurogamer or Digital Foundry about this potential information, or better yet, calculations from Thraktor!

There are so many pieces of power-consuming hardware in Switch it's difficult to calculate too much. Things like the screen, the (various) wireless connections, the hd rumble, etc, will all consume power at varying rates, and unless you can nail down each of them precisely any estimate of what's going on in the SoC is going to have a large margin of error.

There is one thing that's easy to calculate, though: 5.3W

That's the typical power draw of the full system while playing Zelda in portable mode (based on a 4310 mAh battery and a 3 hour battery life).
 

Cerium

Member
Great Post As Usual
giphy.gif
 
There are so many pieces of power-consuming hardware in Switch it's difficult to calculate too much. Things like the screen, the (various) wireless connections, the hd rumble, etc, will all consume power at varying rates, and unless you can nail down each of them precisely any estimate of what's going on in the SoC is going to have a large margin of error.

There is one thing that's easy to calculate, though: 5.3W

That's the typical power draw of the full system while playing Zelda in portable mode (based on a 4310 mAh battery and a 3 hour battery life).

Well can't we at the very least compare the power consumption of the DF core configurations and clocks with the Foxconn leak's core configurations and clocks? Like, 4 A57s at 1GHz vs 4 A73s at 1.78GHz (on 16nm), and same for the GPU? I think if those power consumption figures match decently then we could say that the final hardware was improved over the info DF got, likely due to a die shrink to 16nm.

If the wattage numbers don't match at all then it's less likely this is true for the final hardware.

Also, I love the SCD speculation, very interesting!
 

Thraktor

Member

Yeah, it's a bit of a wall of text after re-reading it. Here's a simplified version:

If the leak is accurate (which is a big if), then it looks like Nintendo is making dev kits using Nvidia's GTX 1060 GPU. It seems likely that the final product would be a special dock for Switch allowing it to play games with this more powerful GPU. Although it would be clocked lower than the GTX 1060, it would still be very powerful, potentially competitive with PS4 Pro.
 
They are over 4 months old at this point, as they admitted themselves.

I never knew that. It wouldn't surprise me if Nintendo were playing it safe during the Summer like they did with Wii U and slowly increased the clocks until they hit what they felt was a good balance between battery life and performance.

Mario and Splatoon 2 running at 720p is extremely strange. Could it be like PS4 Pro and Scorpio where individual developers are given the choice of using better visuals instead of increased resolution. The Mario team for instance might have valued larger areas and graphical effects at 720p vs lower effects and 900p docked.

I hope someone tears this thing down at launch for specific clocks.
 

Eolz

Member
Yeah, it's a bit of a wall of text after re-reading it. Here's a simplified version:

If the leak is accurate (which is a big if), then it looks like Nintendo is making dev kits using Nvidia's GTX 1060 GPU. It seems likely that the final product would be a special dock for Switch allowing it to play games with this more powerful GPU. Although it would be clocked lower than the GTX 1060, it would still be very powerful, potentially competitive with PS4 Pro.

Wait what?
I understood most of the leak, but why it would a 1060 in the devkit? Instead of a 950 or similar for example, which seems more realistic.
 

Mr Swine

Banned
Wait what?
I understood most of the leak, but why it would a 1060 in the devkit? Instead of a 950 or similar for example, which seems more realistic.

950 is An old medium end GPU and a 1060 rivals last gen high end GPU the GTX 980

Also how in the world will Switch work with a SCD when it doesn't have a port?
 

Thraktor

Member
Wait what?
I understood most of the leak, but why it would a 1060 in the devkit? Instead of a 950 or similar for example, which seems more realistic.

I don't know why Nintendo would go this route, but the leak describes a devkit that includes both the standard Switch SoC and a separate 16nm chip which measures about 12mm x 18mm. These measurements line up pretty much exactly with the size of the GP106 (used in GTX1060).

950 is An old medium end GPU and a 1060 rivals last gen high end GPU the GTX 980

Also how in the world will Switch work with a SCD when it doesn't have a port?

You mean like the USB-C port at the bottom? Assuming Nintendo were planning this while designing Switch (and the patent is from 2014), then it wouldn't be an issue.
 

Mr Swine

Banned
I don't know why Nintendo would go this route, but the leak describes a devkit that includes both the standard Switch SoC and a separate 16nm chip which measures about 12mm x 18mm. These measurements line up pretty much exactly with the size of the GP106 (used in GTX1060).



You mean like the USB-C port at the bottom? Assuming Nintendo were planning this while designing Switch (and the patent is from 2014), then it wouldn't be an issue.

I have a hard time believing that USB type C has enough bandwidth to even handle the 1060. I think that the thunderbolt 3 is a bit to bandwidth constraint to max out any high end GPU from 970 and upward
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
I have a hard time believing that USB type C has enough bandwidth to even handle the 1060. I think that the thunderbolt 3 is a bit to bandwidth constraint to max out any high end GPU from 970 and upward

Thraktor has entirely ignored the bandwidth required for the HDMI signal to pass over the USB-C connection too, which at 4.46Gbps for 1080p@60Hz is pretty much most of the available bandwidth.

The leak is a crank, holding onto it is WUST II Electric Boogaloo levels of delusion, entertaining it just confuses, unless that is his and others aim in entertaining it, of course.
 
The leak is a crank, holding onto it is WUST II Electric Boogaloo levels of delusion, entertaining it just confuses, unless that is his and others aim in entertaining it, of course.

Again, outside of the SCD speculation, none of this is all that big of a change from the DF specs. The improved CPU would be the real story and nothing about it seems all that unbelievable. Especially when DF themselves stated they didn't know what the CPU cores would be.
 

prag16

Banned
I have a hard time believing that USB type C has enough bandwidth to even handle the 1060. I think that the thunderbolt 3 is a bit to bandwidth constraint to max out any high end GPU from 970 and upward

Not that I believe the leak, and maybe I'm misunderstanding the argument, but if this theoretical 1060 is in the dock, why do the images going to the TV ever have to pass over USB-C? The USB-C bandwidth should be enough for the CPU to 'direct traffic' so to speak while the GPU in the dock sent graphics to the screen over HDMI, no?
 

IC5

Member
They were producing 20k of these a day, that is a lot of devices to scrap...

Also the GPU performance is only 20% greater, nothing that is going to transform what we knew about the graphics capabilities.

The CPU on the other hand would move from being about equal to 4 PS4 CPU cores to faster than all 8 PS4 CPU cores. I'd like to add that this makes sense with michael pachter's dev info, where he was told by a developer working with Switch that it is the easiest of the 3 to develop for, that would lead me to believe that it has more CPU power in fewer cores, since that is the biggest headache with hardware.



He saw the clocks on the screen.

He also got the weight right as well, so he physically touched them and weighed them too.
20% would be huge.

From a consumer standpoint, that would be like spending $150-$200 extra on your PC graphics card.
 

Eolz

Member
I don't know why Nintendo would go this route, but the leak describes a devkit that includes both the standard Switch SoC and a separate 16nm chip which measures about 12mm x 18mm. These measurements line up pretty much exactly with the size of the GP106 (used in GTX1060).

Ohhh ok, thanks. Looks like I skipped that part ahah.
Seems plausible that they would go with recent hardware in their devkits, but a 1060 is a weird choice. Interesting.
 

Reallink

Member
Impressive, you guys only took 4 days to reach Misterxmedia levels of batshit. Now there's a 4k dock with a 1060, is it in the power brick?
 
Have you guys entertained the possibility that Switch (much as with Playstation and Xbox) are now in a "platform as a service" model with hardware abstraction layers good enough to take into account multiple hardware sets?

And that somewhere down the line, at some point, a Switch Pro or Switch 2 or what have you will be available for purchase, similar to a New3DS or PS4 Pro or Xbox Scorpio but much more useful?
 

Pokemaniac

Member
Thraktor has entirely ignored the bandwidth required for the HDMI signal to pass over the USB-C connection too, which at 4.46Gbps for 1080p@60Hz is pretty much most of the available bandwidth.

The leak is a crank, holding onto it is WUST II Electric Boogaloo levels of delusion, entertaining it just confuses, unless that is his and others aim in entertaining it, of course.

It's highly unlikely they'd be having both a video signal and a GPU use the USB port at the same time. Using the latter would generally mean the former would be unnecessary, since the video signal would probably be produced by the external GPU, not the one in the Switch.
 

sits

Member
The previous Switch thread was 185 pages of don't-go-WUST-again expectation-termpering.

Don't do this to me GAF.
 

Theonik

Member
I have a hard time believing that USB type C has enough bandwidth to even handle the 1060. I think that the thunderbolt 3 is a bit to bandwidth constraint to max out any high end GPU from 970 and upward
USB-C uses the same connector as Thunderbolt 3 so that's not impossible these solutions are bespoke anyway so it doesn't matter. If Nintendo wants they can implement it. THe USB2.0 in the dock kind of confuses that point. Nintendo could just be cheap though like the shitty controller dock they include in the box vs the accessory one.

As for bandwidth, existing Thunderbolt docks only have a moderate overhead vs internal. The Thunderbolt does lack bandwidth but that's why GPUs have their own discrete memory. Normal GPU interconnects tend to have similar limitations. This is a workable system really.

Thraktor has entirely ignored the bandwidth required for the HDMI signal to pass over the USB-C connection too, which at 4.46Gbps for 1080p@60Hz is pretty much most of the available bandwidth.
1080p60 with 8-bit RGB is just 2.78Gbps. Where are you getting these numbers. Combined with 6CH LPCM at 48KHz 24-bit it's about 2.8Gbps.
e: 3 channels rats.
 

wildfire

Banned
It's not. Playing online everywhere with the Switch is a killer feature. If they include that in the montly fee for online gaming it will be a selling point for many.


I agree. I hate the current subscription service as currently announced but if this is a hidden feature they will talk about at E3 then think there is decent value here. But I still think there should be more than that.
 

Theonik

Member
Yeah USB 3.1 has plenty of spare bandwidth for the dock connection.
I forgot to count 3 channels per pixel so the actual number is 3x. Bear in mind this has only 5gbps of available bandwidth probably. (base USB 3.1 but they can do anything including USB2.0 over USB-C, my phone does that)

You can transfer an HDMI signal down a USB-C connector but that is done in addition to USB signals it doesn't go down the USB channel. USB-C has extra lines for added functionality.
 

ggx2ac

Member
The 4g part still sticks out to me as being a bit odd. Why wouldn't Nintendo have disclosed that yet if it was true? Seems very hard to believe.

As someone else pointed out on the first page, the guy made no mention of an SD card slot. He most likely mistook the microSD Card slot on the Switch for a SIM card slot.
 
As someone else pointed out on the first page, the guy made no mention of an SD card slot. He most likely mistook the microSD Card slot on the Switch for a SIM card slot.

Ah I must have missed that piece of speculation. So that explains the 4g and 1080p guesses, which to me were the most obvious issues with that leak.

It's certainly very hard to doubt this guy had access to the hardware, I guess the remaining questions are about how much is actually known and how much is guesswork, when DF got their reports on the clock speeds, and whether or not those clock speeds he wrote are supposed to be used in games or just for testing purposes.

All in all, if the CPU cores are A72s or A73s that would be excellent news.
 

Thraktor

Member
I have a hard time believing that USB type C has enough bandwidth to even handle the 1060. I think that the thunderbolt 3 is a bit to bandwidth constraint to max out any high end GPU from 970 and upward

The benefit of USB C is that Nintendo can effectively send any kind of signal they want over it by defining their own alt mode. They get 10 data pins to work with and can run them at whatever data rate they like (benefiting from the fact that, unlike USB, Thunderbolt, etc., they don't have to worry about signal degradation over a potentially long cable). To use PCIe for an example they could simply run two PCIe v3 lanes over those pins, giving about 2 GB/s or 16 Gb/s.

I also wouldn't worry too much about it in general. It may be an issue for PC GPUs, but a closed system with low-level APIs like Vulkan and developers who are consciously optimising around the available bandwidth would likely be able to get by with a lot less. Much of the patent actually deals with such a device operating over low-bandwidth connections (including wireless connections), so I'm sure Nintendo is very conscious of the software engineering required.

Thraktor has entirely ignored the bandwidth required for the HDMI signal to pass over the USB-C connection too, which at 4.46Gbps for 1080p@60Hz is pretty much most of the available bandwidth.

Why would they pass HDMI over the USB-C connection? The frame would be rendered by the GPU in the dock, which obviously has a HDMI cable directly attached.

The leak is a crank, holding onto it is WUST II Electric Boogaloo levels of delusion, entertaining it just confuses, unless that is his and others aim in entertaining it, of course.

Impressive, you guys only took 4 days to reach Misterxmedia levels of batshit. Now there's a 4k dock with a 1060, is it in the power brick?

I have two questions for you:

1. How would the leaker know that Switch used a 4310 mAh battery without any inside knowledge?

2. If they do have inside knowledge, then what's the 12mm x 18mm sized die being used in the dev kit they're talking about?

A few days ago if anyone said that Nintendo was planning some kind of GP106-powered dock as a Switch add-on I would have laughed at them, and I'm still seriously dubious about it. But when evidence comes along we can't just ignore it. The leaker gave an incredibly precise description of Switch's battery capacity that turned out to be dead on, and got many other previously unrevealed features (red/blue controllers, SL/SR) correct. That's not a co-incidence, and it's clear that the person typing it had some kind of insider info.

Now, it's certainly possible that they had correct info about the Switch, but incorrect info about the other device, or that they had some correct info but also decided to make some stuff up for some reason. However they clearly knew a lot more than we did when they posted that, so it's definitely worth discussing the implications of the second part of the "leak", even if it sounds rather fanciful.

Have you guys entertained the possibility that Switch (much as with Playstation and Xbox) are now in a "platform as a service" model with hardware abstraction layers good enough to take into account multiple hardware sets?

And that somewhere down the line, at some point, a Switch Pro or Switch 2 or what have you will be available for purchase, similar to a New3DS or PS4 Pro or Xbox Scorpio but much more useful?

I absolutely expect them to do that, as Iwata talked about taking that approach in the past, and with their online partnership with DENA, their hardware partnership with Nvidia, and the existence of technologies like ARMv8 and Vulkan, all tie into the idea of a long-term software ecosystem with a variety of hardware form-factors.
 

Durante

Member
You know, there are some things you can know if you put together a Switch in some factory. Like the name of a button, the size of a chip or the capacity of a battery.

What you can't know from that are clock frequencies, or the hardware architecture revision of the GPU.
 

Theonik

Member
You know, there are some things you can know if you put together a Switch in some factory. Like the name of a button, the size of a chip or the capacity of a battery.

What you can't know from that are clock frequencies, or the hardware architecture revision of the GPU.
This has been pointed out before. The worker claims to have seen a debug software screen on an assembled unit. There is no way to verify those claims of course until we get leaks of dev docs or units in the wild of course.

Most of the rest are speculation on the author part. They also saw devkits with an additional nVidia chip that matches the size of a GP106 die.
 
You know, there are some things you can know if you put together a Switch in some factory. Like the name of a button, the size of a chip or the capacity of a battery.

What you can't know from that are clock frequencies, or the hardware architecture revision of the GPU.

He's specifically stated that he saw the unit in a debug/testing state with a demo having "thousands/millions of fish on screen" that indicated the various clock speeds. Nothing about that seems too far fetched. Also the clock speeds he's listed are lower than those of the base TX1 (2GHz CPU, 1GHz GPU) which would be an odd thing to lie about, and as z0mbI3e pointed out earlier it matches up perfectly with frequency increments possible on a TX1.

I don't know how he got the info about 16nm or A72/A73 though, so that's what I'm hoping will be clarified at some point.
 
Top Bottom