• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Curious Case of the Switch Foxconn Leak (Now a hardware fanfiction thread)

Hoo-doo

Banned
I'd buy an SCD dock. As the Switch stands though it might be a decent Vita replacement which is why I am getting one. *shrug*

I'm sure you would, as would a few others in this thread.
But Nintendo isn't going to all these lengths, muddying up all their marketing and creating customer confusion for a handful of guys determined to play Mario games in 4K with some add-on.

When the base configuration will always be the undocked Switch unit for the life of the device, what developer would even invest the time to upgrade their game for the small niche group of customers that actually bought this SCD with PS4 Pro-like power, when all that power is going to evaporate the second the unit is removed from it's dock. Nintendo's development houses are going to be spread thin on Switch either way and third parties aren't there to pull their weight and fill the gaps.

People interested in high-end visuals and hardware only tangentially overlap with Nintendo's core audience and that's not changing for the foreseeable future. Besides, that audience is already heavily catered two with the other two consoles and Nintendo would be a fool to act like it can realistically catch up without having a huge third party portfolio to back it up.
Nintendo should double down on it's strengths and not desperately cling to what the other two consoles are showing with a me-too add-on that is only really desired by people interested in winning forum wars.
You're getting a handheld system that also displays stuff on the living room screen when you dock it. Because the portable gaming market is the only market that's even remotely up for grabs and Nintendo knows this.
 

Theonik

Member
I almost wouldn't be too surprised if the Scorpio is lower than 500. Actually I half expect it to be $400, because Microsoft and Sony are usually neck and neck in undercutting each other for gaining a larger install base.
If MS could do $399 Sony can do $349 or lower. Don't think either party benefits from that particular pricewar.

Just think about that for a moment. What are you actually achieving by docking a Switch to a SCD, instead of just using a SCD standalone. Ostensibly it would replace the SoC and RAM, take over input & output... so the purpose of the docked device would be what, being a cartridge and SD card slot?
The SCD as we have seen it so far is a PCI-E gpu connecting to the equivalent of a Thunderbolt 3 port. Some parts of the original switch would remain useful but the biggest benefit is continuity. Playing Zelda with a 4K output then switching to portable with minimal interruption. In practice this approach has a lot of technical challenges both for Nintendo but also Game design.
 

kIdMuScLe

Member
I wonder if an SCD is why we didn't get any AAA 3rd party games at launch. Any. I know this is expected for Nintendo at this point but I think even the 3rd parties that want to support Switch would want big titles there at launch, not only because it can help them sell when people are looking for software, but also because if there was any time to sow a tiny seed to help the foundation for a platform to sell games on, I would think it was in the first few months, rather than many months in, especially if these were just "market test" ports.

It could be that they are planning an end of the year launch that would be more like a standard console launch, whereas this March Switch release, has been much more like a handheld launch with the time of year and even the line up. It's a bit of a soft launch, but rather than being "rushed" like many people have commented, they are allowing the Switch hardware to gain traction and win over people with the design and function, as well as getting a chance to see what features people want (apps, online, etc.) so by the time the Holidays roll around, they can have closer to a full suite of features when the SCD and their paid online service launches.

It might also explain why NBA 2K18, FIFA18 and Skyrim are all coming out in the Fall, besides that being when the sports games usually release, it may also be why they decided on those franchises to be the titles they put on the Switch. A few interesting things of note, Peter Moore says it's a custom built version for the Switch but they confirmed it is actually FIFA18, it is also being built by the same team working on the other versions, and, perhaps the oddest detail to me, it's releasing the same date as the other versions. With how terribly the Wii U ports tended to be announced and marketed (or rather not announced and marketed), given delayed dates, and having features just not be present, this actually sounds a little promising for once. Also, while Skyrim may not named as such right now, it appears to be the remastered Special Edition. We also don't know when Steep will release and despite Ubi saying they have more unannounced games in development for the Switch and being a company that loves to be on a new platform day one, they only have Just Dance announced for the forseeable future. Even Rayman Legends isn't announced for at least launch window, which seems odd when you'd think that now is the time to release that game. Then there is also Project Sonic 2017 with is a winter title also being released on the other platforms. Seems a little odd that so many big 3rd party titles are sitting at the back half of the year.

This might also be a coincidence and they are trying to pack the holidays like Nintendo is with big titles for the system, but maybe they also know that the games will run better with the additional hardware.

It seems like a bit of a stretch, but the SCD from the patents will presumably be more than just an add-on to provide more power but can be used for cloud computing and such. It's possible that it can provide benefit not just in docked mode, but cloud/online features while you are out in portable mode, adding value for everyone no matter how you use the Switch. I've also speculated that by buying an SCD, you might be able to forgo the online fee, since you might hypothetically be helping to provide computing power for the cloud and that would also provide the option for vanilla Switch owners to get some of the cloud benefits, without an SCD, by paying the online fee.

If there is any truth to this, maybe they are waiting until E3 to announce the SCD, which also seems like the most likely time Nintendo announces more upcoming games. That would also likely include more AAA third party games, and could explain why they and Nintendo wanted to wait.


So you want Nintendo to confuse the consumers on launch year? GAF is really going down the shitter...
 

TLZ

Banned
I'm sure you would, as would a few others in this thread.
But Nintendo isn't going to all these lengths, muddying up all their marketing and creating customer confusion for a handful of guys determined to play Mario games in 4K with some add-on.

When the base configuration will always be the undocked Switch unit for the life of the device, what developer would even invest the time to upgrade their game for the small niche group of customers that actually bought this SCD with PS4 Pro-like power, when all that power is going to evaporate the second the unit is removed from it's dock. Nintendo's development houses are going to be spread thin on Switch either way and third parties aren't there to pull their weight and fill the gaps.

People interested in high-end visuals and hardware only tangentially overlap with Nintendo's core audience and that's not changing for the foreseeable future. Besides, that audience is already heavily catered two with the other two consoles and Nintendo would be a fool to act like it can realistically catch up without having a huge third party portfolio to back it up.
Nintendo should double down on it's strengths and not desperately cling to what the other two consoles are showing with a me-too add-on that is only really desired by people interested in winning forum wars.
You're getting a handheld system that also displays stuff on the living room screen when you dock it. Because the portable gaming market is the only market that's even remotely up for grabs and Nintendo knows this.

We're simply discussing that Chinese translation leak. It's too legit to quit.
 

Donnie

Member
Because the pro won't stay at $400 by the time this ever sells... So the difference in price would be bigger. And because i can guess most of the people who don't care about the portable aspect of it would feel they're paying extra for something they'd never use. You're basically forcing people to buy both your handheld and home console at the same time. That's a lot of cost that can be made easier dividing the costs making them separate.

Maybe, just maybe I'd get both sooner if both handheld and scd combined are close to pro pricing.

Switch won't stay at $300 either, so if we're assuming this addon will be out long enough in the future for PS4 Pro to drop in price why assume that Switch will still be at full price?

Not to mention that just because any potential addon sells at $200 (made up price I might add) that doesn't mean there couldn't be a bundle of Switch plus addon for cheaper than the price of buying individually.

What you're basically saying isn't that it would be overpriced. In effect you're saying it would be overpriced too people who aren't interested in the Switch and its concept of gaming anywhere. That's the same as any other product, if you aren't interested in what it offers its not worth the money, obviously. For people who already have a Switch its a great deal (much cheaper than having to pay for a entire new system) and for people who were interested in Switch but unhappy at lack of performance in docked mode it can be the reason to jump on board.
 

Bluth54

Member
So we're still clinging to this SCD pipedream, are we?

Come on, people. That's not remotely what the Switch (or Nintendo) is about and you know it.

If Nintendo is charging $90 for a dock that's a piece of plastic with a few USB ports imagine how much they would charge for a dock that had real hardware in it.
 

Donnie

Member
Scorpio.

I'm actually serious this time, PS4 Pro is great value at $399 and Scorpio will sit at $499-599, but by the time the Switch add-on comes out the competing systems will be out for years and if these are the specs it releases with as an add-on in 2 years or so it will be a tough sell unless it's under $199

Well I don't know where 2 years has come from but if we're assuming that we have to consider any Switch price drop as well. So you have PS4 Pro at lets say $300 ($100 price drop in 2 years). Scorpio at $450-500 ($50-$100 price drop in a year or so). Then assuming a $50 price drop for Switch and a small saving on bundling it can sell for around $400. Performance wise Switch would be right in the middle of both consoles, price wise it would be in the middle. Then you'd have the added feature of gaming anywhere.
 

TLZ

Banned
Switch won't stay at $300 either, so if we're assuming this addon will be out long enough in the future for PS4 Pro to drop in price why assume that Switch will still be at full price?

Not to mention that just because any potential addon sells at $200 (made up price I might add) that doesn't mean there couldn't be a bundle of Switch plus addon for cheaper than the price of buying individually.

What you're basically saying isn't that it would be overpriced. In effect you're saying it would be overpriced too people who aren't interested in the Switch and its concept of gaming anywhere. That's the same as any other product, if you aren't interested in what it offers its not worth the money, obviously. For people who already have a Switch its a great deal (much cheaper than having to pay for a entire new system) and for people who were interested in Switch but unhappy at lack of performance in docked mode it can be the reason to jump on board.

As much as I'd love the Switch's price to drop before I buy (I'm depending on it to buy it later), we all know how Nintendo operates and there's a high chance it'll stay at $300 for as long as possible.

As for the SCD, I honestly don't think most of the crowd who bought/will buy the Switch have any interest in the SCD if it ever comes to fruition. The reason they bought the Switch is the portable aspect. The SCD making their Switch more powerful at home is irrelevant to them, since they're happy with how it looks on the move, let alone coughing up the (imaginary) $200 to get something they're not interested in.
 

Theonik

Member
Well I don't know where 2 years has come from but if we're assuming that we have to consider any Switch price drop as well. So you have PS4 Pro at lets say $300 ($100 price drop in 2 years). Scorpio at $450-500 ($50-$100 price drop in a year or so). Then assuming a $50 price drop for Switch and a small saving on bundling it can sell for around $400. Performance wise Switch would be right in the middle of both consoles, price wise it would be in the middle. Then you'd have the added feature of gaming anywhere.
I don't think it could be coming out this year and so next year is the earliest it could happen but we have no idea. And you can't use the SCD everywhere. That's the whole point. Add-ons, even successful ones have always struggled.
 
I'm sure you would, as would a few others in this thread.
But Nintendo isn't going to all these lengths, muddying up all their marketing and creating customer confusion for a handful of guys determined to play Mario games in 4K with some add-on.

When the base configuration will always be the undocked Switch unit for the life of the device, what developer would even invest the time to upgrade their game for the small niche group of customers that actually bought this SCD with PS4 Pro-like power, when all that power is going to evaporate the second the unit is removed from it's dock. Nintendo's development houses are going to be spread thin on Switch either way and third parties aren't there to pull their weight and fill the gaps.

People interested in high-end visuals and hardware only tangentially overlap with Nintendo's core audience and that's not changing for the foreseeable future. Besides, that audience is already heavily catered two with the other two consoles and Nintendo would be a fool to act like it can realistically catch up without having a huge third party portfolio to back it up.
Nintendo should double down on it's strengths and not desperately cling to what the other two consoles are showing with a me-too add-on that is only really desired by people interested in winning forum wars.
You're getting a handheld system that also displays stuff on the living room screen when you dock it. Because the portable gaming market is the only market that's even remotely up for grabs and Nintendo knows this.

I completely agree with this. It doesn't make any sense for Nintendo to pursue a 4k dock with the Switch, at least as far as I can tell from Nintendo's market. Maybe they have market research which says otherwise but I doubt it.

The question remains though, what is this leak about? It's very hard to just toss out any portion of this leak considering how much info he got right, so what is he describing? Maybe a prototype unit for a standalone console? Maybe a VR prototype?

As Cuburt said above, the SCD patent had much more interesting potential as a local cloud device, and Cuburt brought up an excellent thought in that purchasing an SCD could grant that user a free online subscription, as the SCD would/could wind up strengthening online infrastructure. Could the kind of die (20x20mm) described by the leaker be explicitly purposed as a kind of "local cloud server"?

Something like that seems like a much more beneficial (and "Nintendo") avenue for Nintendo to go than a 4k dock.
 

_BC_

Member
somebody that's technical ... help me understand ...

in the "base" switch, you are playing docked and reach over and just pull it out ... seems like it works cause the GPU and memory are simply going along with the CPU ... just adjust clocks and such

but with this SCD ... when you pull the CPU out ... how does all that video memory stay with the handheld portion and just continue like nothing happened

how is this possible? .. ELI5 ...
 
somebody that's technical ... help me understand ...

in the "base" switch, you are playing docked and reach over and just pull it out ... seems like it works cause the GPU and memory are simply going along with the CPU ... just adjust clocks and such

but with this SCD ... when you pull the CPU out ... how does all that video memory stay with the handheld portion and just continue like nothing happened

how is this possible? .. ELI5 ...

The console would become a stationary one. You just couldn't use this power in handheld mode.
 
If MS could do $399 Sony can do $349 or lower. Don't think either party benefits from that particular pricewar.


The SCD as we have seen it so far is a PCI-E gpu connecting to the equivalent of a Thunderbolt 3 port. Some parts of the original switch would remain useful but the biggest benefit is continuity. Playing Zelda with a 4K output then switching to portable with minimal interruption. In practice this approach has a lot of technical challenges both for Nintendo but also Game design.

Both have always initially sold their consoles at a loss and sell much more competitively than Nintendo has, and its usually helped them in the long run, particularly since last generation.
 
I can't believe people got so attached to the SCD patent. It's not going to happen for years if ever. There's going to be an upgraded Switch before any kind of SCD.
 
I can't believe people got so attached to the SCD patent. It's not going to happen for years if ever. There's going to be an upgraded Switch before any kind of SCD.

To be fair I recall someone (I think Rosti) being in touch with the inventor of that patent and hearing that Nintendo was very, very interested in actually pursuing and developing that technology. I still think it will relate to online functionality rather than a 4k dock though.
 

_BC_

Member
The console would become a stationary one. You just couldn't use this power in handheld mode.

no .. i'm asking if it would still be possible to "switch" from docked to handheld and back again if the GPU and CPU were separated like this (with the SCD)
 
To be fair I recall someone (I think Rosti) being in touch with the inventor of that patent and hearing that Nintendo was very, very interested in actually pursuing and developing that technology. I still think it will relate to online functionality rather than a 4k dock though.

To be fair I think every company would like to pursue every technology they patent. The SCD makes no sense with the concept of the switch and is the hardware fanfiction the mods are making fun of us for.
 
To be fair I think every company would like to pursue every technology they patent. The SCD makes no sense with the concept of the switch and is the hardware fanfiction the mods are making fun of us for.

The bolded is absolutely, 100% false. Look at Nintendo- you saw how far they pursued that horse riding patent, that U-shaped controller patent... Seriously, as someone who works in patents I can assure you that a large portion of patents coming from big companies (Nintendo, Samsung, etc.) are never pursued as an actual product to use or sell. Rather, they are used for litigation purposes to prevent competitors from using or selling them without paying a licensing fee.

How that relates to an SCD, all I can say is that if Nintendo actively wants to pursue it moreso than other patents, we're likely to see some form of it.

I agree with you that a 4k dock doesn't make much sense, but a local cloud-like device for either storage or networking might make sense, as it could be used whether in handheld or docked mode.
 

Astral Dog

Member
So we're still clinging to this SCD pipedream, are we?

Come on, people. That's not remotely what the Switch (or Nintendo) is about and you know it.
Everything has to be tailored to the portable mode,if there is an extra dock it will mostly be used to push the resolution,it won't actually make the Switch "stronger" without breaking the userbase,more like a pro Switch at best.

But at least the resolution haters will have a nice choice ;)
 
Everything has to be tailored to the portable mode,if there is an extra dock it will mostly be used to push the resolution,it won't actually make the Switch "stronger" without breaking the userbase,more like a pro Switch at best.

But at least the resolution haters will have a nice choice ;)

If it can get switch games to look like the CEMU at 4k with 60fps it would be huge. Zelda wouldn't have any jaggies or shimmering for example which is what I don't like about it.

I hope Nintendo goes for it.
 

z0m3le

Banned
For people saying that a 4k dock doesn't make sense, while Nintendo isn't about chasing power, it's pretty much an extention of the idea of the basic dock. There is no fundamental difference between having a stronger gpu when docked to hit 1080p to having a much stronger dock to hit 4k.

And no, the undocked switch technically is not a limit for the device, the undocked switch is capable of running the full docked clock for the gpu at the expense of the battery life, we've seen Sony do this with the psp, so it's not unheard of.
 

Thraktor

Member
So we're still clinging to this SCD pipedream, are we?

Come on, people. That's not remotely what the Switch (or Nintendo) is about and you know it.

Prior to the confirmation of the Foxconn leak, I would have agreed with you, and in prior threads on the SCD concept I've been very critical of the notion of an "enhanced dock" or add-on to increase the system's performance, as I don't believe there's a sensible business case for it. I still don't see a sensible business case for it.

However, we can't ignore the evidence. The Foxconn leak has now turned out to get a number of very specific things right about the Switch, well beyond public knowledge or educated guesses. Therefore we have to take the last part of the rumour, about the "enhancer", at least somewhat seriously. The rumour is very specific about a roughly 200mm2 chip in a device which attaches to the Switch. I would be very open to any suggestions about what that chip is and what the form factor of the device is, but so far I haven't heard a single suggestion more plausible than GP106. Unlikely though it may be on the surface, it is the least unlikely explanation that I've heard thus far, so it merits consideration.

Just think about that for a moment. What are you actually achieving by docking a Switch to a SCD, instead of just using a SCD standalone. Ostensibly it would replace the SoC and RAM, take over input & output... so the purpose of the docked device would be what, being a cartridge and SD card slot?

In theory it could allow you to do the same thing the regular Switch dock does; allow you to start a game on the TV and undock to continue in portable mode, or vice versa. It would be more complex to implement from a technical perspective than regular Switch docking/undocking, but not impossible.
 

_BC_

Member
In theory it could allow you to do the same thing the regular Switch dock does; allow you to start a game on the TV and undock to continue in portable mode, or vice versa. It would be more complex to implement from a technical perspective than regular Switch docking/undocking, but not impossible.


so ... how would undocking in the middle of a game work?

wouldnt the SCD have all the GPU memory (assuming 4k would need more than the 4Gig that in the base console)? ... with the current configuration, all that data is in the console so undocking doesnt cause any disconnect between CPU and GPU/Memeory
 

Theonik

Member
so ... how would undocking in the middle of a game work?

wouldnt the SCD have all the GPU memory (assuming 4k would need more than the 4Gig that in the base console)? ... with the current configuration, all that data is in the console so undocking doesnt cause any disconnect between CPU and GPU/Memeory
As it has been described, it would probably contain a GPU and some scratch vram. Implementing this is difficult but not impossible. Intel has achieved it with Thunderbolt 3.
All the main memory would need to be on the base switch with the GPU memory being used exclusively for said GPU, the data would be re-generated on disconnect so you can return to gameplay after the fact.
 
So thought experiment:

The leak in the other thread continues to say the Switch has 4x A57s and a 256 core Maxwell GPU. I know it's from July, but still we have not heard anything from anyone besides the Foxconn leaker that would say differently. Even the Foxconn leaker doesn't seem to know for sure what the exact core configurations are.

So the experiment then is, let's assume the final SoC will indeed be 4x A57s with a 256 core Maxwell GPU on 20nm. Is it possible for this SoC to be running for 8 days straight with the CPU at 1.78GHz and the GPU at 921MHz considering the throttling involved in the Shield TV?

Could this be explained by having one CPU core disabled? Or one SM disabled? Or a more powerful cooling fan than the Shield TV has?
 

foltzie1

Member
I'll make an avatar bet against anyone that an SCD won't be revealed by E3 and am willing to re-up that bet for the rest of the year.

An SCD certainly made it to the prototyping stage and warranted patents, but I would be willing to bet money an avatar that it died at that stage for not being feasible for actual customer use.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
So thought experiment:

The leak in the other thread continues to say the Switch has 4x A57s and a 256 core Maxwell GPU. I know it's from July, but still we have not heard anything from anyone besides the Foxconn leaker that would say differently. Even the Foxconn leaker doesn't seem to know for sure what the exact core configurations are.

So the experiment then is, let's assume the final SoC will indeed be 4x A57s with a 256 core Maxwell GPU on 20nm. Is it possible for this SoC to be running for 8 days straight with the CPU at 1.78GHz and the GPU at 921MHz considering the throttling involved in the Shield TV?

Could this be explained by having one CPU core disabled? Or one SM disabled? Or a more powerful cooling fan than the Shield TV has?
They could have 3 cores and 1 SM disabled just as well. We just don't know. If we knew a bit more about the full picture (bigLITTLE), we'd be able to draw firmer conclusions. It could be the case (purely hypothetical) that there's a sole cluster of A57's with no A53 in sight, so whenever the device goes to proper low loads it just shuts 3 cores and downclocks the one remaining A57 -- while the clock is per-cluster, individual cores within a cluster are suspendable. The opposite holds as well - at max singe-threaded loads they could shut 3 cores and boost one core to its viable max (and an A57 can do ~2GHz at 20nm without issues).
 

Thraktor

Member
And this is where development on PS4 Pro will actually greatly begin to benefit the Switch with third parties, as new techniques for taking advantage of 2x FP16 will be developed.

There will be a bit of spillover from PS4 Pro and Scorpio supporting FP16, but I wouldn't expect anything major in the short-term. The impression I've got from reading about PS4 Pro support indicates that it tends to be rather small teams working on them, who would be unlikely to take the time to optimise for something like that, which wouldn't carry over to the other version(s) of the game. I'd expect the bigger middleware engines (e.g. UE4) to do so, and some internal teams like Naughty Dog would probably put quite a bit of work into optimising for PS4 Pro, but I wouldn't expect that much from internal engines by most third parties, as they're likely to focus on methods for improving performance on all platforms, and particularly on the ones with the largest install base (which is still going to be the regular PS4 and XBO for some time).

Anyway, what are your thoughts on the ability for a standard TX1 (like that in the Shield TV) to actually maintain the claimed clock speeds for 8 days straight? Presumably, since the Shield TV does indeed throttle according to MDave to levels below those from this leak, then that would indicate that the SoC being tested with those leaked clock speeds cannot be a standard TX1 right? It would suggest 16nm but I guess it could also suggest an SM or CPU core is disabled.

I just re-read through the updated translation, and it doesn't seem to actually mention testing for 8 days (unless I'm missing it). The only similar point it mentions is "There's no lag whatsoever after running for 2 hours straight"

In any case, it doesn't matter too much whether it's 2 hours or 8 days, if it's stable for one it's likely to be stable for the other.

The question would really come down to cooling. Electrically, we know the TX1 can hit those clocks (otherwise the Shield TV would have shut down immediately in MDave's test, rather than throttling), but it reaches a sufficiently high temperature to throttle in the Shield TV's case and with the Shield TV's cooling system. While we might expect Switch's cooling system to be less effective than Shield TV's, due to the smaller case, that's not necessarily guaranteed. There are a number of aspects of cooling design where, in theory, Nintendo may have improved over Shield TV's cooling system, such as using copper rather than aluminium cooling fins, using higher quality heatpipes, having a better thermal interface between chip and heatpipe, or between heatpipe and cooling fins, using a higher quality and/or higher speed fan, or just designing the system's internals in such a way as to facilitate better airflow.

We can't necessarily guarantee that Nintendo has done any of these, but we also couldn't rule them out. It should be possible to cool a 20nm chip at those clocks in a case the size of Switch, but it would likely require a pretty well designed cooling system.

The two problems I have with the 1.785GHz clock being used for games aren't related to the ability to power or cool the chip, but rather that (a) it's an extremely large jump over a 1Ghz clock which had already been described as final and (b) Nintendo seems intent to keep CPU clocks the same between docked and portable mode for games, and it seems like a very high clock to retain a decent battery life in portable mode (although this of course depends on architecture and manufacturing process).

If you pay close attention you'll notice the side profile of device's side matching the switch side profile to the t, the top bevel included, all underneath some form of a lid/cover placed over the joycon rails - notice the different plastic texture of the lids, vis-a-vis device's chassis.

That's a fair point. I had assumed the side panels were screwed on in place of the rails, but it's interesting that they have that little lip at the top. It seems a little odd that they would feel the need to cover up the rails with anything, as the photo was obviously only ever supposed to be seen by certified developers in the first place.

Thanks for the explanation, looks like my understanding of fp16/fp32 was a bit too limited. If they can actually get that 70% ratio of FP16 that Ubi dev mentioned, then that would mean quite a significant boost (54% if my math is correct).

Yeah, seems about right. It would obviously vary quite a bit from one game to the next, probably with better utilisation with Nintendo's internal engines. It's also not necessarily guaranteed that pure computational throughput is going to be the bottleneck in any specific scenario, but in cases where the game is tightly optimised for the architecture (i.e. first party games), it could potentially make a decent bit of a difference.

The points I bolded from the translation quote perfectly describe a home console IMO.

I dont believe it's a Dev unit. Why would the switch Dev unit be ps4pro power levels when the switch is multitudes lower in power? I hope we get more leaks about this from someone. Can't be at Nintendo's mercy to release info.

The unit is described as an "enhancer" which attaches to the Switch, and is used in addition to the Switch's SoC and RAM. This would point towards it being some kind of add-on rather than a stand-alone console.

The argument that it's a dev unit isn't that it's a dev unit for the regular Switch (as you say there's no reason for an extra GPU in that case), but rather that it's a dev unit for some kind of future hardware, possibly an add-on for the Switch which is still in development.

It could easily be a different version, but I still think he's talking about something resembling the picture.
The picture most likely has either a standard or modified Switch motherboard, with a second board attached to the back of it via some kind of PCI bridge, no?

Yeah, I wouldn't be surprised if it did look somewhat like the unit that was pictured, I just don't think it was that one in particular. For one, I'd expect it to be somewhat larger if it's cooling both Switch's SoC and a separate 200mm² chip.

I completely agree with this. It doesn't make any sense for Nintendo to pursue a 4k dock with the Switch, at least as far as I can tell from Nintendo's market. Maybe they have market research which says otherwise but I doubt it.

The question remains though, what is this leak about? It's very hard to just toss out any portion of this leak considering how much info he got right, so what is he describing? Maybe a prototype unit for a standalone console? Maybe a VR prototype?

As Cuburt said above, the SCD patent had much more interesting potential as a local cloud device, and Cuburt brought up an excellent thought in that purchasing an SCD could grant that user a free online subscription, as the SCD would/could wind up strengthening online infrastructure. Could the kind of die (20x20mm) described by the leaker be explicitly purposed as a kind of "local cloud server"?

Something like that seems like a much more beneficial (and "Nintendo") avenue for Nintendo to go than a 4k dock.

I agree that it wouldn't seem to make much sense from a business perspective, but at the same time the other possibilities seem to make even less sense to me. I would narrow down the possibilities into two groups as follows:


  • Possibility 1: The 200mm² chip is an existing "off the shelf" chip

    By this I mean it's not a chip designed custom for Nintendo. It may be a stand-in for a custom chip, or they may for whatever reason decide to actually use an off the shelf chip for the final product.

    In this case, it would seem reasonable to restrict the possibilities to the chips Nvidia produce or have in production. It's certainly technically possible for them to use a chip from another source, but it would be extremely bizarre, for example, for them to use an AMD chip in the add-on to their Nvidia-powered Switch. Nvidia make GPUs and SoCs, and on the GPU front they have one model which matches the described dimensions almost exactly, which is the GP106. Their SoC lineup doesn't have a matching model, though. The TX1 and Parker are both smaller than the described chip, and Xavier is expected to be bigger (it also isn't due to sample until late this year).

    Out of the plausible candidates for a pre-existing chip that Nintendo may be using, the GP106 would seem like the most likely option. It would also make sense as a stand-in for an existing chip, either a dedicated GPU or an SoC of some sort. The timing would, then, also make some degree of sense. A device being developed with an intended launch of late 2018 or thereabouts would expect to have its first run of dev kits go out around now, with off the shelf hardware used in place of custom chip(s) still being finalised. It would also explain why we haven't heard anything about it from developers yet, as if it's early enough to still be using stand-in hardware it may not have made it to many (or any) third-parties yet.

  • Possibility 2: The 200mm² chip is a custom chip designed specifically for Nintendo

    In a theoretical sense, this chip could be pretty much anything. It could be an SoC, a GPU, a CPU, or a fixed-function chip of any kind. The potential applications are quite wide.

    The issue with this possibility, though, is the timing. For Nintendo to have 2000 samples of a custom chip to be ready to go out to developers, they would have to be very far along in development. Keep in mind that final Switch SoCs reportedly only hit third parties a few months prior, and it would suggest a launch timetable perhaps 6 months or less after Switch itself. This would seem very unlikely to me, firstly from Nintendo's point of view to release a significant "upgrade" to Switch after it's barely hit shelves, but also because we haven't heard anything about it until now. If this had been in development concurrently with Switch for 1-2 years, we would have expected at least some kind of hint of it in one of the numerous leaks we've had on Switch over the past 6 months or so. We haven't, though, which would indicate that it either hasn't hit third parties yet, or it has hit very few of them, and in either case that is evidence of a device early in its development cycle.

Regarding the concept of a "local cloud server", you'd have to be a bit more specific as to what that entails. From the simplest of implementations being a wireless storage device (a NAS, effectively), it certainly wouldn't need a chip that large powering it. It could provide meaningful computational capacity (i.e. CPUs and/or GPU) over a wireless connection, but if it did so I would imagine it would be in addition to a hard-wired connection rather than as a replacement for, given the significantly enhanced capabilities a hard-wired connection would provide. The use of a PCIe bridge with the Switch (or a version thereof) in the dev kit would suggest to me a device that is intended to physically connect to Switch, at least in some circumstances.

Prior to all this I would have said that the most sensible way for Nintendo to pursue something like this would be to just release a traditional home console device (let's call it Switch Home, for the sake of argument). This would have a wider potential audience than a dock, and given Switch's specs it could likely play most Switch games at 4K while keeping the cost low. It could potentially also play third party games at 1080p if Nintendo allowed developers to make game supporting the Switch Home, but not the regular Switch. It would also be relatively low-risk, as all of Nintendo's development efforts would still benefit owners of other Switch devices.

I still think that seems like the most sensible way to go, but the evidence seems to suggest that the "enhancer" is something different. It's certainly possible that they're using the GP106 as a stand-in for a new custom SoC, with a few A72 cores and perhaps a GTX1050-level GPU. In that case, it may make sense to use both the Switch SoC and the GP106 combined together on the same board to stand-in for the new chip. The problem is, though, that this isn't a standalone box with both chips on one board, it's a device which attaches to Switch via a connector, and when combined they still have a screen (and can presumably be detached). It wouldn't seem to make much sense to design a dev kit in such a way if the device doesn't make any use of the regular Switch's functionality.

I suppose one option which is technically possible is that it's a stationary home console that also works as a dock for Switch. That is, when you buy it it comes with a pro controller (or possibly a pair of joycons) and can be used on its own without a Switch plugged in. If, however, you chose to plug a Switch into it you could use the same docking and undocking functionality as normal. It would be somewhat complicated from both a hardware and software design point of view (even more so than a GPU dock), but not necessarily impossible.

so ... how would undocking in the middle of a game work?

wouldnt the SCD have all the GPU memory (assuming 4k would need more than the 4Gig that in the base console)? ... with the current configuration, all that data is in the console so undocking doesnt cause any disconnect between CPU and GPU/Memeory

It's certainly possible, and laptops have been doing a similar thing for quite a while with automatic switching between a dedicated GPU (with its own RAM) and integrated GPU (which shares system RAM). Microsoft's Surface Book even allows users to physically separate the keyboard section (which includes a dGPU) and the system will revert to the integrated GPU automatically.

It is a little more complex in Switch, with a lower-level API and software specifically tailored to the hardware, but not impossibly so. It would likely be a matter of how much of a delay they'd be willing to allow for after you dock or undock the Switch. The Switch's memory pool would have to store all data relating to the game state, with the GPU's memory pool storing assets (i.e. textures, models, etc.) and render targets. This would pretty much be the expected division of data anyway, though, so would be unlikely to be an enormous burden to developers.

The decision then is whether at un-docking time you just re-load all assets from the game card or flash memory, which would be relatively straightforward but slow (effectively you're just loading the game again at the precise point you left it) or you could store lower-detail assets in Switch's memory pool to be prepared for undocking. This would reduce the delay after undocking, but would use up memory which then couldn't be used for other things, and would also be more complex for developers to implement.
 
They could have 3 cores and 1 SM disabled just as well. We just don't know. If we knew a bit more about the full picture (bigLITTLE), we'd be able to draw firmer conclusions. It could be the case (purely hypothetical) that there's a sole cluster of A57's with no A53 in sight, so whenever the device goes to proper low loads it just shuts 3 cores and downclocks the one remaining A57 -- while the clock is per-cluster, individual cores within a cluster are suspendable. The opposite holds as well - at max singe-threaded loads they could shut 3 cores and boost one core to its viable max (and an A57 can do ~2GHz at 20nm without issues).

Interesting... I guess then we cannot make any definitive determinations about the hardware based on this stress test, even if we believe the stress test is 100% accurate. Would dynamically suspending cores be a viable way to reduce thermals when playing a game though? Especially if throttling isn't an option?

I just re-read through the updated translation, and it doesn't seem to actually mention testing for 8 days (unless I'm missing it). The only similar point it mentions is "There's no lag whatsoever after running for 2 hours straight"

In any case, it doesn't matter too much whether it's 2 hours or 8 days, if it's stable for one it's likely to be stable for the other.

He claims the unit is being tested for 11,750 minutes which comes out to 8 days, 3 hours. I suppose he doesn't specify that it's tested for that long without the clocks changing though.

The question would really come down to cooling. Electrically, we know the TX1 can hit those clocks (otherwise the Shield TV would have shut down immediately in MDave's test, rather than throttling), but it reaches a sufficiently high temperature to throttle in the Shield TV's case and with the Shield TV's cooling system. While we might expect Switch's cooling system to be less effective than Shield TV's, due to the smaller case, that's not necessarily guaranteed. There are a number of aspects of cooling design where, in theory, Nintendo may have improved over Shield TV's cooling system, such as using copper rather than aluminium cooling fins, using higher quality heatpipes, having a better thermal interface between chip and heatpipe, or between heatpipe and cooling fins, using a higher quality and/or higher speed fan, or just designing the system's internals in such a way as to facilitate better airflow.

We can't necessarily guarantee that Nintendo has done any of these, but we also couldn't rule them out. It should be possible to cool a 20nm chip at those clocks in a case the size of Switch, but it would likely require a pretty well designed cooling system.

The two problems I have with the 1.785GHz clock being used for games aren't related to the ability to power or cool the chip, but rather that (a) it's an extremely large jump over a 1Ghz clock which had already been described as final and (b) Nintendo seems intent to keep CPU clocks the same between docked and portable mode for games, and it seems like a very high clock to retain a decent battery life in portable mode (although this of course depends on architecture and manufacturing process).

Well, according to today's leak- back in July, devkits only had 3 A57 cores available for games, and no limit on clock speed, meaning 2GHz max. Only later (sometime in Fall) was that apparently dropped to 1GHz according to Digital Foundry. It would be strange asking developers to go from 3x A57s at 2GHz to 3x A57s at 1GHz, as that would be a 50% drop, a good deal bigger of a jump than from 1GHz to 1.78GHz.

It could also be that the reported 1GHz clock was tied to A72s (possibly more than 3) in a stronger October devkit so that the dropoff from 2GHz wasn't as large. I don't know how big of a difference 1GHz A57s to 1GHz A72s are, but it could make up for a decent portion of the processing lost in the 1GHz drop.

Basically what I'm saying is, back in July devkits had 3x A57s available for games with a max clock speed of 2GHz. Going from there to 1.78GHz is not as drastic of a difference as going from there to 1GHz.

Regarding the concept of a "local cloud server", you'd have to be a bit more specific as to what that entails. From the simplest of implementations being a wireless storage device (a NAS, effectively), it certainly wouldn't need a chip that large powering it. It could provide meaningful computational capacity (i.e. CPUs and/or GPU) over a wireless connection, but if it did so I would imagine it would be in addition to a hard-wired connection rather than as a replacement for, given the significantly enhanced capabilities a hard-wired connection would provide. The use of a PCIe bridge with the Switch (or a version thereof) in the dev kit would suggest to me a device that is intended to physically connect to Switch, at least in some circumstances.

So, back when the SCD patent first surfaced I recall several users were discussing the possibility of a "local cloud" which could supplement any NX console regardless of if the NX console owner actually owned an SCD. Meaning, this SCD would connect to the internet to supply processing power to nearby NX consoles (nearby as in within 10's of miles) when it's not being used to supplement the NX console to which it's attached. That would be how the "reward" functionality detailed in the patent would work. You get certain rewards for "lending" your SCD to the cloud.

Now that we know the NX is the Switch, I think that idea sorta fizzled out because there doesn't seem to be much of a reason to supplement power to a handheld with a 720p screen, as it can't display more than 720p in handheld mode. However, it could still be something designed to enhance docked consoles in that way. Or somehow enhance online functionality.

That's what I mean by "local cloud server".
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Interesting... I guess then we cannot make any definitive determinations about the hardware based on this stress test, even if we believe the stress test is 100% accurate. Would dynamically suspending cores be a viable way to reduce thermals when playing a game though? Especially if throttling isn't an option?
If a game expressed interest (read: ran so-and-so active threads) in a smaller number of cores than the total available, I don't see why not (save for os deficiencies).
 

Ridley1

Neo Member
I wouldn't expect that much from internal engines by most third parties, as they're likely to focus on methods for improving performance on all platforms, and particularly on the ones with the largest install base (which is still going to be the regular PS4 and XBO for some time).

Hey Thraktor, great reply, well written and well thought out! Thank you! If I may, I have a couple silly questions:
1) Regarding the above quote... I'm not sure the source for the Ubisoft dev claiming 70% FP16, but that seems strange given what you said here. Why do they have so much FP16 code in their games if neither PS4 nor XBO can use it? Is he talking about mobile games?

2) Could the "enhancer" dock be referring to something much simpler like upscaling to 4k? Like an upscaling blu ray player or TV? I know that doesn't make a ton of sense because if you have a 4k TV, chances are your TV or receiver is going to try and upscale it anyway... maybe they think they can do a better job of it?
 
If a game expressed interest (read: ran so-and-so active threads) in a smaller number of cores than the total available, I don't see why not (save for os deficiencies).

Right, I get that as a game-by-game basis, but I'm more curious if it's more efficient to suspend CPU cores than it is to throttle CPU speed for the system in general. Of course it likely will always depend on the game, as I guess some games require more single-threaded operations than others.

Hey Thraktor, great reply, well written and well thought out! Thank you! If I may, I have a couple silly questions:
1) Regarding the above quote... I'm not sure the source for the Ubisoft dev claiming 70% FP16, but that seems strange given what you said here. Why do they have so much FP16 code in their games if neither PS4 nor XBO can use it? Is he talking about mobile games?

XB1 and PS4 do use FP16 code, and I think it's slightly more efficient than FP32 code for them, but the difference with the Switch and Pro is that they get 2 times the performance out of FP16 code. So when people talk about the Switch having that advantage over XB1 and PS4 they mean the Switch can run that code twice as fast, not that the PS4/XB1 can't run that code at all.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Right, I get that as a game-by-game basis, but I'm more curious if it's more efficient to suspend CPU cores than it is to throttle CPU speed for the system in general. Of course it likely will always depend on the game, as I guess some games require more single-threaded operations than others.
Right, it's a game-by-game case - if a game needs 2 lower-latency threads there's no point in giving it 3 higher-latency threads. I'm not referring to ideal cases - I'm referring to day-to-day scenarios.
 

sits

Member
I've missed all the gossip in the other thread today. After a cursory glance at 25 pages, just checking: we're none the wiser, right?

Seems like it's just confirming the July Digital Foundry info we already knew.
 
I've missed all the gossip in the other thread today. After a cursory glance at 25 pages, just checking: we're none the wiser, right?

Seems like it's just confirming the July Digital Foundry info we already knew.

Yup, although we got Matt saying that the devkits actually got stronger after July, which has been rumored elsewhere. And the leak states that the July devkits have one out of four A57s unavailable for games, but with no restriction on clock speed.

So the devkits getting stronger yet limiting the clock speeds to 50% of what they were don't quite add up to me.
 

foltzie1

Member
by E3 2018 I believe it will be

What compelling feature could such a SCD offer over what the Switch offers now?

Is Pascal or its successor capable of driving 4K 60fps-ish gaming competently?

It is hard to imagine what would be on the table to justify such a device.

I would be willing to do a bet on avatar 2018 too if your game.
 

Thraktor

Member
He claims the unit is being tested for 11,750 minutes which comes out to 8 days, 3 hours. I suppose he doesn't specify that it's tested for that long without the clocks changing though.

Ah, right you are. I was looking for the number 8 and I skipped past that.

Well, according to today's leak- back in July, devkits only had 3 A57 cores available for games, and no limit on clock speed, meaning 2GHz max. Only later (sometime in Fall) was that apparently dropped to 1GHz according to Digital Foundry. It would be strange asking developers to go from 3x A57s at 2GHz to 3x A57s at 1GHz, as that would be a 50% drop, a good deal bigger of a jump than from 1GHz to 1.78GHz.

It could also be that the reported 1GHz clock was tied to A72s (possibly more than 3) in a stronger October devkit so that the dropoff from 2GHz wasn't as large. I don't know how big of a difference 1GHz A57s to 1GHz A72s are, but it could make up for a decent portion of the processing lost in the 1GHz drop.

Basically what I'm saying is, back in July devkits had 3x A57s available for games with a max clock speed of 2GHz. Going from there to 1.78GHz is not as drastic of a difference as going from there to 1GHz.

Well, the spec sheet says that the hardware used in the dev kits was capable of running at 2GHz, but in "Functionality available to the application" they describe the clock as "TBD". Ditto with the GPU (which seems to be TBD in its entirety). It seems a bit strange that they would give the theoretical maximum speeds but not the actual in-game ones, but it's possible that in-game clocks were changing too frequently at that point for them to put in printed material (it may have been in the patch notes for firmware updates instead). I'd be quite surprised if clock speeds were ever actually hitting 2GHz in dev kits, though.

So, back when the SCD patent first surfaced I recall several users were discussing the possibility of a "local cloud" which could supplement any NX console regardless of if the NX console owner actually owned an SCD. Meaning, this SCD would connect to the internet to supply processing power to nearby NX consoles (nearby as in within 10's of miles) when it's not being used to supplement the NX console to which it's attached. That would be how the "reward" functionality detailed in the patent would work. You get certain rewards for "lending" your SCD to the cloud.

Now that we know the NX is the Switch, I think that idea sorta fizzled out because there doesn't seem to be much of a reason to supplement power to a handheld with a 720p screen, as it can't display more than 720p in handheld mode. However, it could still be something designed to enhance docked consoles in that way. Or somehow enhance online functionality.

That's what I mean by "local cloud server".

That's fair enough, and I wouldn't completely rule that out, but there are definitely limitations involved in that kind of approach. In a videogame scenario, the applications for remote computational capacity are going to become more and more limited the more "remote" the computational capacity is. This is just down to the majority of videogame computation being both latency critical (i.e. you want it done before the end of the frame) and often bandwidth intensive. As latency goes up and bandwidth goes down with distance, the pool of potential applications drops. The work required by developers also becomes more challenging as distance grows (when I say distance here I'm largely concerned with the logical distance over a network, although that's typically correlated with physical distance too).

The physically attached GPU is pretty much the most extreme case of how "near" an SCD can be, operating over (let's say) a 1GB/s+ low latency PCIe connection. In this case the limitations on what the attached computation device can do are pretty much zero. The GPU can (and would) perform any and all tasks suited for it, with only relatively little attention paid to the interconnect between the two. The functionality offered by a "far" SCD, over an internet connection, will be much more limited (and could vary quite a bit depending on the circumstances). You certainly wouldn't be able to run the majority of rendering tasks at that distance.

AI is one relatively obvious example where remote computation is feasible (and it can use the same latency-managing networking principles as are already used for multiplayer). Physics is also a potential use-case, depending on the specifics of the implementation and the quality of the connection (some physical simulations may be more latency-sensitive than others). You could perhaps also get a bit inventive about some other parts of the game, but you'd still be limited by latency and bandwidth.

The issue is that I don't really see much of an incentive either for developers to make good use of such a device, or for users to see value in what it adds. "Better AI" is a notoriously difficult thing for players to identify, whereas the classic "better graphics" that usually sells new hardware would be very difficult to deliver in this paradigm. The benefits of a "near" SCD are both much easier to implement and much easier to sell than a "far" SCD, and although Nintendo could certainly design a device which would provide for both, I can't imagine them releasing one which is limited only to that far-away functionality.

Of course, this is assuming that the Switch is the primary computational device and is assisted by the SCD, rather than the other way around. If you consider the SCD to be the primary device then you open up the possibility of it being used as a game streaming system, as I posted in the original SCD thread (as described here, here and here). In this case you don't need to transmit any high-bandwidth graphical data between the two systems, as the entire frame is rendered on the SCD, and then the resultant video stream can be encoded in VP9 and transmitted with relatively little bandwidth. This would seem like a reasonable use-case from a business perspective, as the benefits would be obvious to customers and would require minimal (or possibly even zero) implementation by developers.

It isn't without its potential issues, though. For one, with Switch being a portable device it would only be feasible while either docked or on a decent-quality wifi connection, which may limit its usefulness in many people's eyes. Secondly, as the SCD would need a CPU, a full complement of RAM, etc, and effectively be an all-in-one console. This seems to conflict with the leak this thread relates to, though, which describes an "enhancer" that attaches to Switch.

Switch's wireless capabilities are interesting from the perspective of an SCD, though. For one thing, the supposed "4G" version of the Switch described in the leak seems to be the one which connects to the "enhancer". This may imply that Nintendo would release a new version of the Switch which can wirelessly connect to an enhancer SCD-style over 4G, but I'd be quite skeptical of how well that kind of functionality would work over a 4G connection.

In the realm of confirmed facts, though, we know from the FCC filings that Switch supports 802.11ac with 2x MIMO over 80Hz channels, giving a theoretical 867Mb/s over 5GHz and 400Mb/s over 2.4GHz. This is comparable to high-end phones and tablets (and some of the better-specced mid-range models), and you have to go to high-end laptops before you see devices which support higher-speed wifi (such as the €2,800+ 15" MacBook Pro, but not the €1,750+ 13" version). For Nintendo, though, it's an enormous jump. The n3DS, which released in late 2014 after 802.11ac was starting to become common, supports only 54Mb/s 802.11g (an 11-year old standard by that point). Even the Wii U only supports 150Mb/s 802.11n over 2.4GHz*.

It's interesting to consider why they would have made such a change in behaviour. It may be part of a change in personnel or design philosophy beginning with Switch (and there definitely seems to be a change in design philosophy from prior hardware in a number of other ways), but it would be interesting to consider why they would commit the BoM necessary for such wireless functionality when a cheaper solution would have elicited few complaints for services like online multiplayer and game downloads. It may make sense in the context of some kind of SCD-like functionality, though.

Consider, for example, a dock/SCD device which Switch could connect to via an ad-hoc wireless connection (i.e. a direct connection between the two, rather than through the user's router). The bandwidth provided by a 2x80MHz MIMO 802.11ac connection would actually be enough to do some pretty interesting things, potentially including some remote graphical work without having to just stream video a-la Wii U (although it could also do that). I also wouldn't really see a reason to limit it just because of the 720p screen. The best-looking games around at 720p still don't look as good as real-life (or good quality CG), so there's always scope to benefit from additional processing capacity.

*To be fair, Wii U uses the 5GHz wifi PHY layer in it's proprietary video-streaming protocol for the gamepad, so it's likely that they wanted to avoid any potential interference which might arise from also including 5GHz wifi connectivity.

Hey Thraktor, great reply, well written and well thought out! Thank you! If I may, I have a couple silly questions:
1) Regarding the above quote... I'm not sure the source for the Ubisoft dev claiming 70% FP16, but that seems strange given what you said here. Why do they have so much FP16 code in their games if neither PS4 nor XBO can use it? Is he talking about mobile games?

Well, Ubisoft seem to put a bit more effort into Nintendo platforms than most third-parties. They're also rumoured to have an exclusive game in the works (Mario/Rabbids RPG), so it's possible that the developer in question is working on that, and in such a case you can certainly imagine they'd put a reasonable amount of work into optimising for Switch, given its the only platform.

2) Could the "enhancer" dock be referring to something much simpler like upscaling to 4k? Like an upscaling blu ray player or TV? I know that doesn't make a ton of sense because if you have a 4k TV, chances are your TV or receiver is going to try and upscale it anyway... maybe they think they can do a better job of it?

They wouldn't really need an extra box for that. If they wanted to output an upscaled 4K image then Switch could do it alone, as the SoC should be capable of scaling and outputting to 4K, and the USB-C connector would be capable of transmitting it (via DisplayPort alt mode). I'm not sure if the HDMI port on the dock is HDMI 2.0 compatible, but if 4K was something they wanted from the system they could have supported it out of the box at relatively little expense.
 

Maybe the 802.11ac Wifi support has more to do with local LAN multiplayer. I think for a game like Splatoon with 10 (apparently) players locally in a single game the bandwidth requirements might get fairly high. Also you're pushing over 10x more pixels per Switch than you were per 3DS, though I don't know if that has much of an affect on wifi bandwidth requirements.

And regarding the July devkits, if they were labeled as 2GHz max and 1GHz max for the CPU and GPU on the hardware side, and TBD on the "for applications" side, how would developers know what the max for their games should be? Would the devkits be incapable of actually reaching those max clock speeds? I guess that would be hard for us to know without hearing from a developer who used those devkits back then.

Anyway, if the 4k SCD dock is a thing (which I'm still unsure about) then I don't think 3x A57s at 1GHz would be a very suitable CPU for running 4k applications. I guess I don't know how CPU requirements scale with GPU power but I would imagine for 4k that you'd need something a bit closer to what the PS4Pro and Scorpio will have (aka what the PS4 and XB1 have).
 
D

Deleted member 465307

Unconfirmed Member
Can anyone tell me how this potential leak overcomes the bandwidth issue between the hypothetical Switch and dock? Last I heard, the single USB-C port presented a big issue if you wanted to have an external GPU or RAM.
 
Can anyone tell me how this potential leak overcomes the bandwidth issue between the hypothetical Switch and dock? Last I heard, the single USB-C port presented a big issue if you wanted to have an external GPU or RAM.

I think the hypothetical "enhanced dock" will have a different connection protocol which would work with the USB-C port in the base Switch.
 

Hermii

Member
I'll just say this.... something is off. No way this is the console devs are saying Nintendo worked with third parties with on developing. We still haven't heard any major negative tone from developers about the specs. Something is missing. No way a stock X1 or even worse is in the switch and it is a developers dream and running full unreal 4 games in a week. Something is missing... what is the most "logical" answer? my personal opinion like I said before is whatever they put in the switch is nothing to write home about... but it does its job. It can get ports of xb1 and ps4 games without much hassle.
The feeling I get from these leaks is that 99% of the r&d budget for the Switch soc was spent on creating awesome API/ tools/ middleware support for a standard tx1. And I agree something sounds off about that.
 

antonz

Member
I'll just say this.... something is off. No way this is the console devs are saying Nintendo worked with third parties with on developing. We still haven't heard any major negative tone from developers about the specs. Something is missing. No way a stock X1 or even worse is in the switch and it is a developers dream and running full unreal 4 games in a week. Something is missing... what is the most "logical" answer? my personal opinion like I said before is whatever they put in the switch is nothing to write home about... but it does its job. It can get ports of xb1 and ps4 games without much hassle.

Paper specs do not reveal everything about hardware. Not all Flops are created Equal and in the nd have large performance differences across generations. A Maxwell Flop performs around 40% higher than the flops you would see from the PS4/XBO.

So if Foxconn leaker proves to be true in the end Switch would be performing at around 660gflops in the GCN architecture. That is still a gap from the 1.3tflops of XBO and 1.84 of the PS4. Obviously there would be issues like memory bandwidth but the device can punch reasonably high for what it is.
 
Paper specs do not reveal everything about hardware. Not all Flops are created Equal and in the nd have large performance differences across generations. A Maxwell Flop performs around 40% higher than the flops you would see from the PS4/XBO.

So if Foxconn leaker proves to be true in the end Switch would be performing at around 660gflops in the GCN architecture. That is still a gap from the 1.3tflops of XBO and 1.84 of the PS4. Obviously there would be issues like memory bandwidth but the device can punch reasonably high for what it is.

There's also the FP16 performance to take into account. If we assume the maximum amount of FP16 code possible in a game is 70% then that 660GFlops turns into around 1TFlop max. Which you can see in something like Snake Pass on UE4, which looks quite similar to the PS4 version at 1080p 30fps locked, where the PS4 version is 60fps unlocked.

That type of comparison wouldn't make sense if based on a flat flop to flop comparison of 1.8TF to ~400GF.
 
D

Deleted member 465307

Unconfirmed Member
I think the hypothetical "enhanced dock" will have a different connection protocol which would work with the USB-C port in the base Switch.

But wouldn't it still be limited by USB-C's transfer speed if it uses that port? I thought that was the whole issue, but maybe I'm misunderstanding.
 
But wouldn't it still be limited by USB-C's transfer speed if it uses that port? I thought that was the whole issue, but maybe I'm misunderstanding.

I'm not an expert here but I think the difference is the physical USB-C port versus the USB 3.0 or 3.1 protocol associated with that port. I think (based on a Thraktor post somewhere in the past few pages) Nintendo can opt to use a connection protocol with much higher bandwidth with the same USB-C port on the Switch.

Again, I could be very wrong on all of this because I know nothing about this more than what other people on this forum have been saying.
 
I think the hypothetical "enhanced dock" will have a different connection protocol which would work with the USB-C port in the base Switch.

Whatever the protocol is it would i imagine have more bandwidth available to it as the usb c would no longer have to carry the display port connection over it as the visuals would be created by the gpu in the dock
 
Here's a question. How far below 720p can a game's resolution in portable mode go before the IQ starts looking massively shitty?

I wanted to sort of test it on my current phone, which has a 5" 720p screen. But the pixel density of that is much higher than the Switch's screen. 480p wouldn't look great, but I wouldn't be surprised if someone attempted it.
 
There's also the FP16 performance to take into account. If we assume the maximum amount of FP16 code possible in a game is 70% then that 660GFlops turns into around 1TFlop max. Which you can see in something like Snake Pass on UE4, which looks quite similar to the PS4 version at 1080p 30fps locked, where the PS4 version is 60fps unlocked.

That type of comparison wouldn't make sense if based on a flat flop to flop comparison of 1.8TF to ~400GF.

I think the biggest worry for me though is that even though docked mode will be 2.5x as more powerful as handheld, I hope that power won't just be dedicated to bumping the resolution instead of effects. As we know 720p to 1080p bump takes roughly 2.25x processing power.. I'd hope Nintendo allows devs to use remaining power would be for frame rate stability or graphical effects.. I'm more so curious if we get aAAA game like Call of Duty this Fall, and if handheld ends up being 720p(god forbid its less), then would devs make docked mode 1080p(just resolution bump) or if they could be given a choice to have it at same 720p resolution but with added graphical effects to match the PS4/Xbone version.

? I'd rather have cod on switch 720p on docked with the same graphical fidelity as Xbone/PS4 but with sacrificed resolution, than COD4 switch at 1080p resolution with considerably less graphical effects like polygons, textures, lighting, shadows, etc
 
Top Bottom