• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Curious Case of the Switch Foxconn Leak (Now a hardware fanfiction thread)

Where'd you get the Wii U part from?

Also I'd hardly call Switch docked 900p30 with minimal drops "significantly more powerful". What would you say if docked was 1080p60 or 4k30 then?

One question is, if portable is natively 720P30 and the clock speed ratio reported by EG is correct (40% of docked), then why couldn't they do 1080P30 in docked?

This presents 3 possibilities:

- The potable would have to be 600P30 to account for the difference
- The portable is 720P30 but with some IQ/FPS sacrifices
- Portable is 720P30 with the same IQ/FPS but they haven't optimized the docked version as well as they could.

Incidentally, Nintendo didn't mention anything about the resolution of the portable mode in the IGN article. We only have DF's report on that.
 

Durante

Member
One question is, if portable is natively 720P30 and the clock speed ratio reported by EG is correct (40% of docked), then why couldn't they do 1080P30 in docked?

This presents 3 possibilities:

- The potable would have to be 600P30 to account for the difference
- The portable is 720P30 but with some IQ/FPS sacrifices
- Portable is 720P30 with the same IQ/FPS but they haven't optimized the docked version as well as they could.

Incidentally, Nintendo didn't mention anything about the resolution of the portable mode in the IGN article. We only have DF's report on that.
... or it's memory bandwidth limited.
 

TLZ

Banned
Also NateDrake is banned and mods already advised against using him as a source for info.

Oh? When did that happen?

One question is, if portable is natively 720P30 and the clock speed ratio reported by EG is correct (40% of docked), then why couldn't they do 1080P30 in docked?

This presents 3 possibilities:

- The potable would have to be 600P30 to account for the difference
- The portable is 720P30 but with some IQ/FPS sacrifices
- Portable is 720P30 with the same IQ/FPS but they haven't optimized the docked version as well as they could.

Incidentally, Nintendo didn't mention anything about the resolution of the portable mode in the IGN article. We only have DF's report on that.

Oh I actually thought that was reported by Nintendo themselves.
 
I generally agree z0m3Ie that it's very strange to have these stress tests running at such high clocks, but it does really seem like Eurogamer's info comes from the final devkit info given to developers. So I'm not expecting these clocks for the final hardware.

However, if Nintendo wants to leave open the possibility to raise the clocks via a patch one day, then this would indicate that either the CPU cores are indeed at least A72s, or that Nintendo would allow for a CPU boost mode when docked further down the road (which doesn't make much sense).

I'm curious- if we do a calculation of A57 power draw at 1GHz and A72 power draw at 1.78GHz, as well as a 768GHz 20nm GPU with 2 SMs vs a 921MHz 16nm GPU with 2 SMs, is it basically the exact same power consumption for both?

Interesting. I didn't check to see if there was a corelation with the CPU, but it appears to be there. It is interesting that this post's CPU info is a precise 75% higher than Eurogamer's CPU clockspeed.

As for the GPU, MDave's tests with TX1 showed that the GPU was throttling up/down in 76.8 increments

On top of all of that, keep in mind this leak came out weeks before Eurogamer's clockspeed leaks, so it really makes it hard to dismiss this leak as fake. Very hard.

Or the added performance was used for draw distance enhancements and extra detail...

This is what I'm thinking too. On a 6.2 inch screen you might not need the same LoD, draw distance or foliage detail/amount as you do on a 60 inch TV. Much simpler, less open games like Mario Kart 8 run at 1080p because Nintendo has chosen to prioritize the resolution over effects like these.
 

z0m3le

Banned
I generally agree z0m3Ie that it's very strange to have these stress tests running at such high clocks, but it does really seem like Eurogamer's info comes from the final devkit info given to developers. So I'm not expecting these clocks for the final hardware.

However, if Nintendo wants to leave open the possibility to raise the clocks via a patch one day, then this would indicate that either the CPU cores are indeed at least A72s, or that Nintendo would allow for a CPU boost mode when docked further down the road (which doesn't make much sense).

I'm curious- if we do a calculation of A57 power draw at 1GHz and A72 power draw at 1.78GHz, as well as a 768GHz 20nm GPU with 2 SMs vs a 921MHz 16nm GPU with 2 SMs, is it basically the exact same power consumption for both?



On top of all of that, keep in mind this leak came out weeks before Eurogamer's clockspeed leaks, so it really makes it hard to dismiss this leak as fake. Very hard.



This is what I'm thinking too. On a 6.2 inch screen you might not need the same LoD, draw distance or foliage detail/amount as you do on a 60 inch TV. Much simpler, less open games like Mario Kart 8 run at 1080p because Nintendo has chosen to prioritize the resolution over effects like these.

Problem is there is nothing to really compare it to with the GPUs. I can compare maxwell to pascal but X1 is a 20nm chip while all other maxwell chips are 28nm on earlier fabs so they use more power than they would at 20nm. Technically the X1 at 768mhz should draw more power than the X1 manufactured on 16nm at 921mhz, which could be where the better battery life came from for the final devkits. It's hard to calculate since the later chip doesn't exist.
 
Problem is there is nothing to really compare it to with the GPUs. I can compare maxwell to pascal but X1 is a 20nm chip while all other maxwell chips are 28nm on earlier fabs so they use more power than they would at 20nm. Technically the X1 at 768mhz should draw more power than the X1 manufactured on 16nm at 921mhz, which could be where the better battery life came from for the final devkits. It's hard to calculate since the later chip doesn't exist.

Can't we use the "60% improved power consumption" at the same clock rate to get a good idea though? It may not be precisely the final power draw but it would give us a reasonable expectation, right?
 

z0m3le

Banned
Can't we use the "60% improved power consumption" at the same clock rate to get a good idea though? It may not be precisely the final power draw but it would give us a reasonable expectation, right?

Isn't it 40% performance increase at the same power consumption then? so you'd still have less power consumption with the clock increase.

Which frame rate dips during DoF effects would point heavily towards.

Wii U's 35MB of embedded ram is also coming into play here, or am I missing something? I mean if Wii U is using all available embedded ram, doesn't that mean it can push more alpha textures than it's gpu has any right to? especially the 3MB of very fast t1sram from legacy gamecube on die?

It's also reckless to use this metric when we've seen Wii U with the same effects causing dips (alpha textures, fire, ect) and that is the comparison we are making.
 
Isn't it 40% performance increase at the same power consumption then? so you'd still have less power consumption with the clock increase.

Right, the GPU would likely be consuming less power with the Foxconn leak specs, but what I'm asking is when you add that to the Foxconn leak CPU power consumption, does that wind up almost matching the DF GPU and CPU power consumption? So:

Foxconn CPU power + Foxconn GPU power = DF CPU power + DF GPU power?

Because if those numbers do match, then it would definitely suggest that they managed to put the previous specs onto a 16nm node to increase performance at similar power consumption.

Again, I'm still of the opinion that the Foxconn specs are either outdated or just for stress tests though, due to the timing and wording of the DF article.

Wii U's 35MB of embedded ram is also coming into play here, or am I missing something? I mean if Wii U is using all available embedded ram, doesn't that mean it can push more alpha textures than it's gpu has any right to? especially the 3MB of very fast t1sram from legacy gamecube on die?

It's also reckless to use this metric when we've seen Wii U with the same effects causing dips (alpha textures, fire, ect) and that is the comparison we are making.

Based on the E3 footage also, the Wii U version dips heavily during explosions with fire/smoke alpha effects while the Switch version does not. I'm not sure if that suggests anything though.

Personally I just think the DF BotW demo was the 12S version of the game which is simply the Wii U E3 demo ported to the Switch (with little optimization), and the final 1.0 version played by the Treehouse likely will have none of those frame drops. I don't think BotW can tell us much about the hardware anyway, as it is a port from completely different architecture.
 
Which is funny because everyone is throwing the kitchen sink on how terrible the hardware is based on these lahnch ports. To me i use ARMS as an example. It looks like a game tat might have been developed exclusively on Switch hardware and not started its life on Wii U. The game is clean as fuck! Eye popping colors and I believe the word coming out is 1080p 60fp. I'm not saying the switch is ultra powerful I'm just saying we don't have good enough proof of what it is or isn't. I can't what for this device teardown.

I completely agree. ARMS honestly looks like it could be a PS4/XB1 game. Apparently it has little AA which I personally haven't noticed, but certainly at first glance it looks very, very good.

And as you said this really doesn't tell us much about the hardware either, but it's definitely interesting to me speculating about the hardware itself. I'm kinda fascinated by that end of console/hardware design, even including things like how business and marketing decisions affect the hardware.
 

z0m3le

Banned
Right, the GPU would likely be consuming less power with the Foxconn leak specs, but what I'm asking is when you add that to the Foxconn leak CPU power consumption, does that wind up almost matching the DF GPU and CPU power consumption? So:

Foxconn CPU power + Foxconn GPU power = DF CPU power + DF GPU power?

Because if those numbers do match, then it would definitely suggest that they managed to put the previous specs onto a 16nm node to increase performance at similar power consumption.

Again, I'm still of the opinion that the Foxconn specs are either outdated or just for stress tests though, due to the timing and wording of the DF article.



Based on the E3 footage also, the Wii U version dips heavily during explosions with fire/smoke alpha effects while the Switch version does not. I'm not sure if that suggests anything though.

Personally I just think the DF BotW demo was the 12S version of the game which is simply the Wii U E3 demo ported to the Switch (with little optimization), and the final 1.0 version played by the Treehouse likely will have none of those frame drops. I don't think BotW can tell us much about the hardware anyway, as it is a port from completely different architecture.

So the answer is both the CPU and GPU clocks here together would consume less power because the CPU clock with A72 @ 1.78ghz clock is under 2.1watts so the 0.27w should be within the scope of power consumption that the GPU is saving, as 500mhz X1 GPU seems to draw 1.5watts and a 20% power savings there would be saving 0.3w, here we are talking about what? 30% in a linear power increase? I haven't seen other power consumption numbers for just the GPU though, so it's hard to estimate anything closer than just less than DF's clocks.
 
So the answer is both the CPU and GPU clocks here together would consume less power because the CPU clock with A72 @ 1.7ghz is the same, the 1.78ghz clock is about 2watts so the 0.15w should be within the scope of power consumption that the GPU is saving, as 500mhz X1 GPU seems to draw 1.5watts and a 10% power savings there would be the same, here we are talking about what? 30% in a linear power increase? So you could actually see these clocks save a noticeable amount of power. I haven't seen other power consumption numbers for just the GPU though, so it's hard to estimate anything closer than just less than DF's clocks.

Interesting, thanks. So since those overall power consumption numbers appear to be lower for the Foxconn clocks and 16nm/A72 then it seems less likely that Nintendo simply opted for slightly more performance at the same power consumption, which would mean the Foxconn leaker's assumption of 16nm and A72 seems incorrect to me.

Which means these clocks were just for stress testing, and the DF clocks will be final. At least that's my read on this now.
 

z0m3le

Banned
Interesting, thanks. So since those overall power consumption numbers appear to be lower for the Foxconn clocks and 16nm/A72 then it seems less likely that Nintendo simply opted for slightly more performance at the same power consumption, which would mean the Foxconn leaker's assumption of 16nm and A72 seems incorrect to me.

Which means these clocks were just for stress testing, and the DF clocks will be final. At least that's my read on this now.
Actually it would line up with the rumor we heard that the final Devkits have lower power consumption...
I cleaned it up a bit.
 
Actually it would line up with the rumor we heard that the final Devkits have lower power consumption...
I cleaned it up a bit.

I don't remember seeing any such rumor, besides the 5-8 hour battery life rumor from a now banned Gaffer.

Anyway, I probably didn't phrase my point well. I was musing that, if the power consumption for the specs guessed in this leak matched exactly the power consumption of the DF specs, then it would seem much more likely that the leak is genuine, and that the guessed specs are accurate, rather than guessed.

I admit it's just a theory and it might not make sense (hell, it might not be possible to match the exact same power consumption with A72s and 16nm) but it would have been an awfully big find if it was the case.

But as it is now, I see much more evidence pointing to DF's clock speeds being the final story and the clock speeds cited by the Foxconn leak being only for stress testing. Which means we still have no clue about the CPU core or process node, though it would seem more likely to be A57s and 20nm.
 

z0m3le

Banned
I don't remember seeing any such rumor, besides the 5-8 hour battery life rumor from a now banned Gaffer.

Anyway, I probably didn't phrase my point well. I was musing that, if the power consumption for the specs guessed in this leak matched exactly the power consumption of the DF specs, then it would seem much more likely that the leak is genuine, and that the guessed specs are accurate, rather than guessed.

I admit it's just a theory and it might not make sense (hell, it might not be possible to match the exact same power consumption with A72s and 16nm) but it would have been an awfully big find if it was the case.

But as it is now, I see much more evidence pointing to DF's clock speeds being the final story and the clock speeds cited by the Foxconn leak being only for stress testing. Which means we still have no clue about the CPU core or process node, though it would seem more likely to be A57s and 20nm.

While that might have matched up perfectly and been a lock for the change. What we found out from that estimation literally changes nothing but your personal perspective on which side of the fence you fall. In fact, having a slightly lower power consumption would lead to a more likely move to these clocks than a less likely one.

My estimation is lower but the difference shouldn't be very big, 700mhz X1 @ 16nm would do 1.5watts according to the anandtech article about x1 and that is with the 40% clock increase and the same consumption. It needs to save ~0.27watts to match up, so the question is how many watts does 768mhz use at 20nm and the 20% increase in clock with 16nm consumption should be smaller than the last 20% as the higher clocks should require more power. Remember this is all estimations but there is a slight chance that they do match up, if for instance it is 2.17watts for 768mhz (just a round number for this estimation, you'd need to draw about 1.9watts to match up perfectly with the clocks here, and what you get is somewhere about .3w to .35w

Basically the same power consumption.
 
While that might have matched up perfectly and been a lock for the change. What we found out from that estimation literally changes nothing but your personal perspective on which side of the fence you fall. In fact, having a slightly lower power consumption would lead to a more likely move to these clocks than a less likely one.

My estimation is lower but the difference shouldn't be very big, 700mhz X1 @ 16nm would do 1.5watts according to the anandtech article about x1 and that is with the 40% clock increase and the same consumption. It needs to save ~0.27watts to match up, so the question is how many watts does 768mhz use at 20nm and the 20% increase in clock with 16nm consumption should be smaller than the last 20% as the higher clocks should require more power. Remember this is all estimations but there is a slight chance that they do match up, if for instance it is 2.17watts for 768mhz (just a round number for this estimation, you'd need to draw about 1.9watts to match up perfectly with the clocks here, and what you get is somewhere about .3w to .35w

Basically the same power consumption.

Like you said it only really changes my personal perspective on this (and it barely did that), but I still think the DF article has a lot more weight to it right now, especially if it got the clock speeds from the October devkits. It could wind up being the same power consumption, in which case yeah- that would highly suggest the October devkits had the improved hardware, but as of right now, until more details come out... the Digital Foundry article still seems to be referring to final hardware.

I really wouldn't bank on this leak being accurate about the final hardware and clock speeds, regardless of whether or not the contents of the leak are 100% true. Unless someone from DF or EG clarifies further about their article.
 

z0m3le

Banned
Like you said it only really changes my personal perspective on this (and it barely did that), but I still think the DF article has a lot more weight to it right now, especially if it got the clock speeds from the October devkits. It could wind up being the same power consumption, in which case yeah- that would highly suggest the October devkits had the improved hardware, but as of right now, until more details come out... the Digital Foundry article still seems to be referring to final hardware.

I really wouldn't bank on this leak being accurate about the final hardware and clock speeds, regardless of whether or not the contents of the leak are 100% true. Unless someone from DF or EG clarifies further about their article.

The one answer we got from Eurogamer is that clocks could have changed but their info comes from "fall" the leak is 100% true and I believe that for both but eurogamer's clocks don't explain these clocks and occam's razor would suggest that they stress tested retail clocks and not clocks that the portable could never run at 20nm anyways.

Also everything about the 40% power consumption reduction is a very close estimate, it's actually pretty interesting that the numbers line up that closely tbh and at less than 0.1watt, I'd suggest it is static due to estimation anyways.


EDIT: Did the math with as best numbers as I could get. I figured out the X1's power draw at different clocks based on TSMC's 16nm process and Pascal's power consumption chart Thraktor gave us in an earlier speculation thread.

First, here is the TSMC numbers for 16nm process. 55% power consumption reduction or 35% clock increase. source: http://techon.nikkeibp.co.jp/english/NEWS_EN/20131213/322503/

Second, Thraktor's chart:
pascal_powercurve.png

Pascal might have slightly better power consumptions but these are the best numbers we can get without an actual tegra pascal chip.

So it looks like Pascal at 768mhz would consume around 0.95 watts per SM or 1.9w for X1's config, which means X1 at around 4.3 watts (this is docked of course)
Pascal at 307.2mhz would consume around 0.18w (it is off the chart but the curve begins to flatten out here, so you are not saving very much from a higher clock. X1 would be ~0.4watts at this clock.
A57 quad core is 1.83watts @ 1ghz so that gives us ~2.23watts for the portable SoC with Eurogamer's clocks.


Now, foxconn's clocks can be looked at with the same breakdown, first lets get the portable's clock: 386.6mhz x 2.5 = 921.5mhz so this should be the foxconn's portable GPU clock.
Pascal at 921mhz is 3 watts for 2 SM (this is docked)
Pascal at 384mhz, since this too is slightly off chart, we have to go with ~0.09watt per SM, with 2 that is 0.18watts

A72 power consumption chart^

Finally A72 @ 16nm with 1.78ghz consumes about 2.05 watts or ~0.22watts more than A57 @ 1ghz 20nm.

This means that the SoC when portable is ~2.23watts. The exact same power consumption from above.


Speculation: They moved to these clocks because the portable clocks are drawing the same power consumption.
 
EDIT: Did the math with as best numbers as I could get. I figured out the X1's power draw at different clocks based on TSMC's 16nm process and Pascal's power consumption chart Thraktor gave us in an earlier speculation thread.

First, here is the TSMC numbers for 16nm process. 55% power consumption reduction or 35% clock increase. source: http://techon.nikkeibp.co.jp/english/NEWS_EN/20131213/322503/

Second, Thraktor's chart:


Pascal might have slightly better power consumptions but these are the best numbers we can get without an actual tegra pascal chip.

So it looks like Pascal at 768mhz would consume around 0.95 watts per SM or 1.9w for X1's config, which means X1 at around 4.3 watts (this is docked of course)
Pascal at 307.2mhz would consume around 0.18w (it is off the chart but the curve begins to flatten out here, so you are not saving very much from a higher clock. X1 would be ~0.4watts at this clock.
A57 quad core is 1.83watts @ 1ghz so that gives us ~2.23watts for the portable SoC with Eurogamer's clocks.

Now, foxconn's clocks can be looked at with the same breakdown, first lets get the portable's clock: 386.6mhz x 2.5 = 921.5mhz so this should be the foxconn's portable GPU clock.
Pascal at 921mhz is 3 watts for 2 SM (this is docked)
Pascal at 384mhz, since this too is slightly off chart, we have to go with ~0.09watt per SM, with 2 that is 0.18watts


A72 power consumption chart^

Finally A72 @ 16nm with 1.78ghz consumes about 2.05 watts or ~0.22watts more than A57 @ 1ghz 20nm.

This means that the SoC when portable is ~2.23watts. The exact same power consumption from above.

Speculation: They moved to these clocks because the portable clocks are drawing the same power consumption.

Thanks for doing that legwork! That's very interesting- essentially the conclusion one can draw from this (not necessarily the only conclusion) is that, some time in the Fall, likely before October, Nintendo determined that they could manufacture the Switch with the exact same power consumption on a smaller node in order to achieve higher performance.

This would mean that DF's info is outdated, and the clock speeds have likely been raised since they got that info.

It wouldn't mean all that much for how games would look, but the greatly increased performance of the CPU alleviates some concerns around ports at the very least.
 

z0m3le

Banned
Thanks for doing that legwork! That's very interesting- essentially the conclusion one can draw from this (not necessarily the only conclusion) is that, some time in the Fall, likely before October, Nintendo determined that they could manufacture the Switch with the exact same power consumption on a smaller node in order to achieve higher performance.

This would mean that DF's info is outdated, and the clock speeds have likely been raised since they got that info.

It wouldn't mean all that much for how games would look, but the greatly increased performance of the CPU alleviates some concerns around ports at the very least.

I think they manufactured the chip before figuring out clocks. IIRC Wii U ports were reporting to have trouble by emily rogers. The main reason for this would likely be the lower CPU clocks than Wii U's CPU, cycle can be important with such a short step design of Wii U's Espresso, to match it, you often need same or higher clocks.

They probably increased the GPU with the multiplier and cranked the CPU as high as it could go inside their ~2.23w power envelope. I did believe that the assumption of A73 was wrong from the start, it was introduced after Nintendo would have been designing this chip, so it made less sense. 50GB/s can also probably be guessed now as well, I'd like to point out Thraktor's speculation on it as there was little point in using 2 32bit chips when 1 64bit chip could do the job. (The foxconn leak showed 2 chips, thus 2 64bit chips are more likely IMO than 2 32bit chips)
 

disap.ed

Member
Thanks for doing that legwork! That's very interesting- essentially the conclusion one can draw from this (not necessarily the only conclusion) is that, some time in the Fall, likely before October, Nintendo determined that they could manufacture the Switch with the exact same power consumption on a smaller node in order to achieve higher performance.

If this whole case holds any credibility (which I hope but I won't hold my breath), than Nintendo definitely didn't determine this in October, but it was the plan from the beginning.
 
If this whole case holds any credibility (which I hope but I won't hold my breath), than Nintendo definitely didn't determine this in October, but it was the plan from the beginning.

True, I didn't word it correctly. What I mean is, before October (or whenever the devkits with this new hardware were sent out) the current "final" clocks were those reported by Digital Foundry. So whether or not the use of a 16nm process was always planned, this change seemingly happened sometime before the October devkits went out.

Again, if this is all true.
 

z0m3le

Banned
True, I didn't word it correctly. What I mean is, before October (or whenever the devkits with this new hardware were sent out) the current "final" clocks were those reported by Digital Foundry. So whether or not the use of a 16nm process was always planned, this change seemingly happened sometime before the October devkits went out.

Again, if this is all true.

Not sure it could be a fake, it's like watching someone walk into a store and buy their very first lotto ticket and hitting the jackpot.

Whenever they had Wii U porting problems, they sought out a faster CPU. They likely just didn't update people on the final clocks, we've heard far worse from Nintendo's pre-launch of hardware in the past.
 
Not sure it could be a fake, it's like watching someone walk into a store and buy their very first lotto ticket and hitting the jackpot.

Whenever they had Wii U porting problems, they sought out a faster CPU. They likely just didn't update people on the final clocks, we've heard far worse from Nintendo's pre-launch of hardware in the past.

Well I'm not questioning that the leak is legit, just that it isn't necessarily indicative of the final hardware. It appears to be leaning that way now in my mind though.
 

z0m3le

Banned
Well I'm not questioning that the leak is legit, just that it isn't necessarily indicative of the final hardware. It appears to be leaning that way now in my mind though.

Here's the thing, this leaker saw a final retail Switch running these clocks for 8 days at full throttle, during this time, his factory produced 160k more of these devices, and their relationship to both X1 and prior clocks can't be denied. For these exact clocks to just be used in the current X1 makes zero sense, especially to stress test. You have to jump through so many logic hoops to get to "well maybe this isn't true" that you are stretching the realm of reality to fit with a world view. With these power consumptions matching only in the case that they shrunk the chip and used A72, that would lead anyone outside of the whole "nintendo" speculation world to believe that it's obviously a new SoC and new clocks, targeting the same power consumption.

This was actually your idea, I just did the leg work as you said. Good job coming up with the solution to this problem, enjoy the fruits of that idea, because they are ripe.
 
Here's the thing, this leaker saw a final retail Switch running these clocks for 8 days at full throttle, during this time, his factory produced 160k more of these devices, and their relationship to both X1 and prior clocks can't be denied. For these exact clocks to just be used in the current X1 makes zero sense, especially to stress test. You have to jump through so many logic hoops to get to "well maybe this isn't true" that you are stretching the realm of reality to fit with a world view. With these power consumptions matching only in the case that they shrunk the chip and used A72, that would lead anyone outside of the whole "nintendo" speculation world to believe that it's obviously a new SoC and new clocks, targeting the same power consumption.

This was actually your idea, I just did the leg work as you said. Good job coming up with the solution to this problem, enjoy the fruits of that idea, because they are ripe.

Haha thanks, and I do appreciate you doing those calculations. Again, I don't think you need to do much more convincing to me personally. It's just hard to take this Foxconn leak as a 100% accurate depiction of the final hardware and clocks when it has not been corroborated whatsoever, and the most recent report from a very trustworthy source is saying differently.

Again, it's hard to deny this leak has some legitimate information, and it's certainly painting a certain picture that adds up, but I'd be a bit cautious about jumping in 100% before this has been corroborated elsewhere. I'm hopeful that this is representative of the final hardware but we really can't be sure yet.
 

z0m3le

Banned
Haha thanks, and I do appreciate you doing those calculations. Again, I don't think you need to do much more convincing to me personally. It's just hard to take this Foxconn leak as a 100% accurate depiction of the final hardware and clocks when it has not been corroborated whatsoever, and the most recent report from a very trustworthy source is saying differently.

Again, it's hard to deny this leak has some legitimate information, and it's certainly painting a certain picture that adds up, but I'd be a bit cautious about jumping in 100% before this has been corroborated elsewhere. I'm hopeful that this is representative of the final hardware but we really can't be sure yet.

This is very reasonable, we should be hearing something soon. I would like to point out a very noted insider did make an unusual comment in the Eurogamer clocks thread. It tipped some of us off that something wasn't right. I won't comment who it was, but if you are reading this, thanks.
 

Chronos24

Member
Though i understand most of what you guys are discussing on clock speeds and whatnot. For those who dont completely understand, what is the bottom line here? Does this mean more powerful than reported on speculatively?
 
Though i understand most of what you guys are discussing on clock speeds and whatnot. For those who dont completely understand, what is the bottom line here? Does this mean more powerful than reported on speculatively?

If the specs from the Foxconn leak in the OP are final, then we're looking at a GPU 20% faster than previously thought, in both handheld and console mode, and a CPU somewhat on par with that of PS4/XB1s (better at some tasks, worse at others).
 

z0m3le

Banned
Though i understand most of what you guys are discussing on clock speeds and whatnot. For those who dont completely understand, what is the bottom line here? Does this mean more powerful than reported on speculatively?

Gpu is 20% faster, relatively small increase, based on architecture improvements instead of 4times wii u's duct taped together, it's 5. As a portable it's 2.

The cpu is as fast or faster than ps4's despite having half the cores.

Also likely that memory is 50GB/s.

To give a better understanding of that 20%, that is not enough to change the resolution of a game from 900p to 1080p. However, if your game was getting 50fps because of the gpu, it would now get 60fps.
 

maxcriden

Member
Regarding the possibility of 4G:

Coming from Nintendo's David Young...

On preorder allotments

“What we’re doing is, of course, we want to meet demand, right?. I mean, it doesn’t do us any good not to, right? I mean, it’s in our best interest, and we certainly will do our best to meet that. Mr. [Nintendo president Tatsumi] Kimishima announced that during that launch window, you know, he announced a worldwide number of what’s available — I believe he said there was two million … so that’s in that launch window. So that’s the, you know, sort of the starting point, and then, we’ll have ways to react depending on what happens there, right? I mean, there’s probably some levers to pull if we need to accelerate things.”

On tech specs

- “more powerful than Wii U”
- no 4K mode
- 720p in handheld mode
- uses Wi-Fi and will not use 4G or LTE for any type of wireless connection
- does not have an optical out

On battery life

- screen has a brightness setting you can use to get more battery out of the Switch

On continuing with 3DS

“The Nintendo 3DS line is very healthy. We’ve had a great year. We’ve had several months of year-on-year growth … so that’s not going away. You know, we’ve got a good lineup of games we’re talking about in 2017 … so yeah, that business is strong. It’s not going anywhere. So, we’re still a company that makes, you know, the on-the-go games for the handheld system and these great experiences for the home console. But now, you can take your home console with you.”

On online play

“Yeah, you know, this’ll allow really a more robust kind of environment and development of that online. That’s what we’ve been discussing, is that this app will allow you [to use] the voice chat.”

On leaked info

“Yeah, absolutely. I mean, we keep our eyes open on those kind of things and it is disappointing because, you know, you want to deliver a message with the maximum impact, right? And so, we design things like today, right? And like the broadcast from Japan last night. You want to have things that are synchronized and timed and sometimes, when [there are] leaks, it will interfere with that. But, you know, we do the best we can, right? Sometimes, these things happen when you have this information spread in a broad way, you know. There’s somebody that always wants to say, ‘Hey, I know something you don’t know,’ and post things. And that stuff happens. But I think, for the most part, we did pretty well on moving through and sharing the information. And today, I think, is going really great for us.”

http://venturebeat.com/2017/01/18/n...ility-battery-life-leaks-and-the-3dss-future/ (via GoNintendo.com)
 

Fafalada

Fafracer forever
Durante said:
... or it's memory bandwidth limited.
Or primitive throughput, or a number of other things in the pipeline. Not everything in the graphics pipeline scales with resolution.
 
Gpu is 20% faster, relatively small increase, based on architecture improvements instead of 4times wii u's duct taped together, it's 5. As a portable it's 2.

The cpu is as fast or faster than ps4's despite having half the cores.

Also likely that memory is 50GB/s.

To give a better understanding of that 20%, that is not enough to change the resolution of a game from 900p to 1080p. However, if your game was getting 50fps because of the gpu, it would now get 60fps.
That's pretty amazing if true. I want to keep my fingers crossed, but I'm keeping expectations low. I'm more anxious about finding the actual specs then the launch day itself.
 

Rodin

Member
Gpu is 20% faster, relatively small increase, based on architecture improvements instead of 4times wii u's duct taped together, it's 5. As a portable it's 2.

The cpu is as fast or faster than ps4's despite having half the cores.

Also likely that memory is 50GB/s.

To give a better understanding of that 20%, that is not enough to change the resolution of a game from 900p to 1080p. However, if your game was getting 50fps because of the gpu, it would now get 60fps.

If the specs from the Foxconn leak in the OP are final, then we're looking at a GPU 20% faster than previously thought, in both handheld and console mode, and a CPU somewhat on par with that of PS4/XB1s (better at some tasks, worse at others).
Remember that Emily (or LKD) leak about Mario getting better performances around october/november? There sure seem to be quite a lot of coincidences around this.
 

Randomizer

Member
Remember that Emily (or LKD) leak about Mario getting better performances around october/november? There seem to be quite a lot of coincidences.
How would the performance of an inhouse Mario get leaked? I understand third parties leaking stuff but to be privy to information about Mario Odyssey you'd really need to be working directly for NCL as a member of Nintendo SPD.
 
How would the performance of an inhouse Mario get leaked? I understand third parties leaking stuff but to be privy to information about Mario Odyssey you'd really need to be working directly for NCL as a member of Nintendo SPD.

LKD specifically has a source within Nintendo. Anyway I think the quote about Mario struggling with performance before improving was from Tom Phillips of Eurogamer, which would indeed line up with the Foxconn leaked hardware.

And I think Eurogamer has pretty trustworthy sources.
 
Oh I actually thought that was reported by Nintendo themselves.

Not sure how DF determined the resolution of the undocked mode. Probably got told by a Nintendo rep as I think the direct footage they've captured was of the docked mode.

But yeah, in the IGN article, Nintendo only officially talked about the docked mode resolution.

Who has been talking about an SCD ?

There were some speculation about the more advanced unit (most probably a Dev Kit since they are already manufacturing 2000 a day; a 10th of the retail unit volume but we don't know for how long they were hoping to keep that up) being an SCD.

I just think it's a Dev Kit as they usually have more memory to allow debugging. Why the apparent discrepancy in power, then? Dunno, maybe the Dev Kit allows for running the docked and undocked modes simultaneously (one on the device's screen and the other on an external monitor) so developers can compare performance in real time?

I think the SCD concept is a bit far-fetched and goes against Nintendo's philosophy of simple design. Moreover, the USB-C specification the Switch is using probably won't have enough bandwidth to support such expansion satisfactorily (unless they are using some unannounced proprietary technology to increase the bandwidth).
 

elohel

Member
Something to add. Cant do it myself as im on mobile. However the source speculated that TSMC was responsible for the chips inside it(Switch). This is a big deal if true because TSMC were tied into the rumours about the deal Nvidia gave Nintendo on the chips/hardware support.

It could be another way all these crazy NX/Switch rumours tie together

Agreed that would be strange as usually only a few are true and the rest is never reconnected lol
 

z0m3le

Banned
Not sure how DF determined the resolution of the undocked mode. Probably got told by a Nintendo rep as I think the direct footage they've captured was of the docked mode.

But yeah, in the IGN article, Nintendo only officially talked about the docked mode resolution.



There were some speculation about the more advanced unit (most probably a Dev Kit since they are already manufacturing 2000 a day; a 10th of the retail unit volume but we don't know for how long they were hoping to keep that up) being an SCD.

I just think it's a Dev Kit as they usually have more memory to allow debugging. Why the apparent discrepancy in power, then? Dunno, maybe the Dev Kit allows for running the docked and undocked modes simultaneously (one on the device's screen and the other on an external monitor) so developers can compare performance in real time?

I think the SCD concept is a bit far-fetched and goes against Nintendo's philosophy of simple design. Moreover, the USB-C specification the Switch is using probably won't have enough bandwidth to support such expansion satisfactorily (unless they are using some unannounced proprietary technology to increase the bandwidth).

They made 2000. there is no continual production of these units. 2000 would suggest that they are going out to developers as I can't see internal use of 2000 of these units making any sense.

As for USB-C, even the 3.1 USB-C they are likely using would carry enough bandwidth for an egpu and the short connection between the switch and the dock should allow for no signal degradation.

The Switch is being produced at 20000 units a day which would put them at 9m units at current production levels. (the Screen manufacture would line up with 1 extra shipment and 1 in production by the end of the year or 10.2m switch screens shipped to Nintendo by years end)
 

Arizone

Banned
I should say that I do absolutely expect Nintendo to release new Switch hardware over the next few years, but I was assuming something along the lines of:

2017: Switch
2018: Switch Pocket (~5" screen, cheaper, no removable controls, no dock, uses Switch SoC at portable clocks)
2019: Switch Home (traditional home console, maybe 1.5-2TF for 4K Nintendo games and 1080p third party ports)

Nintendo have talked about iOS and Android and "brothers in a family of systems" for quite a while, and from both technological and business points of view it makes sense to have a few different form factors which all play the same game. Having a hybrid as the first one also makes some kind of sense in terms of trying to hit as wide an audience as possible before the more specialised devices hit.

That said, the concept of a SCD dock doesn't really fit that narrative. Instead of a "buy whatever you want, they all play the same games!" message, it's a "buy this device, but only if you have this other device" message, and if it's as powerful as the rumour suggests then there's also "oh, by the way, some games will only play on the second device, but you still need the first device to use the second one". It also just seems like overkill for what Nintendo would want or need from a home console-like setup.

I thought the same during the NX rumors, but the problem I have now is that the "Switch" name and logo start to fall apart when you don't revolve around the single core device like an SCD would. I also don't think this SCD would have to be that expensive sold standalone or optionally bundled at a discount, but I think it's important to note that the current dock despite doing essentially nothing has an inflated $90 price, possibly preparing consumers for whatever comes next.

Back to the name and logo, a dedicated home console would immediately lose "switching" to handheld, and is only left with "switching" the joy-cons between the grip and individual motion controllers, and that's if it didn't just come bundled with a Pro controller instead. Add in smaller handhelds and the joy-cons can't be attached, so either you lug them around separately or build some of the functions into the device but lose some game compatibility too which I doubt they want. The 3DS XL already sells better than the smaller unit anyways, right?

A 1080p upgrade down the line with full backwards compatibility is what I would most expect, just like the DSi and NN3DS, with or without the SCD being true. Until this happens I would not expect any of the games being exclusive to better hardware ("some games will only play on the second device"), just like the PS4 Pro will not have exclusives.
 

z0m3le

Banned
I thought the same during the NX rumors, but the problem I have now is that the "Switch" name and logo start to fall apart when you don't revolve around the single core device like an SCD would. I also don't think this SCD would have to be that expensive sold standalone or optionally bundled at a discount, but I think it's important to note that the current dock despite doing essentially nothing has an inflated $90 price, possibly preparing consumers for whatever comes next.

Back to the name and logo, a dedicated home console would immediately lose "switching" to handheld, and is only left with "switching" the joy-cons between the grip and individual motion controllers, and that's if it didn't just come bundled with a Pro controller instead. Add in smaller handhelds and the joy-cons can't be attached, so either you lug them around separately or build some of the functions into the device but lose some game compatibility too which I doubt they want. The 3DS XL already sells better than the smaller unit anyways, right?

A 1080p upgrade down the line with full backwards compatibility is what I would most expect, just like the DSi and NN3DS, with or without the SCD being true. Until this happens I would not expect any of the games being exclusive to better hardware ("some games will only play on the second device"), just like the PS4 Pro will not have exclusives.

Good point with the logo, a 1080p switch does make the most sense for an iteration, and we even should know the gpu performance of such a device. Either it will be the 472gflops that the original switch has when docked or 1062gflops to 1180gflops for a new 1080p target, meaning that the current switch would run full clock and target 720p with future titles once "new" switch is released.
 

Durante

Member
Or primitive throughput, or a number of other things in the pipeline. Not everything in the graphics pipeline scales with resolution.
But what we are discussing here is why the resolution doesn't scale more between the docked and undocked modes. "Primitive throughput" seems a very unlikely answer to that, given that (a) as you say it doesn't scale with resolution, and (b) it actually does increase proportionally to the clock increase when docked.

Also likely that memory is 50GB/s.
Why?
 

Rodin

Member
But what we are discussing here is why the resolution doesn't scale more between the docked and undocked modes. "Primitive throughput" seems a very unlikely answer to that, given that (a) as you say it doesn't scale with resolution, and (b) it actually does increase proportionally to the clock increase when docked.

Why?
How can it run Zelda, MK8 and Fast at a higher resolution if Wii U had at least around 50GB/s of peak bandwidth? Not even Maxwell/Pascal color compression should be able to do more than twice the res with half the bandwidth in two of those.

Seems more likely to me that Zelda being an enhanced port of a game for a different machine developed on the side would have some kind of problems hitting 1080p like the other two, more than Nintendo bandwidth starving their machine.
 

Durante

Member
How can it run Zelda, MK8 and Fast at a higher resolution if Wii U had at least around 50GB/s of peak bandwidth? Not even Maxwell/Pascal color compression should be able to do more than twice the res with half the bandwidth in two of those.
Because that peak bandwidth was harder to leverage? Or because of tiled rasterization and a more effective L2 cache? Or for lots of other reasons.

Or maybe because it has 50 GB/s of external memory bandwidth. It could be the case.

But, at least from my uneducated perspective, nothing here is "obvious".
 

Rodin

Member
Because that peak bandwidth was harder to leverage? Or because of tiled rasterization and a more effective L2 cache? Or for lots of other reasons.

Or maybe because it has 50 GB/s of external memory bandwidth. It could be the case.

But, at least from my uneducated perspective, nothing here is "obvious".
I meant more "likely". Anyway i'll repost this quote from Manfred Linzner about Wii U memory bandwidth, i still remember people saying that it was bandwidth starved "cuz 12.8GB/s", and none of them pointed at the CPU's rather large cache or those 32MB of eDRAM being part of a more complex but balanced memory subsystem, not even despite the fact that literally 0 devs ever complained about bandwidth despite shitting on the CPU at every turn. Now you're saying there are other possibilities like having more external bandwidth on Switch, which certainly makes more sense to me than "25GB/s period". Still I don't see any major challenge in using a 128bit bus in a device this size, and it's certainly not a cheap one either.

Hopefully marcan will test this d1 if we don't get other leaks.
 

z0m3le

Banned
Thraktor's post about this leak earlier in the thread, he noticed the foxconn employee saw 2 memory chips on Switch final hardware:
Speculated 2x RAM = 4GB - This is kind of interesting, as it implies that there are two memory modules. LPDDR4 modules are typically available at either 32 bit or 64 bit bus width per module, and although Shield TV uses two 32 bit modules for a total 64 bit bus, this almost never happens in a tablet, as space is much more limited, and a single 64 bit chip could be used instead. The only tablet I can think of which uses two LPDDR4 modules is the iPad Pro, which uses two 64 bit modules, as it's the only way to get a 128-bit bus. This doesn't really specify anything one way or the other, but it does leave the option open for a 128-bit memory bus.

Also his post on the battery and charging from here:

So come into this thread I read the OP, see the 3 hours charge time, think "that's pretty reasonable", and see that they're using a 4310 mAh battery, which is on the top end of what I would have expected. Good stuff.

And then I start reading the replies... holy hell. I know neogaf has a reputation for over-reacting to things, but this thread is just straight up nuts. Go outside, take a walk, have a breath of fresh air, and when you've calmed down you can come back and we can actually look at the facts.

You're back? Great, nice day outside, isn't it? Okay, now let's talk batteries.

Battery Size

At 4310 mAh, Switch has by far the highest-capacity battery of any gaming device ever made. Here's what's currently out there:

3DS / 2DS - 1300 mAh
New 3DS - 1400 mAh
(New) 3DS XL - 1750 mAh
PS Vita - 2200 mAh

That's about twice the capacity of PS Vita or almost three and a half times the capacity of the 3DS or 2DS.

Now, you're surely saying "but X phone or Y tablet has such-and-such a capacity, Nintendo must be able to fit more in there!", but I'm willing to be X phone and Y tablet aren't actively cooled. A typical phone or tablet these days is basically a battery with some electronics and a screen attached. For a phone the battery could account for 60-70% of the internal volume, and it can be even higher for tablets.

Nintendo doesn't have the luxury of dedicating so much internal space to batteries, because it has to fit a fan and heatsink in there, which likely occupies as much as half the space between the screen and the rear of the case. Have a look at this rear-view photo of the Switch:

switch_battery_space.jpg


The blue box I've included is pretty much the best-case-scenario for how much space Nintendo can allocate to the battery in Switch. If you look at the fan vents at the top and bottom of the unit you can see where the heatsink and fan are going to sit (and I've been pretty conservative with the red box for this, as I'm not even including the area around the third vent to the lower right). Then on the left we've got space taken up by the game card slot, microSD slot, kickstand, and likely part of the logic board (which will overlap the cooling system).

There's no space for a bigger battery. The reason I say 4310 mAh is at the top end of what I would have expected is because there's just nowhere to put anything bigger. A 4310 mAh battery is basically Nintendo squeezing as big a battery as they possibly can in there. Quote me on this, when we see teardowns in March there isn't going to be some big gap where they could have put a bigger battery. They've squeezed in as much as they can without increasing the physical size of the device.

Shows how valuable real-estate is inside the Switch, so 2 32bit memory chips doesn't make much sense when they can do the same job with 1 64bit memory chip that would likely cost less money, especially in the long run. Far more likely that it is 2 64bit chips for bandwidth but this is my speculation, Thraktor's is simply that it could be.
 
Top Bottom