• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Curious Case of the Switch Foxconn Leak (Now a hardware fanfiction thread)

prag16

Banned
that difference seems pretty similar to XB1 vs PS4, I wouldn't call the power gap between them "significant"

Okay, if you don't think 2.5x as powerful (likely more like 3x or 4x due to the likelihood that even handheld mode is more powerful than Wii U), based on eurogamer's report, is significant (xbone to PS4 is more like 1.5x as we know).
 

Lord Error

Insane For Sony
No, that is incorrect.

Zelda Wii U: 720p30 with significant drops
Zelda Switch docked: 900p30 with minimal drops
Zelda Switch mobile: 720p30 with minimal drops

MK8 Wii U: 720p60 (720p30 for 3/4 players)
MK8 Switch docked: 1080p60 (1080p30 for 3/4 players)
MK8 Switch mobile: 720p60 (720p30 for 3/4 players)

Handheld mode seems only marginally more powerful than Wii U. Docked mode is significantly more powerful.
I see - I mixed up MK8 resolution wit SSMB resolution. I knew there was one very prominent game on WiiU that pulled off 1080p/60. So undocked Switch is slightly better in power than WiiU, rather than that being the case when it's docked like I originally thought. That's much better - very happy with that.
 

Thraktor

Member
Wait, so the leaker himself explicitly said the A73 (or A72) piece is his own speculation? I thought the OP stated that there was some power draw reason why it couldn't be A57 based on his info. If that's all 100% speculation then that certainly changes the picture for me.

Well, I can't speak for what the leaker said precisely, but the reddit translation says the following:

*Speculated CPU is arm A73 pascal, much powerful than X1, when tested it only shows ARM_V8 structure hence the speculation

z0m3le then says that it has to be A72 or A73, as A57 wouldn't be able to clock this high. This is true if we were talking about actual game clocks for portable mode, but from the evidence given to us all we can say is that these are the clocks used for thermal stress testing, which may bear no relation to clocks used in game. And from a stress testing point of view, A57 cores could absolutely get to 1.78GHz on 20nm, so there's no way to rule them out.

Thraktor, let me blow your mind.

What happens if you downclock a GTX 1060 to 768mhz? You get 1.77 teraflops.

1770 / 384 (Switch docked gpu) ~ 4.6094

720p to 1080p is a 2.25 time bump in resolution. Switch provided 2.5 the power in dock mode. An extra of 11%.

1080p to 4k is a 4 time bump in resolution. The SCD would provide 4.6094 times the power. An extra of 15%.

I don't see the reason to take a GPU like GP106 and clock it that low, compared to using a smaller (and cheaper) Pascal at higher clocks. As a point of reference, GP104 (which is the chip used in the GTX1080, with twice the SMs of GP106) consumes just 36W at 1GHz. A GP106-equivalent at 1-1.2GHz would probably only consume Wii U levels of power, and could be cooled with a fairly small heatsink and fan.

I want to see this SCD also be Nintendo's VR solution with a 1080p screen in it, but it set fire to peoples eyes due to it also containing a fucking 1060.

So long as it comes with a neck brace to handle the weight I'm fine.

Seriously, though, although I'm intrigued to see what kind of craziness would result from Nintendo's strap-a-Switch-to-your-face VR patent, I'd really prefer if they just supported OpenVR headsets if they ever do VR. I don't own a separate TV for each of my consoles, and I'd prefer not to have to deal with a separate headset for each of them either.

I mean, if devkits are already out with an SCD, shouldn't this thing be coming out relatively soon? Like early next year?

It's hard to say. If it's just an off-the-shelf GP106 being used as a substitute while a custom chip is developed, then it could be a while. Nintendo were using kits with off-the-shelf Radeon GPUs for Wii U about two years before it launched. On the other hand if it's a late dev kit and they're just using the GP106 as-is (or have a similar custom GPU already completed) then in theory it could be released as early as the end of this year.

It could just be test cases.

This supposed dock would costs several (3-5) hundreds of dollars, lol. I can't see it hitting retail for at least a year if not two.

Well, it may not have to be that expensive. In its most stripped-down form (where it's just the GPU and 4GB of RAM) it could conceivably be (relatively) affordable. The GP106 is about 200mm² (compared to about 300mm² for PS4 Pro and potentially ~400mm² for Scorpio), it would have half the RAM of the PS4 Pro, and it wouldn't include a controller, disc drive, hard drive, or a number of other components. In theory it could be quite a bit cheaper than the "competition". Of course in this scenario you need a Switch for it to work, so the total cost of both a Switch and an SCD would almost certainly be more than the cost of a PS4 Pro or a Scorpio.

Alternatively, if they were to take the route of making it a fully-fledged home console, then you have to add the cost of a CPU, a controller, potentially a hard drive, etc., etc., and you're into PS4 Pro/Scorpio territory.

I imagine they would sell it as a bundle. This would basically act as a complete console to compete with PS4 Pro and Scorpio for the attention of people who want 4K gaming. Existing Switch owners could purchase just the dock in order to upgrade at a lower price. Assuming they can develop games to scale easily between each configuration, this would be a pretty good value proposition for current switch owners to upgrade for a lower price, and also for developers to easily make games for a wider range of consumer preferences.

EDIT: To explain a little bit more clearly, the proposed Switch configuration would ideally be able to run the same game in every graphics mode, which would seem to me to avoid the trap set by previous add-on setups like 32X or Sega CD. If anything, it might widen the audience for an individual game by being able to cater to a larger range of consumer tastes, such as a standard low cost Switch option for casual gamers vs the Switch + SCD bundle for gamers who care about 4K.

The problem with selling it as a bundle is that you have to include all the Switch's components (screen, battery, etc., etc.) in the cost, pushing the price up above PS4 Pro and Scorpio even though much of the target audience may not have any interest in the portable component. It would be cheaper, and sell to a wider audience, if they made it a stand-alone device which, if the player wants, can interact with their Switch. They could potentially sell it in an optional bundle with the Switch (Sony sold PS4 + Vita bundles for a while), but I can't see them doing that as the main SKU.

Didn't Todd Howard from Bethesda say he saw the most impressive demo he's ever seen from the Switch? The SCD patent is as old as the collaboration with Nvidia per Semiaccurates original leak, what if Nintendo has been shopping around this power dock with the switch from the beginning?

I'm with Skittzo0413 on this, I think it makes most sense as a demo of the HD rumble (which would be nice to see in Skyrim, actually). Besides, these dev kits were supposedly only manufactured in November, and if they had been demoing anything similar to this since last E3 I'm sure we would have heard of it.

Since no one actually reads the foxconn leak: "the software demo testing is millions of fish and running almost 8 days, there's no single frames drops" This was at the 1.78ghz and 921mhz. What we know of X1 would lead us to believe that such a demo would throttle.

It would lead us to believe that it would throttle in the Shield TV. Throttling is a function of power delivery and cooling, which can change from one device to another even with an identical SoC. Switch could have more effective cooling, or be capable of delivering more power to the SoC, or it could simply have higher thermal limits applied, for the sake of the test (as they're stress-testing the cooling system).

Extra memory on the SoC seems unlikely. The leaker states a 10x10mm die area which is already smaller than the X1 having eDRAM or eSRAM takes a lot of die area that reduces the space available for the actual GPU on the die.

It depends on how much memory is added. Switch wouldn't need a full 32MB due to the tile-based rendering, so they could have only added a couple of MBs, which would be within the margin of error for a chip measured in millimetres on each side. There's also hardware they could remove from the TX1 (such as some of the video codec block) to reduce the size.

My mind can't roll with this from a messaging standpoint.

Reggie goes on talk shows and says the entire console is in the tablet, there's nothing in the dock.

They reveal the thing and set everyones minds to it having X level of performance. The keynote happened, advertising started, the site is up, gameplay video is up.

And then...What? Maybe a year or two down the line they advertise the new hardware addon? It just feels messy to me somehow, even more than the Pro/Scorpio.


There's also the technical side, with everything they've said it's a USB C port on the Switch, 5Gb/s if gen 1, 10Gb/s if gen 2...Thunderbolt 3 at 40Gb/s limits a card like the 1060 to ~85% of its performance.

There's also that it's Nintendos fanbase, how many would spend maybe 550 dollars for the more powerful version of their console? And if a small number of people buy it, who would develop for it? You could say first party games matter most on Nintendo platforms, but do those need a dock with roughly a 1060? They could use some AA and AF lipstick, sure, but...


I guess I just have a lot of questions :p

Yeah, it's confusing.

I should say that I do absolutely expect Nintendo to release new Switch hardware over the next few years, but I was assuming something along the lines of:

2017: Switch
2018: Switch Pocket (~5" screen, cheaper, no removable controls, no dock, uses Switch SoC at portable clocks)
2019: Switch Home (traditional home console, maybe 1.5-2TF for 4K Nintendo games and 1080p third party ports)

Nintendo have talked about iOS and Android and "brothers in a family of systems" for quite a while, and from both technological and business points of view it makes sense to have a few different form factors which all play the same game. Having a hybrid as the first one also makes some kind of sense in terms of trying to hit as wide an audience as possible before the more specialised devices hit.

That said, the concept of a SCD dock doesn't really fit that narrative. Instead of a "buy whatever you want, they all play the same games!" message, it's a "buy this device, but only if you have this other device" message, and if it's as powerful as the rumour suggests then there's also "oh, by the way, some games will only play on the second device, but you still need the first device to use the second one". It also just seems like overkill for what Nintendo would want or need from a home console-like setup.

But there's not a whole lot else the rumoured dev kit could be. A portable or hybrid Switch 2 with a GP106 is obviously preposterous. A dedicated home console wouldn't have any need for a screen attached. I'd happily subscribe to any other explanation of what it might be, but from the possibilities I can think of the "turbo dock" seems the most likely, and the fact that it fits Nintendo's SCD patent is another point in its favour.

Regarding USB-C, Nintendo is free to use its own alt-mode without restricting itself to USB 3 or Thunderbolt or anything like that, and benefits from the fact that it would be a direct connection, with no signal degradation which would be expected over a long cable. USB-C alt mode gives (as far as I can tell) 10 data pins to play with, which would be enough for, as an example, a dual-lane PCIe 3 connection, providing 2GB/s (16Gb/s). Of course they could use whatever protocol they want, but PCIe is a useful short-range example.
 
Isn't the only fully comparable game running at basically identical resolution and performance on Switch as it does on WiiU? MK8 is 1080/60 in 1/2 player mode, and 1080/30 in 3/4 player mode on both, no? Even looking at paper performance from Eurogamer's specs it really doesn't seem like something more powerful than WiiU, much less 4x more powerful like you're suggesting it shoudl be (unless I'm reading wrong).

MK8 was built for the Wii U, and Nintendo intentionally might just want to reserve their resources and only do minimal upgrades like a resolution bump instead of fully taking advantage of the switch hardware. When we see a Mario kart game built for switch from the ground up, it should look significantly better than mk8.

We'll know in a few months just how powerful switch is. If we assume DF's clockspeeds with 2SMs for GPU, than the Switch could be 1.5x more powerful in portable and close to 4x as powerful in docked mode than the Wii u. We'll see.
 

z0m3le

Banned
I agree with you Thraktor. The SCD could target 4k if the switch uses full clocks and targets 720p, so there is that. Also leaves room for the upgraded switch somewhere down the road that is 2.25 to 2.5 times faster than switch and targets 1080p.

If they really wanted to, they could even target 480p with a micro switch that is a 3.5 or 4inch switch.

I think the big problem with the SCD being a 1060 is the cpu is only 1ghz with the eurogamer clocks.

It's 6 weeks before we get these in our hands so I think I'm done here for now, hopefully eurogamer can give us more specifics about timing and a mod or OP can update the thread.
 

Hermii

Member
z0m3le then says that it has to be A72 or A73, as A57 wouldn't be able to clock this high. This is true if we were talking about actual game clocks for portable mode, but from the evidence given to us all we can say is that these are the clocks used for thermal stress testing, which may bear no relation to clocks used in game. And from a stress testing point of view, A57 cores could absolutely get to 1.78GHz on 20nm, so there's no way to rule them out.

So this leak could be fully legit and nothing would necessarily contradict the EG article.
 
So this leak could be fully legit and nothing would necessarily contradict the EG article.

Yeah this is what I've been trying to articulate over the past few pages.

It's all a matter of timing. If EG heard the clock rates after this leak came out, which their article seems to suggest (but is not clear about), then that would mean Nintendo decided to lower the clock speeds from those performed by the stress test.

If, however, EG's clock speed report came from July, then it's very possible that these clock speeds are the new final ones for the newer devkits which came in October.

Has anybody asked a DF employee yet?
 

Hermii

Member
Yeah this is what I've been trying to articulate over the past few pages.

It's all a matter of timing. If EG heard the clock rates after this leak came out, which their article seems to suggest (but is not clear about), then that would mean Nintendo decided to lower the clock speeds from those performed by the stress test.

If, however, EG's clock speed report came from July, then it's very possible that these clock speeds are the new final ones for the newer devkits which came in October.

Has anybody asked a DF employee yet?

I can do it if someone tells me his username.
 

z0m3le

Banned
Yeah this is what I've been trying to articulate over the past few pages.

It's all a matter of timing. If EG heard the clock rates after this leak came out, which their article seems to suggest (but is not clear about), then that would mean Nintendo decided to lower the clock speeds from those performed by the stress test.

If, however, EG's clock speed report came from July, then it's very possible that these clock speeds are the new final ones for the newer devkits which came in October.

Has anybody asked a DF employee yet?

The one response I know of from an employee at eurogamer is "that rumor looks fake" which doesn't answer anything for us, as at this point it's more than a rumor.
 
Looks like October :)

Thanks!

I don't know how much that exactly helps clear this up though, because this leak was from the end of November, the DF article was from mid December, and reports of final devkits going out were around that same time in December.

Fall could mean anywhere from September to December, but it typically refers more to September-October from what I can tell.

Might just be since the Fall weather only lasts about that long where I live :p
 

Chronos24

Member
About the 4G part. Nintendo said for now free trial of certain online features. Now, assuming a 4G radio inside that has been disabled for now would be activated via firmware later, and accessible via paid subscription. Just a thought. Is this even possible?
 

Theonik

Member
About the 4G part. Nintendo said for now free trial of certain online features. Now, assuming a 4G radio inside that has been disabled for now would be activated via firmware later, and accessible via paid subscription. Just a thought. Is this even possible?
Considering they have already outlined the subscription service it seems unlikely they wouldn't have mentioned it. Same goes for most of these rumours. If such major things were happening I wouldn't expect they'd keep it a secret. (with specs at least Nintendo is silent anyways)

It depends on how much memory is added. Switch wouldn't need a full 32MB due to the tile-based rendering, so they could have only added a couple of MBs, which would be within the margin of error for a chip measured in millimetres on each side. There's also hardware they could remove from the TX1 (such as some of the video codec block) to reduce the size.
My point was that even if the innacuracy of the person measuring and removing bits Nintendo doesn't want were accounted for the extra RAM might take extra space which we are not really sure is accounted for, especially if we can't speculate on its size, not without knowing the exact customisations Nintendo's going with.
 

ggx2ac

Member
About the 4G part. Nintendo said for now free trial of certain online features. Now, assuming a 4G radio inside that has been disabled for now would be activated via firmware later, and accessible via paid subscription. Just a thought. Is this even possible?

It probably doesn't have 4G. The guy never mentioned an SD card slot, he probably mistook the microSD card slot for a SIM card slot.
 
About the 4G part. Nintendo said for now free trial of certain online features. Now, assuming a 4G radio inside that has been disabled for now would be activated via firmware later, and accessible via paid subscription. Just a thought. Is this even possible?

Apparently it was likely that the leaker saw the SD card slot and assumed it was a slot for a SIM card, which led him to believe it would have 4G. This is the likeliest scenario I would say.
 

KingSnake

The Birthday Skeleton
Thanks!

I don't know how much that exactly helps clear this up though, because this leak was from the end of November, the DF article was from mid December, and reports of final devkits going out were around that same time in December.

Fall could mean anywhere from September to December, but it typically refers more to September-October from what I can tell.

Might just be since the Fall weather only lasts about that long where I live :p

We know that the latest devkits were sent out in October and also the retail unit was supposed to be in production by that time. I don't see any change of specs realistically possible after that. Sure, Nintendo could theoretically up the clocks at some point via patches, but I don't see it happening so fast. One day you must target these clocks, next day totally different clocks.
 

kyser73

Member
Isn't this discussion around devkits irrelevant? We've seen production hardware running a finished game. Foxconn will be in mass production mode by this point to hit that 2mn launch target - I'm guessing first shipments will already be on ships now, or at least close to it.

It isn't getting any more power guys. Unless Nintendo decided to completely miss off a whole thing about a turbo charger dock, much like MS forgot about the stacked GPU in the powerbrick in the Bone.

Title change on point.

4G? Again WHY WOULDN'T YOU TALK ABOUT SUCH A CORE FEATURE IN YOUR LAUNCH PRESENTATION?!?!!?
 
We know that the latest devkits were sent out in October and also the retail unit was supposed to be in production by that time. I don't see any change of specs realistically possible after that. Sure, Nintendo could theoretically up the clocks at some point via patches, but I don't see it happening so fast. One day you must target these clocks, next day totally different clocks.

I thought Nate (among others) said that the final devkits were sent out in late November or early December. He mentioned that when the Venture Beat article came out.

Otherwise, yeah I agree with you that this would mean the leaker's clock speeds were just for stress tests (if they weren't made up, which seems unlikely). It's a bit curious that they'd run the CPU at such a high clock rate as a stress test, but it's possible that they did go with A72s and are keeping open the option of upping those clocks one day via a firmware update. I'm not sure what reason they would have to run A57s at 1.78GHz to test a device that won't (and can't really) run them much higher than 1GHz.

Isn't this discussion around devkits irrelevant? We've seen production hardware running a finished game. Foxconn will be in mass production mode by this point to hit that 2mn launch target - I'm guessing first shipments will already be on ships now, or at least close to it.

It isn't getting any more power guys. Unless Nintendo decided to completely miss off a whole thing about a turbo charger dock, much like MS forgot about the stacked GPU in the powerbrick in the Bone.

Title change on point.

The discussion we're currently having is about a potential 20% boost in GPU clock speeds over those reported by DF, and a much larger boost in CPU clock speed. Neither of which would be noticeable from watching games. And it certainly seems like the leak was accurate, but not necessarily for the final clock speeds.
 

kyser73

Member
I thought Nate (among others) said that the final devkits were sent out in late November or early December. He mentioned that when the Venture Beat article came out.

Otherwise, yeah I agree with you that this would mean the leaker's clock speeds were just for stress tests (if they weren't made up, which seems unlikely). It's a bit curious that they'd run the CPU at such a high clock rate as a stress test, but it's possible that they did go with A72s and are keeping open the option of upping those clocks one day via a firmware update. I'm not sure what reason they would have to run A57s at 1.78GHz to test a device that won't (and can't really) run them much higher than 1GHz.



The discussion we're currently having is about a potential 20% boost in GPU clock speeds over those reported by DF, and a much larger boost in CPU clock speed. Neither of which would be noticeable from watching games. And it certainly seems like the leak was accurate, but not necessarily for the final clock speeds.

So basically trying to prove a website leak wrong over something not easily noticeable. Ok.
 

KingSnake

The Birthday Skeleton
I thought Nate (among others) said that the final devkits were sent out in late November or early December. He mentioned that when the Venture Beat article came out.

We got towards end of December this quote from Laura:

Like pretty much every games console, the switch has gone through multiple dev kit iterations. There was one floating around in July a LOT of info on specs has been based on. Another went out to bigger devs in October. Some bigger indies are still working from the July dev kits. The Oct kit is more powerful overall than the July dev kit.

The devkit info from NateDrake about devkit being shipped was also talking about 5-8 hours of battery. We know how that worked. Together with Pascal. Still hopes for that Metroid game I guess. Also NateDrake is banned and mods already advised against using him as a source for info.
 
So basically trying to prove a website leak wrong over something not easily noticeable. Ok.

No one is claiming anyone is wrong, just that some info could be outdated. It's kinda silly to outright dismiss the Foxconn leak when you look at all the information they got right which would be very, very hard to guess.

We got towards end of December this quote from Laura:

The devkit info from NateDrake about devkit being shipped was also talking about 5-8 hours of battery. We know how that worked. Together with Pascal. Still hopes for that Metroid game I guess. Also NateDrake is banned and mods already advised against using him as a source for info.

The Laura quote doesn't say anything about final devkits, but you may be right that the only source about final devkits was Nate. I thought I remembered Emily saying something similar but I can't find that now.

Also I had no idea he was banned!
 

Lord Error

Insane For Sony
Important thing to note about Switch is also that it has 4GB of RAM. Most of which can be allocated towards a single game that it is running. This is a huge contrast with iOS devices for example, where you can only allocate a minuscule portion of the already much lower available RAM for any single app.
 

ggx2ac

Member
So are we nearly on lock at 394Gflops docked or is there anything that indicates otherwise?

Not certain yet.

It'd be great if the CPU/GPU clock speeds for running games were 1.78GHz and 921MHz respectively because it would justify the price that they got something with an A72 and a 16nm node to keep clock speeds high however, we'd still have to wait for more leaks or a teardown when the Switch releases.
 

KingSnake

The Birthday Skeleton
I don't think a 20% increase in GPU speed would:

A) Be considered magic

or

B) Allow Zelda to render at 1080p if a 20% slower GPU could only do 900p.

So I'm not sure what you're talking about here.

This rumour at face value has better GPU clocks, much better CPU clocks and double the memory bandwidth if we consider it being Pascal Tegra. I'm pretty sure Zelda digs deep in all of the above. Even with the Eurogamer clocks the GPU should be able to handle the resolution increase from 720p to 1080p, so the limitations must be somewhere else, CPU or memory or both.
 

Metal B

Member
We have now confirmation from Nintendo that Zelda will indeed run at 900p30fps on Switch.

No magic happening yet.
Zelda (and also Mario Kart) maybe suffers under its position as a port. It needs to be similar to the Wii U-version and this could hold back the game. Nintendo maybe want to have a easier time making changes to both versions and therefore have a weird direct port, which takes away from the performance (like emulating parts of the WiiU).
There are many games, which run worse on better different hardware.
 
This rumour at face value has better GPU clocks, much better CPU clocks and double the memory bandwidth if we consider it being Pascal Tegra. I'm pretty sure Zelda digs deep in all of the above. Even with the Eurogamer clocks the GPU should be able to handle the resolution increase from 720p to 1080p, so the limitations must be somewhere else, CPU or memory or both.

Or it could be that they are prioritizing other areas of the visuals, like draw distance, foliage detail, or AA which are probably more important on a big TV screen than a small 6.2 inch screen.

I just really don't think the Zelda visuals can tell us much about three hardware. Anyway I think ARMS looks XB1 level from a visual standpoint (to me anyway) so clearly different games will look very different on the same hardware due to visual/processing priorities.
 

Vena

Member
This thread has spun into some crazy fan-fiction.

This rumour at face value has better GPU clocks, much better CPU clocks and double the memory bandwidth if we consider it being Pascal Tegra. I'm pretty sure Zelda digs deep in all of the above. Even with the Eurogamer clocks the GPU should be able to handle the resolution increase from 720p to 1080p, so the limitations must be somewhere else, CPU or memory or both.

Regardless of the veracity of the rumor, the Switch isn't going to be clocking its CPU at those clocks on a regular basis (the main use would be as highlighted previously: emulation when the GPU is under-used). Also as there is no Pascal Tegra that is really comparable to the X1 (the PX2 is really a different beast entirely), we don't even know if it will have double the bandwidth. They could have literally die shrunk the X1 to 16nm, so again, not really much changes in this scenario.

But these clocks, especially the GPU side, wouldn't really be making enough of a difference to somehow get us from 900p to 1080p. A 20% boost over the old supposed clocks just isn't enough to push ~45% more pixels, and even double the RAM bandwidth probably wouldn't gel.

There's just too much to this Foxconn leak that is dubious/uncertain of veracity vs. speculation and hard to verify. Durante highlighted some issues, Thraktor noted the dimensions not telling us much. We're just not going to know until someone snapshots the die or something.
 

ggx2ac

Member
Regardless of the veracity of the rumor, the Switch isn't going to be clocking its CPU at those clocks on a regular basis (the main use would be as highlighted previously: emulation when the GPU is under-used). Also as there is no Pascal Tegra that is really comparable to the X1 (the PX2 is really a different beast entirely), we don't even know if it will have double the bandwidth. They could have literally die shrunk the X1 to 16nm, so again, not really much changes in this scenario.

There's just too much to this Foxconn leak that is dubious/uncertain of veracity vs. speculation and hard to verify. Durante highlighted some issues, Thraktor noted the dimensions not telling us much. We're just not going to know until someone snapshots the die or something.

A57 at 1.78GHz ain't going to happen unless cores are disabled or the Switch is restricted to docked mode for the GCN emulation you suggest. The power draw for that CPU gets very close to 7W at that clock speed which is bad for portable mode alone.

That's why an A72 at 1.78GHz with a 16nm node puts it at an equivalent power draw to an A57 at 1GHz in a 20nm node.
 

z0m3le

Banned
A57 at 1.78GHz ain't going to happen unless cores are disabled or the Switch is restricted to docked mode for the GCN emulation you suggest. The power draw for that CPU gets very close to 7W at that clock speed which is bad for portable mode alone.

That's why an A72 at 1.78GHz with a 16nm node puts it at an equivalent power draw to an A57 at 1GHz in a 20nm node.

Yeah, I don't see a point in those clocks being tested for 8 days if they aren't the shipped clocks personally. I mean X1 at those clocks even on 16nm would draw around 4 watts by itself and consume the battery in maybe a hour and a half in the full system.

At least the Eurogamer answer did leave room for clocks being changed, I do hope they check with their source now or maybe they will have answers for us at launch, but I'm not sure we will know for sure until someone hacks the system enough to check clocks.
 
You do realize that Eurogamer's specs would easily allow 1080p and 60fps of 720p and 30fps Wii U games right? It's not because of performance that these launch games aren't maxing out the device.

Hell NSMBU ran at 720p on Wii U at launch. Launch games don't really have the best performance record and Nintendo did say they wanted to try and hit 1080p for Zelda at launch.

Would they really? 720p to 1080p requires 2.25x the processing power.. Please correct me if I'm wrong, but doubling framerate (30fps to 60fps) requires 2x the processing power as well, right? So wouldn't it be 4.5x then? That's lower than eurogamer leaked clockspeeds combined with our speculation of 2 SMs-the Switch would only be 4x as powerful as the Wii U.

What is framerate mostly dependent on? I hear it mostly CPU. I'm wondering how much does bandwidth, RAM, and GPU have a factor in frame rate.


2019: Switch Home (traditional home console, maybe 1.5-2TF for 4K Nintendo games and 1080p third party ports)
a 2 TFLOP console using 4K? Ugh no thanks. I don't want to see Wii U+ games at 4k. It's a waste. Would be better if they spent 4k resolution on a system with +4 GFLOPS. Games like mario and zelda at 30fps at 4k resolution with PS4 pro specs would look amazing.
 
Or it could be that they are prioritizing other areas of the visuals, like draw distance, foliage detail, or AA which are probably more important on a big TV screen than a small 6.2 inch screen.

I just really don't think the Zelda visuals can tell us much about three hardware. Anyway I think ARMS looks XB1 level from a visual standpoint (to me anyway) so clearly different games will look very different on the same hardware due to visual/processing priorities.

I don't think Nintendo really even cares about using Switch's hardware to its full advantage for ports. Just more stable framerates and higher resolution. MK8 and BotW were built from the ground up on Wii U as well. I remember the treatment twilight princess got with the wii version vs gamecube. They're basically the same game but mirrored, and wii version uses motion controls, while gamecube has free roam camera.
 

z0m3le

Banned
Would they really? 720p to 1080p requires 2.25x the processing power.. Please correct me if I'm wrong, but doubling framerate (30fps to 60fps) requires 2x the processing power as well, right? So wouldn't it be 4.5x then? That's lower than eurogamer leaked clockspeeds combined with our speculation of 2 SMs-the Switch would only be 4x as powerful as the Wii U.

What is framerate mostly dependent on? I hear it mostly CPU. I'm wondering how much does bandwidth, RAM, and GPU have a factor in frame rate.


2019: Switch Home (traditional home console, maybe 1.5-2TF for 4K Nintendo games and 1080p third party ports)
a 2 TFLOP console using 4K? Ugh no thanks. I don't want to see Wii U+ games at 4k. It's a waste. Would be better if they spent 4k resolution on a system with +4 GFLOPS. Games like mario and zelda at 30fps at 4k resolution with PS4 pro specs would look amazing.

Your right that it would take slightly more grunt to get to 1080p at 60fps of wii u at 720p and 30fps, it's fairly close and the better api should easily make up the difference since it is just over 4x with eurogamer's specs and about 5x with this leak.

RAM, GPU and CPU need to not be a bottleneck for higher frames, while resolution is mostly dependent on GPU and RAM.

What is interesting to take away from Zelda is how much they changed the game on Switch. If you think about how bad the Wii U is obviously holding the vision back, it is pretty disappointing to me, otherwise they would have pushed for 1080p on switch but they took on a port and that is usually going to take up some performance simply because you are designing a game around the strengths of a unique system and transferring that design to something that has different strengths, overall power might not make up for the difference, at least not without time and final hardware during development.

It's not likely that they would clock a gtx 1060 low enough to do 2tflops when they could have used the smaller gtx 1050 series. It's more likely that they will push gpu clocks on the switch to 768mhz or 921mhz and target 720p on both the portable and TV with newer gamers when the dock is released and look at a 2 hour battery life. We've seen this sort of thing with the psp. This would allow the full GTX 1060 which would trade blows with Scorpio, but this is such a soft topic for people concerned about us speculating about Nintendo hardware that we should probably avoid talking about it as we have seen a few pages back the effect it has on those concerned posters.
 

tronic307

Member
I wouldn't be so quick to assume that the GPU is running at 921MHz. We already knew the GPU was capable of more than 768MHz and it could simply be a case of them stress-testing the system during manufacturing rather than it being clocked at 921MHz ingame.
921(.6)MHz is 3x 307.2, which is handheld GPU frequency on the Switch.
768 is 1/4 of 3072, anyone know the base frequency for Tegra X1's GPU? 1785 is 1.75x1020. These are round numbers so they bear a relationship...very interesting.
 

z0m3le

Banned
921(.6)MHz is 3x 307.2, which is handheld GPU frequency on the Switch.
768 is 1/4 of 3072, anyone know the base frequency for Tegra X1's GPU? 1785 is 1.75x1020. These are round numbers so they bear a relationship...very interesting.

Yeah the base frequency of the X1 gpu is 76.8mhz which is why these clocks can't really be ignored as made up without the poster just being extremely lucky with all the details since even power consumption should be the same and even slightly better with these clocks on 16nm vs 20nm x1.

Again the gpu with eurogamer's clock is the frequency x10, the one here is x12 or a 20% increase in performance. Which isn't enough to be noticeable for the naked eye, it's not even enough to bump resolutions from 900p to 1080p but should cover wii u ports better.

As for cpu, it's much the same, as you said it could be that the base cpu frequency is 102mhz or 51mhz giving 10x or 20x for eurogamer's clocks and 17.5x or 35x for these clocks while drawing the same power as the X1.
 
921(.6)MHz is 3x 307.2, which is handheld GPU frequency on the Switch.
768 is 1/4 of 3072, anyone know the base frequency for Tegra X1's GPU? 1785 is 1.75x1020. These are round numbers so they bear a relationship...very interesting.

Yeah the base frequency of the X1 gpu is 76.8mhz which is why these clocks can't really be ignored as made up without the poster just being extremely lucky with all the details since even power consumption should be the same and even slightly better with these clocks on 16nm vs 20nm x1.

Again the gpu with eurogamer's clock is the frequency x10, the one here is x12 or a 20% increase in performance. Which isn't enough to be noticeable for the naked eye, it's not even enough to bump resolutions from 900p to 1080p but should cover wii u ports better.

As for cpu, it's much the same, as you said it could be that the base cpu frequency is 102mhz or 51mhz giving 10x or 20x for eurogamer's clocks and 17.5x or 35x for these clocks while drawing the same power as the X1.

Interesting. I didn't check to see if there was a corelation with the CPU, but it appears to be there. It is interesting that this post's CPU info is a precise 75% higher than Eurogamer's CPU clockspeed.

As for the GPU, MDave's tests with TX1 showed that the GPU was throttling up/down in 76.8 increments
 

Hermii

Member
Yeah the base frequency of the X1 gpu is 76.8mhz which is why these clocks can't really be ignored as made up without the poster just being extremely lucky with all the details since even power consumption should be the same and even slightly better with these clocks on 16nm vs 20nm x1.

Again the gpu with eurogamer's clock is the frequency x10, the one here is x12 or a 20% increase in performance. Which isn't enough to be noticeable for the naked eye, it's not even enough to bump resolutions from 900p to 1080p but should cover wii u ports better.

As for cpu, it's much the same, as you said it could be that the base cpu frequency is 102mhz or 51mhz giving 10x or 20x for eurogamer's clocks and 17.5x or 35x for these clocks while drawing the same power as the X1.
All of this would be consistent with them running a stress test. The DF article is based on recent information, so that seems like a plausible explanation.
 

z0m3le

Banned
Interesting. I didn't check to see if there was a corelation with the CPU, but it appears to be there. It is interesting that this post's CPU info is a precise 75% higher than Eurogamer's CPU clockspeed.

As for the GPU, MDave's tests with TX1 showed that the GPU was throttling up/down in 76.8 increments

Yeah, the question at this point isn't if the clocks are real or not, but rather why was the system stress testing these clocks instead of the earlier clocks if those were meant for shipping. This leak came only a week or two after full development was rumored to have started and weeks after final devkits went out which were supposedly more powerful than July devkits.

I mean the use case of an A57 in a portable reaching 1.78ghz solid for the life of the battery is just not too reasonable with the power draw of 7watts or nearly 3watts at 16nm.

Lastly if you have a chip like this that runs at those clocks for 8 days straight at full load without dropping a frame, why wouldn't you use this clock for your device at launch? They were making 20k of these a day.
All of this would be consistent with them running a stress test. The DF article is based on recent information, so that seems like a plausible explanation.
That's the question I ask above, why would you stress test these clocks instead of your shipped clocks, especially if you can't clock the cpu that high in the retail product because it uses too much power. It makes sense that it is a stress test, that is exactly what it is obviously and isn't even an assumption since they ran the test for 8 days straight, but it only makes real sense as a stress test for final clocks.
 
Speaking of clockspeed.. If we really ended up getting a 20% boost from eurogamer's clockspeed at one point+A72 CPU, I think it would be substantial. It could mean the difference between 40-45% Xbone to 50-55% xbone, which would a difference for ports in favor of the Switch at least.

Regarding GPU though.. How certain are we that the Switch will be using 2 SMs? It wasn't revealed by Eurogamer nor the foxconn leaks. They can make all the difference for the GPU peformance.

And if we're back at eurogamer leaks, are we back at 20nm nodes too? lol Then again, maybe not. 3 to 6 hours seems like something 16nm pascal could do. The lower the clockspeed, the more juice we get.
 

TLZ

Banned
No, that is incorrect.

Zelda Wii U: 720p30 with significant drops
Zelda Switch docked: 900p30 with minimal drops
Zelda Switch mobile: 720p30 with minimal drops

Handheld mode seems only marginally more powerful than Wii U. Docked mode is significantly more powerful.

Where'd you get the Wii U part from?

Also I'd hardly call Switch docked 900p30 with minimal drops "significantly more powerful". What would you say if docked was 1080p60 or 4k30 then?
 

z0m3le

Banned
Speaking of clockspeed.. If we really ended up getting a 20% boost from eurogamer's clockspeed at one point+A72 CPU, I think it would be substantial. It could mean the difference between 40-45% Xbone to 50-55% xbone, which would a difference for ports in favor of the Switch at least.

Regarding GPU though.. How certain are we that the Switch will be using 2 SMs? It wasn't revealed by Eurogamer nor the foxconn leaks. They can make all the difference for the GPU peformance.

And if we're back at eurogamer leaks, are we back at 20nm nodes too? lol Then again, maybe not. 3 to 6 hours seems like something 16nm pascal could do. The lower the clockspeed, the more juice we get.

2SM would fit the measurements this leak takes. I wouldn't expect more SM especially with those clocks and power consumption.

Where'd you get the Wii U part from?

Also I'd hardly call Switch docked 900p30 with minimal drops "significantly more powerful". What would you say if docked was 1080p60 or 4k30 then?

Zelda tells more about how underpowered wii u is and how it couldn't meet the vision they had for the game. You can see that switch removes a lot of that ugly N64 fog and ugly lighting while it has a better draw distance and sharper textures. Those are all going to require more power and use up some of that 400% faster gpu. That is why it is only capable of giving us about a 2x IQ bump (when more stable frame rate is taken into account)

Personally really glad I don't have to play this game on the Wii U.
 
Top Bottom