• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.

ggx2ac

Member
Even though I'm confident on 2 SM's I just thought I would throw this out.

What about the possibility of a 16nm chip with 4 SM's? Would that account for the fan and clocks? It would also explain why the system could have been originally running on OC'd 20nm 2SM setup as well as the reason final dev kit got a bump in battery (shrinking the die).

I still have a problem with this suggestion that the dev-kits were overclocked.

Eurogamer speculated it and everyone has taken it as fact. Every insider and Eurogamer said the dev-kit leaked specs are accurate and said nothing critical about the clock speeds considering they are at stock speeds, not overclocked.

They also went as far to say that the dev-kit specs would be similar to what the retail unit would be. Yet, look where we are now.

Sure, 20nm node and 3 SM is possible but are Nintendo really paying more to make the GPU larger for a modest increase in performance?

And even the CPUs are a problem that it is suggested they are 4 A57 cores
 
The Cube was pretty advanced for it's time, particularly on the GPU and RAM front. I don't know if the CPU was on the latest process node available at that moment, but it was on equal footing with the Intel Chip in the Xbox (180nm).

Both the Xbox's CPU and the GCN CPU's were produced on 180nm

Thanks! So it seems going with a 2 generation old process node would be fairly out of character for this situation.

Let's see, they kept the Wii U on a 45nm process due to the EDRAM. They recently shrunk the process for the n3DS. What would be a good reason why they would stick with an older process on a handheld form factor? Will that actually be more expensive to do?

If that's somehow the case, it actually crazy that the system is as powerful as the worse-case scenario.
They could also make some dramatic revisions (Switch-Pro?) very soon with a drastic increase of energy efficiency, capability and/or battery life.

The only possible reason I've seen thrown around is that Nvidia or TSMC gave them a hell of a good deal on 28nm chips. But honestly that doesn't make sense to me either, as, with 28nm you need to add the active cooling fan- which would likely drive up costs long term due to increased repairs/replacements caused by the presence of a moving part in a portable device.

They have never built a handheld with a fan before. It would seem like they only did so now because they had no other choice, but a 16nm SoC with these clock speeds (which almost certainly would not require a fan even docked) would represent that other choice.
 

bomblord1

Banned
I still have a problem with this suggestion that the dev-kits were overclocked.

Eurogamer speculated it and everyone has taken it as fact. Every insider and Eurogamer said the dev-kit leaked specs are accurate and said nothing critical about the clock speeds considering they are at stock speeds, not overclocked.

They also went as far to say that the dev-kit specs would be similar to what the retail unit would be. Yet, look where we are now.

Sure, 20nm node and 3 SM is possible but are Nintendo really paying more to make the GPU larger for a modest increase in performance?

And even the CPUs are a problem that it is suggested they are 4 A57 cores

I really don't see 4 A57 cores available to games as much of a problem even at 1ghz. As long as all 4 CPU's are entirely available to the games.
 

TLZ

Banned
I don't know man. All that talk either from wsj's latest modern tech whatever or nvidia's similar latest modern tech and this rumor don't match :/
 
I don't know man. All that talk either from wsj's latest modern tech whatever or nvidia's similar latest modern tech and this rumor don't match :/
An X1 would've been pretty modern tech by the time the WSJ wrote their article.
I wouldn't expect much more than what's stated.
 

Speely

Banned
I don't know man. All that talk either from wsj's latest modern tech whatever or nvidia's similar latest modern tech and this rumor don't match :/

Barring any divergent rumors between now and January, this is what we're working with. I guess it's just all what people choose to value as relevant at this point.

Of course, in January we are very likely to get zero additional spec info, but we will at least have a wealth of analysis based on the games present, and how they look/perform.

I think that will speak volumes, whether for better or for worse.
 

aBarreras

Member
My lesson was "never trust in anybody but LKD, not even Eurogamer & Digital Foundry guys". Eurogamer have been wrong couple of times before (where is my Mother 3?) and I don't know who the Digital Foundry are. LKD's track record still looks like best one, she's yet to be wrong. And if she says Switch is much powerful than Wii U in portable mode and Dark Souls 3 is running on Switch on satisfactory level, I believe her. Ignore everything else.

digital foundry are eurogamer
 
I really doubt 16nm is even an option if it's maxwell based. They already shrunk maxwell to 20nm, i dont see a reason why nvidia would bother wasting time shrinking it to 16nm when they already have pascal at 16nm.

It has to be 20nm or 28nm especially if it needs that fan at those clocks.
 

Schnozberry

Member
I really doubt 16nm is even an option if it's maxwell based. They already shrunk maxwell to 20nm, i dont see a reason why nvidia would bother wasting time shrinking it to 16nm when they already have pascal at 16nm.

It has to be 20nm or 28nm especially if it needs that fan at those clocks.

Maxwell shrunken to 16nm is essentially Pascal, especially in the Tegra line. The Tegra X1 had a lot of features not found in earlier Maxwell chips that were later found in Pascal desktop chips.
 
Barring any divergent rumors between now and January, this is what we're working with. I guess it's just all what people choose to value as relevant at this point.

Of course, in January we are very likely to get zero additional spec info, but we will at least have a wealth of analysis based on the games present, and how they look/perform.

I think that will speak volumes, whether for better or for worse.

Not necessarily. Even after the Wii U launched, a lot of people are still surprised about Wii U's GPUs actual specs. Recently, people was dumbfounded how underclocked the PSVita is.
 

Vic

Please help me with my bad english
It just doesn't make sense. I mean, If Nintendo was targeting the level of performance suggested by the DF rumours, would a partnership with Nvidia even be necessary for this project? Couldn't they use PowerVR or Mali blocks and call it a day?
 

Aroll

Member
It just doesn't make sense. I mean, If Nintendo was targeting the level of performance suggested by the DF rumours, would a partnership with Nvidia even be necessary for this project? Couldn't they use PowerVR or Mali blocks and call it a day?

Uh, sure, but it's not about power perse. It's about compatibility and localized tools. One massive barrier between third parties and Nintendo was always communication having to go through Japan. Now it goes through Nvidia for the dev tools. On top of that, this system is far more compatible for scaling than other options. So third parties could easily port their games depending on how much they are willing to scale.

Power isn't everything. A lot of stuff goes into this decision and they could have gone many directions if power was their primary goal .
 

usmanusb

Member
Another fact given by Nvidia was spending 500 man years to create custom Tegra solution. It is somehow irrational to hear that is the same Tegra x1 with lower clocks by spending that much time to produce. I think there are vital information blocks missing
 

Speely

Banned
Another fact given by Nvidia was spending 500 man years to create custom Tegra solution. It is somehow irrational to hear that is the same Tegra x1 with lower clocks by spending that much time to produce. I think there are vital information blocks missing

Maybe not vital (but then again maybe.) I would easily bet that there are pieces missing. How big they are no one knows.
 

ggx2ac

Member
To recap.

Under the assumption of a 256 CUDA core GPU we get 157 Gflops at 307.2 MHz.

Under the assumption of a 384 CUDA core GPU we get 236 Gflops at 307.2 MHz

Going from 2 SM to 3 SM gives a 50% increase in performance.

To get that same 50% increased performance while using 2 SM requires 461 MHz for the clock speed.

Surely a 16nmFF node would be able to fulfill this while balancing out the power consumption because apparently making the GPU larger with 3 SM would be more costly in comparison.

This is why I am finding it difficult to see if Nintendo aren't that cheap, especially when we still know nothing about the CPU as well other than the clock speed.

(And that the final dev-kits were confirmed to be Maxwell not Pascal which makes the node size unclear.)

Edit: To add to this I reference my doubts about the overclocked dev-kits in this post:

http://m.neogaf.com/showpost.php?p=226954114
 
To recap.

Under the assumption of a 256 CUDA core GPU we get 157 Gflops at 307.2 MHz.

Under the assumption of a 384 CUDA core GPU we get 236 Gflops at 307.2 MHz

Going from 2 SM to 3 SM gives a 50% increase in performance.

To get that same 50% increased performance while using 2 SM requires 461 MHz for the clock speed.

Surely a 16nmFF node would be able to fulfill this while balancing out the power consumption because apparently making the GPU larger with 3 SM would be more costly in comparison.

This is why I am finding it difficult to see if Nintendo aren't that cheap, especially when we still know nothing about the CPU as well other than the clock speed.

(And that the final dev-kits were confirmed to be Maxwell not Pascal which makes the node size unclear.)

Even if we heard it was Pascal, I'm sure we'd have people here saying that it's probably on 28nm process because Nintendo.
 
Not necessarily. Even after the Wii U launched, a lot of people are still surprised about Wii U's GPUs actual specs. Recently, people was dumbfounded how underclocked the PSVita is.
What is to be surprised about Wii U's gpu? Its supposedly better than 360, but I don't recall anyone saying how much better. Of course its hard to see with mulriplats given Wii U's CPU.
 

Vic

Please help me with my bad english
Uh, sure, but it's not about power perse. It's about compatibility and localized tools. One massive barrier between third parties and Nintendo was always communication having to go through Japan. Now it goes through Nvidia for the dev tools. On top of that, this system is far more compatible for scaling than other options. So third parties could easily port their games depending on how much they are willing to scale.

Power isn't everything. A lot of stuff goes into this decision and they could have gone many directions if power was their primary goal .
I was going to suggest that. The system complacency to the Vulkan, OpenGL 4.5 and OpenGL ES APIs and game engines like UE4 and Unity already supporting the system amongst others facts are tangible examples of Nintendo's effort into accommodating non-Nintendo developers to work on their system. They want their games on the Switch.

With that said, wouldn't the system performance also play a major role to support the points stated above? The hardware development teams who conceptualized the console are fully aware of the level of performance most third-parties are crafting their games around with at the moment. Wouldn't providing lesser than ideal performance, especially with the CPU, be a detriment to their effort? This is why I have doubts believing that the Switch will be an "underpowered" system.

But what do I know? The leaked specs are probably the correct ones.
 

Vash63

Member
Even if we heard it was Pascal, I'm sure we'd have people here saying that it's probably on 28nm process because Nintendo.

Really about number of SMs at this point, Pascal isn't very much faster than Maxwell 2 on the X1 per clock, and we already know the clocks. If it was Pascal, the clocks would probably be a lot higher.
 

atbigelow

Member
Really about number of SMs at this point, Pascal isn't very much faster than Maxwell 2 on the X1 per clock, and we already know the clocks. If it was Pascal, the clocks would probably be a lot higher.

Pascal isn't much faster, but it would be a lot faster???
 

antonz

Member
Emily has broken her silence. Suggests people should not be surprised by the numbers being heard as she said as far back as may the only console the Switch was going to "blow away" was the Wii U.
 
Pascal isn't much faster, but it would be a lot faster???

I think he's saying that, at the same clock speed, Pascal and Maxwell cores perform the same, so the only reason to go Pascal is to get the increased performance with lower power draw. That increased performance can only be gained if you increase the clock speed. So given the clock speeds we've heard, they wouldn't really gain much by going 16nm Pascal.

Emily has broken her silence. Suggests people should not be surprised by the numbers being heard as she said as far back as may the only console the Switch was going to "blow away" was the Wii U.

Errr.. The specs projected by the DF article do not seem to present a device which "blows away" the Wii U. It seems much more like a minor to moderate upgrade.

Directed at Emily's statement, not you of course.
 

ggx2ac

Member
Emily has broken her silence. Suggests people should not be surprised by the numbers being heard as she said as far back as may the only console the Switch was going to "blow away" was the Wii U.

It's the 3DS all over again. People were comparing it to the 360 because they couldn't convey how powerful it was with numbers.
 

CrustyBritches

Gold Member
Another fact given by Nvidia was spending 500 man years to create custom Tegra solution. It is somehow irrational to hear that is the same Tegra x1 with lower clocks by spending that much time to produce. I think there are vital information blocks missing

My memory is a bit hazy, but wasn't AMD claiming Xbox 720 would have "Avatar-like" graphics. Lol.
 

antonz

Member
I think he's saying that, at the same clock speed, Pascal and Maxwell cores perform the same, so the only reason to go Pascal is to get the increased performance with lower power draw. That increased performance can only be gained if you increase the clock speed. So given the clock speeds we've heard, they wouldn't really gain much by going 16nm Pascal.



Errr.. The specs projected by the DF article do not seem to present a device which "blows away" the Wii U. It seems much more like a minor to moderate upgrade.

Directed at Emily's statement, not you of course.

Honestly it feels like Nintendo was playing with a lot more power than they ended up going with. Early CPU reports were it was massively better than what's in current consoles. Now they are somewhat equal if not slightly worse than console cpu.
 

conpfreak

Member
Emily has broken her silence. Suggests people should not be surprised by the numbers being heard as she said as far back as may the only console the Switch was going to "blow away" was the Wii U.

But nobody expected the Switch to blow away the PS4/Xbox One. They just expected generational performance in a hybrid form factor, which has been indicated with the reveal trailer but now confirmed with hard evidence of real world performance. The Breath of the Wild footage was a good indicator that the Switch is at least capable of better than Wii U visuals, but it's a port and nothing more can be determined from that.

I think the footage we should be paying more attention to is the new 3D Mario footage, which looks significantly better than the HD Mario games from the Wii U. Looks like they put that engine on steroids in that footage.
 

Pokemaniac

Member
Emily has broken her silence. Suggests people should not be surprised by the numbers being heard as she said as far back as may the only console the Switch was going to "blow away" was the Wii U.

If the assumptions that DF made are all true, then saying the Switch could "blow away" the Wii U seems a little... generous.
 

atbigelow

Member
I think he's saying that, at the same clock speed, Pascal and Maxwell cores perform the same, so the only reason to go Pascal is to get the increased performance with lower power draw. That increased performance can only be gained if you increase the clock speed. So given the clock speeds we've heard, they wouldn't really gain much by going 16nm Pascal.
Right; they would only gain efficiency.


Errr.. The specs projected by the DF article do not seem to present a device which "blows away" the Wii U. It seems much more like a minor to moderate upgrade.

Directed at Emily's statement, not you of course.

I guess it depends what you mean "blow away" as. At least 2x RAM (available for games, assuming), 1-2x GPU power, and at least 2x CPU power. That's at least twice the machine.

It ain't no XBO, for sure. But you could easily make a Switch game the Wii U would not be able to run.
 
But nobody expected the Switch to blow away the PS4/Xbox One. They just expected generational performance in a hybrid form factor, which has been indicated with the reveal trailer but now confirmed with hard evidence of real world performance. The Breath of the Wild footage was a good indicator that the Switch is at least capable of better than Wii U visuals, but it's a port and nothing more can be determined from that.

I think the footage we should be paying more attention to is the new 3D Mario footage, which looks significantly better than the HD Mario games from the Wii U. Looks like they put that engine on steroids in that footage.

I don't think that message from Emily was for us. It was people that were betting it to be very close to the XB1. Right now, we are arguing about a spec range that is hovering around 1/2 and 1/3 of the XB1.
 
Even if we heard it was Pascal, I'm sure we'd have people here saying that it's probably on 28nm process because Nintendo.
Yeah, those people would be wrong in the sense that there cant be pascal on anything greater than 16nm. Pascal is only Pascal because of the 16nm process it was created for.
 

guek

Banned
What if this is all a controlled leak and Nintendo is playing us for chumps. Now that expectations are so low, the actual specs will pleasantly surprise everyone while still falling short of xb1. What if this is their way of seeing expectations a month in advance. What if Nintendo marketing is playing chess while we're playing checkers.
giphy.gif


I'm kidding
 
Oh that is basically what handheld is. Handheld is a slightly improved Wii U. Docked is at best 3x the Wii U as far as GPU capabilities go.

Well, my impression was that the Wii U hit the CPU bottleneck well before the GPU.

Ok sub question, which gap is bigger

Wii U to PS4

or

Switch to PS4Pro

Wii to PS4. The modern architecture and the significant RAM/CPU increase gives the Switch a boost.
 

Vena

Member
Emily has broken her silence. Suggests people should not be surprised by the numbers being heard as she said as far back as may the only console the Switch was going to "blow away" was the Wii U.

That's hardly informative lol.

Nothing we've discussed is anything more than "blowing away the WiiU" it's just a matter of degrees.
 
You wont need storage space if the games are played off of some cartrage as the rumors say.
The comment you replied to talked about patch storage, not game storage. Unfortunately today's console games are patched quite regularly and often on the first day of release. If the Switch is to receive 3rd party support, where is all that patch data going to be stored? This is yet another reason to expect that the Switch won't get 3rd party support for major games.
 
Wrong, Wii U was gimped by the Radeon 4000 architecture which came out 2008, not the Wii,s TEV architecture
Actually, going by that, the Wii U's GPU is actually even worse than 10x weaker than the PS4's. That 10x number came from just looking at the GFLOPS, but that shouldn't be directly compared due to the GCN architecture of the PS4.
 
Status
Not open for further replies.
Top Bottom