• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.

AzaK

Member
Well we'll likely find out battery capacity and play time which in turn will give us handheld power usage. That should help a bit, maybe..

Yeah but not from Nintendo we won't find out much. It'll still be detective work for months or more to figure it out.
 
New Famitsu releases on the 12th. Just saying.

The new Famitsu's details are already coming out. A new game from Nippon Ichi was announced, plus there's new details for Nioh, Danganronpa V3, KH2.8 etc. Don't expect any new details on the Switch for this issue. They'll have their blowout on it next week.
 

ultrazilla

Gold Member
So tech NeoGAF, if I'm reading the last few pages correctly, it's safe to assume that when we get a Nintendo Switch, we will be getting a system with a minimum of 3-4 times the power of Wii U in docked mode and at least 1-1.5 times more powerful in portable mode?

Is that a good, safe assumption at this point from all the testing/speculation/leaks, etc?
 

Hermii

Member
So tech NeoGAF, if I'm reading the last few pages correctly, it's safe to assume that when we get a Nintendo Switch, we will be getting a system with a minimum of 3-4 times the power of Wii U in docked mode and at least 1-1.5 times more powerful in portable mode?

Is that a good, safe assumption at this point from all the testing/speculation/leaks, etc?

I wouldn't consider myself techgaf, but thats my understanding too. Thats whiteout considering the impact of better cpu, more ram, better gpu featureset, better tools, api support and middleware support.
 

tkscz

Member
So tech NeoGAF, if I'm reading the last few pages correctly, it's safe to assume that when we get a Nintendo Switch, we will be getting a system with a minimum of 3-4 times the power of Wii U in docked mode and at least 1-1.5 times more powerful in portable mode?

Is that a good, safe assumption at this point from all the testing/speculation/leaks, etc?

Not as easy as saying a number of times a powerful to determine what we'd get graphically. While those numbers are about right (better to say 2 to 3 times undocked) the Switch will be able to run things (like UE4) and produce graphical effects that the Wii U cannot.
 

Rodin

Member
Because according to Mdaves tests, the clocks we are getting is apparently the throttling sweetspot for a TX1 at 20nm.

Yeah but he can't do tests to find out the sweetspot for 16/28nm. Maybe they were conservative at 16 or the clocks were good at 28 as well, who knows?
 

ultrazilla

Gold Member
Here and Tk-thanks! More than enough for me! And we still don't know if Nintendo added any "secret sauce" to the chips, etc.
 
I cant help to think that going A57 is a mistake when A72 is far better in term of power consumption/heat/power.

What I don't understand is how anyone would think they know more than Nvidia or Nintendo in regards to what is best for the vision they have for their platform. Im sure they are doing everything possible to make the best system they can for the best price possible. They arent purposefully nerfing the system.
 

ggx2ac

Member
Which brings us back to the question: How sure can we be of a 20 or 28nm fabrication node?

Low clock speeds like the ones from Eurogamer.

If Laura Kate Dale isn't spilling the beans about what makes the October dev-kits more powerful than the last ones while she leaks a lot of other things about Switch then it doesn't really tell us if Nintendo possibly went with 16nmFF which would let them increase the clock speeds for example of a way the dev-kits could be more powerful.
 

Hermii

Member
What I don't understand is how anyone would think they know more than Nvidia or Nintendo in regards to what is best for the vision they have for their platform. Im sure they are doing everything possible to make the best system they can for the best price possible. They arent purposefully nerfing the system.

Im sure they did the best they could with the budget they had, but the point of having a discussion thread is trying to figure out what choices they made and why they made them. Nobody is saying they know better than the engineers who built the system.
 

ggx2ac

Member
Im sure they did the best they could with the budget they had, but the point of having a discussion thread is trying to figure out what choices they made and why they made them. Nobody is saying they know better than the engineers who built the system.

I don't know, it gets pretty hilarious when we've had people say that the Switch would be weak because they went with ARM instead of x86.

Haha
 

Hermii

Member
Low clock speeds like the ones from Eurogamer.

If Laura Kate Dale isn't spilling the beans about what makes the October dev-kits more powerful than the last ones while she leaks a lot of other things about Switch then it doesn't really tell us if Nintendo possibly went with 16nmFF which would let them increase the clock speeds for example of a way the dev-kits could be more powerful.

They could have gone with conservative clocks at launch and plan to increase them later in a patch like the vita. The switch is thinner than the shield, maybe these clock speeds were necessary even at 16nmFF.

I agree 20nm seems most likely

I don't know, it gets pretty hilarious when we've had people say that the Switch would be weak because they went with ARM instead of x86.

Haha

There will always be those kind of posts hehe.
 

ggx2ac

Member
They could have gone with conservative clocks at launch and plan to increase them later in a patch like the vita. The switch is thinner than the shield, maybe these clock speeds were necessary even at 16nmFF.

I agree 20nm seems most likely

If they're doing an upclock via patch it would probably just be for the CPU. We know the 4 A57 CPU cores use a lot less wattage than the TX1 GPU although, this is something Nintendo would more likely do for themselves rather than for third parties.

The only exception is Capcom considering Nintendo would always accommodate them regarding Monster Hunter where we got the Classic Controller Pro made for Monster Hunter Tri (Wii) and the Circle Pad Pro for Monster Hunter 3G (3DS)

Also, at 16nmFF they can afford to upclock CPU and GPU easily as I showed with the A8X and A9X SoCs as examples. We know the wattage for 4 A57 cores and a rough approximation for the TX1 GPU. We also know that 16nmFF can reduce power consumption by 60% compared to 20nm within the boundaries of the 20nm node clock speeds.

Adding to that, we know Nintendo have done this for the New 3DS when it got a die shrink from 45nm to whatever it was? 32nm?

The New 3DS is an interesting revision of the older hardware - ARM11 core count doubles and clock-speed radically shifts upwards from 268MHz to 804MHz, while memory and VRAM increase substantially. What's curious is that allegedly the GPU remains completely unchanged - in effect, New 3DS seems to be about bringing CPU power more into balance with its graphics potential.

ARM11 CPU: 2x MPCore/2x VFPv2 Co-Processor at 268MHz. Doubling to 4x MPCore/4x VFPv2 Co-Processor for new 3DS with 804MHz max clock.
ARM9 CPU: ARM946 at 134MHz
GPU: DMP PICA at 268MHz
VRAM: 6MB, 10MB for new 3DS
DSP: CEVA TeakLite at 134Mhz. 24ch 32728Hz sampling rates
System memory: 128MB, 256MB for New 3DS

http://www.eurogamer.net/articles/digitalfoundry-2016-nintendo-3ds-vs-new-3ds-face-off
 

Thraktor

Member
I think they just a little late to the party to hit 16nm. Had Wii U recovered enough to plan for a Holiday 2017 release, then we might be seeing A72 cores and a 16nm GPU that affords them a little more headroom for clocks. The leakage and thermal issues with 20nm are very real, as the Snapdragon 810 showed in 2015, and the Switch is reflecting that now with the limited clock speeds.

I think the Switch is the most forward thinking piece of tech we've seen from Nintendo since the Gamecube. It's just easy to see how timing sort of tied their hands.

I don't think 16nm wasn't an option for them (even assuming a late 2016 target they were a year after the first 16FF+ SoCs, and A72s for that matter), but if they went with 20nm I'm sure it was simply a matter of cost. The A72 core should have been available to them right from the start of design work (late 2014/early 2015), so if they are using A57s it's a puzzling decision. I suppose it's cheaper from an R&D perspective than the A72 on 20nm (given Nvidia had already taped out A57s on the node), but given the long-term benefits of using A72s (including financial, due to the smaller die size), it would have to be a pretty big cost to outweigh the benefits.

What (if any) would be the most likely customizations Nintendo would ask for based on what we know of TX1?

The most likely customisations would be to the memory subsystem. So:

- A larger GPU L2 cache or an added L3
- Wider memory interface
- Modified ROPs to fully support tiling for Vulkan's renderpasses/subpasses (if they don't already)

There's also the implementation of HMP (heterogenous multi-processing) allowing all CPU cores to be utilised at the same time (so that the A53 cores can be used for the OS).
 

AR15mex

Member
So Gaffers, I am a complete ignorant when it comes to discuss specs, but all I care about is the following.

How does the Switch stand VS their competitors?

Will is stand the test of times when it comes to graphics? Look at the Wii U it looked good for a while, but by 2014-2015 it started showing his time.

So any thoughts?
 

LordOfChaos

Member
So... Who's crowd funding the Chipworks die shot?

Look what Marcan got with " a razor blade, a DSLR, and a $100 microscope". We might not need them, as awesome as they were last time, if they don't want to give it out this time.

CyXUWYEUAAAK3rF.jpg:large

https://twitter.com/marcan42/status/803281643750363136
 
So Gaffers, I am a complete ignorant when it comes to discuss specs, but all I care about is the following.

How does the Switch stand VS their competitors?

Will is stand the test of times when it comes to graphics? Look at the Wii U it looked good for a while, but by 2014-2015 it started showing his time.

So any thoughts?

Based on what we are assuming it is halfway between the Wii U and the X1.

What we do know is that it will be the best portable gaming device on the market. Nothing will compete with it in the portable space in regards to exclusives and quality games.
 

MDave

Member
Any techies here that know a good way of measuring memory bandwidth performance? If the Switch is operating at the clocks it is, I don't see why games won't be 1080p when docked. The Shield TV is pulling off 1080p 8xAA in Unity pretty comfortably! And this is using the Vulkan api too.

Using the same Vulkan API, same scene and same render quality settings (1080p 8x MSAA).

PC:
i5 4690K @ 4GHz, GTX 970 (3.5 TF):
284 FPS

Shield TV:
Clocked limited to 1GHZ CPU, GPU fluctuating between 614MHz and 1GHz, averaging about 768MHz most of the time:
44 FPS

Frame rate is barely fluctuating between 1 and 2 FPS on both platforms. Extrapolate that data between those platforms to get what it would perhaps be on the Xbox One and PS4, to see how far or how close the Switch might be? Hah!
 

BuggyMike

Member
The most likely customisations would be to the memory subsystem. So:

- A larger GPU L2 cache or an added L3
- Wider memory interface
- Modified ROPs to fully support tiling for Vulkan's renderpasses/subpasses (if they don't already)

There's also the implementation of HMP (heterogenous multi-processing) allowing all CPU cores to be utilised at the same time (so that the A53 cores can be used for the OS).

Intersting, thanks for your detailed response. If you dont mind humoring my noobiness, I'm curious what areas of performance you expect these customizations to improve upon over the vanilla TX1 (besides the CPU HMP since you explained that pretty clearly).
 
Low clock speeds like the ones from Eurogamer.

If Laura Kate Dale isn't spilling the beans about what makes the October dev-kits more powerful than the last ones while she leaks a lot of other things about Switch then it doesn't really tell us if Nintendo possibly went with 16nmFF which would let them increase the clock speeds for example of a way the dev-kits could be more powerful.

It seems unlikely to me that there would be an upclock in the dev kits vs the Eurogamer leak, which stated that those clocks were the final retail spec. I'd imagine any bump in dev kit performance comes from API optimizations and the like.
 

Doczu

Member
They could have gone with conservative clocks at launch and plan to increase them later in a patch like the vita. The switch is thinner than the shield, maybe these clock speeds were necessary even at 16nmFF.

I agree 20nm seems most likely



There will always be those kind of posts hehe.
The original 3DS also got a small power bump through a firmware update early in the life cycle. There was a long debate here on gaf, measuring by comparing some games on updated and non updated consoles.
But that may not be the same case, as the update probably released some system reserved power for games, instead of bumping the clocks.
 

LordOfChaos

Member
Yup. My thoughts exactly. I'm sure blu or Thraktor will give good explanations though for a pleb like me.

Better on area, performance, and power draw, it's a win in every way, it just doesn't exist on 20nm so far.

But, Nintendo is also no stranger to resynthesizing a CPU on a fab it was never meant for either, the PPC750 on 45nm for instance. Not sure if that makes it equally feasible in reverse, rather than an old architecture meeting a new fab, they'd be pushing one a fab back, which may have other challenges.
But the 750 was introduced on the 250 or 150 nm process, so that may have been a bigger effort than pushing the A72 a half node back.

16nm ff would be some really well received news though, heh....


It seems unlikely to me that there would be an upclock in the dev kits vs the Eurogamer leak, which stated that those clocks were the final retail spec. I'd imagine any bump in dev kit performance comes from API optimizations and the like.

LKD said something like "new dev kits weren't much different, but performed better", which could be taken as what you said. Though that should also mean the old ones could get patched to be better? Unless there was a mix of both, which is most plausible.

NVN being a new API I'd imagine we'll see continual improvement for a good while too.
 

ggx2ac

Member
It seems unlikely to me that there would be an upclock in the dev kits vs the Eurogamer leak, which stated that those clocks were the final retail spec. I'd imagine any bump in dev kit performance comes from API optimizations and the like.

I guess so.

"The information in this table is the final specification for the combinations of performance configurations and performance modes that applications will be able to use at launch."

And as for customisations, the following is the probably what to expect since it apparently shouldn't be a standard TX1 (Emily Rogers mentioning custom GPU months back) but it's still going to be similar to it.

The most likely customisations would be to the memory subsystem. So:

- A larger GPU L2 cache or an added L3
- Wider memory interface
- Modified ROPs to fully support tiling for Vulkan's renderpasses/subpasses (if they don't already)

There's also the implementation of HMP (heterogenous multi-processing) allowing all CPU cores to be utilised at the same time (so that the A53 cores can be used for the OS).
 

Hermii

Member
It seems unlikely to me that there would be an upclock in the dev kits vs the Eurogamer leak, which stated that those clocks were the final retail spec. I'd imagine any bump in dev kit performance comes from API optimizations and the like.
Eurogamer said:
Documentation supplied to developers along with the table above ends with this stark message: "The information in this table is the final specification for the combinations of performance configurations and performance modes that applications will be able to use at launch."[

It doesn't say "these are the final performance modes applications will be able to use in the entirety of the systems lifecycle". Maybe Im reaching but this doesn't rule out an upclock as I see it.
 

Vena

Member
It doesn't say "these are the final performance modes applications will be able to use in the entirety of the systems lifecycle". Maybe Im reaching but this doesn't rule out an upclock as I see it.

Well, its peculiar in that after the Euro leak we had this weird notion of "old vs. new" dev kits but, on the flip, I also found it a bit perplexing that given everything they seemingly have on the Switch... Eurogamers lack some basic information on... the chip.

How do they have the clocks yet nothing else at all? Did someone leak to them a singular document page that also included "these are retail specs" in more words.

/shrug
 

Donnie

Member
How does the rumoured Switch CPU at 1GHz compare to the WiiU CPU at 1.2GHz, several times more powerful or no ?

Nothings certain until we find out exactly what the final CPU is. But as far as a standard 1Ghz A57 goes, its very similar in performance to PS4's CPU at 1.6Ghz (core for core obviously).
 
How does the rumoured Switch CPU at 1GHz compare to the WiiU CPU at 1.2GHz, several times more powerful or no ?
Looking only at Dhrystone MIPS, the Wii U CPU comes in at (based on some cursory internet searches so I'm not sure if it's accurate) about 2.3 DMIPS/MHz, while Cortex A57 is 4.6 DMIPS/MHz. So at those clocks, for general/integer performance the A57 is easily 1.6x the Wii U per core, and that's not counting the immensely better SIMD support.
 

Polygonal_Sprite

Gold Member
Thank you both ^^ Is the A57 the high end of expectations for the Switch CPU or low end because I have to say I didn't expect PS4 CPU performance (although that's nothing to shout about in 2017) after all the doom and gloom after the clocks were released. I think that's fantastic for a CPU built around mobile constraints.
 

Mr Swine

Banned
Looking only at Dhrystone MIPS, the Wii U CPU comes in at (based on some cursory internet searches so I'm not sure if it's accurate) about 2.3 DMIPS/MHz, while Cortex A57 is 4.6 DMIPS/MHz. So at those clocks, for general/integer performance the A57 is easily 1.6x the Wii U per core, and that's not counting the immensely better SIMD support.

Nice, how does it compare to the Cell and 360 CPU?
 

Schnozberry

Member
It seems unlikely to me that there would be an upclock in the dev kits vs the Eurogamer leak, which stated that those clocks were the final retail spec. I'd imagine any bump in dev kit performance comes from API optimizations and the like.

They may have had final hardware in the October kits, which would include any improvements Nintendo had requested to The TX1 memory subsystem and whatever else they decided to spend money on.
 

disap.ed

Member
Look what Marcan got with " a razor blade, a DSLR, and a $100 microscope". We might not need them, as awesome as they were last time, if they don't want to give it out this time.

https://twitter.com/marcan42/status/803281643750363136

User OC_burner on german 3Dcenter also does really good quality dieshots, for example:
https://www.forum-3dcenter.org/vbulletin/showthread.php?p=11160208#post11160208

Flicker gallery: https://www.flickr.com/photos/130561288@N04/
 

LordOfChaos

Member
How does the rumoured Switch CPU at 1GHz compare to the WiiU CPU at 1.2GHz, several times more powerful or no ?

Looking only at Dhrystone MIPS, the Wii U CPU comes in at (based on some cursory internet searches so I'm not sure if it's accurate) about 2.3 DMIPS/MHz, while Cortex A57 is 4.6 DMIPS/MHz. So at those clocks, for general/integer performance the A57 is easily 1.6x the Wii U per core, and that's not counting the immensely better SIMD support.


128 bit 4-way SIMD ALUs on ARMs Advanced SIMD (NEON) mandatory per-core vs 2×32 bit 2-way ALUs on paired-singles on the Espresso, absolutely the SIMD will see a huge uplift.
 
Status
Not open for further replies.
Top Bottom