• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.

Schnozberry

Member
Rather than customise the SoC to remove them they could instead customise to allow them to run in conjunction with higher powered cores in order to run backgrounds tasks like the OS? Not that you'd need 4 A53's to run a OS I suppose but a couple of them would do it with a tiny power draw.

It's possible. How the chip was customized is entirely up in the air.
 

Nikodemos

Member
They could pick 20nm but they stuck a fan in the Switch when it shouldn't be necessary with how low clocked it is.

TSMC have apparently cut the price of 28nm and 20nm nodes by 10% a few months ago but I assume that 28nm is still cheaper in comparison.
That's because, from all rumours and circumstantial evidence, it seems like 20nm is a bad node. The only advantage over 28nm is a smaller footprint. Its disadvantages are high cost, mediocre perf/watt and subpar thermals (it doesn't run a lot cooler than 28nm, like it should). It's the reason Qualcomm retained an improved 28nm process for midrange chips like the 650/652 while moving to 14nm for 820/821.

Also, I take the fact that the Tegra X1's A53 cores haven't been mentioned in any dev leaks as proof they're OS-locked and not dev-accessible.
 
Ok I can tell you're just going to repeat the same thing over and over and ignore anything I say so I wont bother anymore, life is too short to keep repeating myself :)

I'll just say its going to be interesting to see the whole picture. Because like I said, Nintendo don't do off the shelf GPU's and down clocks do not make a custom GPU.
I was following your post on this topic, and it led me to the same question: why was that info with the 1T GFLOPS (fp16) in a recent briefing. And why there was a clock speed doc leak without the listing for the type of CPU, core count, cuda count, etc? I don't think it's farfetch to assume that Nintendo is aiming for performance of that TX1 kit via customizations.
 
Actual switch developers have been confirmed by digital foundry to have been briefed with those same specs. It doesn't matter if they were pulled from another development board. Devs working on the system have confirmed that's what they are working with.

DF said developers have been briefed on those TX1 devkit specs. We don't know when though. It could very well be when those TX1 devkits were sent out before July.

The fact that DF is explicitly saying there are likely customizations they don't know about clearly says they're not presenting those specs as any sort of confirmation.
 

Akhe

Member
The lesson from this is never trust in "insiders".

We don't know if the actual dev kits have a small form factor or similar to a retail Switch, in which case having a fan makes sense.


No that I recall, Nate said it was a TX1 Maxwell, Emily said it was close to XBONE and was confirmed by Laura, it doesn't seems the case anymore, OsirisBlack said it would be able to handle XBONE/PS4 ports with out much problem. But yeah, we had missing this down clock gift from Nintendo all this time.

I've tried so many times but it's so hard. I think I'm masochist.
 

Vic

Please help me with my bad english
DF said developers have been briefed on those TX1 devkit specs. We don't know when though. It could very well be when those TX1 devkits were sent out before July.

The fact that DF is explicitly saying there are likely customizations they don't know about clearly says they're not presenting those specs as any sort of confirmation.
It was emphasized that the briefing happened recently.
 

Schnozberry

Member
DF said developers have been briefed on those TX1 devkit specs. We don't know when though. It could very well be when those TX1 devkits were sent out before July.

The fact that DF is explicitly saying there are likely customizations they don't know about clearly says they're not presenting those specs as any sort of confirmation.

They said as much in their Youtube Video. They only have a source for the clocks, and that the Dev Kits line up with the leaked specs.
 

Log4Girlz

Member
This thing will be marketed as a home console on the go. If multi platform games are going to be severely gimped, or not even possible to get to run in some cases it's going to hurt it a lot like the Vita.

It'll basically just be another Nintendo handheld and perform as such in sales if they are lucky.
 
It was emphasized that the briefing happened recently.

Recently is broad, and we don't know how long the devkits were just TX1s. Also...

They said as much in their Youtube Video. They only have a source for the clocks, and that the Dev Kits line up with the leaked specs.

This.

They explicitly say the specs they listed are not confirmed and were not given to them from developers. The fact that they are speculating about potential customizations should make this apparent.
 

Vena

Member
It'll basically just be another Nintendo handheld and perform as such in sales if they are lucky.

I think this will do better than the 3DS if they continue with the on-point marketing. It falls in a nice spot that I think can be very interesting to a wider audience than the pigeonholed upper console echelon or the freemium stuck mobile markets.

/shrug
 
It'll basically just be another Nintendo handheld and perform as such in sales if they are lucky.
I do think it had the added benefit of Nintendo's entire output as well as console only features like local multiplayer on one unit right out the box.
I think it'll do pretty well if they play their cards right.
Right now they don't have the complex and appealing gimmick of the Wii U which raised the price by $100 nor the 3DS's 3D which went out of fashion pretty quickly. I think it seems fine for now
 

z0m3le

Banned
It makes no sense that developers would know the clock speeds and not the core count, the people who leaked these clocks (not eurogamer, but their source) is causing a lot of needless speculation here. Clock speeds without core counts for CPU and GPU are not going to give developers performance numbers to target. I hope someone comes forward with core counts. As for the whole 28nm vs 20nm, 16nm is more likely than 28nm, 28nm is just really out there, not to say it isn't impossible but it is wacky tinfoil hat crazy.
 

Vena

Member
It makes no sense that developers would know the clock speeds and not the core count, the people who leaked these clocks (not eurogamer, but their source) is causing a lot of needless speculation here. Clock speeds without core counts for CPU and GPU are not going to give developers performance numbers to target. I hope someone comes forward with core counts. As for the whole 28nm vs 20nm, 16nm is more likely than 28nm, 28nm is just really out there, not to say it isn't impossible but it is wacky tinfoil hat crazy.

I suspect someone just gave them like a page out of the retail sdk document, lol.

Only way I figure that they have so little information otherwise.
 

Log4Girlz

Member
Their console days are over. Selling as well as the 3ds will be a challenge but worthwhile. Shoot, maybe it can sell as well as the original DS. There will never be another Wii/DS super combo, so selling as well as just one of them would be fantastic.
 

Speely

Banned
Is it possible that the dev kits were stock X1s overclocked as speculated (because of the loud fan) and that Nintendo was just approximating a performance target that they would later downclock and use more SMs to hit?

I am asking because early rumors mentioned a 3-hour gaming battery life, and according to NateDrake they are now targetting 5-8 (don't give me any NateDrake guff... sources can be mistaken and plans can change.)

If the original dev kits were running at higher clocks with standard X1s, 3 hours sounds about right since dev kits don't need good battery life. They just need to hit performance targets on the same architecture, right? High clock speeds eat up power quickly.

So cheap X1s running at higher clocks emulating the performance of lower-clocked final chips with more SMs (which would be more power-efficient than higher clocks) at least seems like a possibility to me as a cheaper and faster way to get devs up and running while still optimizing for a less power-hungry final retail model.

If it doesn't end up happening, I will still be excited, but I am tentatively considering a 3 SM setup right now. It's doable and makes more sense to me than Nintendo considerably downgrading the device after dev kits went out (this is assuming that the X1s were indeed overclocked and not just so inefficient that they needed fans.)
 
^ Make that two.

I wonder if anyone here can try to get in touch with the author of the DF article on Twitter or something to see if he can clarify what exactly they mean when they say "developers have recently been briefed" on those stock TX1 specs. Clearly based on their speculation about CUDA cores this doesn't mean those specs are confirmed, but I'm curious if it just refers to recent devkits.
 

Enduin

No bald cap? Lies!
^I think that is a best case scenario at this point with what we "know."

I don't love this EG story, but when it comes down to it it's not a dealbreaker for me. I'm sold on the premise of the device and doubly sold on the fact that I will only need one system to get all the Nintendo titles out there, plus the several 3rd party handheld franchises that have managed to stay alive over the years. I don't think I'd ever consider getting 3rd party ports. If it's available on PC I'm getting it there, unless it's some kind of indie game that performs equally on the Switch.
 

Schnozberry

Member
I said the same on the previous page. It's unlikely but not entirely implausible.

The pie in the sky scenario is that Nintendo was approximating performance of the final lower clocked custom chip that has more SM's/Cores by using a full clocked TX1 in the dev kits that was readily available for people to work on.
 

z0m3le

Banned
Is it possible that the dev kits were stock X1s overclocked as speculated (because of the loud fan) and that Nintendo was just approximating a performance target that they would later downclock and use more SMs to hit?

I am asking because early rumors mentioned a 3-hour gaming battery life, and according to NateDrake they are now targetting 5-8 (don't give me any NateDrake guff... sources can be mistaken and plans can change.)

If the original dev kits were running at higher clocks with standard X1s, 3 hours sounds about right since dev kits don't need good battery life. They just need to hit performance targets on the same architecture, right? High clock speeds eat up power quickly.

So cheap X1s running at higher clocks emulating the performance of lower-clocked final chips with more SMs (which would be more power-efficient than higher clocks) at least seems like a possibility to me as a cheaper and faster way to get devs up and running while still optimizing for a less power-hungry final retail model.

If it doesn't end up happening, I will still be excited, but I am tentatively considering a 3 SM setup right now. It's doable and makes more sense to me than Nintendo considerably downgrading the device after dev kits went out (this is assuming that the X1s were indeed overclocked and not just so inefficient that they needed fans.)

If X1 dev kits were clocked at 1.152ghz it would give you exactly the same performance as 3SM at 768mhz and would draw much less power in the 3SM result, (590gflops fp32 or 1.18tflops fp16) which fits with everything we heard before, it just isn't something we know and so assuming that becomes more optimistic than maybe we should be. 393gflops might sound like a big down grade, but it is still just outside the pessimistic view of pixel c. The handheld performance actually doesn't matter at all since it is worked out to perform identical to the docked performance at a lower resolution. My big "hope" is that they went with 6 A57 cores @ 1ghz rather than 4, 4 is enough to port games, but it will be a hassle when we've been hearing the opposite for months, it is difficult to figure out. 6 A57 cores @ 1ghz is similar to 6 jaguar cores around 1.5ghz but that is a pretty rough estimation.
 

Dakhil

Member
Is it possible that the dev kits were stock X1s overclocked as speculated (because of the loud fan) and that Nintendo was just approximating a performance target that they would later downclock and use more SMs to hit?

I am asking because early rumors mentioned a 3-hour gaming battery life, and according to NateDrake they are now targetting 5-8 (don't give me any NateDrake guff... sources can be mistaken and plans can change.)

If the original dev kits were running at higher clocks with standard X1s, 3 hours sounds about right since dev kits don't need good battery life. They just need to hit performance targets on the same architecture, right? High clock speeds eat up power quickly.

So cheap X1s running at higher clocks emulating the performance of lower-clocked final chips with more SMs (which would be more power-efficient than higher clocks) at least seems like a possibility to me as a cheaper and faster way to get devs up and running while still optimizing for a less power-hungry final retail model.

If it doesn't end up happening, I will still be excited, but I am tentatively considering a 3 SM setup right now. It's doable and makes more sense to me than Nintendo considerably downgrading the device after dev kits went out (this is assuming that the X1s were indeed overclocked and not just so inefficient that they needed fans.)

I do hope that's the case. But I'm not going to hold my breath.
 

Reki

Member
Just wanted to ask as I have read it a time or two: lower clocked chips with more SMs have better battery life than higher clocked chips with less SMs?
 
If X1 dev kits were clocked at 1.152ghz it would give you exactly the same performance as 3SM at 768mhz and would draw much less power in the 3SM result, (590gflops fp32 or 1.18tflops fp16) which fits with everything we heard before, it just isn't something we know and so assuming that becomes more optimistic than maybe we should be. 393gflops might sound like a big down grade, but it is still just outside the pessimistic view of pixel c. The handheld performance actually doesn't matter at all since it is worked out to perform identical to the docked performance at a lower resolution. My big "hope" is that they went with 6 A57 cores @ 1ghz rather than 4, 4 is enough to port games, but it will be a hassle when we've been hearing the opposite for months, it is difficult to figure out. 6 A57 cores @ 1ghz is similar to 6 jaguar cores around 1.5ghz but that is a pretty rough estimation.

Yeah the more I consider the GPU part of this rumor the less important it feels. It was never going to be that high in portable mode- around 200GF would seem appropriate if the TX1 was at its max clock when docked.

But the CPU situation sounds like a real problem, especially in light of everything we've been hearing. Hopefully they customized it with 2-4 additional A57s at the very least.

Just wanted to ask as I have read it a time or two: lower clocked chips with more SMs can improve the battery life in comparison to higher clocked chips with less SMs?

Based on a chart posted by Thraktor a few months ago, yes this is accurate. I think this is because power draw increases exponentially as clocks increase but linearly as SMs increase.
 

Rodin

Member
Is it possible that the dev kits were stock X1s overclocked as speculated (because of the loud fan) and that Nintendo was just approximating a performance target that they would later downclock and use more SMs to hit?

I am asking because early rumors mentioned a 3-hour gaming battery life, and according to NateDrake they are now targetting 5-8 (don't give me any NateDrake guff... sources can be mistaken and plans can change.)

If the original dev kits were running at higher clocks with standard X1s, 3 hours sounds about right since dev kits don't need good battery life. They just need to hit performance targets on the same architecture, right? High clock speeds eat up power quickly.

So cheap X1s running at higher clocks emulating the performance of lower-clocked final chips with more SMs (which would be more power-efficient than higher clocks) at least seems like a possibility to me as a cheaper and faster way to get devs up and running while still optimizing for a less power-hungry final retail model.

If it doesn't end up happening, I will still be excited, but I am tentatively considering a 3 SM setup right now. It's doable and makes more sense to me than Nintendo considerably downgrading the device after dev kits went out (this is assuming that the X1s were indeed overclocked and not just so inefficient that they needed fans.)

^ Make that two.

I wonder if anyone here can try to get in touch with the author of the DF article on Twitter or something to see if he can clarify what exactly they mean when they say "developers have recently been briefed" on those stock TX1 specs. Clearly based on their speculation about CUDA cores this doesn't mean those specs are confirmed, but I'm curious if it just refers to recent devkits.

I said the same on the previous page. It's unlikely but not entirely implausible.
There's also another scenario that we can consider, and it's Mario Switch performances. We heard from LKD or Emily (i think) that performances weren't good until recently (september-november), and then they got fixed. Now i know that simple optimization and new builds can fix these issues, and it's most likely just that... but what if the older build was on older kits with stock tx1 (512gflops) and the new one is on the final hardware with 3SM? At the clockspeed revealed by DF, it would be 590 gflops.

I know it sounds like fanfiction, and i'm sure that they simply improved performances with a new more optimized build. But reading the posts above the timing would be oddly perfect for this scenario lol

I still think it's 2SM at 28nm and Nintendo went dirt cheap with this, but it's fun to speculate weird scenarios.

Yeah the more I consider the GPU part of this rumor the less important it feels. It was never going to be that high in portable mode- around 200GF would seem appropriate if the TX1 was at its max clock when docked.

But the CPU situation sounds like a real problem, especially in light of everything we've been hearing. Hopefully they customized it with 2-4 additional A57s at the very least.
Yeah the CPU is the most underwhelming part for sure. I really didn't expect that clock (my worst case scenario was 1.4GHZ).
 

Proelite

Member
Just wanted to ask as I have read it a time or two: lower clocked chips with more SMs have better battery life than higher clocked chips with less SMs?

It would be easier to cool since you have same heat spread over more surface area. Contrarily it'll be more costly to manufacture.
 

ggx2ac

Member
It makes no sense that developers would know the clock speeds and not the core count, the people who leaked these clocks (not eurogamer, but their source) is causing a lot of needless speculation here. Clock speeds without core counts for CPU and GPU are not going to give developers performance numbers to target. I hope someone comes forward with core counts. As for the whole 28nm vs 20nm, 16nm is more likely than 28nm, 28nm is just really out there, not to say it isn't impossible but it is wacky tinfoil hat crazy.

So it's not impossible but crazy to suggest? lol

Maxwell originated on a 28nm node, 2nd Gen Maxwell may have occurred on a 20nm node but considering that it's of the same architecture suggests it was more of a die shrink than a new design that required more transistors.

If the Switch SoC is 16nmFF then it really shouldn't need fans for the clocks it is running at. If anything it should be clocked higher while still requiring fans and yet that isn't the case.

Maybe they have stuck with 20nm and 3 SM but, it is looking more and more unlikely when a 28nm node suggests that they had to clock it very low to get it around 1.5W when running the GPU.

If it doesn't end up happening, I will still be excited, but I am tentatively considering a 3 SM setup right now. It's doable and makes more sense to me than Nintendo considerably downgrading the device after dev kits went out (this is assuming that the X1s were indeed overclocked and not just so inefficient that they needed fans.)

Considering every insider and Eurogamer didn't debunk the leaked specs, it is unlikely. They never even said, "Hey, the GPU clock is wrong. It's supposed to be around 1.2GHz not 1GHz."

They even said the Switch final specs would be similar and yet we have a severely underclocked system.
 

Reki

Member
Based on a chart posted by Thraktor a few months ago, yes this is accurate. I think this is because power draw increases exponentially as clocks increase but linearly as SMs increase.

It would be easier to cool since you have same heat spread over more surface area. Contrarily it'll be more costly to manufacture.

Thank you both! Even if pretty unlikely, I sure hope this is the case.
 

Speely

Banned
I'm just trying to keep hope afloat without popping the flotation device due to my heavy breathing.

How's that for a messy metaphor?
 

Schnozberry

Member
28nm is so asinine I'm having a hard time formulating a response to it. I know Nintendo are seen as using old tech, but you'd have to believe they went to Nvidia and saw the finished design on the 20nm Tegra X1 and said "Hey, can we pay you a bunch of unnecessary money to make that run even shittier on an older process node?"
 
28nm is so asinine I'm having a hard time formulating a response to it. I know Nintendo are seeing as using old tech, but you'd have to believe they went to Nvidia and saw the finished design on the 20nm Tegra X1 and said "Hey, can we pay you a bunch of unnecessary money to make that run even shittier on an older process node?"

Since this is the first time they've built a new console from scratch since the Gamecube, looking at their decisions there might be a good way to determine their thinking here. Was the Gamecube built on the most modern process node?

Although since this is more like a portable that might not be the most comparable situation. But with it being a portable you'd expect them to go with the smaller more efficient process node to begin with.

Who the hell knows...
 

Cerium

Member
For sure it's 28 nm 2SM

col%C3%A8re.gif
 

Pokemaniac

Member
Given what we know, I don't even think the number of SMs being greater than 2 is unlikely. Otherwise I'm somewhat perplexed at the inclusion of the fan, as well as a several other details (the dev kits originally being overclocked, the way devs who barely considered Wii U are seemingly somewhat satisfied with this, etc.).
 

bomblord1

Banned
Even though I'm confident on 2 SM's I just thought I would throw this out.

What about the possibility of a 16nm chip with 4 SM's? Would that account for the fan and clocks? It would also explain why the system could have been originally running on OC'd 20nm 2SM setup as well as the reason final dev kit got a bump in battery (shrinking the die).
 
Given what we know, I don't even think the number of SMs being greater than 2 is unlikely. Otherwise I'm somewhat perplexed at the inclusion of the fan.

Me too about the fan, but let's remember to be a bit more conservative from now on with our expectations. It would be a very pleasant surprise if they added an SM or two but at this point I'm not banking on it.

Even though I'm confident on 2 SM's I just thought I would throw this out.

What about the possibility of a 16nm chip with 4 SM's? Would that account for the fan and clocks? It would also explain why the system could have been originally running on OC'd 2SM setup.

I'm sure it's possible but that increases the price likely too much. At that point 3SMs clocked higher with a slightly larger battery is likely more cost effective. Though maybe heavier or bigger than they want.
 

18-Volt

Member
The lesson from this is never trust in "insiders".

My lesson was "never trust in anybody but LKD, not even Eurogamer & Digital Foundry guys". Eurogamer have been wrong couple of times before (where is my Mother 3?) and I don't know who the Digital Foundry are. LKD's track record still looks like best one, she's yet to be wrong. And if she says Switch is much powerful than Wii U in portable mode and Dark Souls 3 is running on Switch on satisfactory level, I believe her. Ignore everything else.
 

Schnozberry

Member
Since this is the first time they've built a new console from scratch since the Gamecube, looking at their decisions there might be a good way to determine their thinking here. Was the Gamecube built on the most modern process node?

Although since this is more like a portable that might not be the most comparable situation. But with it being a portable you'd expect them to go with the smaller more efficient process node to begin with.

Who the hell knows...

The Cube was pretty advanced for it's time, particularly on the GPU and RAM front. I don't know if the CPU was on the latest process node available at that moment, but it was on equal footing with the Intel Chip in the Xbox (180nm).
 

magash

Member
Since this is the first time they've built a new console from scratch since the Gamecube, looking at their decisions there might be a good way to determine their thinking here. Was the Gamecube built on the most modern process node?

Although since this is more like a portable that might not be the most comparable situation. But with it being a portable you'd expect them to go with the smaller more efficient process node to begin with.

Who the hell knows...

Both the Xbox's CPU and the GCN CPU's were produced on 180nm
 
I think ultimately these specs come down to the "experience" Nintendo want to provide. The whole Switch selling point seems to be "play the same game at home or away from the TV". Resolution aside I really don't think Nintendo want people to have an obviously downgraded experience on the go or for them or third parties to have to create and optimise two different versions of assets for every single game.

I have faith we will see a much more powerful, traditional Nintendo home console based on the same Switch architecture by the end of 2018 anyway so the specs aren't a great concern to me personally.

Nintendo blew people away visually on Gamecube, Wii, 3DS and WiiU all on "pathetic" specs. I have no doubt that they will do the same with this modest leap over WiiU.

Would be great to get Nate, Matt, LC or Osirus take on these new revelations, surely one of them has something to add ?
 
Let's see, they kept the Wii U on a 45nm process due to the EDRAM. They recently shrunk the process for the n3DS. What would be a good reason why they would stick with an older process on a handheld form factor? Will that actually be more expensive to do?

If that's somehow the case, it actually crazy that the system is as powerful as the worse-case scenario.
They could also make some dramatic revisions (Switch-Pro?) very soon with a drastic increase of energy efficiency, capability and/or battery life.
 

kIdMuScLe

Member
Is there a huge difference between 1ghz and 2ghz in wattage power? And what kind of CPU speed would they need to be if Nintendo told them at the beginning that to develop for 2ghz but now that we know is 1ghz?
 
My lesson was "never trust in anybody but LKD, not even Eurogamer & Digital Foundry guys". Eurogamer have been wrong couple of times before (where is my Mother 3?) and I don't know who the Digital Foundry are. LKD's track record still looks like best one, she's yet to be wrong. And if she says Switch is much powerful than Wii U in portable mode and Dark Souls 3 is running on Switch on satisfactory level, I believe her. Ignore everything else.

Pretty sure Eurogamer was the first to leak that the Switch is a hybrid
 
Status
Not open for further replies.
Top Bottom