• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.

Durante

Member
Can it really do that? I thought the chip was stuck with either one or the other. Like if you choose efficiency, then you're stuck with it, and not something that can be switched back and forth. Unless that is why the system is called Switch.

There is one choice you can only make once when you create the hardware.
That's die size, and it's a trade-off between cost and performance/efficiency.

Dynamically, you can change the clock and voltage, which is a induces a tradeoff between performance and power consumption.
 

Xellos

Member
I linked that very same article in this thread a couple of pages back. What's notable about it is that throughout the entire development process the dev kits remained almost identical to the target hardware, aside from bug fixes.

If that's how Nintendo works, then we would have seen something long in advance of full production.

BTW a Tegra X1 in the NS is logically consistent with the custom SoC claim since the dev kit was never explicitly identified as a Jetson board. It could be a die shrunk X1 with a few alterations for power saving, and therefore custom in the same way RSX was a custom GPU. Would totally fit in with the way Nvidia works.

Yup, X1 (hopefully on 14nm or 16nm) seems like a reasonable expectation. Not the most powerful chip, but at least it's better than Wii U. Will be interesting to see how it handles something like DQXI.
 

Vena

Member
BTW a Tegra X1 in the NS is logically consistent with the custom SoC claim since the dev kit was never explicitly identified as a Jetson board. It could be a die shrunk X1 with a few alterations for power saving, and therefore custom in the same way RSX was a custom GPU. Would totally fit in with the way Nvidia works.

So... Pascal? We've now gone full circle.

A shrunk X1/Maxwell with alterations isn't an X1/Maxwell, its Pascal.

Yup, X1 (hopefully on 14nm or 16nm) seems like a reasonable expectation. Not the most powerful chip, but at least it's better than Wii U. Will be interesting to see how it handles something like DQXI.

The only major gains from Maxwell to Pascal are those of the die shrink from 20nm to 16nm, there is no such thing as a die-shrunk X1. Its redundant.
 
32gb of storage? Why would they do this again? Probably to save cost, I know, but a 1TB HDD is what, $70? You'll probably have to use an external HDD again too :/.
 
32gb of storage? Why would they do this again? Probably to save cost, I know, but a 1TB HDD is what, $70? You'll probably have to use an external HDD again too :/.

It's also a handheld, so a HDD is out of the question.

But again it should be pointed out (because this thread is constantly being bumped, by me too) that the "leak" in the OP is 100% BS.
 

Zil33184

Member
A die shrunk X1 wouldn't be a X1 anymore.

Not even if it still had the same core count, 64-bit lpddr4, is clocked the same, has identical cache and features no new instructions whatsoever? C'mon, that's like saying Cell @65nm isn't Cell.

If none of the above changes from the dev kit, I'd feel more comfortable calling it a die shrunk X1 than an X2. Semantic games aside though, anyone disappointed with a Tegra X1 would be equally disappointed with the latter.
 

Vena

Member
Not even if it still had the same core count, 64-bit lpddr4, is clocked the same, has identical cache and features no new instructions whatsoever? C'mon, that's like saying Cell @65nm isn't Cell.

If none of the above changes from the dev kit, I'd feel more comfortable calling it a die shrunk X1 than an X2. Semantic games aside though, anyone disappointed with a Tegra X1 would be equally disappointed with the latter.

X2 doesn't exist.
 

Xellos

Member
So... Pascal? We've now gone full circle.

A shrunk X1/Maxwell with alterations isn't an X1/Maxwell, its Pascal.



The only major gains from Maxwell to Pascal are those of the die shrink from 20nm to 16nm, there is no such thing as a die-shrunk X1. Its redundant.

OK, so a chip very similar to X1, but with the major difference being that it is 14/16nm.
 

Zil33184

Member
Parker. It's still using the codename at this stage (X1 was Erista).


Because Nintendo prioritizes memory efficiency in their customizations. Also X1 is irrelevant.

Irrelevant, aside from being in the dev kit that's been confirmed by multiple sources.
 

Zil33184

Member
In the early devkit yes. Still irrelevant for what the final customized SoC will be.

"Early". We're talking about July here, with the console entering production in September and launching in March.

I mean there isn't even an alternative spec or leak floating out there, just GAF speculation.
 

Schnozberry

Member
Can it really do that? I thought the chip was stuck with either one or the other. Like if you choose efficiency, then you're stuck with it, and not something that can be switched back and forth. Unless that is why the system is called Switch.

I'm just referring to altering the power profiles. Draw more wattage with increased clocks while docked, and return to battery friendly clocks while away.

The efficiency gains from going from 20nm planar to 16nm finfet will remain no matter the clock speed. Pascal will be able to achieve higher clock speeds compared to Maxwell within the same power envelope, and it will draw less power at equivalent clockspeeds.

Nvidia has known 20nm was a dead end since 2014. I'm having hard time believing they would have been able to keep the shortcomings of the 20nm X1 from being readily apparent to Nintendo, especially when it comes to power consumption.
 

Schnozberry

Member
"Early". We're talking about July here, with the console entering production in September and launching in March.

I mean there isn't even an alternative spec or leak floating out there, just GAF speculation.

July was when Eurogamer leaked what was in a version of the Devkit they heard about. We don't know at what stage of development that kit was from, or how long Eurogamer had been sitting on the rumor.
 

Somnid

Member
Irrelevant, aside from being in the dev kit that's been confirmed by multiple sources.

I don't think there have been multiple sources and there has certainly been no confirmation. It's not any stronger than a couple rumor reports stating it's using newer architecture. All we really know is it's a custom Tegra.
 

bomblord1

Banned
Uh... Man sometimes the technical illiteracy of even gaffers manages to astound me sometimes. You can't be for real dude...

If someone doesn't know something instead of insulting them for it it's usually more productive to teach them what it is they don't know if a non-condescending manner.

Just because you take a piece of knowledge for granted doesn't mean everyone knows it or have been in a situation where they would have been able to learn it.
 
I'd be surprised to see more than 32gb on this thing. It keeps cost down for those just using Carts. I do expect it to be expansive with micro sd, and I expect that to perform slowly. You know, just like the 3ds. I personally don't mind any of this. Likely there will be a revision with 64gb later. The Wii I started at 16gb...
 

MacTag

Banned
"Early". We're talking about July here, with the console entering production in September and launching in March.

I mean there isn't even an alternative spec or leak floating out there, just GAF speculation.
I asked for a source on the September production before, can you provide that?

There's multiple Pascal rumors, including one from one of the same early sources citing Nvidia. X1 inclusion in any dev kit isn't confirmed either, it's also still a rumor at this stage.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Screen isn't even 1080? What year is this. Come on Nintendo. Expected but so disappointing.
It's the year when home consoles often struggle to do 1080p right.
 

guek

Banned
Not even if it still had the same core count, 64-bit lpddr4, is clocked the same, has identical cache and features no new instructions whatsoever? C'mon, that's like saying Cell @65nm isn't Cell.

If none of the above changes from the dev kit, I'd feel more comfortable calling it a die shrunk X1 than an X2. Semantic games aside though, anyone disappointed with a Tegra X1 would be equally disappointed with the latter.

Why would they do a die shrink for an older chip and keep everything else the same? That makes zero sense when it wouldn't save then any money.

I asked for a source on the September production before, can you provide that?

There's multiple Pascal rumors, including one from one of the same early sources citing Nvidia. X1 inclusion in any dev kit isn't confirmed either, it's also still a rumor at this stage.
http://www.businessinsider.com/nintendo-nx-entering-production-report-2016-9
 

Zil33184

Member
July was when Eurogamer leaked what was in a version of the Devkit they heard about. We don't know at what stage of development that kit was from, or how long Eurogamer had been sitting on the rumor.

Sure, they reported information related to them by multiple sources. It's possible to speculate about how old the information was or how long they kept silent about it, though not very long I suspect since publishing leaks is big part of the tech enthusiast press, but nothing else has come out in the interim to suggest the final hardware is any different.

So yes, it's possible to construct a counter narrative regarding the dev kit leak, but we still have nothing substantial to replace it with, and this late in the game I'm very much doubtful of a surprising new development.
 

MacTag

Banned
Sure, they reported information related to them by multiple sources. It's possible to speculate about how old the information was or how long they kept silent about it, though not very long I suspect since publishing leaks is big part of the tech enthusiast press, but nothing else has come out in the interim to suggest the final hardware is any different.

So yes, it's possible to construct a counter narrative regarding the dev kit leak, but we still have nothing substantial to replace it with, and this late in the game I'm very much doubtful of a surprising new development.
Pascal rumors have surfaced though, as you've been told repeatedly. Nothing indicates the devkit leak is representative of final hardware either. What would be unusual though is Nintendo slapping an off the shelf chip in Switch, something they haven't done since the 1980s.
 
Pascal rumors have surfaced though, as you've been told repeatedly. Nothing indicates the devkit leak is representative of final hardware either. What would be unusual though is Nintendo slapping an off the shelf chip in Switch, something they haven't done since the 1980s.

Again, adding on to that, one of the few things that's been confirmed is that the Switch uses a custom Tegra SoC.
 

KingSnake

The Birthday Skeleton
Not even if it still had the same core count, 64-bit lpddr4, is clocked the same, has identical cache and features no new instructions whatsoever?

I think your confusion starts from that fact that you assume that Nintendo buys the SoC directly from Nvidia.

Nintendo doesn't buy the SoC from Nvidia, it just buys the design and pays royalties. And they outsource the production of the chips (usually with TSMC and one Japanese manufacturer that I can't remember now).

If they want a 16nm SoC, what does make more sense? To ask Nvidia to design X1 with a die shrink or to use the Parker design as a basis that it's already a 16nm SoC? Especially since it has a better memory controller and if there something Nintendo loves is efficient memory designs.
 

tkscz

Member
X2 doesn't exist.

Parker does

Screen isn't even 1080? What year is this. Come on Nintendo. Expected but so disappointing.

1080p screen, increased price, decreased battery life. I'll take my chances with the 720p screen thank you.

Also ITT some claim that the Tegra X2 isn't a thing, even though Nvidia says it is. Go figure.

That's because there isn't an X2, the chip is called Parker. It's being made for the self driving cars (like the X1 was) but can be used for other hardware (like the X1). It was revealed in August but finished development before that. Nintendo could've had Parker in dev kits sometime in August.
 

Zil33184

Member
Why would they do a die shrink for an older chip and keep everything else the same? That makes zero sense when it wouldn't save then any money.

Power savings.

Plus, it seems as if everyone already believes that a die shrunk Maxwell Tegra X1 automatically qualifies as a 16nmFF Pascal Tegra. However, it therefore can't still be an X1 because apparently it's a logical impossibility or some such bizarre reason.

Also ITT some claim that the Tegra X2 isn't a thing, even though Nvidia says it is. Go figure.
 

Schnozberry

Member
Sure, they reported information related to them by multiple sources. It's possible to speculate about how old the information was or how long they kept silent about it, though not very long I suspect since publishing leaks is big part of the tech enthusiast press, but nothing else has come out in the interim to suggest the final hardware is any different.

So yes, it's possible to construct a counter narrative regarding the dev kit leak, but we still have nothing substantial to replace it with, and this late in the game I'm very much doubtful of a surprising new development.

It's not a counter narrative. There is good reason to believe Nintendo would not go with a 20nm Maxwell based TX1. Much of it is based on rumor, but then again, the rumors also pointed to a noisy active cooler in the Tegra X1 dev kit. That in itself makes no sense when you consider the only active cooled version of the chip in production is in the Shield TV, and that cooler is completely silent. So either Nintendo managed to royally screw up building a thermal solution that's only required to dissipate 5-10w, or they were running the TX1 at clock speeds outside of it's normal capacity in order to approximate the final hardware.
 

Zil33184

Member
Pascal rumors have surfaced though, as you've been told repeatedly. Nothing indicates the devkit leak is representative of final hardware either. What would be unusual though is Nintendo slapping an off the shelf chip in Switch, something they haven't done since the 1980s.

How many of these rumours have been vetted enough to have made to the front page of Eurogamer?
 

Peterc

Member
https://twitter.com/NWPlayer123/status/789116886109655041

Four ARM Cortex-A57 cores, max 2GHz
NVidia second-generation Maxwell architecture
256 CUDA cores, max 1 GHz, 1024 FLOPS/cycle
4GB RAM (25.6 GB/s, VRAM shared)
32 GB storage (Max transfer 400 MB/s)
USB 2.0 & 3.0
1280 x 720 6.2" IPS LCD
1080p at 60 fps or 4k at 30 fps max video output
Capcitance method, 10-point multi-touch

I follow this girl on twitter, but also Emily. I can say it's not the first time she posting specs leaks. She also share allot of fake info. You can put her in the basket where supermetaldev and Happy_Nintendofan is.
 

KingSnake

The Birthday Skeleton
How many of these rumours have been vetted enough to have made to the front page of Eurogamer?

If you use the Eurogamer article as gospel, why are you ignoring the report from them that stated that the devkit had a very noisy active cooling and that could point to Parker being used in the final version?
 

tkscz

Member
How many of these rumours have been vetted enough to have made to the front page of Eurogamer?

Which means nothing. Do I have to post her previous tweet again?

zGUGaHL.jpg

The person who start the dev kit RUMOR also said that the Switch wouldn't use Tegra at all. It was also pointed out in this thread that the devkit specs she posted are those of other Tegra X1 devices. Eurogamer also posted WiiU specs that were speculation from a GAF thread that the poster said might not be true. Just saying.
 
I like how people are happy to accept low quality tech. No one would be complaining if it was 1080 and the battery life was just as good.

You just answered your own question. The battery technology isn't good yet so they can't suddenly just use whatever screen they want and expect a long battery life. Do you know what resolution your smartphone games play on or how much power they are pushing? Even if the screen is 1440p or whatever the games will play on a low resolution and with that comes some blurriness and input lag.
 

Deadstar

Member
You just answered your own question. The battery life isn't good so they can't suddenly just use whatever screen they want and expect a long battery life. Do you know what resolution your smartphone games play on or how much power they are pushing? Even if the screen is 1440p or whatever the games will play on a low resolution and with that comes some blurriness and input lag.

I understand this but as someone who just switched to a 2k monitor, even 1080p looks bad in comparison. I can't even imagine 4k. I just want technology to improve faster than it is. Seeing a 720 screen, something we had 10 years ago is just disappointing for someone who likes tech.
 

AzaK

Member
Threads trying to talk about Swirch ideas get closed regularly but a thread about bullshit specs is left to continue the madness.
 
Power savings.

Plus, it seems as if everyone already believes that a die shrunk Maxwell Tegra X1 automatically qualifies as a 16nmFF Pascal Tegra. However, it therefore can't still be an X1 because apparently it's a logical impossibility or some such bizarre reason.

Also ITT some claim that the Tegra X2 isn't a thing, even though Nvidia says it is. Go figure.

Nvidia has never called any of its chips a "Tegra X2." There is a Drive PX2 chip which is completely different, and a chip called Parker which is essentially the successor to the TX1. But a "TX2" doesn't officially exist.

Also as far as I'm aware, one of the defining feature of the TX1 is the Maxwell architecture, and the major difference between Maxwell and Pascal is the 16nm process, so I would say a hypothetical die shrunk TX1 indeed would no longer be a TX1, and would be a (edit: custom) Pascal Tegra. I could be wrong on that though.

All in all, I'm not even sure what you're arguing anymore. We know for a fact that the Switch uses a custom SoC, and we definitely do not know if it will be closer in performance to a TX1 or to Parker, but that's where speculation based on rumors comes in. Either way, the final unit will not use a stock TX1 because Nvidia has confirmed such.
 

bomblord1

Banned
I understand this but as someone who just switched to a 2k monitor, even 1080p looks bad in comparison. I can't even imagine 4k. I just want technology to improve faster than it is. Seeing a 720 screen, something we had 10 years ago is just disappointing for someone who likes tech.

Question: When you open the web browser on this 2k screen phone does it look bad to you?

Are you aware you are looking at an image around 720p if not lower?

My 1440 x 2560 Nexus 6 browser window is currently at a resolution of 732x412

That is then upscaled to your screen.
 

Instro

Member
I like how people are happy to accept low quality tech. No one would be complaining if it was 1080 and the battery life was just as good.

Depends how worthwhile it would be. If most of the software is targeting 720p, then a 1080p screen doesn't add a whole lot. Having said that my thought has been that even in mobile mode it should have the power to drive some lower end and/or indie games at 1080p, so maybe that type of screen could have been good, especially since it would be nice for non gaming functions. In the end though this is probably designed to be a cheaper device, so a 1080p screen and the battery to drive that screen may push this into the realm of being too expensive.
 

KingSnake

The Birthday Skeleton
All in all, I'm not even sure what you're arguing anymore. We know for a fact that the Switch uses a custom SoC, and we definitely do not know if it will be closer in performance to a TX1 or to Parker, but that's where speculation based on rumors comes in. Either way, the final unit will not use a stock TX1 because Nvidia has confirmed such.

His hope is that the memory bandwidth will be 25GB/s. This whole thing with devkit theories and how the SoC will not change started from the memory bandwidth discussion in the other thread and that theory from Zlatan. That's why it's acceptable that it can get a die shrink but it can never be a Parker, because then the memory bandwidth would be too big.
 

Orbis

Member
I understand this but as someone who just switched to a 2k monitor, even 1080p looks bad in comparison. I can't even imagine 4k. I just want technology to improve faster than it is. Seeing a 720 screen, something we had 10 years ago is just disappointing for someone who likes tech.
I only have to look at the Vita screen to tell you that for gaming, even 540p isn't bad. We are nowhere near portable gaming at 1080p with acceptable battery life and affordable. If this was a smartphone you'd want the higher resolution for crisper text but it matters less for gaming.
 
Status
Not open for further replies.
Top Bottom