• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.

ozfunghi

Member
I was under assumption this had a pascal based chip.

Rumors are pointing towards it being a Pascal based chip in the final hardware. But the devkits are running on Maxwell (Tegra X1) according to Eurogamer (who had the other info about the devkit correct).
 
Rumors are pointing towards it being a Pascal based chip in the final hardware. But the devkits are running on Maxwell (Tegra X1) according to Eurogamer (who had the other info about the devkit correct).

Devkits from July or earlier, we don't know exactly when.
 

MDave

Member
Which one? The TX1 is 512 at fp32 and 1TF at fp16. But considering the fact that fp16 can only used for a number of computations, to call it a 1TF chip would be disingenuous.

It would also be disingenuous to say developers will only use FP32 thus 500GF :p

I made a post in the Switch Nvidia chip thread, unfortunately its going to be unnoticed slipping to the 3rd page and with this one is staying on the front.

I dig up what sort of balance and use FP16 would be expected to be seen so we get an idea of the ratio of FP precision usage, and it looks in favor to see FP16 used more then FP32 for shaders.

http://www.neogaf.com/forum/showpost.php?p=221603499&postcount=1707

If Thraktor or anyone with technical background can check this out, greatly appreciate it!
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
So is there any news or is this just fake?

NateDrake (insider on GAF) is adamant about a Pascal-based chip.

As for the info in the OP, there is zero validity to it whatsoever. No source has ever been verified, and the person who posted it was adamant that Nvidia had nothing to do with Switch. It is most likely fake.
 

Schnozberry

Member
First off can we get some consensus on the whole Pascal is JUST 16nm Maxwell? Because for desktop cards it has worse fp16 support and there have been changes to async compute, colour compression, and PolyMorph. Since it has architectural differences and new features not on Maxwell parts, I'm going to disagree that the two are identical.

If the two are not identical is it then possible to die shrink a 20nm Maxwell Tegra X1 and not inherit features from unrelated desktop cards and other products in the pipeline? Yep. There problem solved, 16nm TX1 isn't Pascal like some have argued.

For the purposes of the Nintendo Switch and it's power envelope, there would be little functional differences between Maxwell and Pascal at 16nm. The Switch isn't going to make heavy use of Async Compute or the new dynamic scheduler. On the consumer focused boards, the changes to polymorph aren't much to talk about. They were intended for the deep learning focused GP100 chip so multiple SM's could share a Texture Processor Cluster since that chip isn't so graphics focused. On GeForce Cards the SM to TPC ratio is still 1:1, just like Maxwell.

It would be nice if Nintendo could take advantage of Pascal's new color compressor. It would offer a nice savings on memory bandwidth, but with a chip this small I'm not sure they would use the die space for the new compression patterns.

The FP16 issue on the desktop GeForce Parts appears to be so they can sell more Tesla parts. There's no reason for them to repeat that on the GPU for the Switch.
 
It would also be disingenuous to say developers will only use FP32 thus 500GF :p

I made a post in the Switch Nvidia chip thread, unfortunately its going to be unnoticed slipping to the 3rd page and with this one is staying on the front.

I dig up what sort of balance and use FP16 would be expected to be seen so we get an idea of the ratio of FP precision usage, and it looks in favor to see FP16 used more then FP32 for shaders.

http://www.neogaf.com/forum/showpost.php?p=221603499&postcount=1707

If Thraktor or anyone with technical background can check this out, greatly appreciate it!

I appreciated the work you put in for that post (and this one) even though I don't know enough to really speculate much on it. It seems like we'll see a fair amount of FP16 code for this but I have no idea what kind of a general or average ratio we might see (that would change on a game by game basis obviously).
 
Screen isn't even 1080? What year is this. Come on Nintendo. Expected but so disappointing.

1080p on a sub-7 inch screen would be a waste of horsepower and battery life. Of all the criticisms to have about the Switch and Nintendo, this is what you landed on?
 

AntMurda

Member
On our lowest expectations of the hardware now - it should still run Zelda, MK8+ and Splatoon+ 1080p on HDMI out no?

Granted the third party XB1/PS4 ports would have to run lower pixel counts.
 

Zil33184

Member
I'm pretty sure it's just semantics. You're claiming that a simple die shrink of a 20nm Tegra X1 to a 16nm chip can also be called a Tegra X1. But the Tegra X1, which is a single defined product is made on a 20nm process. Is a 16nm Tegra chip with the same CPU/GPU configuration as a Tegra X1 pretty similar, or nearly identical? Probably, but that's not the point- a chip on a 16nm process cannot be a Tegra X1, because part of the definition of the Tegra X1 is a 20nm process.

Like I said, it's semantics, but it seems to be a major problem with the way the argument is being phrased. I honestly have almost lost sight of your initial argument at this point too.

The point: a TX1 shrunk down on 16nm is a custom Tegra SoC, not a TX1, by definition.

That's pretty arbitrary, since a die shrunk Cell is still a Cell for example. But if that's the major sticking point I'll just call a hypothetical 16nmFF TX1 equivalent "mike" or "exercise bike". Either way, I agree that the 16nmFF exercise bike would qualify as a custom SoC. In fact I made that point earlier.

Getting back to final NS hardware versus the dev kit, I still haven't seen a more recent spec. Given how leaky this industry is that's actually not a great sign.
 

Eolz

Member
On our lowest expectations of the hardware now - it should still run Zelda, MK8+ and Splatoon+ 1080p on HDMI out no?

Granted the third party XB1/PS4 ports would have to run lower pixel counts.

I'd say so for MK8+ and Splatoon +.
Depends what they want to do for Zelda, I could see them going for more details or other stuff, still at 720p.
 
That's pretty arbitrary, since a die shrunk Cell is still a Cell for example. But if that's the major sticking point I'll just call a hypothetical 16nmFF TX1 equivalent "mike" or "exercise bike". Either way, I agree that the 16nmFF exercise bike would qualify as a custom SoC. In fact I made that point earlier.

Getting back to final NS hardware versus the dev kit, I still haven't seen a more recent spec. Given how leaky this industry is that's actually not a great sign.

Wouldn't it take more work to get a custom chip based on a Tegra X1 on 16nmFF than to base it of Pascal? Serious question.

Edit:
I'd say so for MK8+ and Splatoon +.
Depends what they want to do for Zelda, I could see them going for more details or other stuff, still at 720p.

Hopefully they fix some of the pop in on NS.
 

AntMurda

Member
I'd say so for MK8+ and Splatoon +.
Depends what they want to do for Zelda, I could see them going for more details or other stuff, still at 720p.

It might be important for Nintnedo to set a precedence or standardize first-party games at 1080p on the TV. Especially to demonstrate superiority over the Wii U.
 
pascal in the consumer space is mostly identical to maxwell. it supports a higher CR tier but im not sure how important that is in practice. regardless its a feature unlikely to gain traction anytime soon. it also supports dynamic load balancing but that feature is probably also rendered moot due to the number of SMs in the switch. the enhanced color compression is the only feature that seperates it from maxwell, especially considering the anemic bandwidth figures

It would also be disingenuous to say developers will only use FP32 thus 500GF :p

I made a post in the Switch Nvidia chip thread, unfortunately its going to be unnoticed slipping to the 3rd page and with this one is staying on the front.

I dig up what sort of balance and use FP16 would be expected to be seen so we get an idea of the ratio of FP precision usage, and it looks in favor to see FP16 used more then FP32 for shaders.

http://www.neogaf.com/forum/showpost.php?p=221603499&postcount=1707

If Thraktor or anyone with technical background can check this out, greatly appreciate it!

there are definitely use cases for fp16, its just a question of whether devs are going to deem all that extra optimization work worth it. history says its unlikely.
 

Schnozberry

Member
Wouldn't it take more work to get a custom chip based on a Tegra X1 on 16nmFF than to base it of Pascal

No reason to go with Maxwell on 16nm when Pascal is so similar and the preliminary designs (barring Nintendo Customization) have already been done for other chips..
 
No reason to go with Maxwell on 16nm when Pascal is so similar and the preliminary designs (barring Nintendo Customization) have already been done for other chips..

That's exactly my thoughts. It wouldn't really make sense to take Maxwell and do a die shrink when they have Pascal already there.
 
They have rather large allotments of memory Reserved for the OS, though I believe developers have been given access to more memory over time.

Yeah, but so does the Wii U, so I'm not sure why people expect differently of the Switch?

Hell, as a percentage, the Wii U devotes a greater portion of its ram to the OS than either of the two other systems.
 

Schnozberry

Member
Yeah, but so does the Wii U, so I'm not sure why people expect differently of the Switch?

Hell, as a percentage, the Wii U devotes a greater portion of its ram to the OS than either of the two other systems.

I think 1GB reserved for the OS is probably a safe bet on the Switch. It just depends on if Nintendo went with 4GB of RAM for the final kit or bumped it up. Either way, the percentage goes down.
 
You can only access 5 as a developer

Yeah, but then the Wii U has only 1GB by that standard. By comparison, that's even worse than the person I quoted was insinuating, as the Wii U only has a fifth of the developer-accessible RAM as compared to its competitors versus a fourth of total RAM

I think 1GB reserved for the OS is probably a safe bet on the Switch. It just depends on if Nintendo went with 4GB of RAM for the final kit or bumped it up. Either way, the percentage goes down.

I'm not sure why that's a safe bet; I'd expect the OS to be more ambitious than the Wii U's.
 

rekameohs

Banned
In what way? Wii U needed a more modern UI and more apps, but it has a modern browser, plus Miiverse and the eShop were fine.
Wii U's OS was slow as molasses, so loading more of its resources into memory would help. Then you've got other features that are relatively standard now, like multitasking beyond what Wii U could do, more seamless Miiverse / Social integration, and gameplay recording.
 

Malakai

Member


Not that this adds anything to the specifics of the internals of the Nintendo Switch, poster Zil33184, please read below and stop stressing whether or not Nintendo is so call releasing an "under power" console.

To add context to my previous post (I was asked via PM) without going into too much detail any game that runs on the XB1 or PS4 should run on the NX with little to no issue. What developers choose to or not to port to the console will more than likely depend on consumer support for the thing.
 

Schnozberry

Member
Wii U's OS was slow as molasses, so loading more of its resources into memory would help. Then you've got other features that are relatively standard now, like multitasking beyond what Wii U could do, more seamless Miiverse / Social integration, and gameplay recording.

Wii U was crippled by reading from the really slow built in flash memory, and the CPU wasn't setting the world on fire either. There are full desktop Linux distributions that run on less than 512MB of RAM. 1GB isn't an unreasonable target.
 

ozfunghi

Member
It would also be disingenuous to say developers will only use FP32 thus 500GF :p

I made a post in the Switch Nvidia chip thread, unfortunately its going to be unnoticed slipping to the 3rd page and with this one is staying on the front.

I dig up what sort of balance and use FP16 would be expected to be seen so we get an idea of the ratio of FP precision usage, and it looks in favor to see FP16 used more then FP32 for shaders.

http://www.neogaf.com/forum/showpost.php?p=221603499&postcount=1707

If Thraktor or anyone with technical background can check this out, greatly appreciate it!

I already asked the same question in the other (or this, can't remember) thread. And Blu responded that indeed fp16 would be used for shaders, which would take up anywhere between 25 and 50%. But he also said not to quote him on that, hah. Thraktor jumped in on the discussion as well.

http://www.neogaf.com/forum/showpost.php?p=221402916&postcount=1540

The discussion with Thraktor is on the next page.
 
In what way? Wii U needed a more modern UI and more apps, but it has a modern browser, plus Miiverse and the eShop were fine.

In every way? Miiverse can be greatly improved beyond its seemingly barely hidden web browser origins. Same for the eShop. Neither feel particularly elegant (the former more than the latter, at least)

Then there's that rumored Share button...if Nintendo wants to embrace sharing (even without the button), they're going to need better tools than what they have now.

And things like the Friend list and maybe even Miiverse should always be running and easily visible, instead of the segmented mess it currently is.
 

Veal

Member
Wii U was crippled by reading from the really slow built in flash memory, and the CPU wasn't setting the world on fire either. There are full desktop Linux distributions that run on less than 512MB of RAM. 1GB isn't an unreasonable target.
Also the Wii U's OS got sped up quite a bit over time. I don't feel it's much slower than the other console OS' at this point.
 
So is there any news or is this just fake?

The President of Nintendo just reconfirmed that the Switch won't be sold at a loss, but at the same time they are taking consumer expected prices seriously. This is big news IMO, because this kisses $200 and $350 outliers goodbye, and narrows the price down to $250-300 IMO. This also gives a better gauge to guestimate the power of the console.

Best case scenario is that Nintendo breaks even at either 250 or 300.. If they go with 300, we could get more power out of it, which helps in the long run.
 
I think so also. 300 is the best balance between power and price. LED Screen, joycon controllers, dock.. I hope they have enough enough space for storage(more than 32GB) or at least a game included.
 

Durante

Member
pascal in the consumer space is mostly identical to maxwell. it supports a higher CR tier but im not sure how important that is in practice. regardless its a feature unlikely to gain traction anytime soon. it also supports dynamic load balancing but that feature is probably also rendered moot due to the number of SMs in the switch. the enhanced color compression is the only feature that seperates it from maxwell, especially considering the anemic bandwidth figures
Fine-grained preemption and the Polymorph engine enhancements for multi-projection are also important consumer features which are new in Pascal (but probably irrelevant for Switch).
 

ggx2ac

Member
https://twitter.com/NWPlayer123/status/789116886109655041

Four ARM Cortex-A57 cores, max 2GHz
NVidia second-generation Maxwell architecture
256 CUDA cores, max 1 GHz, 1024 FLOPS/cycle
4GB RAM (25.6 GB/s, VRAM shared)
32 GB storage (Max transfer 400 MB/s)
USB 2.0 & 3.0
1280 x 720 6.2" IPS LCD
1080p at 60 fps or 4k at 30 fps max video output
Capcitance method, 10-point multi-touch

Time to speculate.

zGUGaHL.jpg

Typed into Google: nwplayer123 twitter nvidia

His/her posts denying Nvidia have been deleted, even posts where he/she claims AMD and DMP are being used for NX are deleted.

The tweet in the OP was posted 50 minutes after the NX reveal which during that time it was confirmed that Nvidia was being used for the Switch.

Next: http://www.nvidia.com/object/jetson-tx1-module.html

Oh hey, the specs in the OP are remarkably similar to the specs to the Jetson TX1. Where could that person have gotten that info?...

Oh yeah, from the articles made by Eurogamer and Emily Rogers, the ones who were right in the first place about Nvidia and the hybrid.

http://www.eurogamer.net/articles/2016-07-26-nx-is-a-portable-console-with-detachable-controllers

https://arcadegirl64.wordpress.com/2016/09/02/rumor-what-should-we-expect-from-the-final-nx-product/

It looks like no one can prove that the info in the OP is falsified because no developer is going to be stupid enough to leak technical details of a dev-kit.

Edit: I'll also add, this person also claimed the NX was being revealed around 12th or 13th of September and deleted that tweet too.

I remember seeing Zhugeex post a screenshot of it.

Edit 2: https://developer.nvidia.com/embedded/dlc/jetson-tx1-module-data-sheet

There's a module data sheet for the Jetson TX1... Hmm...

Edit 3: Ah I was already late to this from page 3.

Edit 4: Ohhhh........ *Leaves thread*
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I already asked the same question in the other (or this, can't remember) thread. And Blu responded that indeed fp16 would be used for shaders, which would take up anywhere between 25 and 50%. But he also said not to quote him on that, hah. Thraktor jumped in on the discussion as well.

http://www.neogaf.com/forum/showpost.php?p=221402916&postcount=1540

The discussion with Thraktor is on the next page.
Here's more on the subject of how fp16 matters: http://www.realworldtech.com/apple-custom-gpu/
 

Malakai

Member
I get a feeling the Switch won't be priced anything under 300.

I would buy it for 300.

The President of Nintendo just reconfirmed that the Switch won't be sold at a loss, but at the same time they are taking consumer expected prices seriously. This is big news IMO, because this kisses $200 and $350 outliers goodbye, and narrows the price down to $250-300 IMO. This also gives a better gauge to guestimate the power of the console.

Best case scenario is that Nintendo breaks even at either 250 or 300.. If they go with 300, we could get more power out of it, which helps in the long run.

I think so also. 300 is the best balance between power and price. LED Screen, joycon controllers, dock.. I hope they have enough enough space for storage(more than 32GB) or at least a game included.




Looking at this report for a 6 inch phone:
http://www.techinsights.com/teardown/Mobile-Survey-Plus-Sample-Report.pdf
Looking at page 3 of above linked pdf, it gives an estimated cost of components in a phablet

Also, the site have an breakdown of the estimated cost of bill of materials for the Moto E (2nd Gen) here:
http://www.techinsights.com/teardown/Motorola_Moto_E_2nd_Gen_XT1527-Sample.pdf

Also, below is a link to a 7" inch phone that retailed for $199
http://www.androidpolice.com/2015/1...e-who-want-a-tablet-trapped-in-a-phones-body/

Given that Nintendo will not have a camera, nor will Nintendo have to pay royalties for telephony patents, (The Switch won't have LTE/GSM modems) should help shave off it price. Also, Nintendo will have slight bit larger scale than most telephone/tablet makers given that Nintendo is making just one device (vs multiple models even for the smallest vendors) for the time being. Anyway, this is still huge guess work. I think Nintendo could pull of a profit or just break even at $199 price point. Heck, even Barnes and Noble did it with the original Nook Color back in 2010 at a price point of $250.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Also more power efficient... but... how does that work if two fp16 calculations are made instead of 1 fp32? I take it that it's not more powerefficient then, right? Same with memory bandwidth?
Power efficiency these days is largely a matter of how many bits you need to haul from whereever they were sitting, in order to compute your desired result - the ALU is relatively cheap. So you're right that if you increase (read: double) the computations there won't be any gain. But for the same amount of computations, fp32->fp16 is a clear power and performance gain. And that's the crux of the entire movement in the hw industry, both GPU and CPU. Ironically, while doing that tons of code will end up with accidental-success computations, which, being the worst thing that can happen in computations, will bite the authors (or their successors) in the bottom. /people-don't-do-enough-numerical-analysis-these-days rant
 

ozfunghi

Member
Power efficiency these days is largely of matter of how many bits you need to haul from whereever they were sitting, in order to compute your desired result - the ALU is relatively cheap. So you're right that if you increase (read: double) the computations there won't be any gain. But for the same amount of computations, fp32->fp16 is a clear power and performance gain. And that's the crux of the entire movement in the hw industry, both GPU and CPU. Ironically, while doing that tons of code will end up with accidental-success computations, which, being the worst thing that can happen in computations, will bite the authors (or their successors) in the bottom. /people-don't-do-enough-numerical-analysis-these-days rant

In the article it says Apple's programmers are trying to focus on fp16 (which makes sense). I can imagine Nintendo is trying to build an engine doing the same as much as possible. So, we might again see big differences between games built for the system, or games that were ported with little effort. Which is what Thraktor was also suggesting in the other thread I believe.
 
Status
Not open for further replies.
Top Bottom