• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.

Dehnus

Member
That's not what using AOSP would mean, as it's just the operating system codebase. You can create an operating system from Android source code, deploy, and use it without ever contacting Google.


That being said, I doubt the Switch's OS is based on AOSP, as there's not really any code there that would be useful for a gaming OS over more basic UNIX codebases.

AOSP is a fully open source codebase, and Google would have nothing to do with it. So what you write there wouldn't be happening at all.
You can perfectly use AOSP's kernel just as flexible as you could use vanilla unix or bsd afaik.

Google is the "leader" of that project, so I do NOT trust it then. Period. IF they don't have anything in there that datamines, it still helps their Datamining shit by making their OS more popular and thus on more phones, consoles and other machines. I do not like that, nor do I want that. I do not want 1 overlord to rule all the datamining companies.

If you are going ot use a Linux Kernel or BSD? Go for Debian, RedHat or OpenBSD and FreeBSD. Why should I use a Google inspired Linux kernel which I can't trust or only helps to make them the biggest player? I know what AOSP is, I simply want nothing to do with it.
 

Dehnus

Member
I mean don't use the web at all then? I shit ton of companies contribute to the Linux Kernel so good luck?

I block their scripts with plug ins in Firefox. ANd there are other alternatives to Google, Facebook, Microsoft and other scrooglers of data. But I won't go into it too deeply as it would derail. But thank you for AGAIN bringing up that :"OMG you don't use the Internet at all then LOOOOL" defense that google fanboys do each time, to try and redicule those that actively use the net without using Google/FB/Twitter/etc.

COntributing to the LInux Kernel is one thing, using a LInux Kernel of a known Scroogler? nooooope...
 

Vanillalite

Ask me about the GAF Notebook
I block their scripts with plug ins in Firefox. ANd there are other alternatives to Google, Facebook, Microsoft and other scrooglers of data. But I won't go into it too deeply as it would derail. But thank you for AGAIN bringing up that :"OMG you don't use the Internet at all then LOOOOL" defense that google fanboys do each time, to try and redicule those that actively use the net without using Google/FB/Twitter/etc.

COntributing to the LInux Kernel is one thing, using a LInux Kernel of a known Scroogler? nooooope...

Hey it's your tinfoil hat thing. Just saying a ton of companies from IBM to Samsung to MS to Google to Facebook make large to small contributions to the kernel itself.

Idk where you go tin foil hat and draw the line? Lol
 

Hermii

Member
I block their scripts with plug ins in Firefox. ANd there are other alternatives to Google, Facebook, Microsoft and other scrooglers of data. But I won't go into it too deeply as it would derail. But thank you for AGAIN bringing up that :"OMG you don't use the Internet at all then LOOOOL" defense that google fanboys do each time, to try and redicule those that actively use the net without using Google/FB/Twitter/etc.

COntributing to the LInux Kernel is one thing, using a LInux Kernel of a known Scroogler? nooooope...
Why do you think Evillore is so rich? Just saying.
 
What I don't understand is lowering the clocks even in Docked mode.

Can this be because of undocked mode, to keep the transition between the two under control?

It is going to be a long week, I hope we get some answers.
 
To be fair most of the internet thought pascal until recently and the mvidia blog had the "same architecture as top performing cards" line which at least I and many others assumed confirmed pascal. I doubt they have better sources than we do.

Fuck, this is a repeat of the Wii U "same technology as Watson" thing where people thought that meant that the Wii U was going to use a Goddamn quad-core POWER7 CPU.

EDIT: To be fair, very few people had seen how small the Wii U was until E3 2012.
 

Vanillalite

Ask me about the GAF Notebook
Honestly I'm more interested in the screen quality than anything. I don't mean resolution, but just general display quality.

More interested in that than even the core specs.
 

Xdrive05

Member
What I don't understand is lowering the clocks even in Docked mode.

Can this be because of undocked mode, to keep the transition between the two under control?

It is going to be a long week, I hope we get some answers.

Layman speculation here, but the way it seems to me is that even while docked it's still a very small form factor which has to run at full clock speed potentially for hours or days at a time, without the throttling profiles that its Android counterparts has. So they probably had to scale back the docked mode for that reason, which then has the "added benefit" of its 720p/undocked profile needing to be even lower clocked (proportional to the # of pixels needing rendered) and thus giving better batter life in that mode too.

So a 150gf GPU profile in handheld will presumably suck a lot less juice than, say, a 250gf like many of us were speculating as the "lowest likely handheld performance". Maybe we were not considering the proportionality of 720p/1080p in conjunction with the fact that even while docked it is still a tiny device without the same throttling options of the current market devices?

Again, total layman here, but makes sense on the surface.
 

ggx2ac

Member
What I don't understand is lowering the clocks even in Docked mode.

Can this be because of undocked mode, to keep the transition between the two under control?

It is going to be a long week, I hope we get some answers.

Two possible factors:

1) They only cared about a 2.5 times increase in FLOPS to be able to get Switch games output from 720p to 1080p when docked

2) Seeing as that is the case, the portable mode's clock speeds are the baseline and that they wanted to get the Switch running well as a portable in relation to battery life and thermal output regardless of how high they know a TX1 can run in docked mode. They could have went with a GPU clock speed of 400 MHz to upclock to 1 GHz when docked but again, battery life and thermal output may have affected lowering it.

That's speculation though, there's lots of variables whether it's a new CPU or more CUDA cores etc.

Layman speculation here, -snip-

The Shield TV has a fan in it, it doesn't throttle at all. It's temperature was somewhere between 30°C to 40°C when playing games and that is at the stock speeds we know, 2GHz CPU, 1GHz GPU.
 
Anything remotely touched by google I do not want to touch myself. I do not trust it nor do I want to be near it. I know what AOSP is and with Google being a part of it I simply do NOT trust it. PERIOD!

TInfoil hat much? Sure... but I rather stay away from things like Facebook, Trumper.. erm I mean Twitter and anything that farms my personal information. Like GMail, that you are FORCED to deal with as people get GMail accounts and use that so if you send an Email to them, you are still known to those feckers,

It thus wouldn't be that surprising that in the "Android Open Source Project" there are someremnants that collect data or phone home.. and I want NOTHING of it.

Seeing as how your posting history indicates that you've used Snapchat, I assume you're OK with Apple collecting just as much of your data as Google does.
 
The Shield TV has a fan in it, it doesn't throttle at all. It's temperature was somewhere between 30°C to 40°C when playing games and that is at the stock speeds we know, 2GHz CPU, 1GHz GPU.

To be fair, the 1GHz CPU in the switch makes sense purely for the fact that having a 2GHz CPU in a portable sounds like an excellent way of wrecking battery life. Plus, it takes into account idiotic programmers who still insist on using CPU clocks as a timing metric versus an adaptive tick-rate solution.
 

Raet

Member
How would 16 nm or Pascal make any sense at all with the rumored clock speeds from Eurogamer? Pascal can be clocked a lot higher, so the only way I see that working out is if Eurogamer's report is based on the older July devkits. And that's doubtful considering their report is from December, so likely to be info on final dev kits.
 
How would 16 nm or Pascal make any sense at all with the rumored clock speeds from Eurogamer? Pascal can be clocked a lot higher, so the only way I see that working out is if Eurogamer's report is based on the older July devkits. And that's doubtful considering their report is from December, so likely to be info on final dev kits.

Saves on battery life in portable mode? Pascal cards, even at higher clockspeeds, use less power than Maxwell cards.
 

Rolf NB

Member
That's not what using AOSP would mean, as it's just the operating system codebase. You can create an operating system from Android source code, deploy, and use it without ever contacting Google.


That being said, I doubt the Switch's OS is based on AOSP, as there's not really any code there that would be useful for a gaming OS over more basic UNIX codebases.
Well it is tuned to run with very little memory. I have tons of shit running on my phone, and it currently uses 720MB RAM.
I don't think starting with a more generic all-purpose Linux codebase would get you there.
 

Raet

Member
Saves on battery life in portable mode? Pascal cards, even at higher clockspeeds, use less power than Maxwell cards.
Possible, but even a Maxwell 20/28 nm chip clocked as low as Eurogamer reports in portable mode would have good battery life. If they went with Pascal they should afford to clock it higher than what was reported and still have decent battery life.
 
Possible, but even a Maxwell 20/28 nm chip clocked as low as Eurogamer reports in portable mode would have good battery life. If they went with Pascal they should afford to clock it higher than what was reported and still have decent battery life.

Considering the backlash that Nintendo took for both the Launch 3DS and Wii U (both of which I still have) battery life, I'd say that, given the opportunity, I'd want as much battery life out of the thing as I could get.

But, then again, I'm an audio engineer - not an electrical one, so what do I know?
 

Refyref

Member
Well it is tuned to run with very little memory. I have tons of shit running on my phone, and it currently uses 720MB RAM.
I don't think starting with a more generic all-purpose Linux codebase would get you there.

System kernels in general take little amounts of system memory. AFAIK, the Android kernel is not notably less memory intensive than the Linux or BSD kernels. This part doesn't matter much, because with the system reserving hundreds of MBs for the OS, kernel size is not a problem. (And obviously, they'll delete all the parts that aren't relevant to their system.)
 

Hermii

Member
Fuck, this is a repeat of the Wii U "same technology as Watson" thing where people thought that meant that the Wii U was going to use a Goddamn quad-core POWER7 CPU.

EDIT: To be fair, very few people had seen how small the Wii U was until E3 2012.
Not really the same thing. It would have made perfect sense for the switch to use pascal.
 

Vena

Member
Not really the same thing. It would have made perfect sense for the switch to use pascal.

Pascal and Pascal Tegra (P1) are different things, it can still use the minor changes that Pascal saw over Maxwell in the GPU without actually ever becoming Pascal on 16nm as a P1 Tegra designation.

In fact, the lack of P1 in the Shield and the, at one point, expectation of Pascal P1 in the Switch or "Nintendo was looking into Pascal" from Nate, could well explain the fall-through we had a month ago now. Its looking like Tegra P1 just never manifested outside of the PX2 (Denver cores, and a rather large chip too boot) drive for the cars and could/would not be made into a smaller form-factor. Its become very clear that Pascal was nothing but a stop-gap to Volta's delays. There may well not even be a reason for either nVidia or Nintendo to have pursued making an official P1. And the V1 will be the next actual Tegra branch.

If the N1 (Nintendo Tegra) is "Maxwell Gen 3" as a derivative/amalgamation of Maxwell Gen 2 + Pascal features but with or without a 16nm die-shrink, we may well have wrapped up most if not all of the mysteries around this device's rather weird history of architecture talk barring the fans.

Of course, we also have no idea what dev kit clocks Eurogamer reported, and there's the "mysterious stronger dev" kits in October.... and we haven't a word on changes for the November dev kits that were reported near end of the year last year. So there's still a lot of details floating in the air with no real information on them, or at least a lack of clarity of where the information we have has come from as there's been no ascribed time table to Eurogamers information relative to the information we have on at least two different dev kit "powers" and three different dev kit release windows of July, October (boost in performance), and November (no information).
 

Hermii

Member
Pascal and Pascal Tegra (P1) are different things, it can still use the minor changes that Pascal saw over Maxwell in the GPU without actually ever becoming Pascal on 16nm as a P1 Tegra designation.

In fact, the lack of P1 in the Shield and the, at one point, expectation of Pascal P1 in the Switch or "Nintendo was looking into Pascal" from Nate, could well explain the fall-through we had a month ago now. Its looking like Tegra P1 just never manifested outside of the PX2 (Denver cores, and a rather large chip too boot) drive for the cars and could/would not be made into a smaller form-factor. Its become very clear that Pascal was nothing but a stop-gap to Volta's delays. There may well not even be a reason for either nVidia or Nintendo to have pursued making an official P1. And the V1 will be the next actual Tegra branch.

If the N1 (Nintendo Tegra) is "Maxwell Gen 3" as a derivative/amalgamation of Maxwell Gen 2 + Pascal features but with or without a 16nm die-shrink, we may well have wrapped up most if not all of the mysteries around this device's rather weird history of architecture talk barring the fans.

Of course, we also have no idea what dev kit clocks Eurogamer reported, and there's the "mysterious stronger dev" kits in October.... and we haven't a word on changes for the November dev kits that were reported near end of the year last year. So there's still a lot of details floating in the air with no real information on them, or at least a lack of clarity of where the information we have has come from as there's been no ascribed time table to Eurogamers information relative to the information we have on at least two different dev kit "powers" and three different dev kit release windows of July, October (boost in performance), and November (no information).
The E.G. Article specified these will be the clocks at launch so I'm pretty sure it was the final one.
 
Anything remotely touched by google I do not want to touch myself. I do not trust it nor do I want to be near it. I know what AOSP is and with Google being a part of it I simply do NOT trust it. PERIOD!

TInfoil hat much? Sure... but I rather stay away from things like Facebook, Trumper.. erm I mean Twitter and anything that farms my personal information. Like GMail, that you are FORCED to deal with as people get GMail accounts and use that so if you send an Email to them, you are still known to those feckers,

It thus wouldn't be that surprising that in the "Android Open Source Project" there are someremnants that collect data or phone home.. and I want NOTHING of it.

Are you posting from an Amiga or something?
 

atbigelow

Member
I think it's fair to say that whatever process is being used for the new shield is the same process being used for the Switch.

Previous Shield was on 20nm. May or may not be safe.

Honestly, god knows what to expect. I'm to the point of just gritting my teeth for a week and not trying to make anymore guesses or read anything.
 

AlStrong

Member
Considering the backlash that Nintendo took for both the Launch 3DS and Wii U (both of which I still have) battery life, I'd say that, given the opportunity, I'd want as much battery life out of the thing as I could get.

But, then again, I'm an audio engineer - not an electrical one, so what do I know?

Gotta balance with cost, the bane of all engineers. ;)
 

Hermii

Member
I know marketing speech and creative accounting is a thing, but even if the 500 man years figure is highly exaggerated there has to be some part of this chip that's heavily customized.
 
I know marketing speech and creative accounting is a thing, but even if the 500 man years figure is highly exaggerated there has to be some part of this chip that's heavily customized.

What if all the customized was the clock speeds lol and the rest of the man hours went into the API.
 

EDarkness

Member
I know marketing speech and creative accounting is a thing, but even if the 500 man years figure is highly exaggerated there has to be some part of this chip that's heavily customized.

My feeling is that the chip Nintendo is using isn't Maxwell...it's something else, but has Maxwell roots. It may end up bing the Frakenstein's monster of X1 chips. A mix of Maxwell things, Pascal things, and GeForce things. Which may be why it needs a fan even when undocked. I guess we'll see next week.
 

Hermii

Member
My feeling is that the chip Nintendo is using isn't Maxwell...it's something else, but has Maxwell roots. It may end up bing the Frakenstein's monster of X1 chips. A mix of Maxwell things, Pascal things, and GeForce things. Which may be why it needs a fan even when undocked. I guess we'll see next week.
We won't see until someone does a tear down.
 

Mr Swine

Banned
My feeling is that the chip Nintendo is using isn't Maxwell...it's something else, but has Maxwell roots. It may end up bing the Frakenstein's monster of X1 chips. A mix of Maxwell things, Pascal things, and GeForce things. Which may be why it needs a fan even when undocked. I guess we'll see next week.

My feeling is that this is Kepler, on a 20/28nm node with maxwell and pascal features


[spoileri'm obviously joking here but yeah, could be closer to Kepler performance wise than X1][/spoiler]
 

EDarkness

Member
My feeling is that this is Kepler, on a 20/28nm node with maxwell and pascal features


i'm obviously joking here but yeah, could be closer to Kepler performance wise than X1]

I don't think it's Kepler. If it was, then we would be hearing about that. The base was a Jetson X1 board, so it's some variation of that.
 

Bert

Member
Anything remotely touched by google I do not want to touch myself. I do not trust it nor do I want to be near it. I know what AOSP is and with Google being a part of it I simply do NOT trust it. PERIOD!

TInfoil hat much? Sure... but I rather stay away from things like Facebook, Trumper.. erm I mean Twitter and anything that farms my personal information. Like GMail, that you are FORCED to deal with as people get GMail accounts and use that so if you send an Email to them, you are still known to those feckers,

It thus wouldn't be that surprising that in the "Android Open Source Project" there are someremnants that collect data or phone home.. and I want NOTHING of it.

Better get off GAF dude: https://en.wikipedia.org/wiki/VBull...gle_AdSense_integration_through_vBulletin.com
 
Anything remotely touched by google I do not want to touch myself. I do not trust it nor do I want to be near it. I know what AOSP is and with Google being a part of it I simply do NOT trust it. PERIOD!

TInfoil hat much? Sure... but I rather stay away from things like Facebook, Trumper.. erm I mean Twitter and anything that farms my personal information. Like GMail, that you are FORCED to deal with as people get GMail accounts and use that so if you send an Email to them, you are still known to those feckers,

It thus wouldn't be that surprising that in the "Android Open Source Project" there are someremnants that collect data or phone home.. and I want NOTHING of it.

Bitch, the US government already has everything they could ever want on you. Your eating habits, fetishes, film preferences, everything.
 

Vic

Please help me with my bad english
I know marketing speech and creative accounting is a thing, but even if the 500 man years figure is highly exaggerated there has to be some part of this chip that's heavily customized.
Why should we even believe that the chip is strongly based on the X1? Tegra chips since the K1 aren't really special other than having the particularity of Nvidia fitting their GPU technology (with Kepler for the K1, then Maxwell with the X1) into SoCs for mobile devices. Other than that, they are pretty much typical ARM-based SoCs for the most part. The main benefit for Nvidia to have their GPU micro-architecture on SoCs is that they work very nicely with their CUDA framework for parallel computing, which is heavily used for automation tasks in embedded systems. The Switch has different priorities, therefore I wouldn't be surprised if we might see certain aspect of the system configuration being more similar to GeForce chips for laptop than the Tegra X1 (more than 2 SMs, wider memory bus, etc).
 
Anything remotely touched by google I do not want to touch myself. I do not trust it nor do I want to be near it. I know what AOSP is and with Google being a part of it I simply do NOT trust it. PERIOD!

TInfoil hat much? Sure... but I rather stay away from things like Facebook, Trumper.. erm I mean Twitter and anything that farms my personal information. Like GMail, that you are FORCED to deal with as people get GMail accounts and use that so if you send an Email to them, you are still known to those feckers,

It thus wouldn't be that surprising that in the "Android Open Source Project" there are someremnants that collect data or phone home.. and I want NOTHING of it.

Never go full Stallman.
 
Why should we even believe that the chip is strongly based on the X1? Tegra chips since the K1 aren't really special other than having the particularity of Nvidia fitting their GPU technology (with Kepler for the K1, then Maxwell with the X1) into SoCs for mobile devices. Other than that, they are pretty much typical ARM-based SoCs for the most part. The main benefit for Nvidia to have their GPU micro-architecture on SoCs is that they work very nicely with their CUDA framework for parallel computing, which is heavily used for automation tasks in embedded systems. The Switch has different priorities, therefore I wouldn't be surprised if we might see certain aspect of the system configuration being more similar to GeForce chips for laptop than the Tegra X1 (more than 2 SMs, wider memory bus, etc).

Well, the dev kits was reported to be Vanilla TX1s, so it wouldn't be helpful if the final hardware derived too far from that. The final kits is also reported to be not much different from the former, but are more powerful. This implies that whatever changes that happened are not that divisible to developers.
 

Net

Member
Occam's razor.

What do you think of Thraktor's suggestion that 28nm is a very realistic possibility for the Switch Tegra?

Here's a list of companies still making 20nm SoCs:

- MediaTek

That's it, and it's only a single die, which they introduced in late 2015 with their X20/X25 and are now using for the X23/X27. Nvidia is just about still selling 20nm TX1 based devices, but I don't imagine the chip is still being fabricated, given the Shield TV is about to be replaced. In fact, I'm not aware of a single new 20nm chip going into production over the whole of 2016, let alone 2017. With the exception of a handful of high-end mobile SoCs over 2014-2015, everyone's either moving straight to finfet or sticking on 28nm.

There doesn't seem to be much of a reason to use 20nm. The price is too close to 16nm and the performance is too close to 28nm. If it weren't for the fact that TX1 is fabbed on 20nm we probably wouldn't even be considering it. That's not to say it's impossible, though. Perhaps TSMC is offering an exceptionally good deal to use up their remaining 20nm capacity.

The clock speeds we are aware of are perfectly doable on 28nm, and if Nintendo's goal is to make Switch as affordable as possible from day one then 28nm would seem like the sensible choice.
 

10k

Banned
Anything remotely touched by google I do not want to touch myself. I do not trust it nor do I want to be near it. I know what AOSP is and with Google being a part of it I simply do NOT trust it. PERIOD!

TInfoil hat much? Sure... but I rather stay away from things like Facebook, Trumper.. erm I mean Twitter and anything that farms my personal information. Like GMail, that you are FORCED to deal with as people get GMail accounts and use that so if you send an Email to them, you are still known to those feckers,

It thus wouldn't be that surprising that in the "Android Open Source Project" there are someremnants that collect data or phone home.. and I want NOTHING of it.
Jesus....
 

jdstorm

Banned
Cost and yields, unless TSMC apparently gave a "good deal" I don't see why it'd be better to use 20nm when it's barely used anymore.

You don't have to worry about cache in relation to the node. Plus we've never speculated that the GPU would use EDRAM, that it could just have more Cache or ESRAM.

Nodes aren't really preventing a 128-bit bus from happening but it's not even known if Nintendo even needs one relative to whatever the final specs will be.

The Rumors concerning the Nvdia deal was that Nvidia gave Nintendo a great deal because they were about to defult on a large production agreement for 20nm chips.

Most of that has held up so far.
 

saskuatch

Member
The Rumors concerning the Nvdia deal was that Nvidia gave Nintendo a great deal because they were about to defult on a large production agreement for 20nm chips.

Most of that has held up so far.

so essentially, Nintendo went to the garbage dump and got what was available for the switch.
 
Status
Not open for further replies.
Top Bottom