• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.

Thraktor

Member
Reading up on Huawei's new Kirin 960, I'm starting to hope that Nintendo & Nvidia have managed to squeeze in the new A73 core (and TSMC's new 16FFC process) for Switch's SoC. Performance per clock isn't that much higher than the A72 (perhaps ~15% according to early Geekbench results), but the A73's improved power efficiency combined with 16FFC's reduced power draw could actually give them a quite sizeable performance boost over A72/16FF+ at the power draw Nintendo's likely to be targeting.

I've put together a graph of estimated power curves to illustrate the improvement. The A72 and A53 curves on 16FF+ are from Anandtech's Kirin 950 analysis, so should be pretty accurate. The dashed lines are my estimates based on simple extrapolation from those results. I've used ARM's claims of about a 25% reduction in power draw from A72 to A73, but for the 16FFC savings it's a little trickier, as I haven't been able to find any decent figures for what to expect for this kind of a chip. TSMC themselves have claimed as much as 50% power savings for 16FFC over 16FF+, but this is likely for ultra-low power IoT chips, which can take advantage of 16FFC's lower minimum voltage. I've ended up going with a 20% power saving, as it lines up with Huawei's claim that Kirin 960 is 40% more power efficient than it's predecessor.

a72_a73_16ffc_comparison.png


As you can see, the total benefit both from the move to 16FFC and the jump to the A73 on top of that is quite substantial. To take a single data point, with a 1W power draw, you could either get a cluster of A72s on 16FF+ at 1.2GHz, or a cluster of A73s on 16FFC at 1.6GHz. Taking A73's improved performance per clock into account, you could be looking at as much as a 50% improvement in performance at the same power draw.

The A53s on a big.little config would also get a boost from the move to 16FFC. With a 1.5W power draw on 16FF+, you could get 4x A72 @ 1.35GHz + 4x A53 @ 1.3GHz, whereas on 16FFC with A73s you could manage 4x A73 @ 1.75GHz + 4x A53 @ 1.5GHz. Even at a much lower power limit, you could still get pretty decent performance, for example at 750mW a 2x A73 @ 1.6GHz + 4x A53 @ 1.35GHz would outperform PS4's Jaguar at most single-threaded tasks, and would provide surprisingly respectable performance in heavily multi-threaded scenarios.

It's also worth noting that what would keep Nintendo from adopting either of these new technologies isn't cost. In fact, both 16FFC and A73 are substantially cheaper than what they replace (16FFC is 10-20% cheaper per die than 16FF+ and A73 is ~25% smaller [ie cheaper] than A72). The question here is really the timescale. The first chips using 16FFC and containing A73 cores are only just starting to trickle out now, 4-5 months before Switch's launch. Whether Nintendo would have chosen either would depend on (a) what the console's release target was during SoC design and (b) how confident they were at either of both of these being ready for that target.

On part (a), it does seem likely that Nintendo's internal target for Switch's release was around now, and that was probably only changed early this year. From that point of view, A73 cores would seem a tight squeeze, and Nintendo may not even have had the option of using them during initial design, and may not have been willing to delay the tape-out by swapping them in once they became available. I don't think 16FFC would have been too tight a squeeze for a November 2016 launch, though. Nintendo would have known about TSMC's 16FFC plans from the beginning, and Nvidia have a very good relationship with TSMC, so they would have known exactly where yields were and how likely it was to be feasible for a late 2016 launch. It's also something Nintendo may have been more willing to risk, given the substantial cost and power savings for the entire SoC from moving to 16FFC. If Switch was always targeted for early 2017, though, or if the delay was in part to allow them to move to 16FFC and/or A73 cores, then all bets are off.

At this point I think 16FFC seems reasonably likely, but A73s may be a bit much to ask. Nonetheless it's interesting to consider the options open to Nintendo on the CPU front, given the need to keep performance high and power consumption low to make the system in any way competitive in handheld mode.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Reading up on Huawei's new Kirin 960, I'm starting to hope that Nintendo & Nvidia have managed to squeeze in the new A73 core (and TSMC's new 16FFC process) for Switch's SoC. Performance per clock isn't that much higher than the A72 (perhaps ~15% according to early Geekbench results), but the A73's improved power efficiency combined with 16FFC's reduced power draw could actually give them a quite sizeable performance boost over A72/16FF+ at the power draw Nintendo's likely to be targeting.

I've put together a graph of estimated power curves to illustrate the improvement. The A72 and A53 curves on 16FF+ are from Anandtech's Kirin 950 analysis, so should be pretty accurate. The dashed lines are my estimates based on simple extrapolation from those results. I've used ARM's claims of about a 25% reduction in power draw from A72 to A73, but for the 16FFC savings it's a little trickier, as I haven't been able to find any decent figures for what to expect for this kind of a chip. TSMC themselves have claimed as much as 50% power savings for 16FFC over 16FF+, but this is likely for ultra-low power IoT chips, which can take advantage of 16FFC's lower minimum voltage. I've ended up going with a 20% power saving, as it lines up with Huawei's claim that Kirin 960 is 40% more power efficient than it's predecessor.

a72_a73_16ffc_comparison.png


As you can see, the total benefit both from the move to 16FFC and the jump to the A73 on top of that is quite substantial. To take a single data point, with a 1W power draw, you could either get a cluster of A72s on 16FF+ at 1.2GHz, or a cluster of A73s on 16FFC at 1.6GHz. Taking A73's improved performance per clock into account, you could be looking at as much as a 50% improvement in performance at the same power draw.

The A53s on a big.little config would also get a boost from the move to 16FFC. With a 1.5W power draw on 16FF+, you could get 4x A72 @ 1.35GHz + 4x A53 @ 1.3GHz, whereas on 16FFC with A73s you could manage 4x A73 @ 1.75GHz + 4x A53 @ 1.5GHz. Even at a much lower power limit, you could still get pretty decent performance, for example at 750mW a 2x A73 @ 1.6GHz + 4x A53 @ 1.35GHz would outperform PS4's Jaguar at most single-threaded tasks, and would provide surprisingly respectable performance in heavily multi-threaded scenarios.

It's also worth noting that what would keep Nintendo from adopting either of these new technologies isn't cost. In fact, both 16FFC and A73 are substantially cheaper than what they replace (16FFC is 10-20% cheaper per die than 16FF+ and A73 is ~25% smaller [ie cheaper] than A72). The question here is really the timescale. The first chips using 16FFC and containing A73 cores are only just starting to trickle out now, 4-5 months before Switch's launch. Whether Nintendo would have chosen either would depend on (a) what the console's release target was during SoC design and (b) how confident they were at either of both of these being ready for that target.

On part (a), it does seem likely that Nintendo's internal target for Switch's release was around now, and that was probably only changed early this year. From that point of view, A73 cores would seem a tight squeeze, and Nintendo may not even have had the option of using them during initial design, and may not have been willing to delay the tape-out by swapping them in once they became available. I don't think 16FFC would have been too tight a squeeze for a November 2016 launch, though. Nintendo would have known about TSMC's 16FFC plans from the beginning, and Nvidia have a very good relationship with TSMC, so they would have known exactly where yields were and how likely it was to be feasible for a late 2016 launch. It's also something Nintendo may have been more willing to risk, given the substantial cost and power savings for the entire SoC from moving to 16FFC. If Switch was always targeted for early 2017, though, or if the delay was in part to allow them to move to 16FFC and/or A73 cores, then all bets are off.

At this point I think 16FFC seems reasonably likely, but A73s may be a bit much to ask. Nonetheless it's interesting to consider the options open to Nintendo on the CPU front, given the need to keep performance high and power consumption low to make the system in any way competitive in handheld mode.
I too think A73 could be a lucrative though improbable scenario for two reasons - A73 the IP was announced only this summer. While partners must have been aware well in advance, the official announcement indicate readiness for mass licensing, i.e. both fab process and design tools availability for any interested parties. While that makes it just in time for early 2017 short-term products (read: 2017's crop of phones and tablets, to be superseded around the same time next year) it would be an unrealistically close call for a long/mid term embedded device. We have the example of Acer chromebook r13 which is a fall16/spring17 product with a realistic market lifetime of at least 2 years, and a semi-custom sw ecosystem. That one is hosting the MT8173C, not because acer/mediatek couldn't have possibly implemented a A73 chip, but because A72 is the better choice for something you might have to support for the next few years.
 
Are they? I know there are some benchmarks but how will that work in real gaming situations. And PS4 has 8 cores. I fear 4×a57 will never get near the Jaguars.

The PS4/XB1 don't use all of the cores for gaming. The CPU in the Switch will also likely have some
usage of shadow cores.

In either case, The chipset would probably have to include at least A72s if it matches or surpasses those Jaguars unless there a significant difference in its clockspeed.
 

Thraktor

Member
I too think A73 could be a lucrative though improbable scenario for two reasons - A73 the IP was announced only this summer. While partners must have been aware well in advance, the official announcement indicate readiness for mass licensing, i.e. both fab process and design tools availability for any interested parties. While that makes it just in time for early 2017 short-term products (read: 2017's crop of phones and tablets, to be superseded around the same time next year) it would be an unrealistically close call for a long/mid term embedded device. We have the example of Acer chromebook r13 which is a fall16/spring17 product with a realistic market lifetime of at least 2 years, and a semi-custom sw ecosystem. That one is hosting the MT8173C, not because acer/mediatek couldn't have possibly implemented a A73 chip, but because A72 is the better choice for something you might have to support for the next few years.

The Chromebook R13 doesn't use a custom SoC, though, Acer were buying of the shelf, and there weren't any A73 based SoCs available to them. The Mediatek chip they went with is actually a rebadge of a 28nm chip announced last year (although it's nice to see some dual-A72 + dual-A53 SoCs out in the wild, they make far more sense for affordable mid-range devices than an octo-A53 design).

I should clarify that, if Nintendo had planned to release Switch in late 2016, and the delay was entirely about software and nothing else, then I absolutely wouldn't see A73 as being an option. Huawei just about managed to get an A73 powered device out by the end of the year, but given the margins in flagship phones (and the importance of benchmarks in the sales of those phones) it would seem a worthwhile risk to them, if not to Nintendo in the same timescale. It would also seem Huawei worked relatively closely with ARM to get this done, as they were in a similar situation with A72 devices a few months before competitors.

If, however, Nintendo had always planned to release Switch in early 2017, or it had been delayed specifically to allow them to delay tape-out of the SoC, then A73 may become an option. It would be a roughly similar lead-time as Shield TV was from the first A57 device, and ARM's first (and afaik only) POP implementation of the A73 is on TSMC's 16FFC process. ARM and TSMC also worked quite closely on the A73, as the first A73 test chip was fabbed on 16FFC. With Nvidia as one of, if not TSMC's largest customer, it's well within reason that TSMC would have been pushing the A73 quite early on based on this.

I've also noticed that ARM have actually commented on A73's voltages on 16FFC, with it apparently hitting 3GHz at 1V, but more pertinently for our discussion, 1.8GHz at 720mV. This compares to about 1.35GHz at 720mV for the A72 on 16FF+, so seems roughly in line with my estimates above.
 
I guess you could also say that Nintendo contributed a fair amount to that growth. And over the next – as you know, the Nintendo architecture and the company tends to stick with an architecture for a very long time. And so we've worked with them now for almost two years. Several hundred engineering years have gone into the development of this incredible game console. I really believe when everybody sees it and enjoy it, they're going be amazed by it. It's really like nothing they've ever played with before. And of course, the brand, their franchise and their game content is incredible. And so I think this is a relationship that will likely last two decades and I'm super excited about it. - Jen-Hsun Huang Nvidia CEO


So 2 years worth of Dev time on the custom SOC.. not sure if that reveals anything in terms of which CPU / APU this thing could be using?
 

Clessidor

Member
Question: In the Q&A which Thraktor posted in the other thread Jen-Hsun Huang also made the following quote:
Jen-Hsun Huang said:
(...)The quality of games has grown significantly. And one of the factors of production value of games that has been possible is because the PC and the two game consoles, Xbox and PlayStation, and in the future – in the near-future, the Nintendo Switch, all of these architectures are common in the sense that they all use modern GPUs, they all use programmable shading and they all have basically similar features.

They have very different design points, they have different capabilities, but they have very similar architectural features. As a result of that, game developers can target a much larger installed base with one common code base and, as a result, they can increase the production quality, production value of the games.(...)
Does this tell us anything new or worthy to talk about? I just saw it, and I'm not sure what he is actually talking aboout, to be honest ^^'
 

Thraktor

Member
Question: In the Q&A which Thraktor posted in the other thread Jen-Hsun Huang also made the following quote:

Does this tell us anything new or worthy to talk about? I just saw it, and I'm not sure what he is actually talking aboout, to be honest ^^'

Not really, all he's saying is that all graphics hardware, between PCs and consoles (whether AMD or Nvidia) has very similar high-level architecture and feature-sets nowadays, so developing cross-platform games is much easier than it would have been in the past (particularly pre-unified shaders). All it really tells us is that Switch won't use a graphics architecture that's more than ten years old, which I think was a safe enough bet already.
 
Not really, all he's saying is that all graphics hardware, between PCs and consoles (whether AMD or Nvidia) has very similar high-level architecture and feature-sets nowadays, so developing cross-platform games is much easier than it would have been in the past (particularly pre-unified shaders). All it really tells us is that Switch won't use a graphics architecture that's more than ten years old, which I think was a safe enough bet already.

I guess the only real clues or hint on the specs for the system is that they have been working with Nintendo for over two years.

Considering that Tegra Shields are no longer going to be released, does it makes sense for Nintendo to have the "next-generation" Tegra chips in Switch that would have been used for the successor of Shield TV?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
The Chromebook R13 doesn't use a custom SoC, though, Acer were buying of the shelf, and there weren't any A73 based SoCs available to them. The Mediatek chip they went with is actually a rebadge of a 28nm chip announced last year (although it's nice to see some dual-A72 + dual-A53 SoCs out in the wild, they make far more sense for affordable mid-range devices than an octo-A53 design).
Well, yes and no. Both MT8173C and RK3288C (in the Asus Flip C100) are variations of stock SoCs exclusively for chromebook purposes - those chips never get used in anything else, and the customisations are never disclosed by the vendors (one guess would be AV codec IP). Anyhow, I did not mean to say that Mediatek had an available A73 part (they clearly didn't), but that if that was a priority for them, then they would've been able to prepare a A73 SoC before 2017 - instead they decided to wait for the next fab stepping for their next flagman SoC and focus on the support side of things for their chromebook - something perfectly natural for that kind of product. But to emphasize - Mediatek are a major Cortex customer, just not an early adopter of new design/fabnode combos. In contrast, Huawei are the early adopter type.

But to try and translate some of that to the Switch - while I have absolutely no doubt that nintendo and nv would have been briefed on the A73 roadmap well in advance, I'm estimating the chances for nintendo to actually chose to go A73 as low just because I deem nintendo to be more on the Mediatek side of risk-taking specrtum than on the Huawai side. I hope I don't sound too much 'lol, nintendo' here - that's not my intention, just expressing my view that the Switch is logistically closer to chromebooks than to phablets. And the new crop of chromebooks (both Acer's and the expected Samsung's) are all A72-based, and that's reasonable for the given timespan and nature of the product, despite A73 being the clear specs winner.

I should clarify that, if Nintendo had planned to release Switch in late 2016, and the delay was entirely about software and nothing else, then I absolutely wouldn't see A73 as being an option. Huawei just about managed to get an A73 powered device out by the end of the year, but given the margins in flagship phones (and the importance of benchmarks in the sales of those phones) it would seem a worthwhile risk to them, if not to Nintendo in the same timescale. It would also seem Huawei worked relatively closely with ARM to get this done, as they were in a similar situation with A72 devices a few months before competitors.
No debate here - if Switch was originally a holiday '16 product then A73 would've been non-material to nintendo. I'm just sceptical of how much A73 would've been on the map even for the March '17 Switch. Yes, I'm recognizing the fact nintendo have a strong case for considering the A73.

If, however, Nintendo had always planned to release Switch in early 2017, or it had been delayed specifically to allow them to delay tape-out of the SoC, then A73 may become an option. It would be a roughly similar lead-time as Shield TV was from the first A57 device, and ARM's first (and afaik only) POP implementation of the A73 is on TSMC's 16FFC process. ARM and TSMC also worked quite closely on the A73, as the first A73 test chip was fabbed on 16FFC. With Nvidia as one of, if not TSMC's largest customer, it's well within reason that TSMC would have been pushing the A73 quite early on based on this.

I've also noticed that ARM have actually commented on A73's voltages on 16FFC, with it apparently hitting 3GHz at 1V, but more pertinently for our discussion, 1.8GHz at 720mV. This compares to about 1.35GHz at 720mV for the A72 on 16FF+, so seems roughly in line with my estimates above.
That's quite interesting, actually. 3GHz on 16FCC is quite impressive, given 16FF+ continues to be the 'performance' node, while 16FCC is meant to be the power-peformance tradeoff node. Cool find.
 

ozfunghi

Member
I always feel good about myself when i understand what Thraktor and Blu are talking about.

Also, we've always seen arguments (from posters) about using FF16+. Are you saying, Thraktor, that it would make more sense for Nintendo to go with 16FCC? And only if they were to opt for the A73, or in general?
 

ultrazilla

Gold Member
I always feel good about myself when i understand what Thraktor and Blu are talking about.

Also, we've always seen arguments (from posters) about using FF16+. Are you saying, Thraktor, that it would make more sense for Nintendo to go with 16FCC? And only if they were to opt for the A73, or in general?

It's all tech stuff that I haven't studied so don't know much about it. That said, I love it and try to always make sure I read their stuff and learn from it. :)
 
I don't know shit about DBZ lately but apparently Vegeta is the strongest SSJ
No Goku handled a ki blast struggle afainst the bad guy by himself in the last episode, while vegeta teamed up with
trunks
earlier in the same situation.

back on topic: Its hard to predict what Nintendo will choose for hardware, I'm keeping my expectations low on the specs. We got burned pretty bad in the wust days, when the dev leaks came in with a bad cpu, and we were thinking the Wii u be more significantly more powerful than 360/ps3, but we got awfully slow RAM and CPU.

Something as modern as an A73 seems like a pipe dream this late in the game. A combo of 57 and 72 are more likely. I'd like GFLOPs, but that Mario kart 8 and breath of the wild ports worries me.
 

Rodin

Member
No Goku handled a ki blast struggle afainst the bad guy by himself in the last episode, while vegeta teamed up with
trunks
earlier in the same situation.

back on topic: Its hard to predict what Nintendo will choose for hardware, I'm keeping my expectations low on the specs. We got burned pretty bad in the wust days, when the dev leaks came in with a bad cpu, and we were thinking the Wii u be more significantly more powerful than 360/ps3, but we got awfully slow RAM and CPU.

Something as modern as an A73 seems like a pipe dream this late in the game. A combo of 57 and 72 are more likely. I'd like GFLOPs, but that Mario kart 8 and breath of the wild ports worries me.
A72 is basically a newer version of A57, there's going to be only one of them in the console (A57), possibly (hopefully) with some A53 or A35 cores for the OS. Don't expect anything visibly better than what's in the op anyway. That being said, i'm not sure what's worrying about BotW and MK8 ports. There's simply no way we can judge them from the Switch video.

Also in case you're still wondering after all these years, bandwidth wasn't an issue at all on Wii U. The RAM subsystem was pretty good if you took advantage of it.
 

AzaK

Member
A72 is basically a newer version of A57, there's going to be only one of them in the console (A57), possibly (hopefully) with some A53 or A35 cores for the OS. Don't expect anything visibly better than what's in the op anyway. That being said, i'm not sure what's worrying about BotW and MK8 ports. There's simply no way we can judge them from the Switch video.

Also in case you're still wondering after all these years, bandwidth wasn't an issue at all on Wii U. The RAM subsystem was pretty good if you took advantage of it.

But it did require the EDRAM to make it usable though.
 

Schnozberry

Member
That's the thing though; the system's memory setup was throughly worked around the using EDRAM. It will be interesting to see what Nintendo will do now since EDRAM is not likely an option.

Probably spend money on a 128-bit bus for their DDR4, or a custom cache layout. Maybe even both. Nintendo hates memory latency with a passion.
 

Thraktor

Member
I guess the only real clues or hint on the specs for the system is that they have been working with Nintendo for over two years.

Considering that Tegra Shields are no longer going to be released, does it makes sense for Nintendo to have the "next-generation" Tegra chips in Switch that would have been used for the successor of Shield TV?

Whatever's in Switch will have been designed specifically for Nintendo. Nvidia's other "next-generation" Tegras like Parker and Xavier are now entirely automotive-focussed.

Well, yes and no. Both MT8173C and RK3288C (in the Asus Flip C100) are variations of stock SoCs exclusively for chromebook purposes - those chips never get used in anything else, and the customisations are never disclosed by the vendors (one guess would be AV codec IP). Anyhow, I did not mean to say that Mediatek had an available A73 part (they clearly didn't), but that if that was a priority for them, then they would've been able to prepare a A73 SoC before 2017 - instead they decided to wait for the next fab stepping for their next flagman SoC and focus on the support side of things for their chromebook - something perfectly natural for that kind of product. But to emphasize - Mediatek are a major Cortex customer, just not an early adopter of new design/fabnode combos. In contrast, Huawei are the early adopter type.

But to try and translate some of that to the Switch - while I have absolutely no doubt that nintendo and nv would have been briefed on the A73 roadmap well in advance, I'm estimating the chances for nintendo to actually chose to go A73 as low just because I deem nintendo to be more on the Mediatek side of risk-taking specrtum than on the Huawai side. I hope I don't sound too much 'lol, nintendo' here - that's not my intention, just expressing my view that the Switch is logistically closer to chromebooks than to phablets. And the new crop of chromebooks (both Acer's and the expected Samsung's) are all A72-based, and that's reasonable for the given timespan and nature of the product, despite A73 being the clear specs winner.

No debate here - if Switch was originally a holiday '16 product then A73 would've been non-material to nintendo. I'm just sceptical of how much A73 would've been on the map even for the March '17 Switch. Yes, I'm recognizing the fact nintendo have a strong case for considering the A73.

I agree that Nintendo are going to be more in the Mediatek camp than the Huawei camp when it comes to adopting new ARM cores, which is why I wouldn't even consider it a possibility for a late 2016 launch, but in the (perhaps unlikely) scenario that a March 2017 launch was always intended, then you move out of early adopter territory, and it starts to become a possibility. Not likely, but a possibility.

I always feel good about myself when i understand what Thraktor and Blu are talking about.

Also, we've always seen arguments (from posters) about using FF16+. Are you saying, Thraktor, that it would make more sense for Nintendo to go with 16FCC? And only if they were to opt for the A73, or in general?

The 16FFC process and the A73 are independent of each other (although A73 would seem more likely on 16FFC than 16FF+, given ARM's POP implementation of the core is on 16FFC). I'd consider 16FFC reasonably likely, as a 10-20% reduction in the cost of the die, combined with perhaps a 20% power saving for the entire SoC, would make the process very attractive to Nintendo, and they would have known about it and been able to plan for it from day 1 of the design process. A73 I'd consider a lot less likely, as the timing would have been much tighter.

A72 is basically a newer version of A57, there's going to be only one of them in the console (A57), possibly (hopefully) with some A53 or A35 cores for the OS. Don't expect anything visibly better than what's in the op anyway. That being said, i'm not sure what's worrying about BotW and MK8 ports. There's simply no way we can judge them from the Switch video.

Also in case you're still wondering after all these years, bandwidth wasn't an issue at all on Wii U. The RAM subsystem was pretty good if you took advantage of it.

I don't see any reason for them to go with A57s. We know that design work started end-2014/start-2015, so the A72 would have been available to them from the start. It's cheaper, higher performance, and draws less power, so there's no reason to believe they would have decided against it.
 
I'm really hoping they go with 6GB-8GB of RAM in the last minute. Really don't want this thing bottle necked--particularly for ports, but also I don't want a slow OS and would like more RAM for that and multi tasking.

Wii U's OS was awfully slow.
 
I'm really hoping they go with 6GB-8GB of RAM in the last minute. Really don't want this thing bottle necked--particularly for ports, but also I don't want a slow OS and would like more RAM for that and multi tasking.

Wii U's OS was awfully slow.

Not sure if more ram is the only answer to that. Wii U had a gigabyte of ram dedicated to the OS, and it ran worse than the Xbox 360 and PS3 - which had 32MB and 96MB dedicated to the OS respectively.
 

FyreWulff

Member
Not sure if more ram is the only answer to that. Wii U had a gigabyte of ram dedicated to the OS, and it ran worse than the Xbox 360 and PS3 - which had 32MB and 96MB dedicated to the OS respectively.

It got sped up after an update significantly. Apparently they compiled the original OS release with debug stuff enabled :lol
 
Looks pretty fake to me.

Though personally I'm more concerned about that Corrin poster in the background and why he has that up there....
 
NOO Etika what are you doing!? Even if it's just the stand in shell used in the commercial or a dev kit, Nintendo's going to go HAM on you for even showing that off.
 
Whatever's in Switch will have been designed specifically for Nintendo. Nvidia's other "next-generation" Tegras like Parker and Xavier are now entirely automotive-focussed.



I agree that Nintendo are going to be more in the Mediatek camp than the Huawei camp when it comes to adopting new ARM cores, which is why I wouldn't even consider it a possibility for a late 2016 launch, but in the (perhaps unlikely) scenario that a March 2017 launch was always intended, then you move out of early adopter territory, and it starts to become a possibility. Not likely, but a possibility.



The 16FFC process and the A73 are independent of each other (although A73 would seem more likely on 16FFC than 16FF+, given ARM's POP implementation of the core is on 16FFC). I'd consider 16FFC reasonably likely, as a 10-20% reduction in the cost of the die, combined with perhaps a 20% power saving for the entire SoC, would make the process very attractive to Nintendo, and they would have known about it and been able to plan for it from day 1 of the design process. A73 I'd consider a lot less likely, as the timing would have been much tighter.



I don't see any reason for them to go with A57s. We know that design work started end-2014/start-2015, so the A72 would have been available to them from the start. It's cheaper, higher performance, and draws less power, so there's no reason to believe they would have decided against it.

I see. Are we assuming that the dev kits with the A57s is running at max power (2GHz CPU and 1GHz GPU), but the final product will likely be A72s that a lower frequency to match close to that powerlevel?
 
It got sped up after an update significantly. Apparently they compiled the original OS release with debug stuff enabled :lol

switching between channels/games on the wii home menu was god awfully slow. I hope Nintendo just improves their OS in general. We should be able to chat with friends without having to wait 30 seconds to a minute switching channels just to message friends. It was such a god awful idea to see people on the friends list just to see who was online, but not being able to contact them whatsoever there and forcing us to go to the miiverse channel to send messages.. like wtf.
 

ultrazilla

Gold Member
I'm really hoping they go with 6GB-8GB of RAM in the last minute. Really don't want this thing bottle necked--particularly for ports, but also I don't want a slow OS and would like more RAM for that and multi tasking.

Wii U's OS was awfully slow.

Gotta agree with this. I think Nintendo should have stepped up at included 8 gig myself.
 
Gotta agree with this. I think Nintendo should have stepped up at included 8 gig myself.
At the end, if we are talking about 3.2GB available for Games vs PS4's 5.5GB, perhaps it's not too big of an issue. The Wii U was able to pull out some big worlds at a fraction of that amount, and the PS4 Pro barely adds anymore RAM at all from the base despite pushing for 4K, for example.

How fast would these cards/cartridges be over the HDD and Blu-Ray player of the other consoles?
 

ECC

Member
Sure, more ram is always nice - and without any other context, you would always prefer the largest possible amount.

All that being said, discussing RAM in isolation is only helpful if we make assumptions about the remaining part of the memory subsystem (including caches and busses), as well as the amount of ram allocated to gaming applications. Direct comparisons of ram amounts are only somewhat helpful if the systems in comparison do not differ too much in the memory system architecture.

So what do I think about the Switch and 4 gig of ram? I have no idea what to think. But going by Nintendos history they have spent serious resources on the design of the memory subsystem/cache layout on the last three consoles.
 
Does Laura Dale's latest rumour about retail price given any broad hint as to where the final specs may end up?

Specs regarding GPU/CPU don't really impact price nearly as much as everyone thinks. The most expensive parts of typical consoles are the optical drives and the hard drives*, so it's not surprising (to me anyway) that the Switch will have a relatively low launch price. Solid state storage is a big pricing factor though, as is (I believe) RAM. So a low launch price might suggest less storage (like the rumored 32GB) and less RAM (like the rumored 4GB).

But I don't think we can glean any GPU/CPU info from pricing.

*Edit: I'm not 100% sure about the above, though that's the conclusion based on things I've read regarding BoM on most consoles.
 
If they had 2 skus 1 with 32 gigs of storage and one with 64 gigs of storage would you be happy?


Are we saying in order for this to be perfect it needs to be 6 gigs of ram instead of 4 and 128 gigs of storage instead of 32/64?
 

FoxSpirit

Junior Member
My god...what even IS that twitter account? It's hundreds of posts a day about absolutely nothing... How would they even have time for games? Let alone be selected by Nintendo to get a switch to try above media outlets?
You saw that wrong. The tweet is about a deleted tweet from an account who post 1-2 posts a day max.
 

Thraktor

Member
I see. Are we assuming that the dev kits with the A57s is running at max power (2GHz CPU and 1GHz GPU), but the final product will likely be A72s that a lower frequency to match close to that powerlevel?

I honestly have no idea. I would generally assume that they'd be somewhat conservative with clock speeds in early dev kits, as it's easier for developers to accommodate a surprise increase in performance than a surprise decrease. This is what they did with the Wii U, as they increased clock speeds by about 25% shortly before launch, once they had final systems to test thermal performance, etc. and they could be confident about their ability to clock that high.

Switch is a little different, though, as clock speeds aren't going to be held back by thermal limits so much as battery life constraints. There's also the consideration of having different clock speeds for docked and portable mode and how they relate to each other (for example Nintendo may want to keep CPU clocks close, or even identical, between the two modes, but may target a 2x, or even higher, difference between GPU clocks). The claims that TX1-based dev kits required noisy cooling would suggest that they were clocked quite high, but it's difficult to infer too much from that.
 
I honestly have no idea. I would generally assume that they'd be somewhat conservative with clock speeds in early dev kits, as it's easier for developers to accommodate a surprise increase in performance than a surprise decrease. This is what they did with the Wii U, as they increased clock speeds by about 25% shortly before launch, once they had final systems to test thermal performance, etc. and they could be confident about their ability to clock that high.

Switch is a little different, though, as clock speeds aren't going to be held back by thermal limits so much as battery life constraints. There's also the consideration of having different clock speeds for docked and portable mode and how they relate to each other (for example Nintendo may want to keep CPU clocks close, or even identical, between the two modes, but may target a 2x, or even higher, difference between GPU clocks). The claims that TX1-based dev kits required noisy cooling would suggest that they were clocked quite high, but it's difficult to infer too much from that.
I see. Thanks for your feedback. I'm very curious on how different the final product is compared to the devkits. I originally figured that those TX1 chips were overclocked, but we still don't have much to go on except for the notably loud fans.

Going by what you were saying, is it possible that just the GPU is overclocked?
 

20cent

Banned
This is not an Apple device nor even a Sony one; it comes with a slot for external memory, and that's a MicroSD one. You will be fine with built-in 32GB storage. Jeez..
 
If they had 2 skus 1 with 32 gigs of storage and one with 64 gigs of storage would you be happy?


Are we saying in order for this to be perfect it needs to be 6 gigs of ram instead of 4 and 128 gigs of storage instead of 32/64?

6-8GB RAM with 0.75-1.0 GLOPS FP32 and 1.5 GFLOPS at FP16. More storage the better. Hopefully it is 32-128GB, but ideally more. I'm not so much worried about myself about the lack of storage, but it could turn other people away for being forced to buy storage space.

That's my perfection
 
Status
Not open for further replies.
Top Bottom