• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.
I'm trying to guess how Nintendo ended up switching partners from AMD to Nvidia that easily, considering the Dolphin architecture is co-founded by AMD.
They dragged it out over ~15 years and hurt the performance of two successive generations with the forced requirement of hardware-based backwards compatibility. I don't know I'd consider that "that easily".
 

Mokujin

Member
I'm trying to guess how Nintendo ended up switching partners from AMD to Nvidia that easily, considering the Dolphin architecture is co-founded by AMD.

Dolphin Architecture was a custom design done by ArtX for Nintendo, so I'm pretty sure Nintendo owns the rights for Dolphin.

In fact in all these years AMD has not done any notable work for Nintendo aside from rehashing the Dolphin-Gekko architecture (and I'm not blaming them, it has been Nintendo's fault with their "withered technology" philosophy)

That coupled with Nvidia having a notable advantage in mobile SoCs made them switching partners an easy and logic choice.

Miyamoto recent comment about younger people behind Switch design may have also helped a lot to move away from the withered technology philosophy.
 
Hmm I really doubt there's more than one fan in the handheld. I don't think that would make any sense at all. The article does seem to say that their is by repeatedly referring to the handheld as "the system" and then talking about "the systems inbuilt fans".

But I still think its just a mistake in the way its been written.

The exact wording for the fan(s) is:

Plugging the system into the dock will also activate a small additional fan to help with cooling when run at that higher clock speed. This fan is in the rear of the dock, and there is a gap in the back of the dock to allow the system’s inbuilt fans to vent when docked.

Which seems pretty clear to me, though perhaps she didn't mean it that way. Maybe someone can ask LKD on twitter?

That fan comment is ambiguous. The fan could be in additional to what's in the dock already. I would be very surprised (And annoyed) to find a fan in a handheld. Inevitably they'd get loose and start to make noise

I agree that it's a very strange decision but the wording in the article seems pretty clear to me. I'd rather them avoid any moving parts in the tablet if possible.

This is what I think as well, people are reading too much into the later quote.

Just looking at Pixel C using Tegra X1 @850Mhz / 20nm passively, why would Switch need to be actively cooled? I'm expecting a lower clock speed undocked and it may even be on 16nm.

I was asking about this a few pages ago- now that we have (seemingly) a report stating that there are fans in the tablet (or at least one) in addition to in the dock, what do we think that means for target performance or target power draw when portable?

I guess we don't know for sure what Laura meant in the article though, so maybe we need some clarification first.
 

Hermii

Member
They dragged it out over ~15 years and hurt the performance of two successive generations with the forced requirement of hardware-based backwards compatibility. I don't know I'd consider that "that easily".

This. They pretty much waited until they had no other choice than leaving it behind.
 

Luigiv

Member
Just looking at Pixel C using Tegra X1 @850Mhz / 20nm passively, why would Switch need to be actively cooled? I'm expecting a lower clock speed undocked and it may even be on 16nm.

Because the Pixel C can thermal throttle if need be, the Switch can't. It could be possible that the Switch's internal fan only kicks in intermittently when needed (of course it could be that it runs all the time, who knows).
 

Donnie

Member
To be clear I'm not doubting at all that the LKD article is claiming two fans. As in one in the dock and one in the handheld. I think its 100% clear that she's saying the dock has a fan AND the handheld also has a fan (at least one anyway).

I was just doubting the suggestion that she thinks the handheld has two fans (as in one in the dock and two in the handheld for a total of three fans). The way the article is written suggests that but I'm convinced its just a mistake. It would make zero sense AFAICS to have two fans in the handheld itself, can't see a single reason for doing that.
 

Thraktor

Member
Thanks for that, very informative! At this point, now that we have more than one internal fan being rumored along with another fan in the dock, I'm starting to think this might have a bit of a bigger power envelope in portable mode than we previously thought. I wonder if HBM would essentially solve all of their present and future RAM issues while explaining some of the increased power draw.

Price seems like less of an issue than I previously thought, though as you said, those are very rough estimates. It'll certainly be interesting to see if they discuss any specs at the January event. I'm guessing if they did opt to use HBM then they'd want to publicly state such, as that could be an interesting selling point.

Of course I doubt they'd run into issues with 4GB of LPDDR4 like people have been saying here, but Nintendo has historically splurged on excessive RAM solutions in the past. Who knows.

Well, the issue is that I don't see any reason they'd need HBM. They could get ~60GB/s from LPDDR4 at a lower cost and power draw, and from what we know about the expected performance level, and Pascal's power efficiency, that should be plenty. The only factor that would hint towards HBM (other than the TSMC report) is that Nintendo has such a history of using exotic, expensive memory. If they go with LPDDR4 it would be the first time since perhaps the SNES that Nintendo released a console without any kind of special memory. Of course, both management and hardware design staff have changed over the years, and Switch is their first clean break in terms of hardware design for a long time, so we shouldn't take it as some kind of iron-clad rule that they're always going to use expensive, fast memory.
 
Well, the issue is that I don't see any reason they'd need HBM. They could get ~60GB/s from LPDDR4 at a lower cost and power draw, and from what we know about the expected performance level, and Pascal's power efficiency, that should be plenty. The only factor that would hint towards HBM (other than the TSMC report) is that Nintendo has such a history of using exotic, expensive memory. If they go with LPDDR4 it would be the first time since perhaps the SNES that Nintendo released a console without any kind of special memory. Of course, both management and hardware design staff have changed over the years, and Switch is their first clean break in terms of hardware design for a long time, so we shouldn't take it as some kind of iron-clad rule that they're always going to use expensive, fast memory.

Well, if they are using 4GB of RAM doesn't using much faster RAM sort of alleviate any problems that might occur due to the lower RAM amount (compared to their competitors anyway)? I assume that would be the only reason they'd want to go with HBM, and I still agree that they very likely won't.

But wouldn't, say, 4GB of 256GB/s RAM pretty consistently perform better than say, 6GB of 60GB/s RAM? Or is that just going to be a case by case basis?
 

Vic

Please help me with my bad english
Well, the issue is that I don't see any reason they'd need HBM. They could get ~60GB/s from LPDDR4 at a lower cost and power draw, and from what we know about the expected performance level, and Pascal's power efficiency, that should be plenty. The only factor that would hint towards HBM (other than the TSMC report) is that Nintendo has such a history of using exotic, expensive memory. If they go with LPDDR4 it would be the first time since perhaps the SNES that Nintendo released a console without any kind of special memory. Of course, both management and hardware design staff have changed over the years, and Switch is their first clean break in terms of hardware design for a long time, so we shouldn't take it as some kind of iron-clad rule that they're always going to use expensive, fast memory.
A huge L3 cache shared by both the CPU & GPU like Apple's ARMv8-A based SoC architectures is also a possibility.

Edit: I'm wrong, only the CPU uses the 4MB of L3 cache in Apple's SoCs.
 

Mokujin

Member
I was asking about this a few pages ago- now that we have (seemingly) a report stating that there are fans in the tablet (or at least one) in addition to in the dock, what do we think that means for target performance or target power draw when portable?

I guess we don't know for sure what Laura meant in the article though, so maybe we need some clarification first.

That would mean high clocks undocked, and I may be stubborn but till I get final confirmation I won't believe there are fans on both the unit and the dock.

Because the Pixel C can thermal throttle if need be, the Switch can't. It could be possible that the Switch's internal fan only kicks in intermittently when needed (of course it could be that it runs all the time, who knows).

That is a good point, but I'm expecting quite low clocks undocked for better battery life that wouldn't raise too much thermals, so at least from my point of view I remain sceptical about more than one fan.

I may be wrong about this, but even if I am I'll be puzzled about that decision.
 
That would mean high clocks undocked, and I may be stubborn but till I get final confirmation I won't believe there are fans on both the unit and the dock.

That is a good point, but I'm expecting quite low clocks undocked for better battery life that wouldn't raise too much thermals, so at least from my point of view I remain sceptical about more than one fan.

I may be wrong about this, but even if I am I'll be puzzled about that decision.

Yeah it sure doesn't support seem to work well with the rumor about 5-8 hours of battery life. Unless Nintendo has managed to fit in a truly massive battery, with the overall price package still at $250.

All of these rumors (fans/power, battery life, price) taken together seem far too good to be true. Which means some likely aren't.
 

vern

Member
Yeah it sure doesn't support seem to work well with the rumor about 5-8 hours of battery life. Unless Nintendo has managed to fit in a truly massive battery, with the overall price package still at $250.

All of these rumors (fans/power, battery life, price) taken together seem far too good to be true. Which means some likely aren't.

8 hours is def too good to be true. I mentioned 4-5 for normal conditions at best, as what I've been told. Maybe 6 at dimmest screen brightness and wifi off and all that.
 
I think HBM is pretty much wishful thinking TBH. It isnt that prevalent now in the GPU market. The Fury line of video cards that take advantage of it were priced so high it was absurd and Vega is approaching release sometime next year. This would literally be the first time an Nvidia device would be using HBM (outside of gp100 their research compute product).. I find it extremely unlikely for them to demonstrate HBM in a mobile device like this.
 
8 hours is def too good to be true. I mentioned 4-5 for normal conditions at best, as what I've been told. Maybe 6 at dimmest screen brightness and wifi off and all that.

Have you heard anything about the GPU in terms of Gflop power ? Even something vague like 50% an XBOX ONE GPU would give us some idea of where to keep expectations. Thanks.

I would be happy with 5 hours of battery life in terms of gaming at the highest brightness.
 

NateDrake

Member
Yeah it sure doesn't support seem to work well with the rumor about 5-8 hours of battery life. Unless Nintendo has managed to fit in a truly massive battery, with the overall price package still at $250.

All of these rumors (fans/power, battery life, price) taken together seem far too good to be true. Which means some likely aren't.

Remember that the battery range was a target. Could fall short of that target.
 
Remember that the battery range was a target. Could fall short of that target.

Right of course, I don't mean to say that this means some rumors might be false or fabricated, just that they might not necessarily pan out.

I guess it won't be too long before we find out now.
 
Right of course, I don't mean to say that this means some rumors might be false or fabricated, just that they might not necessarily pan out.

I guess it won't be too long before we find out now.
Well, the battery life info from NateDrake Implies that the architecture went from Maxwell to Pascal. It's unlikely that Nintendo placed a much bigger battery in the system.
 

AzaK

Member
They need a settings slider for "How long do you need your battery to last", then I can slide it down to the "<1hr" settings and give that juice to the GPU.....whilst I cook dinner on it.
 
Well, the battery life info from NateDrake Implies that the architecture went from Maxwell to Pascal. It's unlikely that Nintendo placed a much bigger battery in the system.

The reason I'm bringing up that battery rumor is because it's seemingly at odds with the LKD rumor about multiple fans in the Switch tablet itself, which would only be needed if it uses a lot of power, which should suggest less battery life.

Unless it's a huge battery which would then suggest $250 is hard to believe.

We're clearly missing something in this equation.
 
The reason I'm bringing up that battery rumor is because it's seemingly at odds with the LKD rumor about multiple fans in the Switch tablet itself, which would only be needed if it uses a lot of power, which should suggest less battery life.

Unless it's a huge battery which would then suggest $250 is hard to believe.

We're clearly missing something in this equation.

But those rumors werent about final hardware.

Those LKD reports where specifically about the early over clocked dev kits which makes sense for fans to be around since dev kits are monstrosities anyway.
 
The reason I'm bringing up that battery rumor is because it's seemingly at odds with the LKD rumor about multiple fans in the Switch tablet itself, which would only be needed if it uses a lot of power, which should suggest less battery life.

Unless it's a huge battery which would then suggest $250 is hard to believe.

We're clearly missing something in this equation.
Actually, that would explain Vern's info. Even if they are using the much more energy efficent Pascal architecture, those fans would still consume power. How much power does "one" of those type of fans drain?
 
But those rumors werent about final hardware.

Those LKD reports where specifically about the early over clocked dev kits which makes sense for fans to be around since dev kits are monstrosities anyway.

Huh? That article says nothing about devkits. I doubt the devkits even have docks based on that leaked photo. And that report is from last week, right? Everything else she's reporting is about the final hardware, not devkits.

Actually, that would explain Vern's info. Even if they are using the much more energy efficent Pascal architecture, those fans would still consume power. How much power does "one" of those type of fans drain?

Which info? The battery life? Even 5 hours seems unreasonably high if the device needs active cooling in portable mode. It's not even the power needed to drive the fans which I'm concerned about- it's the fact that it apparently gets hot enough to need fans which would normally suggest a very low battery life.

Maybe someone can get clarification from LKD on Twitter? Like if there are actually fans in the portable and if there are more than one.
 

rekameohs

Banned
The Switch is a thin, thin device. Is there enough physical space for a fan to really be of much help? Asking honestly because I haven't seen a fan in something with that form factor.

I suppose some of those super thin laptops might have a similar use.
 

Lonely1

Unconfirmed Member
The Switch is a thin, thin device. Is there enough physical space for a fan to really be of much help? Asking honestly because I haven't seen a fan in something with that form factor.

I suppose some of those super thin laptops might have a similar use.

The Surface Pro's have fans.
 

antonz

Member
The Shield TV dimensions wise is pretty similar to the Switch. Around 45% of the Shield TVs internal space is empty as well for a potential hard drive. So should give you an idea of how compact the actual components are.
 

Thraktor

Member
Well, if they are using 4GB of RAM doesn't using much faster RAM sort of alleviate any problems that might occur due to the lower RAM amount (compared to their competitors anyway)? I assume that would be the only reason they'd want to go with HBM, and I still agree that they very likely won't.

But wouldn't, say, 4GB of 256GB/s RAM pretty consistently perform better than say, 6GB of 60GB/s RAM? Or is that just going to be a case by case basis?

Additional bandwidth, to my knowledge at least, shouldn't "counteract" low capacity in any way. Bandwidth usage in a console is predominantly composed of buffer accesses, which increase both with resolution (as the buffers become larger) and by using more bandwidth-intensive rendering techniques like deferred rendering (which has extra intermediate buffers). Techniques like buffer compression and tile-based rendering can help, by making buffers smaller and keeping them in cache respectively, but this is all largely independent of the actual memory capacity.

Larger memory capacity would generally be taken up by higher-quality assets (i.e. textures, typically). This will actually increase the required bandwidth a bit, as the larger textures take more bandwidth to read, although it's relatively small in comparison to buffer accesses, and texture caches tend to do a pretty good job of minimising the effect.

Sufficiently fast data storage (i.e. HDD, flash, game card, etc.) could reduce memory requirements for games which make heavy use of data streaming (open-world games, obviously, but also many linear games). These games typically have to load quite a bit more data into memory than is strictly needed, in order to account for mechanical hard drives' low read speeds and high latency. So, for example, the game might keep high quality textures in memory further away than the player can actually see them, because if the player runs in that direction he would get to the point that they're needed before they can be pulled from the hard-drive. On a system with high speed, low latency storage, the game could be a lot more conservative about only keeping assets that are strictly necessary in memory, as it could pull up new assets quickly enough to keep up with the players movements.

In a more extreme example, in theory it would be possible for a game to simply drop any assets which are behind the player from memory, and re-load them if and when he turns around. Of course this would require far higher speed storage than could be expected from Switch, but it illustrates the extent to which improved storage speed and latency can reduce memory requirements in games like this.

A huge L3 cache shared by both the CPU & GPU like Apple's ARMv8-A based SoC architectures is also a possibility.

Edit: I'm wrong, only the CPU uses the 4MB of L3 cache in Apple's SoCs.

Yeah, as I've mentioned previously I think that a combo of LPDDR4 and a large L3 victim cache (or perhaps simply an increased GPU L2 cache) is most likely. I'm curious to hear that Apple's L3 caches are only used for the CPU, though, do you have a source for that? (Not that I don't believe you, I'd just be interested to read the reasoning behind it)

I think HBM is pretty much wishful thinking TBH. It isnt that prevalent now in the GPU market. The Fury line of video cards that take advantage of it were priced so high it was absurd and Vega is approaching release sometime next year. This would literally be the first time an Nvidia device would be using HBM (outside of gp100 their research compute product).. I find it extremely unlikely for them to demonstrate HBM in a mobile device like this.

They're using HBM in some kind of mobile device, as a single 4GB HBM stack doesn't make any sense for a desktop GPU. It only makes sense in a scenario where LPDDR4 memory doesn't provide enough bandwidth and GDDR5 consumes too much power and/or takes up too much motherboard space. The only two chips Nvidia has in development that could match those criteria are Switch's SoC and Xavier. Xavier would seem the most likely candidate, but we can't completely rule out Switch.

Edit: Also, aside from this being almost two years since AMD's Fury cards, and using HBM2 rather than HBM1 (plus just one stack and InFO packaging), I don't think the pricing of Fury was "absurd", given the enormous die size. AMD released two cards in mid-2016 based on the ~600mm² Fiji die, with the full card costing $649 and the cut-down card costing $549. Nvidia, at about the same time, launched two cards based on the ~600mm² GM200 die, with the full card costing $999 and the cut-down card costing $649. Both chips were made on TSMC's 28nm process, so would have cost about the same, and yet AMD, even with HBM, managed to significantly undercut Nvidia for both binned and non-binned cards. Granted, Nvidia would have been taking higher margins (particularly on the Titan X), but there's no evidence to suggest that HBM forced AMD's pricing up.
 
Additional bandwidth, to my knowledge at least, shouldn't "counteract" low capacity in any way. Bandwidth usage in a console is predominantly composed of buffer accesses, which increase both with resolution (as the buffers become larger) and by using more bandwidth-intensive rendering techniques like deferred rendering (which has extra intermediate buffers). Techniques like buffer compression and tile-based rendering can help, by making buffers smaller and keeping them in cache respectively, but this is all largely independent of the actual memory capacity.

Larger memory capacity would generally be taken up by higher-quality assets (i.e. textures, typically). This will actually increase the required bandwidth a bit, as the larger textures take more bandwidth to read, although it's relatively small in comparison to buffer accesses, and texture caches tend to do a pretty good job of minimising the effect.

Sufficiently fast data storage (i.e. HDD, flash, game card, etc.) could reduce memory requirements for games which make heavy use of data streaming (open-world games, obviously, but also many linear games). These games typically have to load quite a bit more data into memory than is strictly needed, in order to account for mechanical hard drives' low read speeds and high latency. So, for example, the game might keep high quality textures in memory further away than the player can actually see them, because if the player runs in that direction he would get to the point that they're needed before they can be pulled from the hard-drive. On a system with high speed, low latency storage, the game could be a lot more conservative about only keeping assets that are strictly necessary in memory, as it could pull up new assets quickly enough to keep up with the players movements.

In a more extreme example, in theory it would be possible for a game to simply drop any assets which are behind the player from memory, and re-load them if and when he turns around. Of course this would require far higher speed storage than could be expected from Switch, but it illustrates the extent to which improved storage speed and latency can reduce memory requirements in games like this.

Interesting. So as usual it's a bit more complex than I was thinking. It seems that if both the RAM bandwidth and the game read speed were sufficiently high, that could potentially alleviate some RAM size concerns since you'd need to keep fewer files in RAM for less time as they could be streamed from the game much more quickly. But that requires higher speeds for both to make a sizable difference I suppose?


I sure hope they do actually give us some indication of specs, as Kimishima seemed to suggest, at the January event because otherwise this circular speculation will go on for quite a while.
 

Dacvak

No one shall be brought before our LORD David Bowie without the true and secret knowledge of the Photoshop. For in that time, so shall He appear.
(Apologies in advance as I'm sure posts like these are super annoying for regulars of this thread.)

So for someone not following the leaks and rumors, could anyone here give a quick summary of what we're expecting and how likely it is to be true?
 

Thraktor

Member
Interesting. So as usual it's a bit more complex than I was thinking. It seems that if both the RAM bandwidth and the game read speed were sufficiently high, that could potentially alleviate some RAM size concerns since you'd need to keep fewer files in RAM for less time as they could be streamed from the game much more quickly. But that requires higher speeds for both to make a sizable difference I suppose?

RAM bandwidth doesn't come into it too much when it comes to data streaming. Yes, if you've got a storage pool with (on an upper limit) 500MB/s of bandwidth and you're streaming constantly from it then that will take up 500MB/s of RAM bandwidth as well, but with RAM bandwidth in the tens, or possibly even hundreds of GB/s, that 500MB/s is a drop in the ocean. RAM bandwidth will always be a couple of orders of magnitude higher than storage bandwidth, so the latter is always going to be the bottleneck in these scenarios.

It becomes a little more interesting when you look at upcoming tech like 3D X-Point, which sort of straddles between RAM and SSDs, with latency of around 10 microseconds (compared to perhaps 300 microseconds for a good SSD or 50 nanoseconds for RAM), much higher bandwidth than SSDs, and pricing somewhere between the two. For a future games console it's a potentially interesting piece of tech, as you could combine a relatively small pool of very fast RAM (say HBM3 or something along those lines) with a big pool of 3D X-Point. The technology is bit-addressable, which means it could be accessed directly like RAM. Therefore, you could keep assets in 3D X-Point (potentially the entire game) and keep the HBM3 for buffers and other data which requires the higher bandwidth. 3D X-Point should provide good enough bandwidth for asset reads while providing higher capacity at a lower cost than RAM would. Effectively you'd eliminate the need to stream from an SSD as all your data would already be in a sufficiently fast storage pool.

I sure hope they do actually give us some indication of specs, as Kimishima seemed to suggest, at the January event because otherwise this circular speculation will go on for quite a while.

Well, if it is HBM, they'll certainly tell us, and if it isn't then all we need to do is open one up in March and read the RAM module. I have a feeling they'll be a bit more open about specs than they have in the past. They seem to be approaching Switch quite differently from their previous hardware, and there are a few hints (like publicly commenting on the console's API) which would suggest a change in their approach to technical matters, too.

(Apologies in advance as I'm sure posts like these are super annoying for regulars of this thread.)

So for someone not following the leaks and rumors, could anyone here give a quick summary of what we're expecting and how likely it is to be true?

What seems most likely at the moment:

CPU: 2-4 A72 cores plus perhaps 4 A53 cores (We haven't had any specific rumours on CPU configuration, but several general comments that CPU performance is good relative to PS4/XBO)
GPU: Nvidia Pascal, between 500-750 Gflops FP32 when docked, probably under 500 Gflops when portable (Numerous reliable people have given a roughly similar range, although no specific numbers)
RAM: 4GB LPDDR4, with 3.2GB for games, 800MB for OS (Total RAM quantity seems very likely, but OS split has only been mentioned by one person, so take with a grain of salt. No specific rumours on the RAM type, although LPDDR4 is by far the most logical, with HBM2 a distant second).
 
If 1080p was an issue because of the RAM bandwidth, I wonder what people would make of all the Switch games being 900p native when docked.

I know we all make a huge deal out of this on forums but to the average consumer I doubt they would care esp as a lot of them are sitting 5+ feet away from their HDTV's anyway.

If 1080p native isn't possible then I think 900p native would be a good compromise. 720p native when docked is the worse case scenario imo.
 
RAM bandwidth doesn't come into it too much when it comes to data streaming. Yes, if you've got a storage pool with (on an upper limit) 500MB/s of bandwidth and you're streaming constantly from it then that will take up 500MB/s of RAM bandwidth as well, but with RAM bandwidth in the tens, or possibly even hundreds of GB/s, that 500MB/s is a drop in the ocean. RAM bandwidth will always be a couple of orders of magnitude higher than storage bandwidth, so the latter is always going to be the bottleneck in these scenarios.

It becomes a little more interesting when you look at upcoming tech like 3D X-Point, which sort of straddles between RAM and SSDs, with latency of around 10 microseconds (compared to perhaps 300 microseconds for a good SSD or 50 nanoseconds for RAM), much higher bandwidth than SSDs, and pricing somewhere between the two. For a future games console it's a potentially interesting piece of tech, as you could combine a relatively small pool of very fast RAM (say HBM3 or something along those lines) with a big pool of 3D X-Point. The technology is bit-addressable, which means it could be accessed directly like RAM. Therefore, you could keep assets in 3D X-Point (potentially the entire game) and keep the HBM3 for buffers and other data which requires the higher bandwidth. 3D X-Point should provide good enough bandwidth for asset reads while providing higher capacity at a lower cost than RAM would. Effectively you'd eliminate the need to stream from an SSD as all your data would already be in a sufficiently fast storage pool.

Yeah, that's sort of what I was thinking when discussing that both storage speed and RAM speed are the factors there, and storage speed will always be the weak link. That's interesting about 3D X-Point though, I'll have to look that up!

(Apologies in advance as I'm sure posts like these are super annoying for regulars of this thread.)

So for someone not following the leaks and rumors, could anyone here give a quick summary of what we're expecting and how likely it is to be true?

Thraktor's breakdown was very good, but it's worth mentioning that a few pages back LCGeek, who has leaked CPU info in the past, predicted that the GPU might get up to 6x that of the Wii U (3-6x, or 5-6x accounting for customizations), which, in pure flops would get to 1TFlop. I say predicted because that's apparently not inside info at all, just something like an educated guess.

But yeah, based on rumors we have Thraktor's list is spot on.
 

AzaK

Member
If 1080p was an issue because of the RAM bandwidth, I wonder what people would make of all the Switch games being 900p native when docked.

I know we all make a huge deal out of this on forums but to the average consumer I doubt they would care esp as a lot of them are sitting 5+ feet away from their HDTV's anyway.

If 1080p native isn't possible then I think 900p native would be a good compromise. 720p native when docked is the worse case scenario imo.

I think anyone being reasonable should expect sub 1080 for most AAA games. Of course Nintendo's style means that they are more likely to hit 1080 but even XB1 and PS4 don't hit 1080 in some games. In the end I think resolution will be sacrificed for graphical niceties.
 
http://venturebeat.com/2016/12/14/nintendo-switch-specs-less-powerful-than-playstation-4/

This could be new thread worthy, but I dunno, Dean Tak doesn't say much but does say he has two independent sources saying it is a very modified Tegra X1 Maxwell design.
Article was poorly written. Powerful enough to run cartoony graphics make no sense.

That being said, it could be true maybe. Wonder what Nate thinks. Maybe we could confirm with Laura and Emily. Ultimately it comes down to Jan 12.
 
Wth is the 10%?

If its really true then.. Likely

-500GFlOPS docked (maybe)
-4GB RAM(as Emily said in Oct)

If its custom though, it could be more powerful. I'm guessing it will be 1.5-2x as powerful as Wii u in handheld, and up to 3x when docked.

I hope it's the CPU. And good customizations. Well, Maxwell doesn't sound to good for bandwith and batterylife in mobile mode.

Didn't Matt say Pascal too?

Emily says her devkit info came in august, switch went full production in September. How likely is it that Nintendo goes for pascal in the endproduct, when it wasn't in devkits?

It does sound like lherre was right when he gave a hint it's not pascal.
 

antonz

Member
Really no reason to expect less than 512gflops from it. The X1 at normal clocks barely even needs a fan. A small underclock such as in the Pixel C completely removes the need of a fan yet we know the Switch has a fan and then it has an additional fan in the dock for docked mode.

If anything it suggests Switch pushes more than a standard X1.
 
I just find it hard to belive it is x1. It had heat issues in almost all devices it was in. The pixel c had to downclock at points. 20nm was awful for mobile... Hope it is a rumor or battery life will blow.
 
What exactly are our options with bandwidth if its maxwell now?

Here's hoping Emily Rogers is wrong.. FFF She was wrong about the Wii U Ram in August 2012, which was 3 months before Wii U's release, when she said she heard 1 to 1.5GB of RAM will be available on the Wii U.
 
What exactly are our options with bandwidth if its maxwell now?

Here's hoping Emily Rogers is wrong.. FFF She was wrong about the Wii U Ram in August 2012, which was 3 months before Wii U's release, when she said she heard 1 to 1.5GB of RAM will be available on the Wii U.

Wii U had 2GB RAM, but only 1GB available for games.
 
What exactly are our options with bandwidth if its maxwell now?

Here's hoping Emily Rogers is wrong.. FFF She was wrong about the Wii U Ram in August 2012, which was 3 months before Wii U's release, when she said she heard 1 to 1.5GB of RAM will be available on the Wii U.

I think Emily is talking about the devkit. How would anyone know what architecture the final retail unit uses outside of top Nintendo brass or someone from the manufacturing plant (if Switch production has even begun yet).

I still think Switch is 16nm Pascal if only for better battery life and not increased specs.
 
Status
Not open for further replies.
Top Bottom