Tertullian
Member
So the switch isn't even stronger than a Vita?
It's basically a Cathode-Ray Tube Amusement Device, but with a downclocked cathode-ray tube.
So the switch isn't even stronger than a Vita?
Those 3 SM numbers are fantasy until confirmed. The 2 SM numbers are more likely at this point.
This means good battery life???
I mean he's trolling but PS3 at 1080p is exactly what the games will look like with these specs.
Also (to those asking about it) the Switch still has a fan which is used when portable so battery life might not be too great.;_;
Looks like this is true for the third gen in a row now:
Lowest estimates put the raw power of it slightly above the Iphone 6S GPU when in handheld mode. Keep in mind this would be unhampered by IOS and typical phone game creation constraints.
lol what in the world is that?
Well...because many of us don't give a single fuck about mobile gaming. The real question is why do so many seem to believe they can deflect console expectations, criticisms and disappointments with mobile stats, as if mobile and consoles are the same thing, bought by the same audiences for the same purpose.
I'm sure this won't surprise you, but many of us are grown ass adults that drive to work and don't have enough interest (or opportunity) to game while mobile anymore. We work at work and game at home where we have full access to a TV/monitor which can give us a superior experience to the often hand-cramping, poor resolution experience that typifies mobile gaming (at least on Nintendo and Sony handhelds). Mobile was good shit K-12 and through part of college, but that was the end of that for me. Maybe if I lived in a subway city, I'd still have use for mobile gaming, but I don't.
So yea, for those of us who only play at home on a proper console, Nintendo hasn't offered a platform with good 3rd party support since the SNES. It is what it is. No need to attempt to spin that reality; I've been playing on Nintendo platforms longer than half this forum has been alive. I know what time it is.
Well...because many of us don't give a single fuck about mobile gaming. The real question is why do so many seem to believe they can deflect console expectations, criticisms and disappointments with mobile stats, as if mobile and consoles are the same thing, bought by the same audiences for the same purpose.
I'm sure this won't surprise you, but many of us are grown ass adults that drive to work and don't have enough interest (or opportunity) to game while mobile anymore. We work at work and game at home where we have full access to a TV/monitor which can give us a superior experience to the often hand-cramping, poor resolution experience that typifies mobile gaming (at least on Nintendo and Sony handhelds). Mobile was good shit K-12 and through part of college, but that was the end of that for me. Maybe if I lived in a subway city, I'd still have use for mobile gaming, but I don't.
So yea, for those of us who only play at home on a proper console, Nintendo hasn't offered a platform with good 3rd party support since the SNES. It is what it is. No need to attempt to spin that reality; I've been playing on Nintendo platforms longer than half this forum has been alive. I know what time it is.
Modern shaders and RAM alone goes a long way.I mean he's trolling but PS3 at 1080p is exactly what the games will look like with these specs.
Also (to those asking about it) the Switch still has a fan which is used when portable so battery life might not be too great.;_;
lol what in the world is that?
Can't wait for the 12th. Everyone will be mad hype the second Reggie is on stage to announce Super Mario Switch and everyone will be drooling over how beautiful will look and everyone will forget this console is not even close to X1 specs and everyone will be happy and Trump will resign and Danny Devito will be president of the US. Mark my words.
Can't wait for the 12th. Everyone will be mad hype the second Reggie is on stage to announce Super Mario Switch and everyone will be drooling over how beautiful will look and everyone will forget this console is not even close to X1 specs and everyone will be happy and Trump will resign and Danny Devito will be president of the US. Mark my words.
This was much much worse:
.
I haven't had time to read through every response here, so I'm probably repeating what others have already said, but here are my thoughts on the matter, anyway:
CPU Clock
This isn't really surprising, given (as predicted) CPU clocks stay the same between portable and docked mode to make sure games don't suddenly become CPU limited when running in portable mode.
The overall performance really depends on the core configuration. An octo-core A72 setup at 1GHz would be pretty damn close to PS4's 1.6GHZ 8-core Jaguar CPU. I don't necessarily expect that, but a 4x A72 + 4x A53 @ 1GHz should certainly be able to provide "good enough" performance for ports, and wouldn't be at all unreasonable to expect.
Memory Clock
This is also pretty much as expected as 1.6GHz is pretty much the standard LPDDR4 clock speed (which I guess confirms LPDDR4, not that there was a huge amount of doubt). Clocking down in portable mode is sensible, as lower resolution means smaller framebuffers means less bandwidth needed, so they can squeeze out a bit of extra battery life by cutting it down.
Again, though, the clock speed is only one factor. There are two other things that can come into play here. The second factor, obviously enough, is the bus width of the memory. Basically, you're either looking at a 64 bit bus, for 25.6GB/s, or a 128 bit bus, for 51.2GB/s of bandwidth. The third is any embedded memory pools or cache that are on-die with the CPU and GPU. Nintendo hasn't shied away from large embedded memory pools or cache before (just look at the Wii U's CPU, its GPU, the 3DS SoC, the n3DS SoC, etc., etc.), so it would be quite out of character for them to avoid such customisations this time around. Nvidia's GPU architectures from Maxwell onwards use tile-based rendering, which allows them to use on-die caches to reduce main memory bandwidth consumption, which ties in quite well with Nintendo's habits in this regard. Something like a 4MB L3 victim cache (similar to what Apple uses on their A-series SoCs) could potentially reduce bandwidth requirements by quite a lot, although it's extremely difficult to quantify the precise benefit.
GPU Clock
This is where things get a lot more interesting. To start off, the relationship between the two clock speeds is pretty much as expected. With a target of 1080p in docked mode and 720p in undocked mode, there's a 2.25x difference in pixels to be rendered, so a 2.5x difference in clock speeds would give developers a roughly equivalent amount of GPU performance per pixel in both modes.
Once more, though, and perhaps most importantly in this case, any interpretation of the clock speeds themselves is entirely dependent on the configuration of the GPU, namely the number of SMs (also ROPs, front-end blocks, etc, but we'll assume that they're kept in sensible ratios).
Case 1: 2 SMs - Docked: 384 GF FP32 / 768 GF FP16 - Portable: 153.6 GF FP32 / 307.2 GF FP16
I had generally been assuming that 2 SMs was the most likely configuration (as, I believe, had most people), simply on the basis of allowing for the smallest possible SoC which could meet Nintendo's performance goals. I'm not quite so sure now, for a number of reasons.
Firstly, if Nintendo were to use these clocks with a 2 SM configuration (assuming 20nm), then why bother with active cooling? The Pixel C runs a passively cooled TX1, and although people will be quick to point out that Pixel C throttles its GPU clocks while running for a prolonged time due to heat output, there are a few things to be aware of with Pixel C. Firstly, there's a quad-core A57 CPU cluster at 1.9GHz running alongside it, which on 20nm will consume a whopping 7.39W when fully clocked. Switch's CPU might be expected to only consume around 1.5W, by comparison. Secondly, although I haven't been able to find any decent analysis of Pixel C's GPU throttling, the mentions of it I have found indicate that, although it does throttle, the drop in performance is relatively small, and as it's clocked about 100MHz above Switch to begin with it may only be throttling down to a 750MHz clock or so even under prolonged workloads. There is of course the fact that Pixel C has an aluminium body to allow for easier thermal dissipation, but it likely would have been cheaper (and mechanically much simpler) for Nintendo to adopt the same approach, rather than active cooling.
Alternatively, we can think of it a different way. If Switch has active cooling, then why clock so low? Again assuming 20nm, we know that a full 1GHz clock shouldn't be a problem for active cooling, even with a very small quiet fan, given the Shield TV (which, again, uses a much more power-hungry CPU than Switch). Furthermore, if they wanted a 2.5x ratio between the two clock speeds, that would give a 400MHz clock in portable mode. We know that the TX1, with 2 SMs on 20nm, consumes 1.51W (GPU only) when clocked at about 500MHz. Even assuming that that's a favourable demo for the TX1, at 20% lower clock speed I would be surprised if a 400MHz 2 SM GPU would consume any more than 1.5W. That's obviously well within the bounds for passive cooling, but even being very conservative with battery consumption it shouldn't be an issue. The savings from going from 400MHz to 300MHz would perhaps only increase battery life by about 5-10% tops, which makes it puzzling why they'd turn down the extra performance.
Finally, the recently published Switch patent application actually explicitly talks about running the fan at a lower RPM while in portable mode, and doesn't even mention the possibility of turning it off while running in portable mode. A 2 SM 20nm Maxwell GPU at ~300MHz shouldn't require a fan at all, and although it's possible that they've changed their mind since filing the patent in June, it begs the question of why they would even consider running the fan in portable mode if their target performance was anywhere near this.
Case 2: 3 SMs - Docked: 576 GF FP32 / 1,152 GF FP16 - Portable: 230.4 GF FP32 / 460.8 GF FP16
This is a bit closer to the performance level we've been led to expect, and it does make a little bit of sense from the perspective of giving a little bit over TX1 performance at lower power consumption. (It also matches reports of overclocked TX1s in early dev kits, as you'd need to clock a bit over the standard 1GHz to reach docked performance here.) Active cooling while docked makes sense for a 3 SM GPU at 768MHz, although wouldn't be needed in portable mode. It still leaves the question of why not use 1GHz/400MHz clocks, as even with 3 SMs they should be able to get by with passive cooling at 400MHz, and battery consumption shouldn't be that much of an issue.
Case 3: 4 SMs - Docked: 768 GF FP32 / 1,536 GF FP16 - Portable: 307.2 GF FP32 / 614.4 GF FP16
This would be on the upper limit of what's been expected, performance wise, and the clock speeds start to make more sense at this point, as portable power consumption for the GPU would be around the 2W mark, so further clock increases may start to effect battery life a bit too much (not that 400-500MHz would be impossible from that point of view, though). Active cooling would be necessary in docked mode, but still shouldn't be needed in portable mode (except perhaps if they go with a beefier CPU config than expected).
Case 4: More than 4 SMs
I'd consider this pretty unlikely, but just from the point of view of "what would you have to do to actually need active cooling in portable mode at these clocks", something like 6 SMs would probably do it (1.15 TF FP32/2.3 TF FP16 docked, 460 GF FP32/920 GF FP16 portable), but I wouldn't count on that. For one, it's well beyond the performance levels that reliable-so-far journalists have told us to expect, but it would also require a much larger die than would be typical for a portable device like this (still much smaller than PS4/XBO SoCs, but that's a very different situation).
TLR
Each of these numbers are only a single variable in the equation, and we need to know things like CPU configuration, memory bus width, embedded memory pools, number of GPU SMs, etc. to actually fill out the rest of those equations to get the relevant info. Even on the worst end of the spectrum, we're still getting by far the most ambitious portable that Nintendo's ever released, which also doubles as a home console that's noticeably higher performing than Wii U, which is fine by me.
It doesn't mean anything though. Nintendo Switch and Nvidia Shield are two very different machines, and I'm pretty sure Switch will be a much better gaming system.
The Switch is not competing in 2013, it's competing in 2017. It needs to be attractive vs. the XB1 and PS4 at their 2017 prices.
I said it before and I'll say it again. I strongly believe the Switch is DOA at $299 and above. I believe it's potentially very successful at $199, and a toss-up at $249. The lineup of launch games will be a huge factor as well.
I think that deep down we all know that this thing needs to be successful from day one to have a chance long-term. That's why the price point will be critical.
Everytime I read something like this, I want to punch a dead Rhino. No matter what Nintendo does, they're always competing with Sony/MS for the same resource - consumer's money- in the same global field: consumer entertainment electronics.
Mom: "Little Josh, what do you want for birthday? A PS4 or a Switch?" Josh: "Why not both?" Mom: "Shut the fuck up and get a job!"
So people are telling me Nintendo is smart because they avoid the shark infested waters and try to settle down in Orca territory?!! You guys are amusing....and Pachter is still a wiseman compared to some in here.
Thanks for the write-up. Let's see how many people read it.
I predict 3.
At launch, those consoles were more expensive. The Switch isn't in its third year here.
There's no way it's less than $300.
Not really. They added some Indies and small devs which supported them anyway and middleware engine companies which they of course had support from.
I haven't had time to read through every response here, so I'm probably repeating what others have already said, but here are my thoughts on the matter, anyway:
CPU Clock
This isn't really surprising, given (as predicted) CPU clocks stay the same between portable and docked mode to make sure games don't suddenly become CPU limited when running in portable mode.
The overall performance really depends on the core configuration. An octo-core A72 setup at 1GHz would be pretty damn close to PS4's 1.6GHZ 8-core Jaguar CPU. I don't necessarily expect that, but a 4x A72 + 4x A53 @ 1GHz should certainly be able to provide "good enough" performance for ports, and wouldn't be at all unreasonable to expect.
Memory Clock
This is also pretty much as expected as 1.6GHz is pretty much the standard LPDDR4 clock speed (which I guess confirms LPDDR4, not that there was a huge amount of doubt). Clocking down in portable mode is sensible, as lower resolution means smaller framebuffers means less bandwidth needed, so they can squeeze out a bit of extra battery life by cutting it down.
Again, though, the clock speed is only one factor. There are two other things that can come into play here. The second factor, obviously enough, is the bus width of the memory. Basically, you're either looking at a 64 bit bus, for 25.6GB/s, or a 128 bit bus, for 51.2GB/s of bandwidth. The third is any embedded memory pools or cache that are on-die with the CPU and GPU. Nintendo hasn't shied away from large embedded memory pools or cache before (just look at the Wii U's CPU, its GPU, the 3DS SoC, the n3DS SoC, etc., etc.), so it would be quite out of character for them to avoid such customisations this time around. Nvidia's GPU architectures from Maxwell onwards use tile-based rendering, which allows them to use on-die caches to reduce main memory bandwidth consumption, which ties in quite well with Nintendo's habits in this regard. Something like a 4MB L3 victim cache (similar to what Apple uses on their A-series SoCs) could potentially reduce bandwidth requirements by quite a lot, although it's extremely difficult to quantify the precise benefit.
GPU Clock
This is where things get a lot more interesting. To start off, the relationship between the two clock speeds is pretty much as expected. With a target of 1080p in docked mode and 720p in undocked mode, there's a 2.25x difference in pixels to be rendered, so a 2.5x difference in clock speeds would give developers a roughly equivalent amount of GPU performance per pixel in both modes.
Once more, though, and perhaps most importantly in this case, any interpretation of the clock speeds themselves is entirely dependent on the configuration of the GPU, namely the number of SMs (also ROPs, front-end blocks, etc, but we'll assume that they're kept in sensible ratios).
Case 1: 2 SMs - Docked: 384 GF FP32 / 768 GF FP16 - Portable: 153.6 GF FP32 / 307.2 GF FP16
I had generally been assuming that 2 SMs was the most likely configuration (as, I believe, had most people), simply on the basis of allowing for the smallest possible SoC which could meet Nintendo's performance goals. I'm not quite so sure now, for a number of reasons.
Firstly, if Nintendo were to use these clocks with a 2 SM configuration (assuming 20nm), then why bother with active cooling? The Pixel C runs a passively cooled TX1, and although people will be quick to point out that Pixel C throttles its GPU clocks while running for a prolonged time due to heat output, there are a few things to be aware of with Pixel C. Firstly, there's a quad-core A57 CPU cluster at 1.9GHz running alongside it, which on 20nm will consume a whopping 7.39W when fully clocked. Switch's CPU might be expected to only consume around 1.5W, by comparison. Secondly, although I haven't been able to find any decent analysis of Pixel C's GPU throttling, the mentions of it I have found indicate that, although it does throttle, the drop in performance is relatively small, and as it's clocked about 100MHz above Switch to begin with it may only be throttling down to a 750MHz clock or so even under prolonged workloads. There is of course the fact that Pixel C has an aluminium body to allow for easier thermal dissipation, but it likely would have been cheaper (and mechanically much simpler) for Nintendo to adopt the same approach, rather than active cooling.
Alternatively, we can think of it a different way. If Switch has active cooling, then why clock so low? Again assuming 20nm, we know that a full 1GHz clock shouldn't be a problem for active cooling, even with a very small quiet fan, given the Shield TV (which, again, uses a much more power-hungry CPU than Switch). Furthermore, if they wanted a 2.5x ratio between the two clock speeds, that would give a 400MHz clock in portable mode. We know that the TX1, with 2 SMs on 20nm, consumes 1.51W (GPU only) when clocked at about 500MHz. Even assuming that that's a favourable demo for the TX1, at 20% lower clock speed I would be surprised if a 400MHz 2 SM GPU would consume any more than 1.5W. That's obviously well within the bounds for passive cooling, but even being very conservative with battery consumption it shouldn't be an issue. The savings from going from 400MHz to 300MHz would perhaps only increase battery life by about 5-10% tops, which makes it puzzling why they'd turn down the extra performance.
Finally, the recently published Switch patent application actually explicitly talks about running the fan at a lower RPM while in portable mode, and doesn't even mention the possibility of turning it off while running in portable mode. A 2 SM 20nm Maxwell GPU at ~300MHz shouldn't require a fan at all, and although it's possible that they've changed their mind since filing the patent in June, it begs the question of why they would even consider running the fan in portable mode if their target performance was anywhere near this.
Case 2: 3 SMs - Docked: 576 GF FP32 / 1,152 GF FP16 - Portable: 230.4 GF FP32 / 460.8 GF FP16
This is a bit closer to the performance level we've been led to expect, and it does make a little bit of sense from the perspective of giving a little bit over TX1 performance at lower power consumption. (It also matches reports of overclocked TX1s in early dev kits, as you'd need to clock a bit over the standard 1GHz to reach docked performance here.) Active cooling while docked makes sense for a 3 SM GPU at 768MHz, although wouldn't be needed in portable mode. It still leaves the question of why not use 1GHz/400MHz clocks, as even with 3 SMs they should be able to get by with passive cooling at 400MHz, and battery consumption shouldn't be that much of an issue.
Case 3: 4 SMs - Docked: 768 GF FP32 / 1,536 GF FP16 - Portable: 307.2 GF FP32 / 614.4 GF FP16
This would be on the upper limit of what's been expected, performance wise, and the clock speeds start to make more sense at this point, as portable power consumption for the GPU would be around the 2W mark, so further clock increases may start to effect battery life a bit too much (not that 400-500MHz would be impossible from that point of view, though). Active cooling would be necessary in docked mode, but still shouldn't be needed in portable mode (except perhaps if they go with a beefier CPU config than expected).
Case 4: More than 4 SMs
I'd consider this pretty unlikely, but just from the point of view of "what would you have to do to actually need active cooling in portable mode at these clocks", something like 6 SMs would probably do it (1.15 TF FP32/2.3 TF FP16 docked, 460 GF FP32/920 GF FP16 portable), but I wouldn't count on that. For one, it's well beyond the performance levels that reliable-so-far journalists have told us to expect, but it would also require a much larger die than would be typical for a portable device like this (still much smaller than PS4/XBO SoCs, but that's a very different situation).
TLR
Each of these numbers are only a single variable in the equation, and we need to know things like CPU configuration, memory bus width, embedded memory pools, number of GPU SMs, etc. to actually fill out the rest of those equations to get the relevant info. Even on the worst end of the spectrum, we're still getting by far the most ambitious portable that Nintendo's ever released, which also doubles as a home console that's noticeably higher performing than Wii U, which is fine by me.
Even if you don't like portable gaming, I think most people would agree that a library that consisted of the Wii U's library + the 3DS's library + the Vita's library would be a rather solid library of games. And there's a good chance that with the Switch, that's basically what we're getting (plus some PC games that scale well to lower power configurations).
Thanks for the write-up. Let's see how many people read it.
I predict 3.
Nintendo's absolutely competing with Sony and Microsoft. They're just not competing the same way.
To put it in DBZ terms, Sony and Microsoft are Goku and Vegeta, Nintendo is Zamasu. There's no way he can keep up by doing the same things Goku and Vegeta do, so he- different strategies that allow him to compete.steals Goku's body ... or wishes himself immortal
You should know however that there are many "grown ass adults" interested in handhelds AND that doesn't give two fucks about high console specs.
You should also know that PC, PS4 and XB1 are fighting each other in the market to appeal your audience, and they're all pretty great options. So I don't know why Nintendo NEEDS to fight for the same share.
I haven't had time to read through every response here, so I'm probably repeating what others have already said, but here are my thoughts on the matter, anyway:
CPU Clock
This isn't really surprising, given (as predicted) CPU clocks stay the same between portable and docked mode to make sure games don't suddenly become CPU limited when running in portable mode.
The overall performance really depends on the core configuration. An octo-core A72 setup at 1GHz would be pretty damn close to PS4's 1.6GHZ 8-core Jaguar CPU. I don't necessarily expect that, but a 4x A72 + 4x A53 @ 1GHz should certainly be able to provide "good enough" performance for ports, and wouldn't be at all unreasonable to expect.
Memory Clock
This is also pretty much as expected as 1.6GHz is pretty much the standard LPDDR4 clock speed (which I guess confirms LPDDR4, not that there was a huge amount of doubt). Clocking down in portable mode is sensible, as lower resolution means smaller framebuffers means less bandwidth needed, so they can squeeze out a bit of extra battery life by cutting it down.
Again, though, the clock speed is only one factor. There are two other things that can come into play here. The second factor, obviously enough, is the bus width of the memory. Basically, you're either looking at a 64 bit bus, for 25.6GB/s, or a 128 bit bus, for 51.2GB/s of bandwidth. The third is any embedded memory pools or cache that are on-die with the CPU and GPU. Nintendo hasn't shied away from large embedded memory pools or cache before (just look at the Wii U's CPU, its GPU, the 3DS SoC, the n3DS SoC, etc., etc.), so it would be quite out of character for them to avoid such customisations this time around. Nvidia's GPU architectures from Maxwell onwards use tile-based rendering, which allows them to use on-die caches to reduce main memory bandwidth consumption, which ties in quite well with Nintendo's habits in this regard. Something like a 4MB L3 victim cache (similar to what Apple uses on their A-series SoCs) could potentially reduce bandwidth requirements by quite a lot, although it's extremely difficult to quantify the precise benefit.
GPU Clock
This is where things get a lot more interesting. To start off, the relationship between the two clock speeds is pretty much as expected. With a target of 1080p in docked mode and 720p in undocked mode, there's a 2.25x difference in pixels to be rendered, so a 2.5x difference in clock speeds would give developers a roughly equivalent amount of GPU performance per pixel in both modes.
Once more, though, and perhaps most importantly in this case, any interpretation of the clock speeds themselves is entirely dependent on the configuration of the GPU, namely the number of SMs (also ROPs, front-end blocks, etc, but we'll assume that they're kept in sensible ratios).
Case 1: 2 SMs - Docked: 384 GF FP32 / 768 GF FP16 - Portable: 153.6 GF FP32 / 307.2 GF FP16
I had generally been assuming that 2 SMs was the most likely configuration (as, I believe, had most people), simply on the basis of allowing for the smallest possible SoC which could meet Nintendo's performance goals. I'm not quite so sure now, for a number of reasons.
Firstly, if Nintendo were to use these clocks with a 2 SM configuration (assuming 20nm), then why bother with active cooling? The Pixel C runs a passively cooled TX1, and although people will be quick to point out that Pixel C throttles its GPU clocks while running for a prolonged time due to heat output, there are a few things to be aware of with Pixel C. Firstly, there's a quad-core A57 CPU cluster at 1.9GHz running alongside it, which on 20nm will consume a whopping 7.39W when fully clocked. Switch's CPU might be expected to only consume around 1.5W, by comparison. Secondly, although I haven't been able to find any decent analysis of Pixel C's GPU throttling, the mentions of it I have found indicate that, although it does throttle, the drop in performance is relatively small, and as it's clocked about 100MHz above Switch to begin with it may only be throttling down to a 750MHz clock or so even under prolonged workloads. There is of course the fact that Pixel C has an aluminium body to allow for easier thermal dissipation, but it likely would have been cheaper (and mechanically much simpler) for Nintendo to adopt the same approach, rather than active cooling.
Alternatively, we can think of it a different way. If Switch has active cooling, then why clock so low? Again assuming 20nm, we know that a full 1GHz clock shouldn't be a problem for active cooling, even with a very small quiet fan, given the Shield TV (which, again, uses a much more power-hungry CPU than Switch). Furthermore, if they wanted a 2.5x ratio between the two clock speeds, that would give a 400MHz clock in portable mode. We know that the TX1, with 2 SMs on 20nm, consumes 1.51W (GPU only) when clocked at about 500MHz. Even assuming that that's a favourable demo for the TX1, at 20% lower clock speed I would be surprised if a 400MHz 2 SM GPU would consume any more than 1.5W. That's obviously well within the bounds for passive cooling, but even being very conservative with battery consumption it shouldn't be an issue. The savings from going from 400MHz to 300MHz would perhaps only increase battery life by about 5-10% tops, which makes it puzzling why they'd turn down the extra performance.
Finally, the recently published Switch patent application actually explicitly talks about running the fan at a lower RPM while in portable mode, and doesn't even mention the possibility of turning it off while running in portable mode. A 2 SM 20nm Maxwell GPU at ~300MHz shouldn't require a fan at all, and although it's possible that they've changed their mind since filing the patent in June, it begs the question of why they would even consider running the fan in portable mode if their target performance was anywhere near this.
Case 2: 3 SMs - Docked: 576 GF FP32 / 1,152 GF FP16 - Portable: 230.4 GF FP32 / 460.8 GF FP16
This is a bit closer to the performance level we've been led to expect, and it does make a little bit of sense from the perspective of giving a little bit over TX1 performance at lower power consumption. (It also matches reports of overclocked TX1s in early dev kits, as you'd need to clock a bit over the standard 1GHz to reach docked performance here.) Active cooling while docked makes sense for a 3 SM GPU at 768MHz, although wouldn't be needed in portable mode. It still leaves the question of why not use 1GHz/400MHz clocks, as even with 3 SMs they should be able to get by with passive cooling at 400MHz, and battery consumption shouldn't be that much of an issue.
Case 3: 4 SMs - Docked: 768 GF FP32 / 1,536 GF FP16 - Portable: 307.2 GF FP32 / 614.4 GF FP16
This would be on the upper limit of what's been expected, performance wise, and the clock speeds start to make more sense at this point, as portable power consumption for the GPU would be around the 2W mark, so further clock increases may start to effect battery life a bit too much (not that 400-500MHz would be impossible from that point of view, though). Active cooling would be necessary in docked mode, but still shouldn't be needed in portable mode (except perhaps if they go with a beefier CPU config than expected).
Case 4: More than 4 SMs
I'd consider this pretty unlikely, but just from the point of view of "what would you have to do to actually need active cooling in portable mode at these clocks", something like 6 SMs would probably do it (1.15 TF FP32/2.3 TF FP16 docked, 460 GF FP32/920 GF FP16 portable), but I wouldn't count on that. For one, it's well beyond the performance levels that reliable-so-far journalists have told us to expect, but it would also require a much larger die than would be typical for a portable device like this (still much smaller than PS4/XBO SoCs, but that's a very different situation).
TLR
Each of these numbers are only a single variable in the equation, and we need to know things like CPU configuration, memory bus width, embedded memory pools, number of GPU SMs, etc. to actually fill out the rest of those equations to get the relevant info. Even on the worst end of the spectrum, we're still getting by far the most ambitious portable that Nintendo's ever released, which also doubles as a home console that's noticeably higher performing than Wii U, which is fine by me.
Even if you don't like portable gaming, I think most people would agree that a library that consisted of the Wii U's library + the 3DS's library + the Vita's library would be a rather solid library of games. And there's a good chance that with the Switch, that's basically what we're getting (plus some PC games that scale well to lower power configurations).
All their western website calls it a "home gaming system". French one calls it a "console de salon".
https://www.nintendo.co.uk/Nintendo-Switch/Nintendo-Switch-1148779.html
https://www.nintendo.de/Nintendo-Switch/Nintendo-Switch-1148779.html
https://www.nintendo.fr/Nintendo-Switch/Nintendo-Switch-1148779.html
https://www.nintendo.es/Nintendo-Switch/Nintendo-Switch-1148779.html
https://www.nintendo.it/Nintendo-Switch/Nintendo-Switch-1148779.html
Only the japanese site though calls their their new gaming console.
https://www.nintendo.co.jp/switch/
At the very least 6 times faster, undocked.
Seeing how the PS2 was the shittiest spec wise and now Sony are always leading cutting edge, who knows
Do you have a wiiu? Try playing with the gamepad. It's the closest thing available. Should help you getting used to it.If this is the absolute floor I am still very happy/excited. A ~150 GFLOP handheld from Nintendo sounds delicious! Here is hoping this low clock rate accompanies a solid battery life... and maybe a $200 price tag.
This will feel huge going form the 3DS.
I just think it's continually astounding that Nintendo can come in under the most conservative estimates of what can be done on modern, cheap, off the shelf hardware.
Can't wait for the 12th. Everyone will be mad hype the second Reggie is on stage to announce Super Mario Switch and everyone will be drooling over how beautiful will look and everyone will forget this console is not even close to X1 specs and everyone will be happy and Trump will resign and Danny Devito will be president of the US. Mark my words.
Quoting this again because, as I've said in this thread and other threads, the narrative that's being spun all of a sudden that this is a 3DS successor and portable console is revisionist to the max.
This is not a dedicated handheld.
My favourite posts on GAF are those from simple people who likely know nothing tech-wise and especially on the marketing side and yet claims that Nintendo never learns, that they are infintely more stupid than themselves. Pretending that something so obvious a children could understand is out of the whole R&D division of Nintendo.
Pretentious as fuck.
The GC and N64 failing and the Wii winning had almost nothing to do with specs.
Why do people think it does?
The Switch is not competing in 2013, it's competing in 2017. It needs to be attractive vs. the XB1 and PS4 at their 2017 prices.
I said it before and I'll say it again. I strongly believe the Switch is DOA at $299 and above. I believe it's potentially very successful at $199, and a toss-up at $249. The lineup of launch games will be a huge factor as well.
I think that deep down we all know that this thing needs to be successful from day one to have a chance long-term. That's why the price point will be critical.
Thanks for the write-up. Let's see how many people read it.
I predict 3.
Thanks for the write-up. Let's see how many people read it.
I predict 3.
I read that and found it informative.
Even if you don't like portable gaming, I think most people would agree that a library that consisted of the Wii U's library + the 3DS's library + the Vita's library would be a rather solid library of games. And there's a good chance that with the Switch, that's basically what we're getting (plus some PC games that scale well to lower power configurations).
Even if you don't like portable gaming, I think most people would agree that a library that consisted of the Wii U's library + the 3DS's library + the Vita's library would be a rather solid library of games. And there's a good chance that with the Switch, that's basically what we're getting (plus some PC games that scale well to lower power configurations).