• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.
Switch won't use a standard memory card for its games, it'll use a game card with serial transmission which should offer transfer speeds around 3 times that of a standard HDD and random access times on another planet to a standard HDD. Lower latency is always an advantage. Of course the advantage is far more significant with lots of small files being transferred. But even for larger it's still an advantage when you need to load something into RAM quickly.

This may be true but assuming games will be available digitally, many people will still load them from standard memory cards.
 

ggx2ac

Member
Wasn't Tegra X2 pascal and also 16nm? I'm still not convinced the Switch uses Tegra X2, tho, but would love to see that.

I don't know why people keep saying Tegra X2, there is no Tegra X2. The only Tegra that uses Pascal is called Parker.

Of course the Switch is not going to use Parker and it's not going to use TX1 either because there are barely any new consumer products that use 20nm nodes because 20nm is not worth it, that's why FinFet was made to be able to have even smaller die sizes work.

That's still speculation, just from an analyst.

And 14/16nm is a bigger deal than Pascal/Maxwell.

Smaller die size is one big deal but designing a new chip at a smaller die size compared to its predecessor is a big deal too.

The last page is a perfect example with how the ARM A73 is better than the A72 due to design as opposed to a mere die shrink. And the A73 and A72 aren't even from the same designers. A72 is from the Texas branch of ARM and A73 is from the Sophia branch of ARM in France.

Hence, with regards to the Switch. It's not only important to have the die size at 16nm but to have a new better design compared to the predecessor used in the dev-kits: Jetson TX1.
 

Donnie

Member
This may be true but assuming games will be available digitally, many people will still load them from standard memory cards.

We'll have to see what the policy is on digital downloads. Obviously if a game is going to be available digitally then that's going to limit how they can use the superior data transfer of Switch's game cards. Just saying that the cards will be superior, far superior to a standard HDD. I mean even a MicroSD will be, not nearly as good as game card but the far lower latency will still be an advantage over HDD.
 

Thraktor

Member
We'll have to see what the policy is on digital downloads. Obviously if a game is going to be available digitally then that's going to limit how they can use the superior data transfer of Switch's game cards. Just saying that the cards will be superior, far superior to a standard HDD. I mean even a MicroSD will be, not nearly as good as game card but the far lower latency will still be an advantage over HDD.

One approach they could take to this would be to drop SD card support and instead use UFS cards for storage expansion, as they could guarantee a much higher minimum bandwidth. Not something I'd necessarily expect from Nintendo, though, as they'd be perhaps the first device to actually support the new cards, and with only one manufacturer you'd be looking at relatively high prices and limited availability, at least at first.

Edit: Actually it seems there is one aspect of UFS cards which may seem amenable to Nintendo; they draw quite a lot less power than MicroSD cards. MicroSD cards have a maximum power draw of 2.88W, versus 1.54W for UFS cards. Additionally, the increased speed of UFS cards would give them even more of an advantage, as it means the card would spend more of its time idle. In an extreme case, a MicroSD could consume the full 2.88W during sustained 100MB/s reads, whereas a 500MB/s UFS card performing the same reads would spend 80% of its time idle, potentially bringing power consumption well below a Watt (depending on the idle power draw). I don't know if real-world savings would be that high, but 1-2W is a huge amount for a battery powered device like this.
 

Roo

Member
Did Nate need redeeming?
No, he didn't. That's the joke :p
That's still speculation, just from an analyst.

And 14/16nm is a bigger deal than Pascal/Maxwell.
Yeah, I know it's speculation from an analyst.
My question was more about what made him go with 14/16nm FinFET instead of.. 20nm or something like that (which KingSnake) kindly explained:
Tegra Parker (Pascal based) is running on TSMC 16nm FinFET process. That's like the strongest argument that me and others were using to support that having a Pascal based Tegra is the most likely options, as the 20nm process (that X1/Maxwell uses) had issues and was quickly abandoned by almost everybody, especially in the mobile world.
Thanks. This is what I wanted to know.

So I assume this may be a little more expensive for Nintendo since 20nm is pretty much obsolete but it will be worth it in the long run.
I also read it's is more power efficient (or at least I think I did lol) so it may be on Nintendo's best interests to secure it for Switch.
 

Cartho

Member
Looking at the above specs people need to lower their expectations. Also remember Nintendo usually have crap batteries that are not as big. Nintendo can't afford to have a large capacity battery as well as decent spec as they don't want to be priced foo high.

I reckon Switch will be severely underclocked when undocked and allow a 3 to 4 hour battery life.

I'm not so sure. I just find it hard to believe that Nintendo won't have thought about stuff like this. I mean they won't want to release a console which has a massive emphasis on being both a handheld and a home console, only for it to have dreadful battery life and awful frame rates when in handheld mode. That would be utterly mental.

I'm no techie so I'm not sure how they could have fixed issues like this but I'm willing to bet they've come up with a way.
 

EDarkness

Member
I'm not so sure. I just find it hard to believe that Nintendo won't have thought about stuff like this. I mean they won't want to release a console which has a massive emphasis on being both a handheld and a home console, only for it to have dreadful battery life and awful frame rates when in handheld mode. That would be utterly mental.

I'm no techie so I'm not sure how they could have fixed issues like this but I'm willing to bet they've come up with a way.

I thought about this, and if they're serious about it being mainly a console, then that would make sense. it would be a case of you CAN take it with you and game on the go, but there's a trade off there. I agree that it should be good in either mode, but I won't lie...if I was making an NS game, I really wouldn't care that much about what the performance was when not docked. I wouldn't want it to be 2fps or anything like that, but I wouldn't put a lot of time in making sure that it was super awesome. I'm sure Nintendo would probably make sure it worked basically well in both situations, though. I wouldn't expect 3rd parties to do it as well.
 

KAL2006

Banned
I'm not so sure. I just find it hard to believe that Nintendo won't have thought about stuff like this. I mean they won't want to release a console which has a massive emphasis on being both a handheld and a home console, only for it to have dreadful battery life and awful frame rates when in handheld mode. That would be utterly mental.

I'm no techie so I'm not sure how they could have fixed issues like this but I'm willing to bet they've come up with a way.

At what they will come up with is Wii U level graphics undocked. This is why every game that was shown in the reveal trailer were Wii U level. People saying 2x Wii U power undocked I don't think that will happen at all. Due to sever underclocking I'm expecting they will just about match Wii U graphics with a 3 to 4 hour battery life but with USB-C Fast Charge.

I reckon all the fans and cooling system don't even turn on undocked to conserve battery life and due to the chip being severely underclocked the heat wouldn't be that bad.

I don't know any system in the market that has a fan for a portable apart from laptops or high end tablets which have crap battery life if you game. Also the batteries on those devices are massive.
 

Rolf NB

Member
One approach they could take to this would be to drop SD card support and instead use UFS cards for storage expansion, as they could guarantee a much higher minimum bandwidth. Not something I'd necessarily expect from Nintendo, though, as they'd be perhaps the first device to actually support the new cards, and with only one manufacturer you'd be looking at relatively high prices and limited availability, at least at first.
Are we now finally figuring out why the Vita used cutom memory cards?

Will the Switch support Vita memory cards?

Stay tuned!
 

Refyref

Member
Are we now finally figuring out why the Vita used cutom memory cards?

Will the Switch support Vita memory cards?

Stay tuned!

Unfortunately, the Vita's memory cards were actually slower than SD cards. They really had absolutely no point other than price gouging. That being said, I hope for Nintendo's sake they don't go with anything other than SD or microSD.
 

Daedardus

Member
Is there any reason why a serial interface for the game cards wouldn't be able to write save data onto the card? It seems that with an appropriate interface and controller this shouldn't be a problem. And it will very likely be a serial interface since the game cards features only 5 connections.

I also wouldn't worry too much between SD card and game card performance, since I still believe that games will be able to be downloaded to the SD storage. They might inforce a certain class requirement, and loading times might change a bit, but it wouldn't go so far that games aren't playable anymore. This stuff can happen with different SD cards for 3DS and different HDD/SSD in PC/WIIU/PS4/XBO applications too and it doesn't prevent a lot of games from being played.
 

Rolf NB

Member
Is there any reason why a serial interface for the game cards wouldn't be able to write save data onto the card? It seems that with an appropriate interface and controller this shouldn't be a problem. And it will very likely be a serial interface since the game cards features only 5 connections.
SATA uses 4 pins for data (2 in, 2 out) and 3 for ground. You could get away with a single pin for ground if you don't need to support longer cables ... or any cable at all really.
 

Refyref

Member
Is there any reason why a serial interface for the game cards wouldn't be able to write save data onto the card? It seems that with an appropriate interface and controller this shouldn't be a problem. And it will very likely be a serial interface since the game cards features only 5 connections.

I also wouldn't worry too much between SD card and game card performance, since I still believe that games will be able to be downloaded to the SD storage. They might inforce a certain class requirement, and loading times might change a bit, but it wouldn't go so far that games aren't playable anymore. This stuff can happen with different SD cards for 3DS and different HDD/SSD in PC/WIIU/PS4/XBO applications too and it doesn't prevent a lot of games from being played.

There should not be any reason for it not to be possible, the question is whether the cost of a controller that supports it and a small NAND flash chip on each game card will be worth it. And if game cards will actually be much faster than SD cards (which I kind of doubt, if only to make every storage format close enough), then games on SD cards will have longer times for loading and streaming. The former probably won't mean much more than longer loading screens, while the latter could hurt graphics as it could lead to (more) cases of pop-in, if the game doesn't take them into account.
 

ozfunghi

Member
So I assume this may be a little more expensive for Nintendo since 20nm is pretty much obsolete but it will be worth it in the long run.
I also read it's is more power efficient (or at least I think I did lol) so it may be on Nintendo's best interests to secure it for Switch.

The "same" chip, will always be more efficient when it's shrunk. When shrunk, it uses less energy, and thus it will run less hot. Basically that means you can then clock it higher to get better performance (but it will again get hotter as well and use more power) or you can keep it at the same performance and have a cooler, less powerhungry chip. I believe it would make sense for Nintendo to go for the first option in "docked" mode (better performance, hotter, more power hungry, doesn't matter when it's docked) and "switch" to the second option when portable (same performance as the 20nm chip, but cooler (less need for active cooling = better battery life) and less power hungry = again better battery life).

According to the factory that produces the chips, you either gain 40% performance (at the same power draw) or you gain 60% power efficiency (at the same performance).
 

Retrobox

Member
One important thing I'm currently wondering about: Now that Nintendo isn't working with Power PC anymore, what does that mean for their programmers? They've spent years and years gathering experience with that architecture after all. So how much of that experience can they carry over to the Switch? Will they have to essentially relearn everything all over or can they quickly pick up the pace where they left off?
 

ozfunghi

Member
One important thing I'm currently wondering about: Now that Nintendo isn't working with Power PC anymore, what does that mean for their programmers? They've spent years and years gathering experience with that architecture after all. So how much of that experience can they carry over to the Switch? Will they have to essentially relearn everything all over or can they quickly pick up the pace where they left off?

It's no different from developers getting to know PS2 and having to get to know the PS3 after that, or PS4 after that. Also, as i understand it, it's basically a non issue since the compilers take care of that, as well as the ARM architecture being easier to port to, than from (from what i've gathered).
 

Vic

Please help me with my bad english
One important thing I'm currently wondering about: Now that Nintendo isn't working with Power PC anymore, what does that mean for their programmers? They've spent years and years gathering experience with that architecture after all. So how much of that experience can they carry over to the Switch? Will they have to essentially relearn everything all over or can they quickly pick up the pace where they left off?
They've also been working with ARM chipsets since the GBA, so yeah, that's probably a non-issue.
 
It's no different from developers getting to know PS2 and having to get to know the PS3 after that, or PS4 after that. Also, as i understand it, it's basically a non issue since the compilers take care of that, as well as the ARM architecture being easier to port to, than from (from what i've gathered).

Not just that, but their experience with ARM architectures goes back further than PPC did.
Dammit Vic
 
ARM is also not terribly difficult to understand and use. There are TONS of resources available, it's well documented and any programmer/tester worth their salt will get good at it in a reasonable amount of time.

ARM is not some esoteric solution.
 

Daedardus

Member
SATA uses 4 pins for data (2 in, 2 out) and 3 for ground. You could get away with a single pin for ground if you don't need to support longer cables ... or any cable at all really.

Yeah, SATA uses cables which are prone to generating noise when put into an electromagnetic environment and noise can influence the potential of the ground. But with an almost direct connection from the Switch's bus to the game card, this should be less of a problem. There also seems to be a much wider pin which I assume to be the ground. I do wonder how they manage the power source, because it seems like there are so little pins.

There should not be any reason for it not to be possible, the question is whether the cost of a controller that supports it and a small NAND flash chip on each game card will be worth it. And if game cards will actually be much faster than SD cards (which I kind of doubt, if only to make every storage format close enough), then games on SD cards will have longer times for loading and streaming. The former probably won't mean much more than longer loading screens, while the latter could hurt graphics as it could lead to (more) cases of pop-in, if the game doesn't take them into account.

I doubt the cost will be that much that it will be done on cost decision alone. The added benefit of on card saves are exchangeability of savegames if your Switch breaks down. Although on the other hand save games on the Switch itself could be compatible with their download version. They could also provide some sort of account tie in for save files and allow them to be backed up personally or via the cloud. Some games could suffer from save game manipulation though. The system already in place for 3DS is restrictive since it locks the backup to the SD card and system, therefore an account sign check could help alleviate that.

The "same" chip, will always be more efficient when it's shrunk. When shrunk, it uses less energy, and thus it will run less hot. Basically that means you can then clock it higher to get better performance (but it will again get hotter as well and use more power) or you can keep it at the same performance and have a cooler, less powerhungry chip. I believe it would make sense for Nintendo to go for the first option in "docked" mode (better performance, hotter, more power hungry, doesn't matter when it's docked) and "switch" to the second option when portable (same performance as the 20nm chip, but cooler (less need for active cooling = better battery life) and less power hungry = again better battery life).

According to the factory that produces the chips, you either gain 40% performance (at the same power draw) or you gain 60% power efficiency (at the same performance).

Just shrinking a chip won't always make it (much) more efficient. That's because closer transistors feature increased capacitance, which is responsible for power loss that scales with a certain power of the frequency. If the type of transistor or the chip design doesn't accomodate for that, power draw may stay roughly the same for the same frequency. Sometimes they'll lower the frequency so it draws less power but still performs faster than at the larger node. It all depends on how small the node already was before you shrink it though, but it's one of the primary reasons why the newest Intel chipsets don't feature the clockspeeds like they used to do if they have to be power efficient. But yes, shrinking is usually the best way to boost performance and efficiency, but the chip design has to check for potential problems too.
 

ggx2ac

Member
Not just that, but their experience with ARM architectures goes back further than PPC did.
Dammit Vic

ARM is also not terribly difficult to understand and use. There are TONS of resources available, it's well documented and any programmer/tester worth their salt will get good at it in a reasonable amount of time.

ARM is not some esoteric solution.

Oh I don't know. I've definitely seen some people that seem convinced that ARM is a foreign architecture to Nintendo's developers and that Nintendo should have used x86 all because Wii U didn't get ports and "x86 has super charged PC architecture"

Did people forget that PS3(to some extent) and Xbox 360 were using PPC as well? Their CPUs were designed by IBM.

At what they will come up with is Wii U level graphics undocked. This is why every game that was shown in the reveal trailer were Wii U level. People saying 2x Wii U power undocked I don't think that will happen at all. Due to sever underclocking I'm expecting they will just about match Wii U graphics with a 3 to 4 hour battery life but with USB-C Fast Charge.

So that explains this.

Looking at the above specs people need to lower their expectations.

Yeah... No.

I don't know why you were basing your expectations off of dev-kit rumours and some handheld from a kickstarter that is using old ARM CPU and GPU chips. They use 28nm nodes.

Switch is likely to be using a custom Nvidia GPU which is likely to be designed well aside from the node being a 16nmFF for power efficiency. We already know how powerful a TX1 is and it's not unrealistic to see something within the performance of Parker or even better for power efficiency because it all comes down to design.

It's not unrealistic to expect 2x the performance of the Wii U in handheld mode when not only is there the hardware provided by Nvidia but also their software which is most likely to include their Tile-based deferred rasterizer.

The Nintendo Switch’s gaming experience is also supported by fully custom software, including a revamped physics engine, new libraries, advanced game tools and libraries. NVIDIA additionally created new gaming APIs to fully harness this performance. The newest API, NVN, was built specifically to bring lightweight, fast gaming to the masses.

Gameplay is further enhanced by hardware-accelerated video playback and custom software for audio effects and rendering.

https://blogs.nvidia.com/blog/2016/10/20/nintendo-switch/

And let's not forget that the Switch is possibly going to be utilising Vulkan in that NVN API which has shown to improve performance per watt.

Together with Samsung and the Vulkan API, Super Evil Megacorp was able to create games with 30 percent faster performance, increased rendering of images on the screen and improved battery life.

https://news.samsung.com/global/see...y-s7-create-more-immersive-gaming-experiences

____

No, it is not unrealistic to expect 2x the performance of the Wii U while it is not docked. Thinking that the Switch will be only as powerful as the Wii U is the extreme opposite to those people that thought the Switch could be more powerful than an Xbox One or PS4.

Your arguments aren't convincing enough for the idea that everyone should lower their expectations when we've had expectations of the hardware being between the TX1 and Parker with regards to performance.
 

Rodin

Member
At what they will come up with is Wii U level graphics undocked. This is why every game that was shown in the reveal trailer were Wii U level. People saying 2x Wii U power undocked I don't think that will happen at all. Due to sever underclocking I'm expecting they will just about match Wii U graphics with a 3 to 4 hour battery life but with USB-C Fast Charge.

I reckon all the fans and cooling system don't even turn on undocked to conserve battery life and due to the chip being severely underclocked the heat wouldn't be that bad.

I don't know any system in the market that has a fan for a portable apart from laptops or high end tablets which have crap battery life if you game. Also the batteries on those devices are massive.
I don't think your reasoning is necessarily wrong here but Wii U is nowhere near that footage of NBA, Skyrim Remastered or the new Mario game. Now whether or not they were actually running on NS is a different question (for Skyrim and NBA at least), but still, calling that footage "Wii U level" is wrong. The reason why they showed those games had nothing to do with graphics and horsepower.
 
That's not how it works. That's not how any of this works.

Yes, that's how it works. My numbers are wishful thinking, but it does work.

As many have pointed out already, screen resolution is low in the list of things draining this console's battery.

Lower resolution could run on a downclocked CPU/GPU. The thing might even be able to undervolt while running at lower speeds to reduce the amount of power it uses.
 

ggx2ac

Member
Yes, that's how it works. My numbers are wishful thinking, but it does work.

N3DS XL and 3DS XL were running off a 6.5Wh 1750mAH battery. (3.5 to 6.5 hours)

Nvidia Shield K1 Tablet was running off a 19.75Wh 5200mAH battery. (Apparently 5 hours battery life for gaming, 10 hours for non-gaming tasks.)

Let's see where Switch falls. (It's too difficult figuring out battery specs from speculation.)

Edit: Meh, I still couldn't calculate battery life so I could make estimations.

N3DS XL (Not the battery) is listed as using 4.1 W, 4.6V, hence approximately 0.89A but calculating the battery life isn't working.
 

ozfunghi

Member
Yes, that's how it works. My numbers are wishful thinking, but it does work.

Lower resolution could run on a downclocked CPU/GPU. The thing might even be able to undervolt while running at lower speeds to reduce the amount of power it uses.

Not CPU. You can reduce the strain on the GPU (and thus consume less power) by reducing resolution. But AI, logic, physics... still have to be calculated the same if you don't want to change the core game experience. I mean, you could play Pikmin on a reduced resolution, but you couldn't cut the amount of Pikmin you control in portable mode because the CPU is downclocked, for instance.
 
Not CPU. You can reduce the strain on the GPU (and thus consume less power) by reducing resolution. But AI, logic, physics... still have to be calculated the same if you don't want to change the core game experience. I mean, you could play Pikmin on a reduced resolution, but you couldn't cut the amount of Pikmin you control in portable mode because the CPU is downclocked, for instance.

It depends on how processor and gpu intensive a game is.
 
Not CPU. You can reduce the strain on the GPU (and thus consume less power) by reducing resolution. But AI, logic, physics... still have to be calculated the same if you don't want to change the core game experience. I mean, you could play Pikmin on a reduced resolution, but you couldn't cut the amount of Pikmin you control in portable mode because the CPU is downclocked, for instance.

Yeah, I don't know why I wrote CPU in there. Was trying to be all encompassing, but you're right. GPU can still be clocked lowered and even undervolt when rendering at a lower resolution.
 

TunaLover

Member
I really hope they come with a balanced architecture this time, Wii U was severaly bottlenecked because the ancient CPU, it allowed Wii BC but at the cost to not keeping up with the GPU and RAM setup (which saddly leave GPU starving and unable to perform better, Wii U GPU is great imo), really don't think they have problems surpassing Wii U raw power by a wide margin, if reports are true about the CPU being highly efficient.
 

Mr Swine

Banned
I really hope they come with a balanced architecture this time, Wii U was severaly bottlenecked because the ancient CPU, it allowed Wii BC but at the cost to not keeping up with the GPU and RAM setup (which saddly leave GPU starving and unable to perform better, Wii U GPU is great imo), really don't think they have problems surpassing Wii U raw power by a wide margin, if reports are true about the CPU being highly efficient.

I think the biggest bottleneck with Switch is the bandwidth speed. 25GB/s is really low with a 64bit bus and 50GB/s with a 128bit bus. How fast was the bandwidth on Wii U with it's DDR3 ram? Think it's possible for them to use both a 128bit bus and have some form of cache to speed things up?
 

Hermii

Member
Oh I don't know. I've definitely seen some people that seem convinced that ARM is a foreign architecture to Nintendo's developers and that Nintendo should have used x86 all because Wii U didn't get ports and "x86 has super charged PC architecture"

Did people forget that PS3(to some extent) and Xbox 360 were using PPC as well? Their CPUs were designed by IBM.



So that explains this.



Yeah... No.

I don't know why you were basing your expectations off of dev-kit rumours and some handheld from a kickstarter that is using old ARM CPU and GPU chips. They use 28nm nodes.

Switch is likely to be using a custom Nvidia GPU which is likely to be designed well aside from the node being a 16nmFF for power efficiency. We already know how powerful a TX1 is and it's not unrealistic to see something within the performance of Parker or even better for power efficiency because it all comes down to design.

It's not unrealistic to expect 2x the performance of the Wii U in handheld mode when not only is there the hardware provided by Nvidia but also their software which is most likely to include their Tile-based deferred rasterizer.



https://blogs.nvidia.com/blog/2016/10/20/nintendo-switch/

And let's not forget that the Switch is possibly going to be utilising Vulkan in that NVN API which has shown to improve performance per watt.



https://news.samsung.com/global/see...y-s7-create-more-immersive-gaming-experiences

____

No, it is not unrealistic to expect 2x the performance of the Wii U while it is not docked. Thinking that the Switch will be only as powerful as the Wii U is the extreme opposite to those people that thought the Switch could be more powerful than an Xbox One or PS4.

Your arguments aren't convincing enough for the idea that everyone should lower their expectations when we've had expectations of the hardware being between the TX1 and Parker with regards to performance.

I believe that 30% faster with vulcan benchmark was compared to Android, not compared to a gaming console.


I really hope they come with a balanced architecture this time, Wii U was severaly bottlenecked because the ancient CPU, it allowed Wii BC but at the cost to not keeping up with the GPU and RAM setup (which saddly leave GPU starving and unable to perform better, Wii U GPU is great imo), really don't think they have problems surpassing Wii U raw power by a wide margin, if reports are true about the CPU being highly efficient.

I don't think the Wii U is unbalanced, its just an overall weak console. The issue with the cpu was that it had very different strengths than the 360 cpu which made it hard to port to. For games programmed for it, its not as bad as its reputation. The GPU is not great, its 176 gflops.
 

ggx2ac

Member
I believe that 30% faster with vulcan benchmark was compared to Android, not compared to a gaming console.

I'm sure it's working on Android on both in its comparison.

The difference is that it would have been running on OpenGL ES vs Vulkan for the comparison.
 

Hermii

Member
I'm sure it's working on Android on both in its comparison.

The difference is that it would have been running on OpenGL ES vs Vulkan for the comparison.
The point still stands, I doubt Vulcan is anywhere near the same improvement on a gaming console, because there are already apis that are close to the metal.
 

ggx2ac

Member
The point still stands, I doubt Vulcan is anywhere near the same improvement on a gaming console, because there are already apis that are close to the metal.

Point taken.

The only benefit I see then is that Vulkan should make things easier for multiplatform games because the low-level coding can be optimised across any platform that supports Vulkan.
 

Thraktor

Member
Is there any reason why a serial interface for the game cards wouldn't be able to write save data onto the card? It seems that with an appropriate interface and controller this shouldn't be a problem. And it will very likely be a serial interface since the game cards features only 5 connections.

I also wouldn't worry too much between SD card and game card performance, since I still believe that games will be able to be downloaded to the SD storage. They might inforce a certain class requirement, and loading times might change a bit, but it wouldn't go so far that games aren't playable anymore. This stuff can happen with different SD cards for 3DS and different HDD/SSD in PC/WIIU/PS4/XBO applications too and it doesn't prevent a lot of games from being played.

A serial interface would be perfectly capable of writing data, even with few pins (from a theoretical point of view there's nothing stopping a two-pin interface from handling full serial read/writes). The M-PHY interface used for UFS (and a few other standards) uses four data pins for full-duplex read/write communication, although many implementations are dual-lane (i.e. 8 pins) for increased bandwidth.

The issue isn't really that implementing a read/write interface would be challenging, after all they could just license M-PHY and use that. The reason I would expect them to move to a read-only interface is because implementing on-card saves is both expensive and effectively redundant nowadays, and if you aren't going to be using on-card saves, then there's no point licensing an interface like M-PHY when you can just design a simpler and cheaper read-only interface yourself. On the Switch hardware the price difference is likely to be trivial, but if you can implement simpler control logic on the game cards themselves, then over the course of hundreds of millions of them the savings start to add up.

There's also just the simple issue of using more pins for reads. Many serial interfaces like M-PHY have separate send/receive pins in order to support full-bandwidth transfers in both directions, but on a system like Switch, where read requirements might be a couple of orders of magnitude higher than writes, those two write pins will mostly sit idle. Change them over to reads and you've got double the maximum bandwidth without any increase in the number of pins. Nintendo can also design a custom interface which specifically minimises the hardware requirements of the game-card side of the interface (i.e. things like allowing the interface to operate with Switch providing an explicit clock, removing the need for a PLL in the card logic).

I doubt the cost will be that much that it will be done on cost decision alone. The added benefit of on card saves are exchangeability of savegames if your Switch breaks down. Although on the other hand save games on the Switch itself could be compatible with their download version. They could also provide some sort of account tie in for save files and allow them to be backed up personally or via the cloud. Some games could suffer from save game manipulation though. The system already in place for 3DS is restrictive since it locks the backup to the SD card and system, therefore an account sign check could help alleviate that.

I'm expecting cloud saves to eliminate the need for card-based saves on this basis. Nintendo management have been talking very explicitly about moving towards cloud-saves for several years now, and as far as I recall they talked about it as one of the main factors behind their collaboration with DENA. At this point I'd be surprised if Switch didn't support cloud saves by default for anyone with a NNID.
 
If we are believing that rumor that the Switch uses 800MB of RAM for the OS, what does that mean for the potential share button applications?

I don't know exactly how it works, but the PS4 share button letting you share/view the last 15 minutes of gameplay is a very nice feature, and I'm assuming that requires a decent amount of system RAM, though I could very well be mistaken. But if I'm not, can we expect the Switch share button to not be able to share prior gameplay footage like that? Or will it just be dedicated to taking screenshots?

It would be a real shame if there is a whole button dedicated to taking/posting screenshots and nothing else...
 

Thraktor

Member
If we are believing that rumor that the Switch uses 800MB of RAM for the OS, what does that mean for the potential share button applications?

I don't know exactly how it works, but the PS4 share button letting you share/view the last 15 minutes of gameplay is a very nice feature, and I'm assuming that requires a decent amount of system RAM, though I could very well be mistaken. But if I'm not, can we expect the Switch share button to not be able to share prior gameplay footage like that? Or will it just be dedicated to taking screenshots?

It would be a real shame if there is a whole button dedicated to taking/posting screenshots and nothing else...

PS4's continuous video recording should have much more of an impact on the hard-drive than RAM. I can't imagine they're keeping the entire 15 minute loop in memory at all times.
 
PS4's continuous video recording should have much more of an impact on the hard-drive than RAM. I can't imagine they're keeping the entire 15 minute loop in memory at all times.

I understand that the video file itself would not be kept in the RAM, but doesn't the OS function of continuously recording/deleting/preparing the video depend on system RAM?
 

AzaK

Member
I understand that the video file itself would not be kept in the RAM, but doesn't the OS function of continuously recording/deleting/preparing the video depend on system RAM?

Not necessarily. I'm not sure how PS4 does it, but it could just read from the framebuffer.
 

LCGeek

formerly sane
The point still stands, I doubt Vulcan is anywhere near the same improvement on a gaming console, because there are already apis that are close to the metal.

We are talking about nintendo who literally has been stuck ass backwards for quite sometime. It's a leap. Someone already said it they use opengl es, that alone should clue you in as to how old it is.

You're comment is true the problem is nintendo doesn't use them.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Not necessarily. I'm not sure how PS4 does it, but it could just read from the framebuffer.
An encoder reading the framebuffer directly would either impose a latency hit (read: fps hit), or require multi-ported memory (read: much more expensive). An encoder would be best piggybacked to something like the display interface fb scan-out, so the fb scans that are sent out to the DI would be also sent to the encoder at the same time.
 

AzaK

Member
An encoder reading the framebuffer directly would either impose a latency hit (read: fps hit), or require multi-ported memory (read: much more expensive). An encoder would be best piggybacked to something like the display interface fb scan-out, so the fb scans that are sent out to the DI would be also sent to the encoder at the same time.

Which would be cheaper, money wise between the two? How much hit would it be reading from the FB?

Anyway the point was memory use. Neither should really use any extra memory.
 
An encoder reading the framebuffer directly would either impose a latency hit (read: fps hit), or require multi-ported memory (read: much more expensive). An encoder would be best piggybacked to something like the display interface fb scan-out, so the fb scans that are sent out to the DI would be also sent to the encoder at the same time.

Which would be cheaper, money wise between the two? How much hit would it be reading from the FB?

Anyway the point was memory use. Neither should really use any extra memory.

That indeed was the question, thanks for answering. So regardless of the method used it shouldn't really impact system memory. Which makes me much more excited about that share button, because it really is a great little idea that Nintendo is smart to steal/implement.


Nothing, remember the Wii U uses 1GB for it's OS, and it has no such feature....

We have seen (and insiders have confirmed that it is) a share button on the Switch, so there will be some sort of sharing features, regardless of OS RAM. Apparently recording video doesn't really impact RAM like I thought it did, so my original point is moot.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Which would be cheaper, money wise between the two? How much hit would it be reading from the FB?
Piggybacking the Display Interface (DI) scan-out is essentially free.

Anyway the point was memory use. Neither should really use any extra memory.
Well, the encoder would need some buffers for its transformations, but that should not amount to a complete frame worth of ram.
 
The question really has to be what OS is Nintendo going to use? Surely they arent going to build in the featureless Wii U OS or the dated 3ds OS.

Do we think there is any possibility Nintendo uses a form of Android since it already works on Tegra? This would be similar to what the Kindle tablets do .. they run a custom version of android I believe. Full blown Android Nougat uses roughly 1.5 gigs of ram on the new Pixel phone.. I know that Nintendo wouldnt have as many processes to keep going in the background so it could slim that down.
 
Status
Not open for further replies.
Top Bottom