• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo files patent application for cloud gaming devices

Extremetech's article says Nintnedo has said they'll announce an NX release date at E3. That's ours speculation though, right?

Unless they know something that Nintendo haven't shared publically, it's speculation. All they've said officially is that they will talk more about NX in 2016.
 

ChaosXVI

Member
Yeah it's definitely speculation that they'll announce a release date at E3. I definitely think they'll say "Coming 2016" or "Coming 2017", but even with Wii U they didn't pin down the exact release date until September 2012, 2 months before release.
 

jblank83

Member
Ignoring cloud computing, this falls in line with Nintendo's focus on local multi.

Supplementing computation with the controllers means local multiplayer performance doesn't degrade no matter how many players you add (theoretically), as they all would be increasing processing capability to compensate for increased load. You can also accomplish multi without split screening if the other topic is correct and the controllers have screens on them.
 
Yeah it's definitely speculation that they'll announce a release date at E3. I definitely think they'll say "Coming 2016" or "Coming 2017", but even with Wii U they didn't pin down the exact release date until September 2012, 2 months before release.

The way they handled Wii U's release information, at least in the U.S., was horrendous, though. When did they announce the release? September? It was the latest date possible without missing the holiday season and the price was the upper limit of what everyone was predicting. It generated no hype whatsoever. Just a clusterfuck of a launch from start to finish.
 

AzaK

Member
Ah yes, that might limit things for yah. My professor was from New Zealand. I'd love to visit some time myself. Anyway, in such instances, you may be stuck using your own SCD and game console (whatever it may be) locally. The way they describe the SCD, however, it might just have one cable for power (and it may even be a relatively low power device). You could take it wherever you want to play.

Remember some other tidbits from Iwata (R.I.P.). I don't feel like digging up the quotes, as my mind is spinning from that controller patent, but there was one where he mentioned "new ways of payment for customers." This might allude to the option of buying an SCD or using point/cash/whatever to leech off someone else's.

There's also Iwata talking about "redefining a platform" and how anyone with an NNID would basically be a part of the Nintendo "platform." Well, wouldn't this make sense if users w/ an account could use the technology in this patent to play Nintendo games using others' SCD along w/ their own tablet, PC, "nintendocast", Wii U or whatever as the "game console/terminal"? As blu said, imagine if the Gamepad had even the power of a 2DS. Most smart device SoCs have enough juice that they could process control inputs, A/V out, and basic game instruction no sweat while the SCD takes care of the heavy lifting.

Totally, I think it's a bloody great idea and if they get close to realising what our imaginations are thinking of this could be amazing. I'm convinced enough that if there was someone that could offer a low latency connection to you that it would be amazing to be able to leverage their processors. Just not sure that practically it'd actually work out given the number of people etc. I'm also a little unconvinced that Nintendo has the chops/dedication to really realise its potential. They always seem to do things 1/2 way, especially with networking related things. Maybe DeNA can make it work?

And I'm all for physically extensible local console with supplemental same-subnet sharing; bring that shit on. Buy one console and then add shit over the years. One in my TV room, one in the bedroom. Wife and I can play different games if we want and if only one of us is playing we get extra juice to sexy up the game. What's not to like?
 

Kimawolf

Member
So while playing Xenoblade X, i had a thought. could these "SCDs" be used in future games to say, add collision detection and physics to games? Have a baseline game like it is now, where you run through things and enemies, and by adding a SCD wifi you can get collisions and physics stuff, and a physical connection device you could get like higher textures along with the collision/physics?

Would that kind of set up be possible?
 

LordOfChaos

Member
So while playing Xenoblade X, i had a thought. could these "SCDs" be used in future games to say, add collision detection and physics to games? Have a baseline game like it is now, where you run through things and enemies, and by adding a SCD wifi you can get collisions and physics stuff, and a physical connection device you could get like higher textures along with the collision/physics?

Would that kind of set up be possible?


If the reason for the developer leaving those out was in fact performance based, then sure, adding additional compute hardware could help. Heck it would probably be the first thing they'd add.

The developer would still have to target the additional hardware and add the collision detection of course.
 

Somnid

Member
Nintendo announces fall dates in September. Anything else is a stupid lie from someone who clearly does not pay attention.
 

Kimawolf

Member
If the reason for the developer leaving those out was in fact performance based, then sure, adding additional compute hardware could help. Heck it would probably be the first thing they'd add.

The developer would still have to target the additional hardware and add the collision detection of course.

This makes me much more excited than I was. It would definitely be a way to make games far more immersive.
 

AmyS

Member
uWkvvBa.jpg

dgciGTm.jpg


cOMR8cQ.jpg

jWzjHHD.png



tumblr_mm9197UaBH1r0bhnuo1_250.gif


^__^
 

Jezbollah

Member
Looks like free online gaming for Nintendo is a thing of the past. This kind of infrastructure doesn't pay for itself..
 

Pokemaniac

Member
Looks like free online gaming for Nintendo is a thing of the past. This kind of infrastructure doesn't pay for itself..

That's one of the interesting things about this setup, though. Consumers are the ones who are going to be paying for the hardware directly. There may be something of a fee for users to use them over the cloud, but, if you buy one for yourself, there's no need to pay for any online services since you have the hardware in your own home.
 

LordOfChaos

Member
Looks like free online gaming for Nintendo is a thing of the past. This kind of infrastructure doesn't pay for itself..

I'm not so sure? It's user owned hardware, on a system that seems peer to peer. If anything this lessens their online costs? Or at least doesn't increase it, if they still have their servers as always for MP, this for cloud-like assistance.

If anything this seems like passing the Cloud cost to consumers.

I wonder how this would play out over time, ideally you'd want enough users to keep the assist system plugged in and running to support people who hop on at the end of it's life or plug it in whenever, I wonder if support would be better or worse than Nintendos history (like the Wii online shutdown).
 

AzaK

Member
Looks like free online gaming for Nintendo is a thing of the past. This kind of infrastructure doesn't pay for itself..

Maybe but we have to be realistic; as you say it costs to run servers and all that shit. If it ends up being like PS+ I'm OK with that. I get to play so many games that I wouldn't purchase otherwise and get a tonne of value from it.
 

heidern

Junior Member
I'm not so sure? It's user owned hardware, on a system that seems peer to peer. If anything this lessens their online costs? Or at least doesn't increase it, if they still have their servers as always for MP, this for cloud-like assistance.

If anything this seems like passing the Cloud cost to consumers.

I wonder how this would play out over time, ideally you'd want enough users to keep the assist system plugged in and running to support people who hop on at the end of it's life or plug it in whenever, I wonder if support would be better or worse than Nintendos history (like the Wii online shutdown).

The whole point of this is that P2P gives low latency and thus better performance than the cloud. The cloud would pretty much be redundant and they wouldn't bother making their own servers. If you want extra power but live in a remote area with no one to P2P with you can just buy an SCD. The passing of the cloud cost to consumers is also why the patent has a system to reward users with things like free games for allowing their SCD to be part of the cloud.

Also as I see it there wouldn't be an end of life. Nintendo could just release more powerful SCDs every year and all the different devices would be compatible.
 

LordOfChaos

Member
The whole point of this is that P2P gives low latency and thus better performance than the cloud. The cloud would pretty much be redundant and they wouldn't bother making their own servers. If you want extra power but live in a remote area with no one to P2P with you can just buy an SCD. The passing of the cloud cost to consumers is also why the patent has a system to reward users with things like free games for allowing their SCD to be part of the cloud.

Also as I see it there wouldn't be an end of life. Nintendo could just release more powerful SCDs every year and all the different devices would be compatible.

Yeah, so there you go if our guesses are right. I don't see why this would be the end of free Nintendo online, this would be the beginning of longer free support if anything.
 
Looks like free online gaming for Nintendo is a thing of the past. This kind of infrastructure doesn't pay for itself..

Lol, P2P means it literally does pay for itself. At most Nintendo's servers will facilitate the matching. Even with the Wii U their multiplayer games are all P2P
 

AmyS

Member
Don't forget PS3 demonstrated something that was not completely unlike what some of the potential might be with the Supplemental Computing Devices.

GT5 running at 240 FPS *or* GT5 rendered in 4K 3840 x 2160p resolution.

XXXTH4Z.jpg


QfkMeij.jpg


http://gizmodo.com/5094334/ps3-tech-demo-runs-gran-turismo-5-in-2160p-or-240-fps
http://kotaku.com/5093592/gran-turismo-5-prologue-running-at-240-fps--3840x2160-resolution

Although this was done locally, with 4 PS3s, not in the cloud. But there's no reason Nintendo NX consoles couldn't potentially have one or more Supplemental Computing Devices linked up, physically right next to the console.
 

LordOfChaos

Member
Don't forget PS3 demonstrated something that was not completely unlike what some of the potential might be with the Supplemental Computing Devices.

GT5 running at 240 FPS *or* GT5 rendered in 4K 3840 x 2160p resolution.

XXXTH4Z.jpg


QfkMeij.jpg


http://gizmodo.com/5094334/ps3-tech-demo-runs-gran-turismo-5-in-2160p-or-240-fps
http://kotaku.com/5093592/gran-turismo-5-prologue-running-at-240-fps--3840x2160-resolution

Although this was done locally, with 4 PS3s, not in the cloud. But there's no reason Nintendo NX consoles couldn't potentially have one or more Supplemental Computing Devices linked up, physically right next to the console.


From the picture there it looks like they were doing simple split frame rendering in this concept? Each PS3 got a quadrant.

That works, but it's not the most efficient, since as we discussed above any part of the screen could be rendering all the cool crap while one has a wall or floor or sky etc.

Hitting a target framerate with that solution would take more developer work, as any one PS3 in the chain could go over budget causing no frame to be output.

If they were all at a fraction of their peak power after the division and had no issues rendering a quarter of the screen every 16ms than sure it would be easier. But there are more modern technologies for splitting the work in SLI/Crossfire already that the NX may borrow from.

Also: Lol, they had each PS3 doing 1080p there, how cute.
 

AmyS

Member
From the picture there it looks like they were doing simple split frame rendering in this concept? Each PS3 got a quadrant.

That works, but it's not the most efficient, since as we discussed above any part of the screen could be rendering all the cool crap while one has a wall or floor or sky etc.

Hitting a target framerate with that solution would take more developer work, as any one PS3 in the chain could go over budget causing no frame to be output.

If they were all at a fraction of their peak power after the division and had no issues rendering a quarter of the screen every 16ms than sure it would be easier. But there are more modern technologies for splitting the work in SLI/Crossfire already that the NX may borrow from.

Also: Lol, they had each PS3 doing 1080p there, how cute.

I don't really know anything how this works technically, but the fact that it was done locally, in both cases (240fps and 4K) and didn't involve using the internet / cloud, makes me hopeful Nintendo could pull off a hardcore gaming option for NX consoles even though games would have to work with just the basic console hardware.
 

Thraktor

Member
I hear ya but......I'm in New Zealand. Us and I imagine MANY other countries won't have the density you suggest.

Whether you're in New Zealand or Ireland or Norway or wherever doesn't hugely matter. If you're using an ADSL-derived fixed connection technology (which 69% of New Zealanders are), then you're operating on basically the name network topology as anyone who's using ADSL anywhere else in the world, because they're all using the infrastructure of public switched telephone networks that were there before them. And back in the old days of PSTNs, there was a practical minimum limit on the size of a telephone exchange, as it wouldn't have made economic sense staffing and equipping an office for a low number of connections, not when a range extended line running at 100V+ could reach in excess of 10km. On the other hand, there were also practical limits on the maximum number of connections to an exchange, both in terms of cable routing and the physical switching machinery, so every exchange, whether rural or urban, would pretty much have to fall within the same range of connections. Exactly where that range lies I can't say without more data*, but it's going to be pretty much the same throughout the developed world.

If you are on a rural exchange you will likely have a longer local loop to the exchange, which won't help your latency, but the 20ms round-trip between two connections on the same exchange is basically worst-case rural ADSL connection anyway, so it's not going to be any different to how I described above. (The increased distances from one exchange to another are relatively trivial, as they'll all be connected by high-quality fibre lines, and even with exchanges located 100km apart you'd still be looking at an added round-trip propagation delay of just 1ms.)

*This page claims there are about 5600 exchanges in the UK, and with 59% of the UK on ADSL, you'd be looking at a little over 6,700 people per exchange, or over 3 times as many as I assumed in my estimate. This is only one country, though, and doesn't give me any info on the range (obviously some exchanges will be larger, and some smaller), so I don't want to infer too much from it.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
From the picture there it looks like they were doing simple split frame rendering in this concept? Each PS3 got a quadrant.

That works, but it's not the most efficient, since as we discussed above any part of the screen could be rendering all the cool crap while one has a wall or floor or sky etc.

Hitting a target framerate with that solution would take more developer work, as any one PS3 in the chain could go over budget causing no frame to be output.
Your observation is correct (and is a fundamental problem in distributed workloads), but you miss one thing: if any of the tiles has a guaranteed throughput of N fps, then the final combination of tiles also has a guaranteed N fps. It doesn't matter that one of the tiles (read: nodes) would have done 99% of the actual work, and the rest of the tiles would have twiddled their thumbs most of the frame time. That's ok because you're not looking to optimise your combined throughout ad infinitum, but just to a certain point ie. N fps.
 

AzaK

Member
Whether you're in New Zealand or Ireland or Norway or wherever doesn't hugely matter. If you're using an ADSL-derived fixed connection technology (which 69% of New Zealanders are), then you're operating on basically the name network topology as anyone who's using ADSL anywhere else in the world, because they're all using the infrastructure of public switched telephone networks that were there before them. And back in the old days of PSTNs, there was a practical minimum limit on the size of a telephone exchange, as it wouldn't have made economic sense staffing and equipping an office for a low number of connections, not when a range extended line running at 100V+ could reach in excess of 10km. On the other hand, there were also practical limits on the maximum number of connections to an exchange, both in terms of cable routing and the physical switching machinery, so every exchange, whether rural or urban, would pretty much have to fall within the same range of connections. Exactly where that range lies I can't say without more data*, but it's going to be pretty much the same throughout the developed world.

If you are on a rural exchange you will likely have a longer local loop to the exchange, which won't help your latency, but the 20ms round-trip between two connections on the same exchange is basically worst-case rural ADSL connection anyway, so it's not going to be any different to how I described above. (The increased distances from one exchange to another are relatively trivial, as they'll all be connected by high-quality fibre lines, and even with exchanges located 100km apart you'd still be looking at an added round-trip propagation delay of just 1ms.)

*This page claims there are about 5600 exchanges in the UK, and with 59% of the UK on ADSL, you'd be looking at a little over 6,700 people per exchange, or over 3 times as many as I assumed in my estimate. This is only one country, though, and doesn't give me any info on the range (obviously some exchanges will be larger, and some smaller), so I don't want to infer too much from it.

I wasn't talking about connection quality per se. I was talking about the number of people close to me that could offer me a decent connection to an SCD. Basically there will be one other Nintendo owner in the country, on Stewart Island that gets their internet by fishing boat.
 

Thraktor

Member
I wasn't talking about connection quality per se. I was talking about the number of people close to me that could offer me a decent connection to an SCD. Basically there will be one other Nintendo owner in the country, on Stewart Island that gets their internet by fishing boat.

My point isn't on the quality of connection, it's that relatively population density doesn't really make all that much difference for the feasibility of the scheme. The important metric is the network distance between you and the NX+, not the physical distance, and that network distance won't vary all that much between high-density and low-density countries.

But back to my original point, they really don't need to sell that many NX+'s for the service to be widely available. By my above calculations, you would only need to sell to 0.15% of the population to reach a 90% probability that a player would be able to find their game on an NX+ on their exchange or a neighbouring one. In New Zealand that's under 7,000 units. I haven't been able to find any sales numbers for New Zealand, but I think it's safe to say if you can't sell 7,000 units it's barely worth shipping them over. The Xbox One in Japan is the only console in any territory I've been able to find data on that has (as of yet) failed to sell through to at least 0.15% of the population, and even the Xbox360 exceeded the threshold by a factor of almost 10, while being universally considered a huge failure in the country.
 

AzaK

Member
My point isn't on the quality of connection, it's that relatively population density doesn't really make all that much difference for the feasibility of the scheme. The important metric is the network distance between you and the NX+, not the physical distance, and that network distance won't vary all that much between high-density and low-density countries.

But back to my original point, they really don't need to sell that many NX+'s for the service to be widely available. By my above calculations, you would only need to sell to 0.15% of the population to reach a 90% probability that a player would be able to find their game on an NX+ on their exchange or a neighbouring one. In New Zealand that's under 7,000 units. I haven't been able to find any sales numbers for New Zealand, but I think it's safe to say if you can't sell 7,000 units it's barely worth shipping them over. The Xbox One in Japan is the only console in any territory I've been able to find data on that has (as of yet) failed to sell through to at least 0.15% of the population, and even the Xbox360 exceeded the threshold by a factor of almost 10, while being universally considered a huge failure in the country.

Did you calculations take into account availability. That is, if most people play after work they're gonna want to use their NX+ themselves and not give it to others. The calculation would have to take into account that as well.

I'm not saying it's not possible, I'm just not convinced that it would be worth the cost of building all the complicated management software infrastructure of a system like that into a home console. Nintendo doesn't have a good track record in that department.

I can however see them supporting physically connected SCDs to enhance the power of your console.
 

LordOfChaos

Member
Your observation is correct (and is a fundamental problem in distributed workloads), but you miss one thing: if any of the tiles has a guaranteed throughput of N fps, then the final combination of tiles also has a guaranteed N fps. It doesn't matter that one of the tiles (read: nodes) would have done 99% of the actual work, and the rest of the tiles would have twiddled their thumbs most of the frame time. That's ok because you're not looking to optimise your combined throughout ad infinitum, but just to a certain point ie. N fps.

For frame times, yeah that's true enough, but in your example you'd then basically be limited by the performance of one box if it did 99% of the work. So the buddy boxes wouldn't be adding much to the capability of the entire system.

It doesn't *have* to be split frame rendering though of course so these theoreticals may not end up meaning much. It could use alternate frame rendering which would see less variance in how much work each GPU has, or ever more sophisticated methods.

SFR was only popular in the very early SLI days, now it's all AFR or newer stuff, I only brought it up because of the PS3 example in the picture being quaint :p
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
For frame times, yeah that's true enough, but in your example you'd then basically be limited by the performance of one box if it did 99% of the work. So the buddy boxes wouldn't be adding much to the capability of the entire system.

It doesn't *have* to be split frame rendering though of course so these theoreticals may not end up meaning much. It could use alternate frame rendering which would see less variance in how much work each GPU has, or ever more sophisticated methods.

SFR was only popular in the very early SLI days, now it's all AFR or newer stuff, I only brought it up because of the PS3 example in the picture being quaint :p
I'm not sure I was understood, so let me elaborate.

Let's say you have two nodes in the system, of equal performance, handling one tile of screen space each. If each of the nodes can sustain e.g. 30fps per its own tile under all circumstances, then the combination of two tiles (i.e. twice the resolution) can sustain 30fps. It doesn't matter that in some occasions all the content might be in one of the tiles, whereas the other would be absolutely barren - the combined output is still 30fps per the two tiles, together.

That's not the case when we are not targeting fixed fps, but instead we want to achieve as high fps as possible. Then under situations of workload disparity between the two tiles, the backing nodes would be used inefficiently - one node would finish its tile early, and would idle while waiting for the other node to finish all the hard work*. Ergo the fps would not be as high as otherwise possible. But that matters only when we aim for as high fps as possible - that's not the case in game scenarios.

In the case of GT tech demonstrator, their setup allowed them to bump up the effective game output resolution N times by connecting N nodes, exactly because each single node had a guaranteed fps for its portion of the screen, no matter what.

* Yes, there are various techniques to address than in situations where uncapped, maximum throughput is sought at all times.
 

LordOfChaos

Member
I understood you, say,

System 1, able to poop frame in under 16.66ms, workload 1N
System 2, 16.66ms, workload 1N
System 3, 16.66ms, workload 10N

In this case, you'd still get your 60FPS, but it's inelegant as in this case you're not adding much to the totality of what could be drawn on screen, which would be what such an additive system is for, right? It wouldn't be much different from if system 3 was just drawing/computing all 12N workload on it's own if it could handle that within budget.

AFR is better in this regard as then you'd effectively have each system in my weird 3 tier example working with ~50ms between each frame being expected of it. Or doing something more asynchronous, like system 2 does all GPGPU compute or something. Late life for the Cell was sometimes spent with an SPU (or at least one) doing culling for the GPU for example.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
More like only #3 would spend 16ms, whereas the rest would all spend 1/10th of that. But yes, you got the general issue right.

Unfortunately, there's no silver-bullet solution to that problem. Particularly not when the total workload is rigidly pre-distributed across the workers. For instance, AFR has its own set of problems. Just because temporal locality is not so much different than spatial locality. Actually, increasing the number of nodes presents AFR with a similar issue to distributed SFR - as you have many nodes, working on some long-ish frame sequence, the chances that one of them is doing some frame which is so much heavier than what other nodes are working on increase. Keep in mind AFR delivers its output to what is essentially a ring-buffer (e.g. a triple-buffer for dual-unit AFR) - then one node 'stalling' can eventually stall the entire ring-buffer, not because there are no other available nodes to get more work done, but because the ring-buffer would de-sync* - frame order would not be guaranteed monotonous.

Of course there are more advanced attempts at solving that issue, where work is not rigidly pre-distributed, but dynamic balance decisioning is applied, e.g. when nodes can 'steal' from each other's work-piles (AKA 'work-stealing'), but that also comes at a price - e.g. increased maintenance traffic across the nodes.

At the end of the day, you need to know what viable goals you can distribute across your workers. IF that is 'we have a game that runs stably at res N on one node, let's bump that res up to 2N, across the available 2N nodes', then so be it!

* yes, that could be addressed by 'timestamps' and a re-sync phase, which again, might not be free.
 

Diffense

Member
Not finished skimming through this thread yet but it seems as if this could be related to another patent posted on neogaf: "stationary game console without optical disk".

Here are a few points describing the console in the other patent:
1. A stationary game apparatus, comprising: an internal hard disk drive storing a program and/or data; a communication unit transmitting/receiving a program and/or data via a network; and a processor executing a program stored in the hard disk drive to perform game processing, wherein the game apparatus is not provided with an optical disk drive.

13. The stationary game apparatus according to claim 1, wherein the game apparatus is compatible with another game apparatus comprising an optical disk drive for reading out a program and/or data from an optical disk, and a processor for executing the program read out from the optical disk to perform game processing, and an interface for the hard disk drive is same as an interface for the optical disk drive in said another game apparatus.

17. A non-transitory recording medium recording a computer program executed by the processor of the stationary game apparatus according to claim 1, wherein the computer program includes a processing routine for the stationary game apparatus, and one or more processing routines for a game apparatus having a hardware configuration different from the stationary game apparatus, and the computer program causes the processor to operate as an identification information obtaining unit obtaining identification information of a game apparatus, and a selection unit selecting a processing routine in accordance with the identification information obtained by the identification information obtaining unit.

26. A non-transitory recording medium recording a computer program executed by processors of a plurality of types of game apparatuses with different hardware configurations, wherein the computer program includes a plurality of processing routines for each of the plurality of types of game apparatuses, and the computer program causes the processor to operate as an identification information obtaining unit obtaining identification information of a game apparatus; and a selection unit selecting a processing routine in accordance with the identification information obtained by the identification information obtaining unit.

[0003] In recent years, high-speed communication such as ADSL (Asymmetric Digital Subscriber Line) or optical communication has widely been spread. Such high-speed communication is utilized to allow a server apparatus or the like to distribute a game program to a game apparatus. When a game program is obtained through communication, a user can enjoy playing a game using a game apparatus without purchasing a recording medium such as an optical disk.

[0004] According to an aspect of the embodiment, a stationary game apparatus includes an internal hard disk drive storing a program and/or data, a communication unit transmitting/receiving a program and/or data via a network, and a processor executing the program stored in the hard disk drive to perform game processing, while not including an optical disk drive reading out a program and/or data from an optical disk.

More here: http://www.neogaf.com/forum/showthread.php?t=1099932

Basically, that "stationary console without optical disk drive" seems like it could fit nicely in the role of a "supplementary computing device". The patent claims it has a hard drive and a processor and is compatible with another game apparatus which has an optical disc drive and a processor [13]. [17] and [26] also seem to hint at functions that the SCD of this thread's patent should be able to perform. Basically, it needs to determine what code to run on behalf of the devices that are connected to it, i.e, those using it as a "supplementary computing device". The final points simply describe a game console that obtains all the game content digitally.

It seems like there is the possibility that there will be multiple SKUs with all of them capable of acting as SCDs. The box without an optical disc drive will be a console in its own right but likely cheaper and more compact. People who're willing to go digital only could buy this. When you're not using your console you could leave it connected to the internet and in sleep mode during which it can be used as a SCD if you enable that permission. If it is used, you get mynintendo points or some other reward. If you happen to have multiple consoles, for example during a local multiplayer session where everyone brought their console, you could probably connect them together and get separate full resolution outputs instead of split screen. That might be a use case for the local connection (Splatoon 2 tournament setup that doesn't require internet).
 

Neoxon

Junior Member
I apologize for the bump, but I'd like to offer my thoughts on the patent. I do recognize the potential for such a patent (where Nintendo could release a supplementary device for the NX Console to boost the power), my issue with this is the input lag that comes with cloud computing. This could be a major problem for games that require pin-point timing that may want that extra power, particularly fighting games. There's also the issue of requiring this for the more graphically-intensive games, more so when you consider those without a decent internet connection or data caps. It's a cool idea, but until America steps up their internet game, I don't think this will work as well as I would hope it would. And the input lag problem would limit the use of a potential supplementary device that could realize this patent.
 

Pokemaniac

Member
I apologize for the bump, but I'd like to offer my thoughts on the patent. I do recognize the potential for such a patent (where Nintendo could release a supplementary device for the NX Console to boost the power), my issue with this is the input lag that comes with cloud computing. This could be a major problem for games that require pin-point timing that may want that extra power, particularly fighting games. There's also the issue of requiring this for the more graphically-intensive games, more so when you consider those without a decent internet connection or data caps. It's a cool idea, but until America steps up their internet game, I don't think this will work as well as I would hope it would. And the input lag problem would limit the use of a potential supplementary device that could realize this patent.

While lag is inherent to network communication, it will not necessarily be an issue with these devices. Some stuff, like online multiplayer, needs to go over the network, anyway, so nothing additional will be added. There are also certain tasks which are just less latency sensitive than others. There is also the configuration where the SCD is connected via a direct wired connection to the console, where there will be no appreciable latency. Even in the worst case, SCDs should still have less latency than traditional cloud servers, since they will be located closer on the network to users.

The important thing to remember is that cloud, and especially these boxes are very versatile. There are many potential use cases where you aren't simply running the entire game on a remote server.
 

Neoxon

Junior Member
While lag is inherent to network communication, it will not necessarily be an issue with these devices. Some stuff, like online multiplayer, needs to go over the network, anyway, so nothing additional will be added. There are also certain tasks which are just less latency sensitive than others. There is also the configuration where the SCD is connected via a direct wired connection to the console, where there will be no appreciable latency. Even in the worst case, SCDs should still have less latency than traditional cloud servers, since they will be located closer on the network to users.

The important thing to remember is that cloud, and especially these boxes are very versatile. There are many potential use cases where you aren't simply running the entire game on a remote server.
My concerns are primarily with offline multiplayer games that take advantage of such a device, where the added input lag would interfere with your gameplay.
 

Pokemaniac

Member
My concerns are primarily with offline multiplayer games that take advantage of such a device, where the added input lag would interfere with your gameplay.

If there is a local device connected to a console with a sufficiently high bandwidth plug, then the difference is negligible. It would be exactly then same as the coprocessors that Nintendo used to put in cartridges.

For handheld, you could get the same benefits tethered, or have about the same latency as a Wii U GamePad over wireless.

If you're talking about using a remote SCD, latency sensitive games (read: Smash Bros) would be designed to minimize utilization of remote SCDs for latency sensitive tasks. Cloud utilization isn't all or nothing. There are various shades of gray where work can be put into background details that don't impact the main gameplay.

If there's ever a case where your local hardware isn't good enough to run the game without lag, the solution will likely be to buy your own SCD.
 

BuggyMike

Member
Decided to go back and read this again after reading the rumors and, this makes much more sense in the context of a portable console with a dock. BYOP suggessted the dock could be the SCD which would come with every device. That patent about the games console that uses cartridges, as well as a screen controller was spot on if the rumors are true. Wonder if this will be too.
 

AmyS

Member
Between the SCD patent and what is possible with GeForce Now, Nintendo has plenty of options for bringing games to NX that are beyond the capabilities of the hardware built into the system alone (Tegra X1, X2 etc).

Edit: Digital Foundry impressions on GeForce Now cloud streaming as of October 2015 https://www.youtube.com/watch?v=Cq_LLPZfwrs

No doubt GeForce Now will be even better in 2017. As for the SCD, hopefully there can be different SCDs, some with a single Tegra and some with two. Plus storage, assuming SCD is implemented into actual products.
 

ksamedi

Member
I came across this thread and its an interesting read. Whith the info we got from the foxconn leak, I think Nintendo has some pretty concrete plans to make this a reality. Exciting times ahead.
 
I came across this thread and its an interesting read. Whith the info we got from the foxconn leak, I think Nintendo has some pretty concrete plans to make this a reality. Exciting times ahead.

I think it's possible that they just may have taken steps behind the scenes to future-proof their modular solution and address the part of the market that wants higher specs sooner rather than later if it indeed comes to that.
 
Top Bottom