• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo's Supplemental Computing Devices patent allowed by USPTO, rejection cleared*

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
If they used the color buffer compression from GCN 1.2 (which should be invisible to software, so they should be able to include without any BC issues), then, according to AMD, they should be able to get away with 40% less bandwidth. And 176*2*0.6 = 211, so they'd theoretically be almost exactly doubling the effective bandwidth.
That's rather optimistic.

For instance, in compute kernels (i.e. sans ROP resolves), that's 40% less BW for read-only color buffers alone, not for all color-buffer accesses. So a compute kernel that uses as little as two RT textures with bilinear filtering will use 8 uncompressable texel reads for every output.

ed: sorry, that example came out rather convoluted and also wrong. The output from the compute kernel would be uncompressed. For the inputs it would depend whether they came from compute or fragment kernels. Point being, any data that is both (1) dynamic and (2) does not originate from ROP resolves will be uncompressed.

ed': no 14nm no Polaris, Vena - I'm sure I've mentioned that before ; )

/sorry Rosti, won't be hijacking any further
 

Vena

Member
That's rather optimistic.

For instance, in compute kernels (i.e. sans ROP resolves), that's 40% less BW for read-only color buffers alone, not for all color-buffer accesses. So a compute kernel that uses as little as two RT textures with bilinear filtering will use 8 uncompressable texel reads for every output.

/sorry Rosti, won't be hijacking any further

But I like reading these! :p What's your opinion on the Polaris rumor? Or do you think its GCN 1.1/1.2?
 

Kimawolf

Member
So the more tech saavy of you.. could they use a pci e connection and kind of allow like external GPU/ CPUs or is there a wire fast enough to work with it?
 

MrBigBoy

Member
How much bandwith can the X1's Kinect port handle? Maybe we can use that as a reference, but I suppose that port isn't fast enough for something like an SCD.
 

bomblord1

Banned
How much bandwith can the X1's Kinect port handle? Maybe we can use that as a reference, but I suppose that port isn't fast enough for something like an SCD.

Were probably looking at something along the lines of a PCI-E slot or some proprietary alternative.
 

Manoko

Member
I think the SCD will indeed provide an increase in processing power in the form of hardware, but most of its use and the "future-proofing" will come from Nintendo's cloud based service described in the patent.

Tell me if I'm wrong, but it seemed that the SCD is basically connecting to other SCD based on proximity (for ping reasons) and allow them to help through cloud computing, for various demanding tasks.
An SCD closer to yours, from a different user, would take charge of things like AI, or physics, and would be rewarded through the My Nintendo program for the amount of hours spent "helping" other users while you're not playing, just as it was described in the patent if I remember correctly.

In this way, you'd only need to buy one SCD, and the hardware on the NX and SCD can be cheap thus reducing drastically the cost of entry and hopefully allowing the install-base to be significant from the get-go, for everyone to start benefiting from this computing solution.

The computing power of your own console would not be limited in fact, a cloud based service with the server being all around you sounds extremely exciting to me, and potent.

I really think it's a genius idea if implemented like such.
If you're not connected to the internet, to have PS4 levels of computing power, but if connected to other NX users through the SCD, have a much more powerful console for no additionnal cost (well, aside from the SCD which could be very cheap).
 

bachikarn

Member
I think the SCD will indeed provide an increase in processing power in the form of hardware, but most of its use and the "future-proofing" will come from Nintendo's cloud based service described in the patent.

Tell me if I'm wrong, but it seemed that the SCD is basically connecting to other SCD based on proximity (for ping reasons) and allow them to help through cloud computing, for various demanding tasks.
An SCD closer to yours, from a different user, would take charge of things like AI, or physics, and would be rewarded through the My Nintendo program for the amount of hours spent "helping" other users while you're not playing, just as it was described in the patent if I remember correctly.

In this way, you'd only need to buy one SCD, and the hardware on the NX and SCD can be cheap thus reducing drastically the cost of entry and hopefully allowing the install-base to be significant from the get-go, for everyone to start benefiting from this computing solution.

The computing power of your own console would not be limited in fact, a cloud based service with the server being all around you sounds extremely exciting to me, and potent.

I really think it's a genius idea if implemented like such.
If you're not connected to the internet, to have PS4 levels of computing power, but if connected to other NX users through the SCD, have a much more powerful console for no additionnal cost (well, aside from the SCD which could be very cheap).

Considering MS hasn't even been able to get cloud computing yet, I'm really skeptical Nintendo will be able to.
 

Instro

Member
USB 3.1? No PCI-E 3.0 can do 32GBPS USB 3.1 can only do about 10.

Maybe Thunderbolt 3? That can do 40.

Wouldn't Thunderbolt 3 make the most sense? It's already being used in the PC market for external GPUs. AMD has the whole XConnect based around it.
 

Manoko

Member
Considering MS hasn't even been able to get cloud computing yet, I'm really skeptical Nintendo will be able to.

Because it's done differently.
Here Nintendo doesn't take care of "servers" for all the extra computing to be done at and then sent to Xbox One users.

It would simply be, multiple SCD, very close to each other, forming a local "cloud computing" server tailored for the user.
 

Russ T

Banned
Because it's done differently.
Here Nintendo doesn't take care of "servers" for all the extra computing to be done at and then sent to Xbox One users.

It would simply be, multiple SCD, very close to each other, forming a local "cloud computing" server tailored for the user.

You're forgetting the part of this patent where inactive users can enable their devices for computing power over the internet. That is very much cloud computing.
 

bomblord1

Banned
Wouldn't Thunderbolt 3 make the most sense? It's already being used in the PC market for external GPUs. AMD has the whole XConnect based around it.

It could definitely work as far as that goes. However I think that Apple owns it which would make it difficult for Nintendo to use. They would probably either go with a more open standard or a proprietary tech by one of their partners. Nintendo's never been in the business of paying royalties. They even went with a proprietary disc format on the wiiU in order to prevent paying blu-ray fees.

If using an existing tech. USB C is my guess it's fast and it's capable of providing up to 100Watts of power which would be essential if the SCD had its own CPU/GPU.
 

Instro

Member
It could definitely work as far as that goes. However I think that Apple owns it which would make it difficult for Nintendo to use. They would probably either go with a more open standard or a proprietary tech by one of their partners. Nintendo's never been in the business of paying royalties. They even went with a proprietary disc format on the wiiU in order to prevent paying blu-ray fees.

If using an existing tech. USB C is my guess it's fast and it's capable of providing up to 100Watts of power which would be essential if the SCD had it's on CPU/GPU.

Thunderbolt is owned by Intel as far as I am aware.
 
Because it's done differently.
Here Nintendo doesn't take care of "servers" for all the extra computing to be done at and then sent to Xbox One users.

It would simply be, multiple SCD, very close to each other, forming a local "cloud computing" server tailored for the user.

I guess I'm not that knowledgeable about cloud computing, but would Nintendo be able to guarantee that 100% of the time every active console is connected to this SCD cloud?

Because if they can't guarantee that 100%, then it seems unlikely to me that they would go forward with this idea, since you'd need to account for some consoles dropping their connection to the SCD cloud and then the performance would drop immensely. Could they put in a workaround for that, like automatically lower the resolution, drop certain effects? Maybe, I don't know.

I just know that, historically speaking, Nintendo's innovations have had the user friendly guarantee of actually working pretty much 100% of the time. This seems like it depends too much on factors outside of Nintendo's control, and therefore I don't see them setting up a cloud system like this anytime soon. Maybe in the future though.
 

bomblord1

Banned
Can someone explain how cloud based computing over wifi would be fast enough as opposed to say over USB3?

Well we are talking about the system literally recognizing there is another CPU/GPU connected to in regards to the port discussion.

Cloud computing on the other hand would offload certain processes to another system and then receive the results and process them accordingly.
 

Manoko

Member
You're forgetting the part of this patent where inactive users can enable their devices for computing power over the internet. That is very much cloud computing.

Well that's what I was talking about in my previous post.
You could enable your SCD to be part of the cloud only when you're not actively using it to improve your experience for example.
 

MuchoMalo

Banned
That's rather optimistic.

For instance, in compute kernels (i.e. sans ROP resolves), that's 40% less BW for read-only color buffers alone, not for all color-buffer accesses. So a compute kernel that uses as little as two RT textures with bilinear filtering will use 8 uncompressable texel reads for every output.

ed: sorry, that example came out rather convoluted and also wrong. The output from the compute kernel would be uncompressed. For the inputs it would depend whether they came from compute or fragment kernels. Point being, any data that is both (1) dynamic and (2) does not originate from ROP resolves will be uncompressed.

ed': no 14nm no Polaris, Vena - I'm sure I've mentioned that before ; )

/sorry Rosti, won't be hijacking any further

It has to be 14nmFF though. There's no way Sony is using some enormous 250W APU.


More on-topic, I really don't understand how this SCD idea will work out. Even in the best case it can't possibly increase performance by more than 10-20%, can it?
 

Manoko

Member
I guess I'm not that knowledgeable about cloud computing, but would Nintendo be able to guarantee that 100% of the time every active console is connected to this SCD cloud?

Because if they can't guarantee that 100%, then it seems unlikely to me that they would go forward with this idea, since you'd need to account for some consoles dropping their connection to the SCD cloud and then the performance would drop immensely. Could they put in a workaround for that, like automatically lower the resolution, drop certain effects? Maybe, I don't know.

I just know that, historically speaking, Nintendo's innovations have had the user friendly guarantee of actually working pretty much 100% of the time. This seems like it depends too much on factors outside of Nintendo's control, and therefore I don't see them setting up a cloud system like this anytime soon. Maybe in the future though.

With many NX with SCDs around you, you wouldn't need everyone of them connected 100% of the time.
If one is dropping, redistribute the computing requirements differently to the other ones, or as you said, maybe lower the quality of certain effects a bit (like physics) until you have another SCD on your local cloud that can take the task for itself.
 

Russ T

Banned
Yeah that's what I was talking about in my previous post.

I understand that, but you act like a local user-hosted cluster of clouds is somehow easier than server farms. That's not necessarily the case. In fact, it's arguably a far more complicated problem to solve. With a dedicated server farm, it's either working or it's not. So you flip the switch locally to turn the high-fidelity [feature] on and off. With this dynamic cloud thing, you've got to handle people dropping in and out, jumping from one cloud to another. Features scaling with the number of clouds that's constantly growing and shrinking. Shit's hard...
 

Pokemaniac

Member
Wouldn't Thunderbolt 3 make the most sense? It's already being used in the PC market for external GPUs. AMD has the whole XConnect based around it.

Whatever Nintendo uses, it will likely be custom. I'd be surprised if they went for a standard port for something like this.
 
With many NX with SCDs around you, you wouldn't need everyone of them connected 100% of the time.
If one is dropping, redistribute the computing requirements differently to the other ones, or as you said, maybe lower the quality of certain effects a bit (like physics) until you have another SCD on your local cloud that can take the task for itself.

Well what I meant by active, is that every console currently which is being used would theoretically be offloading some processing to this cloud, and I'm worried that Nintendo wouldn't be able to ensure that each of these consoles would be able to consistently connect to the cloud in order to gain the benefits of that cloud processing.

There are too many factors outside of Nintendo's control, mainly every user's specific wifi connection and their particular ISPs. If the experience constantly shifts between the base and cloud enhanced experience, that's not a very good proposition.

Then again like I said, I don't know much about how all of this works, but if it would work as you said consistently enough for Nintendo to be happy, then I'm all for it.
 

bomblord1

Banned
Literally the only thing we have aside from the patents from Nintendo regarding the cloud are old Iwata asks talking about the possibility of Nintendo using cloud tech.

Iwata_asks said:
which are opened up by the combination of cloud technologies and new software paradigms like general purpose GPU programming.

I think we are now in exactly the right place to create the best new ideas, but we need to be very bold and ambitious to do so. That’s why I’m so glad I’m at NERD and not anywhere else.
https://www.nintendo.co.uk/Iwata-As...-Dreams/9-NERD-s-Goals-and-Dreams-759323.html

Edit: Now that I'm looking into I'm finding some more info from throughout the years

Iwata2013 said:
I believe that there are games that have an affinity with cloud gaming and games that do not. Of course, we constantly pay attention to the advances and changes in cloud gaming technology and internet infrastructure. On the other hand, I don’t think that our games, particularly the types that have strict requirements in terms of real-time responsiveness, can offer high-quality services using cloud gaming technology because of unavoidable network latency, which I mentioned last time.

We will of course continue to see how this technology develops, but in order to decide whether cloud gaming is something that we should be interested in, we will need to closely follow the changes in technology and also the business environment. However, at this point in time, I do not think that acquiring a cloud gaming company will in any way improve our performance, so we are not moving in that direction.
http://www.nintendolife.com/news/2013/04/satoru_iwata_nintendo_has_no_plans_to_adopt_cloud_gaming

Nintendo is currently using or in partnership with these cloud based companies. They heavily used cloud based technologies for Miiverse and Streetpass.

http://www.gamasutra.com/blogs/Brya...nologies_Power_Nintendos_StreetPass_Relay.php
Hbase

To manage the constant flow of data for StreetPass Exchange, Nintendo turned to Hbase, a data management tool also used by Facebook and other social networks.

Puppet

Nintendo has three years of experience using Puppet, and reuses prior settings with each new project. By adopting prior settings, Nintendo was able to launch StreetPass Exchange within three months of starting the project.

fluentd

But with millions of 3DS users using the service, and 100,000 different access points, Nintendo needed responsive analytics to quickly identify any user experience challenges. The company turned to Fluentd, an open source big data solution that aggregates access logs in real time.

Amazon

But none of this would be possible without Amazon. Nintendo utilized AWS, a dynamic cloud service offered by Amazon, to host StreetPass Relay.

“It’s impossible to imagine starting this kind of service in such a short time without a cloud,” said Yamakazi.

Iwata also confirmed that Nintendo heavily leveraged cloud technology in the past for Miiverse, the social network built into Wii U, the company’s latest home console.
 

Thraktor

Member
I'm starting to suspect that the most likely application for the SCD patent would be one where the handheld is the "games console" in the patent, and the home NX acts as the SCD. The motivation for this line of thinking is that Nintendo may want to work on continuing the development of gamepad-style functionality by connecting their new handheld and home console. The issue with using video streaming, though, is that to implement it to the tolerances Nintendo desires (i.e. extremely low latency) would require the same kind of custom streaming hardware Wii U used, and would have the same limitation in terms of only being usable in very close proximity to the home console.

So, the solution would be to switch things around, from the console being the master and the handheld the slave (as is the case with the Wii U and gamepad, as the Wii U performs all computation and rendering and simply sends the resulting frames to the gamepad for display), to a situation where the handheld is the master and the home console the slave (i.e. the handheld renders the final image and makes use of whatever resources the home console provides). Once the handheld is the one polling inputs and rendering the frames you can start to guarantee low latency for the user.

This solves the business case problem in the sense that Nintendo wouldn't have to actually sell people an extra device just for use as an SCD, they'd just be giving handheld owners a bonus if they also owned the home console. It sort of solves the third party problem in that there wouldn't be any onus to make use of it (if you spend a few hundred dollars on a "super powerful GPU" SCD you're going to be pretty disappointed if a lot of games don't even use it, whereas this is extra functionality for already useful products), but it doesn't really solve the development challenge. Granted there would be, initially anyway, only three different development cases:

A. Game runs on handheld
B. Game runs on home console
C. Game runs on handheld with home console acting as SCD

The problem with C, though, is that it's not a single highly optimisable development environment, but rather one where you have to account for the fact that bandwidth and latency between the two devices are highly variable, not just each time you run the game, but even during a single instance. There are certainly elements which have very low bandwidth and latency requirements and hence could be accommodated even in the worst case, such as AI, but making full and efficient use of the resources available would be challenging to say the least.
 
Well what I meant by active, is that every console currently which is being used would theoretically be offloading some processing to this cloud, and I'm worried that Nintendo wouldn't be able to ensure that each of these consoles would be able to consistently connect to the cloud in order to gain the benefits of that cloud processing.

There are too many factors outside of Nintendo's control, mainly every user's specific wifi connection and their particular ISPs. If the experience constantly shifts between the base and cloud enhanced experience, that's not a very good proposition.

Then again like I said, I don't know much about how all of this works, but if it would work as you said consistently enough for Nintendo to be happy, then I'm all for it.

Never mind that. Even if they could guarantee optimal performance, the real problem is figuring out what to do with it. Microsoft have a much more robust and powerful back-end available to them in Azure, yet they've struggled to find any meaningful use for it in games. And the limited uses they have found for it - creating personality-based AI in Forza, improved large-scale physics simulations in Crackdown - don't seem like they'd be much use in the style of game Nintendo likes to make.
 

Kimawolf

Member
I'm starting to suspect that the most likely application for the SCD patent would be one where the handheld is the "games console" in the patent, and the home NX acts as the SCD. The motivation for this line of thinking is that Nintendo may want to work on continuing the development of gamepad-style functionality by connecting their new handheld and home console. The issue with using video streaming, though, is that to implement it to the tolerances Nintendo desires (i.e. extremely low latency) would require the same kind of custom streaming hardware Wii U used, and would have the same limitation in terms of only being usable in very close proximity to the home console.

So, the solution would be to switch things around, from the console being the master and the handheld the slave (as is the case with the Wii U and gamepad, as the Wii U performs all computation and rendering and simply sends the resulting frames to the gamepad for display), to a situation where the handheld is the master and the home console the slave (i.e. the handheld renders the final image and makes use of whatever resources the home console provides). Once the handheld is the one polling inputs and rendering the frames you can start to guarantee low latency for the user.

This solves the business case problem in the sense that Nintendo wouldn't have to actually sell people an extra device just for use as an SCD, they'd just be giving handheld owners a bonus if they also owned the home console. It sort of solves the third party problem in that there wouldn't be any onus to make use of it (if you spend a few hundred dollars on a "super powerful GPU" SCD you're going to be pretty disappointed if a lot of games don't even use it, whereas this is extra functionality for already useful products), but it doesn't really solve the development challenge. Granted there would be, initially anyway, only three different development cases:

A. Game runs on handheld
B. Game runs on home console
C. Game runs on handheld with home console acting as SCD

The problem with C, though, is that it's not a single highly optimisable development environment, but rather one where you have to account for the fact that bandwidth and latency between the two devices are highly variable, not just each time you run the game, but even during a single instance. There are certainly elements which have very low bandwidth and latency requirements and hence could be accommodated even in the worst case, such as AI, but making full and efficient use of the resources available would be challenging to say the least.


So could nintendo set up a system which could detect how many scds you have and base what quality you get on that? Kind of what a pc does when it recommends settings based on your rig config?

And realistically how many could be daisy chained together?
 

Thraktor

Member
So could nintendo set up a system which could detect how many scds you have and base what quality you get on that? Kind of what a pc does when it recommends settings based on your rig config?

And realistically how many could be daisy chained together?

What I'm describing is a one-to-one link. One handheld linking up to a single home console.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I'm starting to suspect that the most likely application for the SCD patent would be one where the handheld is the "games console" in the patent, and the home NX acts as the SCD. The motivation for this line of thinking is that Nintendo may want to work on continuing the development of gamepad-style functionality by connecting their new handheld and home console. The issue with using video streaming, though, is that to implement it to the tolerances Nintendo desires (i.e. extremely low latency) would require the same kind of custom streaming hardware Wii U used, and would have the same limitation in terms of only being usable in very close proximity to the home console.

So, the solution would be to switch things around, from the console being the master and the handheld the slave (as is the case with the Wii U and gamepad, as the Wii U performs all computation and rendering and simply sends the resulting frames to the gamepad for display), to a situation where the handheld is the master and the home console the slave (i.e. the handheld renders the final image and makes use of whatever resources the home console provides). Once the handheld is the one polling inputs and rendering the frames you can start to guarantee low latency for the user.

This solves the business case problem in the sense that Nintendo wouldn't have to actually sell people an extra device just for use as an SCD, they'd just be giving handheld owners a bonus if they also owned the home console. It sort of solves the third party problem in that there wouldn't be any onus to make use of it (if you spend a few hundred dollars on a "super powerful GPU" SCD you're going to be pretty disappointed if a lot of games don't even use it, whereas this is extra functionality for already useful products), but it doesn't really solve the development challenge. Granted there would be, initially anyway, only three different development cases:

A. Game runs on handheld
B. Game runs on home console
C. Game runs on handheld with home console acting as SCD

The problem with C, though, is that it's not a single highly optimisable development environment, but rather one where you have to account for the fact that bandwidth and latency between the two devices are highly variable, not just each time you run the game, but even during a single instance. There are certainly elements which have very low bandwidth and latency requirements and hence could be accommodated even in the worst case, such as AI, but making full and efficient use of the resources available would be challenging to say the least.
That's more or less the macro way I've been imagining things going with the supplement patent, if implemented in practice.

Apropos, speaking of connection technologies, we can differentiate between two major scenarios: local (namely core unit and supplement over a PtP connection in the same room or apparment - what Thraktor talks about above) and not-so-local/remote (core unit and supplement over a reasonably close tcp/ip network - same neighbourhood or at least same city). In the first scenario. BW-wise, anything that does 1Gb/s (aka 120-ish MB/s) is already perfectly usable for very substantial offloading of work to the supplement, and the latency is, well, for lack of better terms, ideal. Remember we're talking of compact buffers, mostly render targets, and sync messages flying back and forth, respectively, during the frame duration, not raw assets. In the second scenario, things that can be offloaded drop substantially thanks to both BW reductions and most importantly, latency hiking. Then BW goes down to a few (perhaps ten) of MB/s (which is still not bad) but latency stars taking some sort of heuristics and robust contingency planning - think netcode in fighting/racing multiplayes. Now, before you start wondering what good could that second scenario be, here's a rudimentary example: the supplement on a 'remote' connection can compute a fully-realtime texture for the skybox in a game, send it over in motion-frame-compressed format, not unlike how you watch youtube, and let the core unit just paint it over a skybox.
 

Thraktor

Member
That's more or less the macro way I've been imagining things going with the supplement patent, if implemented in practice.

Apropos, speaking of connection technologies, we can differentiate between two major scenarios: local (namely core unit and supplement over a PtP connection in the same room or apparment - what Thraktor talks about above) and not-so-local/remote (core unit and supplement over a reasonably close tcp/ip network - same neighbourhood or at least same city). In the first scenario. BW-wise, anything that does 1Gb/s (aka 120-ish MB/s) is already perfectly usable for very substantial offloading of work to the supplement, and the latency is, well, for lack of better terms, ideal. Remember we're talking of compact buffers, mostly render targets, and sync messages flying back and forth, respectively, during the frame duration, not raw assets. In the second scenario, things that can be offloaded drop substantially thanks to both BW reductions and most importantly, latency hiking. Then BW goes down to a few (perhaps ten) of MB/s (which is still not bad) but latency stars taking some sort of heuristics and robust contingency planning - think netcode in fighting/racing multiplayes. Now, before you start wondering what good could that second scenario be, here's a rudimentary example: the supplement on a 'remote' connection can compute a fully-realtime texture for the skybox in a game, send it over in motion-frame-compressed format, not unlike how you watch youtube, and let the core unit just paint it over a skybox.

I was actually thinking about a system which does both (although reading back over my post it seems I never actually said this), and would see it as a form of competition to Vita's ability to stream from the PS4 over local wifi or further afield. Of course, I'm imagining Nintendo would have higher standards of usability than that.

The local scenario presents quite a lot of possibilities. Even over a standard home wifi network you should be able to squeeze sufficient bandwidth and sufficiently low (and relatively stable) latency to do some pretty interesting things. The problem is that, of the plausible functionality which I can think of in that scenario, almost none of it translates at all to the low-bandwidth, high-latency scenario. For that, you'd need to look at elements of a game which satisfy all of the following:

  1. Is computationally non-trivial
    (otherwise you could just do it locally)
  2. Reacts to the player's actions
    (otherwise you could pre-compute)
  3. But doesn't need to react too quickly
    (needs to accommodate added latency)
  4. Involves small data transfers each way
    (needs to accommodate low bandwidth)
  5. Should be able to be implemented in a simplified form on the handheld
    (or otherwise needs to accommodate the case where the signal drops)
The reason I gave the example of AI is not just that it satisfies the above, but that implementing it over a variable-latency connection is a relatively solved problem, just take the netcode you mention, but instead of a player at the other end there's an NX home console running some AI code. The problem with improved AI as the main use case, though, is that in general the improvement just isn't going to be all that noticeable, even going from a handheld to a home console (unless you're simulating large crowds where a handheld CPU would simply choke, but then you run into a problem with 5 above), so it's hard to sell the whole technology on that basis.

There are much more visible, and hence more sellable, options once you restrict yourself to a home network, I'll admit. For instance, you could have the home NX dynamically calculate lightmaps for the game's environment, providing high-quality real time environmental lighting while leaving the handheld to only have to handle lighting and shadowing for the player character and other dynamic elements. It wouldn't be quite trivial to implement, but it would be doable for talented engineers, and would be impressive enough in use to be a selling point for the technology.

Trying to think up use cases for this beyond the home network becomes a lot more challenging. The best I can think of is fluid simulation. It's very computationally expensive (and can be pretty impressive when you dedicate an entire GPU to the purpose), but requires relatively little data flow back and forth (basically you just need to send the surface mesh and forces on physics objects it interacts with). The problem is that I can't think of any scenario where you want fluid situation but you don't need it to react instantaneously to the player's actions. If you're playing Wave Race NX, for example, you'll want the water to be displaced immediately by your jet-ski, not after you've already sped over it.
 

Snakeyes

Member
The problem is that I can't think of any scenario where you want fluid situation but you don't need it to react instantaneously to the player's actions. If you're playing Wave Race NX, for example, you'll want the water to be displaced immediately by your jet-ski, not after you've already sped over it.
Wouldn't anything that is not in direct contact with the player character be suitable for cloud computing? For example, you don't need instantaneous feedback if you throw a rock into a waterfall and watch how it affects the current. Cloth simulations could also get away with a generous amount of latency without the game feeling sluggish.
 

Thraktor

Member
Wouldn't anything that is not in direct contact with the player character be suitable for cloud computing? For example, you don't need instantaneous feedback if you throw a rock into a waterfall and watch how it affects the current. Cloth simulations could also get away with a generous amount of latency without the game feeling sluggish.

The difficulty is in finding those situations where the player indirectly interacts with systems such as liquid or cloth simulation. Most of the times you see cloth simulation in games the developers make a point of the player directly interacting with it to show it off (e.g. by modelling the player character's clothes, or by having the character walk through curtains, etc.) Having an NPC interact with some kind of cloth in the background would work, but may not be all that visible on a 540p screen, and you may be able to get just as good an effect with pre-computation (e.g. the famous Metro gif of pulling a cloth cover off a car).

I suppose in a god game or RTS you might be able to make interesting use of fluid simulation without latency being an issue (for example blowing up a dam and watching the water flood your enemy's base), but those aren't exactly common genres on handhelds, and you again have the issue of needing a local fallback if the connection drops.
 

AmyS

Member
rejection cleared

Rösti;201254198 said:
Congratulations to inventor Joe Bentdahl and Nintendo.

Excellent.

D896hcq.gif
 

AmyS

Member
I still don't understand what this patent means.

It should mean that you'd be able to add one or more extra boxes that link to your NX home console to get better performance / graphics, higher resolution and more storage space.

There more to it, but that's the gist of it.
 

AzaK

Member
It should mean that users will be able to add one or more extra boxes that link to your NX home console to get better performance / graphics, higher resolution and more storage space.

Ahhh it all sounds so good, but also so un-nintendo.
 

AzaK

Member
That's more or less the macro way I've been imagining things going with the supplement patent, if implemented in practice.

Apropos, speaking of connection technologies, we can differentiate between two major scenarios: local (namely core unit and supplement over a PtP connection in the same room or apparment - what Thraktor talks about above) and not-so-local/remote (core unit and supplement over a reasonably close tcp/ip network - same neighbourhood or at least same city). In the first scenario. BW-wise, anything that does 1Gb/s (aka 120-ish MB/s) is already perfectly usable for very substantial offloading of work to the supplement, and the latency is, well, for lack of better terms, ideal. Remember we're talking of compact buffers, mostly render targets, and sync messages flying back and forth, respectively, during the frame duration, not raw assets. In the second scenario, things that can be offloaded drop substantially thanks to both BW reductions and most importantly, latency hiking. Then BW goes down to a few (perhaps ten) of MB/s (which is still not bad) but latency stars taking some sort of heuristics and robust contingency planning - think netcode in fighting/racing multiplayes. Now, before you start wondering what good could that second scenario be, here's a rudimentary example: the supplement on a 'remote' connection can compute a fully-realtime texture for the skybox in a game, send it over in motion-frame-compressed format, not unlike how you watch youtube, and let the core unit just paint it over a skybox.

It's all good in theory but there's a few questions that this brings up.

1) Who really has 1Gb? I would hazard a guess most people are on wireless, which doesn't really reach 1Gb at present and latency is a little mad.

2) If we're thinking of rendering skyboxes is that real value for money?

Don't get me wrong, I've been asking for expandable consoles for years and years I am just not really convinced with the remote, or LAN scenario. Physically connected though, it could be awesome.

This statement is rather funny considering its their patent.

Not really. Companies patent shit all the time that they have no real goal, nor ability to implement. That said, I'm sure Nintendo have thought about it (Obviously) but there are a number of hurdles that seem so un-Nintendo to me that I can imagine this thing just being a curiosity to them
 
I prefer supplemental computing devices over having to buy a whole new console.

I'm checking GAF everyday hoping to see a thread announcing a NX direct.
 
I prefer supplemental computing devices over having to buy a whole new console.
It comes down to the pricing of the base console and the prices and span of time between release of the SCDs, imo.

If i have to pay, say, 350€ for the console and 200€ for a SDC 1.5 to 2 years in, then i'd be not really into this. And they hopefully won't go the PS4K way of paying the full launch price of the vanilla console again.
 

KingBroly

Banned
It should mean that you'd be able to add one or more extra boxes that link to your NX home console to get better performance / graphics, higher resolution and more storage space.

There more to it, but that's the gist of it.

So...that Gran Turismo 5 BS they talked about years before its' release where it was like '4 PS3's = 120fps in 4K' or something like that?
 
Top Bottom