• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia CEO: Relationship with Nintendo "will likely last two decades"

DarkestHour

Member
Oct 27, 2015
1,304
534
475
Two decades?

That's optimistic.

Honestly, and I say this as someone who loves game consoles, the era of the dedicated console isn't gonna last even one more decade.
Two decades could simply mean in to 2021 and not necessarily through two decades.

I highly doubt they have a 20 year agreement.
 

Eolz

Member
Jun 1, 2014
10,193
20
535
Two decades could simply mean in to 2021 and not necessarily through two decades.

I highly doubt they have a 20 year agreement.
That's not what it would mean no. They don't have a 20 year agreement either obviously. Just a really promising relationship.
 

Negotiator

Member
Jun 28, 2011
4,676
1,480
885
In 20 years from now, everything will be streamed. Dedicated hardware (consoles/PCs) will be obsolete in most households.
 

Kyzer

Banned
Jan 7, 2009
11,985
1
0
ill take this to mean nvidia will eventually be demanding obscene amounts of money and nintendo will drop them for AMD and lose BC
 

Datschge

Member
Sep 23, 2006
5,149
0
1,150
That's not what it would mean no. They don't have a 20 year agreement either obviously. Just a really promising relationship.
The 20 years number likely isn't a coincidence either, the previous console GPU partnership lasted nearly that long (Silicon Graphics with N64, ArtX with GC, ATi with Wii and AMD with Wii U were the continuity of one team first splitting off and then being bought multiple times).
 

LordRaptor

Member
Aug 20, 2015
9,769
579
520
ill take this to mean nvidia will eventually be demanding obscene amounts of money and nintendo will drop them for AMD and lose BC
The GPU is pretty much the least likely part of the console design to break compatibility if the manufacturer changes.
 

KingJ2002

Member
Jun 8, 2004
5,791
5
1,495
36
Los Angeles, CA
www.createdbyking.com
Nvidia had an earnings call yesterday following their Q3 results, and in the Q&A Jen-Hsun Huang (Nvidia CEO) commented briefly about the Nintendo Switch, and I figured the comments were interesting enough to warrant a new thread. Here's what he said (taken from SeekingAlpha's transcript):



A few notes:

- The question asked was about the increased revenues of Nvidia's gaming division, so when he says "Nintendo contributed a fair amount to that growth", that's what he's referring to. Gaming revenue was up by $463 million over the last quarter, although what constitutes "a fair amount" of that is anyone's guess. This could be R&D/consulting payments on completion of tape-out, or it could be an initial payment for the first batch of chips.

- The fact that Nvidia are including Nintendo's business in their Gaming division, rather than their IP licensing division, would indicate that Nvidia are handling manufacturing and selling the final chips to Nintendo, rather than Nintendo licensing the design and handling manufacturing themselves (which was their arrangement with AMD for Wii U's GPU).

- Working with Nintendo for "almost two years" means design work on Switch's SoC likely started at the end of 2014/start of 2015. That would seem like a pretty typical timescale for a custom chip like this.

- The "several hundred engineering years" re-states what was in Nvidia's initial press release (which afaik claimed 500 engineering man-years).

This last bit is speculation on my part, but when he says "over the next" and then cuts off to talk about Nintendo using the same architecture for a long time, I get the impression that Nvidia already has contracts for future Nintendo hardware that he's trying not to talk about. Claiming that the relationship will last "two decades" is also pretty confident even for Huang, and I have a feeling that this is further evidence that Switch is the start of the "family of systems" which Iwata talked about, all revolving around ARM/Nvidia SoCs and a common software platform.
Thanks for the breakdown, I am glad that Nintendo has a partner in Nvidia when it comes to hardware design. The past two generations showed that Nintendo could have used a partner like Nvidia to drive improvement while also staying true to their goal. Here's hoping that the Nintendo Switch finds it's success so we can see what comes next.
 

Negotiator

Member
Jun 28, 2011
4,676
1,480
885
The GPU is pretty much the least likely part of the console design to break compatibility if the manufacturer changes.
Not really.

Consoles give low-level access to the GPU and this complicates BC... PCs are different, because they utilize high-level API abstractions.

There's a reason OG XBOX emulation is still not possible (even on XB1) and Sony had to emulate GCN 1.1 shaders with base mode/downclocking.
 

LordRaptor

Member
Aug 20, 2015
9,769
579
520
Not really.

Consoles give low-level access to the GPU and this complicates BC... PCs are different, because they utilize high-level API abstractions.

There's a reason OG XBOX emulation is still not possible (even on XB1) and Sony had to emulate GCN 1.1 shaders with base mode/downclocking.
Its not trivial, but the idea people are writing assembly and coding to the metal to maximise hardware specific fixed functions is less true today than it ever has been.
 

MacTag

Banned
Jan 5, 2016
2,315
0
0
Not really.

Consoles give low-level access to the GPU and this complicates BC... PCs are different, because they utilize high-level API abstractions.

There's a reason OG XBOX emulation is still not possible (even on XB1) and Sony had to emulate GCN 1.1 shaders with base mode/downclocking.
OG Xbox emulation was possible on 360, presumably it would be on One as well.
 

Negotiator

Member
Jun 28, 2011
4,676
1,480
885
Its not trivial, but the idea people are writing assembly and coding to the metal to maximise hardware specific fixed functions is less true today than it ever has been.
It's even possible on PCs:

http://gpuopen.com/amdgcn-assembly/
http://gpuopen.com/amd-gcn-assembly-cross-lane-operations/
https://github.com/RadeonOpenCompute/LLVM-AMDGPU-Assembler-Extra

Hand-written assembly maximizes performance, but it makes BC harder. It's always been that way; a trade-off.

If ease of use was of utmost importance, games would have ditched Asm/C/C++ in favor of Java/C#.

OG Xbox emulation was possible on 360, presumably it would be on One as well.
IIRC, it was very buggy and they ditched it quickly.
 

Thraktor

Member
Dec 29, 2004
3,889
2
0
Dublin, Ireland
Its not trivial, but the idea people are writing assembly and coding to the metal to maximise hardware specific fixed functions is less true today than it ever has been.
People don't have to be writing GPU assembly for it to be an issue. Unlike on PCs (which use runtime shader compilation), console shaders are typically pre-compiled for the console's specific shader ISA, so for a game to run on any future console it needs to either use the same ISA or be able to emulate it. AMD would easily be able to provide an ARMv8 CPU which would be 100% binary compatible with Switch's CPU code, but providing a GPU that's compatible with shader code compiled for a Nvidia GPU is a whole different thing altogether (and that's even aside from any vendor-specific optimisations that developers may have used for Switch builds).
 

LordRaptor

Member
Aug 20, 2015
9,769
579
520
Hand-written assembly maximizes performance, but it makes BC harder. It's always been that way; a trade-off.
People don't have to be writing GPU assembly for it to be an issue. Unlike on PCs (which use runtime shader compilation), console shaders are typically pre-compiled for the console's specific shader ISA, so for a game to run on any future console it needs to either use the same ISA or be able to emulate it.
Sure, but modern game development is already expensive and specialised enough that you just don't see studios putting in that kind of work - I mean, yeah, okay, a first / second party whose remit is pretty much "make something as pretty as possible to show off our consoles power" probably would, a la a Quantum Break or a The Order, but modern AAA software pipelines aren't really built around that level of per-platform optimisation.

I think its far more likely at worst you would have glitchy graphical effects - like broken water textures as a probable common custom shader example - in any hypothetical vendor switch, which is a lot easier to work around for BC.

e:
Although having said that it depends on if things like DX12 / Vulkan take off, which are a return to very low level platform specific optimisations
 

Hermii

Member
Sep 17, 2012
6,982
0
0
Two decades could simply mean in to 2021 and not necessarily through two decades.

I highly doubt they have a 20 year agreement.
I dont think they have a hard agreement, they are probably basing it on how long they kept IBM powerpc around.
 

MacTag

Banned
Jan 5, 2016
2,315
0
0
IIRC, it was very buggy and they ditched it quickly.
The program lasted around 2 years and over 450 games (51% of the OG Xbox library) were made fully compatible. Titles released as part of the Xbox Originals lineup were even improved with better framerates and higher native resolution.

For comparison under 300 Xbox 360 games are currently playable on Xbox One.
 

Negotiator

Member
Jun 28, 2011
4,676
1,480
885
People don't have to be writing GPU assembly for it to be an issue. Unlike on PCs (which use runtime shader compilation), console shaders are typically pre-compiled for the console's specific shader ISA, so for a game to run on any future console it needs to either use the same ISA or be able to emulate it. AMD would easily be able to provide an ARMv8 CPU which would be 100% binary compatible with Switch's CPU code, but providing a GPU that's compatible with shader code compiled for a Nvidia GPU is a whole different thing altogether (and that's even aside from any vendor-specific optimisations that developers may have used for Switch builds).
This is also true.

Console code is tailored to a certain uArch. That's why Wii games on Wii U utilize something like PS4 Pro's base mode (shutting down some units & downclocking) and they run on 480p, unlike Dolphin's method.

Sure, but modern game development is already expensive and specialised enough that you just don't see studios putting in that kind of work - I mean, yeah, okay, a first / second party whose remit is pretty much "make something as pretty as possible to show off our consoles power" probably would, a la a Quantum Break or a The Order, but modern AAA software pipelines aren't really built around that level of per-platform optimisation.
DICE isn't a 1st party studio and they write certain code sections in assembly. It's a tradition since the Amiga days for some devs. ;)

I think its far more likely at worst you would have glitchy graphical effects - like broken water textures as a probable common custom shader example - in any hypothetical vendor switch, which is a lot easier to work around for BC.
That's exactly what happens in emulators. If we're talking about hobbyists and an open platform (Dolphin & PC), that's fine, but on a closed platform where only certain companies have access to fix things, it's a huge problem. There's a reason PS4 Pro's base mode works the way it works. There's no financial incentive for all companies to patch their games.

Although having said that it depends on if things like DX12 / Vulkan take off, which are a return to very low level platform specific optimisations
DX12/Vulkan/Metal are Mantle derivatives... and you know why we are going back to our low-level roots?

It's because Moore's Law is basically defunct these days (we were stuck at 28nm for 5 years!), so there's no "free lunch" upgrade every year and a half like in the 90s/2000s. Devs will need to extract every bit of extra juice they can.
 

Negotiator

Member
Jun 28, 2011
4,676
1,480
885
The program lasted around 2 years and over 450 games (51% of the OG Xbox library) were made fully compatible. Titles released as part of the Xbox Originals lineup were even improved with better framerates and higher native resolution.

For comparison under 300 Xbox 360 games are currently playable on Xbox One.
Fair enough. I'd love it if they brought OG XBOX emulation on XB1, but unfortunately it sounds harder than it is:

https://forum.beyond3d.com/threads/why-has-the-xbox-not-been-emulated-up-to-now.57558/#post-1892781

PCs are a lot stronger than XB1 and there's still no working OG XBOX emulator last time I checked... the PS2 is more exotic arch-wise and yet we have PCSX2.
 

Ogodei

Member
Apr 13, 2015
6,731
0
0
Nintendo will go 3rd party in two decades.
Two decades is definitely the timeframe dedicated consoles won't exist anymore, imo. The next 15 years should bring us to that long-sought singularity where form factor is only about what you want to do on the device, but anything can be done more or less anywhere.
 

Negotiator

Member
Jun 28, 2011
4,676
1,480
885
Two decades is definitely the timeframe dedicated consoles won't exist anymore, imo. The next 15 years should bring us to that long-sought singularity where form factor is only about what you want to do on the device, but anything can be done more or less anywhere.
What makes you think that PCs will exist, but consoles won't? Both of them will vanish for the majority, because streaming from cloud infrastructures will be viable & cheap.

Nobody could imagine YouTube/NetFlix in 1996 and yet here we are... everyone streams videos and only a minority bothers with discs anymore.
 

acm2000

Member
Jun 18, 2007
6,065
487
1,395
England, UK
Two decades is definitely the timeframe dedicated consoles won't exist anymore, imo. The next 15 years should bring us to that long-sought singularity where form factor is only about what you want to do on the device, but anything can be done more or less anywhere.
This will never happen while PC hardware is a mess like it is, same with operating systems and drivers

This gradual upgrade system for consoles will be here to stay for the long term
 

AmyS

Member
Aug 22, 2012
14,320
29
455
Two decades is a long time, but I have no doubt there will be at least two generations of Nintendo/Nvidia hardware beyond the early 2017 Nintendo Switch. Nvidia has the architectural roadmap to help assure that, especially as far as long-term stability. Nvidia has Volta and Einstein architectures, of which future generations of Tegra/SoCs will be based on, with a Volta-based SoC already announced (Xavier) for late 2017 / 2018.

Moving forward, as long as Nintendo is successful I'd expect more Nintendo/Nvidia hardware by 2020 at the latest.

Edit:

This last bit is speculation on my part, but when he says "over the next" and then cuts off to talk about Nintendo using the same architecture for a long time, I get the impression that Nvidia already has contracts for future Nintendo hardware that he's trying not to talk about. Claiming that the relationship will last "two decades" is also pretty confident even for Huang, and I have a feeling that this is further evidence that Switch is the start of the "family of systems" which Iwata talked about, all revolving around ARM/Nvidia SoCs and a common software platform.
Yes, indeed, I agree.
 

MacTag

Banned
Jan 5, 2016
2,315
0
0
Fair enough. I'd love it if they brought OG XBOX emulation on XB1, but unfortunately it sounds harder than it is:

https://forum.beyond3d.com/threads/why-has-the-xbox-not-been-emulated-up-to-now.57558/#post-1892781

PCs are a lot stronger than XB1 and there's still no working OG XBOX emulator last time I checked... the PS2 is more exotic arch-wise and yet we have PCSX2.
Homebrew's a different beast and relies entirely on the community. That we don't have an OG Xbox emulator on par with PCSX2, Dolphin or even Reicast has nothing to do with the architecture and everything to do with community interest and drive.

Xbox isn't some magically impossible to emulate machine, in fact it's more off-the-shelf that it's competitors were and should be simpler to document. If people were willing to put in the work.
 

LordRaptor

Member
Aug 20, 2015
9,769
579
520
It's because Moore's Law is basically defunct these days (we were stuck at 28nm for 5 years!), so there's no "free lunch" upgrade every year and a half like in the 90s/2000s. Devs will need to extract every bit of extra juice they can.
This hugely off-topic, but there are 2 sides to this argument though - its not just a question of 'power' budget (as in technical specifications required to achieve the desired result), its also a question of manpower budget.
It's arguable that we're already at the point where fully utilising existing tech is cost prohibitive in terms of things like asset creation; there may be technological advances that reduce this cost (like fully physical based asset creation where things behave exactly as expected because things like how a bullet behaves when it hits a material is pre-defined from a library and stone chips, wood splinters and flesh bleeds) but to what extent this propagates is still something of an 'if' rather than a 'when'.

What makes you think that PCs will exist, but consoles won't?
PCs aren't fundamentally consumption devices in the same way that a console is; there will likely always be production reasons to own a computer that performs its tasks locally, and as long as PCs are general purpose enough to have a production necessity, they will also have a consumption benefit because people are always going to goof off.
 

dracula_x

Member
Jun 4, 2013
4,321
3
0
Nintendu 64
twitter.com
...


PCs aren't fundamentally consumption devices in the same way that a console is; there will likely always be production reasons to own a computer that performs its tasks locally, and as long as PCs are general purpose enough to have a production necessity, they will also have a consumption benefit because people are always going to goof off.
Google's Andromeda project will change that. Basically like Switch, you will be able to attach smartphone to dock/display and voila – smartphone become a PC (in terms of performance).
 

LordRaptor

Member
Aug 20, 2015
9,769
579
520
Google's Andromeda project will change that. Basically like Switch, you will be able to attach smartphone to dock/display and voila – smartphone become a PC (in terms of performance).
Sure, I can easily envisage a future where "your PC" is effectively a USB thumbdrive that you dock, I just can't envisage a future where there is no need for local computational power and nobody has anything except thin clients connecting to a mainframe cloud.

The thing that consoles do well and sell because of - namely a 'single serve' purpose as dedicated hardware - is also their own biggest weakness, because a 'general purpose' device of equivalent utility makes them obsolete for anything but the purest enthusiast.
 

Negotiator

Member
Jun 28, 2011
4,676
1,480
885
This hugely off-topic, but there are 2 sides to this argument though - its not just a question of 'power' budget (as in technical specifications required to achieve the desired result), its also a question of manpower budget.
It's arguable that we're already at the point where fully utilising existing tech is cost prohibitive in terms of things like asset creation; there may be technological advances that reduce this cost (like fully physical based asset creation where things behave exactly as expected because things like how a bullet behaves when it hits a material is pre-defined from a library and stone chips, wood splinters and flesh bleeds) but to what extent this propagates is still something of an 'if' rather than a 'when'.
Costs keep skyrocketing because of the ever increasing 1) production values (poly count, high-res textures, mo-cap, storytelling etc), 2) PR/marketing budget.

Coding is a small part of it.

Do you think it's a coincidence that we have all this low-level API craze all of a sudden? Why didn't this happen earlier?

We had Glide when 3DFX reigned supreme, but Direct3D made it irrelevant later on. Moore's Law was live and kicking for decades.

PCs aren't fundamentally consumption devices in the same way that a console is; there will likely always be production reasons to own a computer that performs its tasks locally, and as long as PCs are general purpose enough to have a production necessity, they will also have a consumption benefit because people are always going to goof off.
You're confusing business usage with consumer usage. Consumers already use tablets a lot. How many people use a home PC religiously like in the 90s/2000s? People who do it are a minority. Mobile domination (smartphones/tablets) is a reality.

Of course that doesn't mean that cloud infrastructures won't cater to businesses too. Apps like Adobe ones or Office 365 already do that. We're heading back to the dumb terminal 70s vision (which ended abruptly because of the CLI -> GUI paradigm shift). Everything will run off a server. Companies love subscription-based models (SaaS).

Nvidia has also invested in cloud technology. In 20 years from now, they won't manufacture dedicated hardware for consumers, unless we're talking about limited production (like modern-era vinyl records).

I just can't envisage a future where there is no need for local computational power and nobody has anything except thin clients connecting to a mainframe cloud.
As I've said before, I couldn't imagine YouTube & Netflix in 1996 either.

20 years is a lot of time for technological paradigm shifts...
 

chaosblade

Unconfirmed Member
Jul 14, 2009
21,129
1
0
Isn't that what fucked up OG XBOX?

Remember when Nvidia didn't want to pass the cost savings (die shrunk chips) to Microsoft? Rumors say that Sony wasn't exactly happy with RSX (PS3 GPU) either...

So, why is this supposed to be "good news"? I hope I'm wrong, but Nvidia has set a bad precedent in console hardware.
You would think Nintendo would have made sure their contract covered the problems Sony and MS had with Nvidia. No idea if it was ever debunked but the rumors indicated that the deal was going to favor Nintendo more than Nvidia.

Plus Nintendo at least attempted to work with them on 3DS and ultimately walked, so they probably knew what they were getting into in that regard too. And Nvidia also knew that Nintendo wouldn't hesitate to walk away from a deal they weren't happy with since they had already done it to them.
 

Durante

Member
Oct 1, 2006
48,836
1
0
peter.metaclassofnil.com
If ease of use was of utmost importance, games would have ditched Asm/C/C++ in favor of Java/C#.
This is an interesting point to discuss, given that I'm pretty sure that the majority of games currently in development are being developed in C#.

Google's Andromeda project will change that. Basically like Switch, you will be able to attach smartphone to dock/display and voila – smartphone become a PC (in terms of performance).
Well, at that point PCs still exist. They just look different.

The fundamental difference is the one LordRaptor correctly identified. There will always be a device to actually create and nothing stops that device from also being used to consume.
 

Negotiator

Member
Jun 28, 2011
4,676
1,480
885
This is an interesting point to discuss, given that I'm pretty sure that the majority of games currently in development are being developed in C#.
Are you talking about indies (games of low complexity, with relatively simple graphics)?

Sure, but what about complex and demanding AAA games? Java/C# ain't gonna cut it, especially in consoles with their limited resources.

Low-level APIs are a necessity for the 2nd category. I don't expect to see Limbo written in DX12/Vulkan/GNM/Metal, because there's no benefit performance-wise. Battlefield needs it though.
 

LordRaptor

Member
Aug 20, 2015
9,769
579
520
Costs keep skyrocketing because of the ever increasing 1) production values (poly count, high-res textures, mo-cap, storytelling etc),

...

Coding is a small part of it.
Well, that's the point I was making.
If you create a new photo-realistic rendering technique, you need new photo realistic assets to utilise it, which exponentially increases asset costs.
 

Negotiator

Member
Jun 28, 2011
4,676
1,480
885
Well, that's the point I was making.
If you create a new photo-realistic rendering technique, you need new photo realistic assets to utilise it, which exponentially increases asset costs.
Low-level APIs were not just made for photorealism. There are some features that reduce CPU overhead, like draw call bundles, which can help games with lots of NPCs (like AC Unity or RTS games). Apart from that, you can also offload some demanding tasks to GPGPU compute shaders.

I don't expect every game to be as photorealistic as The Order or Uncharted, but I expect it run decently at least. Games like AC Unity or Just Cause 3 should have never been released like that.

Low-level APIs are even more important in portable consoles, but here's the catch: there's no easy way to transfer Jaguar/GCN optimizations to an entirely different uArch, not to mention the RAM difference (4 vs 8GB).

People who think that high-level PC/DX11 code will be ported to Switch easily are delusional IMHO.
 

Cuburt

Member
Nov 14, 2012
6,298
0
0
This seems like pretty big news and quite telling in certain aspects that don't line up with the norms of each company. This may be a real partnership inline with what Nintendo were going to do with the Nintendo Playstation with Sony.

With Nvidia's knowledge of ARM architecture, and how it could grow with them in the future, this really makes me excited for what they have planned for the future.

Perhaps, Nintendo will be able to make a better infrastructure now with better online. Who knows, it may even help to make them more flexible than the Xbox and Playstation ecosystems are right now.

I really like that Nintendo seems to be making a reset when it comes to backwards compatibility. Making a hard stop and starting over with future-proofing in mind works for me.
Assuming that is the trade off they are making. People assumed we'd get that with the PlayStation and Xbox but it isn't necessarily how it has played out so far. x86 made it easier for PC portability but I'm not convinced it makes cross platform scaling completely painless.
 

Mpl90

Two copies sold? That's not a bomb guys, stop trolling!!!
Mar 10, 2011
22,885
0
955
29
theflyingthoughtsblog.wordpress.com
I suppose this is the right thread for this

http://venturebeat.com/2016/11/13/nvidias-ceo-on-everything-from-ais-dangers-to-donald-trump-and-the-nintendo-switch/

VB: There was a day when it seemed like you were happy with just serving the PC gaming market. The console was a less attractive market. I wonder why you guys went after the Nintendo Switch, and how you accomplished that.

Huang: We’re dedicated to the gaming market and always have been. Some parts of the market, we just weren’t prepared to serve them. I was fairly open about how, when this current generation of consoles was being considered, we didn’t have x86 CPUs. We weren’t in contention for any of those. However, the other factor is whether we could really make a contribution or not. If a particular game console doesn’t require our special skills, what we can uniquely bring, then it’s a commodity business that may not be suited for us.

In the case of Switch, it was such a ground-breaking design. Performance matters, because games are built on great performance, but form factor and energy efficiency matter incredibly, because they want to build something that’s portable and transformable. The type of gameplay they want to enable is like nothing the world has so far. It’s a scenario where two great engineering teams, working with their creative teams, needed to hunker down. Several hundred engineering years went into building this new console. It’s the type of project that really inspires us, gets us excited. It’s a classic win-win.
 

Snakeyes

Member
Sep 20, 2009
7,560
2
865
The type of gameplay they want to enable is like nothing the world has so far.
Interesting, assuming this isn't just PR speak. What they showed in the reveal trailer definitely looked more flexible and convenient than what's currently on the market, but there wasn't something truly revolutionary. Makes you wonder how much of the Switch is still under wraps. Genyo Takeda also suggested something to that effect last year;
You have already announced your plan to integrate the architectures of your dedicated game systems. Now that you will release your software on smart devices as well, I would like to hear what kind of challenges Nintendo will need to cope with in this regard. May I ask this question to Mr. Takahashi and Mr. Miyamoto from the software perspective and to Mr. Takeda from the hardware perspective?

Genyo Takeda (Senior Managing Director, Technology Fellow):

 I understand that, thanks to the evolution of computer technology, aiming to realize a virtualized software development environment that does not depend on specific hardware is becoming the technological norm today. Simultaneously, regarding input and output technologies, I believe that it is also in line with the current technological trend that Nintendo should challenge itself with the creation of a unique user interface.
 

Hermii

Member
Sep 17, 2012
6,982
0
0
Interesting, assuming this isn't just PR speak. What they showed in the reveal trailer definitely looked more flexible and convenient than what's currently on the market, but there wasn't something truly revolutionary. Wonder how much of the Switch is still under wraps.
Im assuming its PR speak.

If its what they showed in the reveal trailer, its still kind of true. The way you can use the joy cons in Wiimote, Nunchuck configuration, connect them to a grip, connect them to the tablet, take it on the go, use them on the tv, use them for local multiplayer is unprecedented, If there was something truly revolutionary with the hardware it would probably have leaked by now.
 
Jun 27, 2015
12,541
2
345
This seems like pretty big news and quite telling in certain aspects that don't line up with the norms of each company. This may be a real partnership inline with what Nintendo were going to do with the Nintendo Playstation with Sony.
This is pretty much what Nintendo did with IBM with the Gamecube
 

LordOfChaos

Member
Mar 31, 2014
11,323
6,381
915
as you know, the Nintendo architecture and the company tends to stick with an architecture for a very long time.


Sounds like plans are...PPC750-ish. Nintendo had a large deal with IBM to make the Gamecube processor, and then stuck with a higher clocked one for the Wii, and then a multicore version with larger caches for the Wii U (and resynthesized and the second highest a 750 has clocked at, which is not insubstantial work, being fair).

I'm enthused by the modernness of the architecture for today, but I also don't want a situation like the above where a decade later a new console is held back by a desire for BC.


But, I'm getting literally a decade ahead of myself.
 

Eolz

Member
Jun 1, 2014
10,193
20
535
Sounds like plans are...PPC750-ish. Nintendo had a large deal with IBM to make the Gamecube processor, and then stuck with a higher clocked one for the Wii, and then a multicore version with larger caches for the Wii U (and resynthesized and the second highest a 750 has clocked at, which is not insubstantial work, being fair).

I'm enthused by the modernness of the architecture for today, but I also don't want a situation like the above where a decade later a new console is held back by a desire for BC.


But, I'm getting literally a decade ahead of myself.
Hey at least ARM should be a lot more future-proof than Power PC if they ever need to change of HW partners while keeping BC...
 

Thraktor

Member
Dec 29, 2004
3,889
2
0
Dublin, Ireland
Sounds like plans are...PPC750-ish. Nintendo had a large deal with IBM to make the Gamecube processor, and then stuck with a higher clocked one for the Wii, and then a multicore version with larger caches for the Wii U (and resynthesized and the second highest a 750 has clocked at, which is not insubstantial work, being fair).

I'm enthused by the modernness of the architecture for today, but I also don't want a situation like the above where a decade later a new console is held back by a desire for BC.


But, I'm getting literally a decade ahead of myself.
Nintendo's decision to stick with PPC750 for so long wasn't just a desire to use precisely the same tech for as long as possible, it was because there literally weren't any other options for them while retaining full backwards-compatibility. IBM's CPU research went firmly in the direction of large, power-hungry server chips just after the Gamecube, so unless Nintendo wanted to fully fund the design of a new CPU from scratch they were pretty much stuck with the PPC750 so long as they wanted to retain BC.

By switching to ARMv8, they've guaranteed this isn't going to happen. ARM is the most common ISA around, and even aside from ARM's stock cores (which are updated at quite a frequent pace), there are a wide variety of other companies all designing new ARM cores which are fully binary compatible with what Nintendo will be using in Switch. What's more, almost all of these cores are designed for the cost/power consumption brackets that Nintendo will be interested in, so Nintendo is guaranteed to have an upgrade path with full BC for quite a long time.
 

LordOfChaos

Member
Mar 31, 2014
11,323
6,381
915
Nintendo's decision to stick with PPC750 for so long wasn't just a desire to use precisely the same tech for as long as possible, it was because there literally weren't any other options for them while retaining full backwards-compatibility. IBM's CPU research went firmly in the direction of large, power-hungry server chips just after the Gamecube, so unless Nintendo wanted to fully fund the design of a new CPU from scratch they were pretty much stuck with the PPC750 so long as they wanted to retain BC.

By switching to ARMv8, they've guaranteed this isn't going to happen. ARM is the most common ISA around, and even aside from ARM's stock cores (which are updated at quite a frequent pace), there are a wide variety of other companies all designing new ARM cores which are fully binary compatible with what Nintendo will be using in Switch. What's more, almost all of these cores are designed for the cost/power consumption brackets that Nintendo will be interested in, so Nintendo is guaranteed to have an upgrade path with full BC for quite a long time.
That's true, it may not be quite the same.

I also wonder how low level or interoperable the new NVN API will be. I seem to remember a dev saying with either ease of use or low level programming flexibility, the Wii Us API had neither while the Switch's has at least one.

Sony for instance is sticking to Jaguar cores and clocking them the same as the base PS4 in unpatched games for perfect compatibility, while even the XBO S uses higher clocks for a bit better performance, so perhaps there are differences in how low level vs portable the XBO version of DX and the PS4s GNM are.