• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch: Powered by Custom Nvidia Tegra Chip (Official)

LordOfChaos

Member
There seem to be vents on top showing a heatsink inside which I'm also encouraged by. Maybe in docked mode it can power up more, even if the dock doesn't add anything itself.
 

vareon

Member
Are you all positive this is sub Xbox1 performance when zero specs have been released? We may be surprised to find this can be very comparable. I'm excited to know. Either way though, I'm pretty damn happy.

For it to be better than XB1, either

1) Nintendo has some VERY advanced technology to achieve an XB1 power on a tablet
2) It's priced ridiculously high

Both seem unlikely.
 

Leatherface

Member
For it to be better than XB1, either

1) Nintendo has some VERY advanced technology to achieve an XB1 power on a tablet
2) It's priced ridiculously high

Both seem unlikely.


keep in mind the portable side will probably be rendering at 720P and we know nothing of the docking unit yet.
 

thefro

Member
For it to be better than XB1, either

1) Nintendo has some VERY advanced technology to achieve an XB1 power on a tablet
2) It's priced ridiculously high

Both seem unlikely.

More modern chipset, also all the ARM processors that have been rumored are more powerful per core than Jaguar.
 

Costia

Member
Definitely. I just hope it's not running CUDA..

Yes. The wrong one.

You know NV hw runs OpenCL as well, right? And that both APIs are largely interchangeable? But sure, 'the API designed specifically for NV GPUs' - this is what happens when you actively downplay OpenCL on your hw for generations on, just to have your proprietary API lock-in in a couple of industries. Now they have a CUDA trojan in the console industry as well. Just because they're determined to fight OpenCL to the end.
Your posts don't make much sense to me.
You are saying that it is a bad thing that NVIDIA is giving you the option to use the NVIDIA CUDA API on NVIDIA hardware? What? o_O
You can still use opencl if you dislike NVIDIA so much.
But personally, from what i have heard, CUDA is easier and nicer to use than OpenCL. (I had some experience with opencl but no cuda)
 

low-G

Member
Are you all positive this is sub Xbox1 performance when zero specs have been released? We may be surprised to find this can be very comparable. I'm excited to know. Either way though, I'm pretty damn happy.

Apple, who have commissioned the most advanced mobile microarchitectures, has not surpassed the X1 in gaming performance... There is zero chance it'll top a X1.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Your posts don't make much sense to me.
You are saying that it is a bad thing that NVIDIA is giving you the option to use the NVIDIA CUDA API on NVIDIA hardware? What? o_O
No, I'm not saying that. NV desktop hw runs OCL. Switch will most likely not have that, erm, luxury.

You can still use opencl if you dislike NVIDIA so much.
Mind quoting me where I say I dislike NV?

But personally, from what i have heard, CUDA is easier and nicer to use than OpenCL. (I had some experience with opencl but no cuda)
From what 'I've heard' (my dev screen in front of me) both APIs are doing the same. There's been times when OCL has been catching up and times when CUDA's been catching up. But NV are artificially keeping an old OCL version on their hw (1.2 when the rest of the world is at 2.0). Can you guess why?
 
Apple, who have commissioned the most advanced mobile microarchitectures, has not surpassed the X1 in gaming performance... There is zero chance it'll top a X1.

The problem with Apple though is that people are not breaking their backs to make graphically intensive games on the platform. Unfortunately less and less people are willing to buy a game outright, so developers have to make their games as cheaply as possible and load them with F2P incentives.

You also have to throw in legacy hardware as well - no one is going to make a game only for the latest and greatest.
 

Ryde3

Member
So with these inferior specs, how are they going to convince third parties to invest time optimizing a port for this platform with wonky controllers and unproven online infrastructure when their money and time are better off spent developing PS4/XB1/PC versions with larger proven installbases and more flexible hardware?

Again, Nintendo is caught in the awkward position of being unattractive to the development of AAA third party games.

I get they announced a list of supporting third parties. They do that every gen to compensate their insecurities. But last time, it just ended up being mostly inferior ports until support died out.

Unless development costs on the platform are record low and accessible, I find it hard to imagine RDR2 or the next Battlefield coming to Switch.

I don't think the controllers require any 'special' development, they seem fairly standard as far as inputs.
 

bomblord1

Banned
A 1TF (according to that pastebin) Pascal Based Nvidia chip will outperform or perform on par with the xbox one.

I would even expect games to run at a higher resolution.
 
I believe unconfirmed but Emily Rogers reported 720p

Boo. Was hoping for 1080p. I'll probably have this hooked up to a TV 99% of the time.

Though I suppose most games would be upscaled even with the higher rez screen, so it'd really only make a difference for the menus.
 

Peru

Member
I'm surprised the Skyrim remaster was shown, but that kind of stuff is a bonus and likely centered around the launch window. Japanese third parties and a united Nintendo dev force in one place will make its lineup much, much stronger than recent Nintendo concsoles.

This thing is powerful enough to get the gorgeous out of Nintendo's team and bring some Japanese stalwarts up to modern standards (yes, I'm thinking about Monster Hunter).

And first impressions of the dock do seem to at least hint at the possibility that you will get 1080 for your home gaming with 720 on th go.
 
I'm pretty sure i've read the A9x is a better CPU than the mobile Jaguar used in the XBO and PS4

A9X's CPU cores are leagues ahead of the X1, but the GPU is either on par or just behind.

The A10 in the iPhone 7 offers similar performance to A9X, so it's really just a matter of time until an A10X or A11 (with new GPU architecture) arrives.
 
Yeah lets be honest, it won't be as powerful as the Xbox One, but Look at the library of WiiU games... compare that to the Xbox One and PS4 and you will see that Nintendo consoles are all about exclusives games not available elsewhere (Bayonetta 2 anyone?)
 
That is interesting, but at a guess (given the author of the tweet) we're looking at Japanese developers targeting the Japanese market, and given Switch's likely domination of what's left of Japan's dedicated gaming industry, they may be willing to move games over even if it requires a significant downgrade in graphics.

Given the budgets a lot of games devs have in development aren't of the AAA variety, and many games on PS4 are up-resed PS3 games (Tales of Berseria, Persona 5) I think it'll be fine.

The big AAA games from Japan aim at a global market and will be on PS4/Xbox One by default.
 

Costia

Member
No, I'm not saying that. NV desktop hw runs OCL. Switch will most likely not have that, erm, luxury.
You think they will ship an OpenGL driver but will dump OpenCL support? That sounds extremely unlikely.
Mind quoting me where I say I dislike NV?
You keep saying the APIs are comparable but you obviously dont like the NVIDIA one...
From what 'I've heard' (my dev screen in front of me) both APIs are doing the same. There's been times when OCL has been catching up and times when CUDA's been catching up. But NV are artificially keeping an old OCL version on their hw (1.2 when the rest of the world is at 2.0). Can you guess why?
My personal experience with openCL was quite bad. I am not talking about performance, but ease of use.

To me it sounds like you are here to bash NVIDIA over their propriatry parctices (in some cases rightfully so) and not to discuss the Switch's HW... Maybe a separate thread would be a good idea.
 

Datschge

Member
Regarding GPGPU and how to access it, Nintendo talked about pushing GPGPU since the beginning of Wii U and was essentially saying their future systems need to be able to take the technology used on Wii U. If Nvidia is only offering an outdated version of OCL (or not even that) Nintendo would essentially have to redo all their efforts since. That's not something I expect Nintendo to allow, especially as Nvidia was likely eager for some high profile design win for Tegra for a change.
 

Sony

Nintendo
This is the nVidia shield we deserve. I wonder what OS this device runs. I think Nintendo have a huge opportunity of they make the device run android.
 

KingSnake

The Birthday Skeleton
What is pretty sure right now is that it's Pascal based and it has active cooling (given the vents). Probably when docked.

We know that the screen is 720p from the leaks (that were proven right with everything else), so now the question is at what resolution does it render when docked.
 
For it to be better than XB1, either

1) Nintendo has some VERY advanced technology to achieve an XB1 power on a tablet
2) It's priced ridiculously high

Both seem unlikely.

Xbox one was weak even for 2013 standards and Nvidia shield was almost half of it's TF with Maxwell - if it uses Pascel it's entirely possible that it would have more TF in FP16 than xbox and less in FP32
 

jroc74

Phone reception is more important to me than human rights
What a min, what did I miss? I thought we were all on board for it being a Tegra chip?

Someone claimed otherwise and I missed it?
 

Lonely1

Unconfirmed Member
That is interesting, but at a guess (given the author of the tweet) we're looking at Japanese developers targeting the Japanese market, and given Switch's likely domination of what's left of Japan's dedicated gaming industry, they may be willing to move games over even if it requires a significant downgrade in graphics.

Arent many of those Jp games PS4/Vita releases as well?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
You think they will ship an OpenGL driver but will dump OpenCL support? That sounds extremely unlikely.
I could offer you an avatar bet but that's be like taking candy from a baby. Also, where did you hear Switch runs OGL?

You keep saying the APIs are comparable but you obviously dont like the NVIDIA one...
I don't dislike it - I think it's redundant. An API whose sole purpose is to keep the developer locked in.

My personal experience with openCL was quite bad. I am not talking about performance, but ease of use.
I've had bad experiences in more APIs than I'd care to remember. I (normally) don't attribute that to those APIs.

To me it sounds like you are here to bash NVIDIA over their propriatry parctices (in some cases rightfully so) and not to discuss the Switch's HW... Maybe a separate thread would be a good idea.
I just voiced an opinion, you asked questions, I responded. I'm done with this line anyway.
 

Mokujin

Member
SuperMetalDave64 and HappyNintendoFan.

Stay strong, you two. 'Cause it's gonna' be a rough week for you.

No hard feelings but that's a nice bonus after the reveal, really wanted that baseless stubborn speculation against strong sources to stop.

Let the crow eating begin (but I'm sure we'll get a ton of excuses instead).
 

Costia

Member
I could offer you an avatar bet but that's be like taking candy from a baby. Also, where did you hear Switch runs OGL?
You think that the ports in the trailer are DX? Or a new nintendo exclusive api? Again, sounds very improbable to me. Too much work.
 

wildfire

Banned
Nintendo Switch is powered by the performance of the custom Tegra processor. The high-efficiency scalable processor includes an NVIDIA GPU based on the same architecture as the world’s top-performing GeForce gaming graphics cards.

Knowing how Nvidia does press releases I strongly suspect Nintendo went with the Keplar version of Tegra instead of Maxwell. It's equally as powerful but the big difference is power efficiency. This combined with the large screen and the fact the screen has a visible exhaust port speaks ominously to how long it lasts in portable mode.
 

The Lamp

Member
I don't think the controllers require any 'special' development, they seem fairly standard as far as inputs.

Local multiplayer has to be designed around accommodating multiple control options or choosing 1 or the other: tiny detachable 2-button controllers, or full dual analog input, or involving all 4 unit screens at once.

It's a lot more complex than what the PS4 and XB1 are doing.
 

kIdMuScLe

Member
In guessing that nvidia is gonna be handling Documentation and support instead of developers waiting for Nintendo to send their engineers over right?
 

lenovox1

Member
Local multiplayer has to be designed around accommodating multiple control options or choosing 1 or the other: tiny detachable 2-button controllers, or full dual analog input, or involving all 4 unit screens at once.

It's a lot more complex than what the PS4 and XB1 are doing.


No, it doesn't. The only games shown in the clip using one half were Splatoon, Mario, and Mario Kart, which are games that need fewer inputs. The only game showing asynchronous play was the EA NBA title (though I'm sure Splatoon would support it).

Like with the Wii U, developers will be given options as to what they will and won't support regarding control scheme. There's no need for any multiplayer game to support any methods of control beyond the new "classic" controller and the standard two halves scheme.

And if that's the worst you can bring up about the control scheme, you've got nothing.

Focus on your system architecture and audience reach arguments.
 
So now that active cooling is confirmed, and Tegra Pascal is highly suggested, it's probably safe to say that the clocks will run faster when docked to a power supply.

If that's so, that do we think the docked performance will look like? Just a resolution/framerate bump? Possibly additional effects?
 
Top Bottom