• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch: Powered by Custom Nvidia Tegra Chip (Official)

z0m3le

Banned
Who says it is? Even the rumours for the dev kit don't specify the maximum battery life.

Sure, if all the assumptions you make turn out to be accurate then the picture you paint is convincing. If just one of them is off then things will be different. Just like with Wii U.

You aren't necessarily wrong, just way too sure of your assumptions -- because that is what they are at this point.
http://m.neogaf.com/showthread.php?t=1297753

It's in the OP, Laura Dale has been a solid source for switch for months and leaked some of the information we now know is right. Emily Rogers also confirmed Laura has legit sources.
 

Zedark

Member

I was wrong about it being Emily Rogers or Laura Kate Dale, rather it was first reported by Eurogamer (who said their sources have confirmed it to them) and a few days later backed up by Direct-feed games, also known as NateDrake on Neogaf, who is seen as a reliable source.

Also, the Nvidia blog post about the Switch states the following:
Nvidia said:
Nintendo Switch is powered by the performance of the custom Tegra processor. The high-efficiency scalable processor includes an NVIDIA GPU based on the same architecture as the world’s top-performing GeForce gaming graphics cards.

It seems really likely that this refers to the new Pascal Architecture, since the new GTX 10** generation runs on Pascal architecture.

That's all I can provide for you.
 

Narroo

Member
I'm just saying that all logic seemingly goes out the window when discussing the specs of upcoming Nintendo hardware. Which then leads to disappointment, meltdowns and misguided expectations.

And yes. Do look at Nintendo's past trajectories, look at their current place in the market. Look at the state of the market itself. Look at the potential audiences and pricing considerations. Look at the goals they set for their new product.
Realize that hardware power seriously isn't that high up on the priority ladder for Nintendo. Meanwhile, these threads always assume it's the most important thing.

And it's worth noting that cutting edge doesn't always equate to raw power. Since the Wii, Nintendo has always one to look towards low energy consumption for their hardware, and this'll be more important than ever.

That's not even considering the additional complexity of the device due to being a hybrid. The thing has a large screen, detachable controllers, a TV dock, and potentially motion controls, judging from the preview. The Switch may very well be an expensive, cutting edge piece of tech, but that doesn't mean it's going to be powerful.
 
Rumours say Pascal and Nvidia said same architecture as their top gpus (which isn't maxwell) but no explicit confirmation

No, at least not for any definition of "confirmed" I'd use.

One of the two leakers (Emily Rogers or Laura Dale, can't remember which) said that their sources confirmed this to her.
Yeah I thought it was not confirmed, I was surprised to read that in this thread.
Hrmmm. Given how NV loves just flaunting any and everything, the fact that this is not confirmed via NV makes me wonder....
 

Mihos

Gold Member
Wonder if this is one of the reasons Sheild 2 wasn't release.

Give me game stream on Switch and I will forgive you
 

Zedark

Member
Yeah I thought it was not confirmed, I was surprised to read that in this thread.
Hrmmm. Given how NV loves just flaunting any and everything, the fact that this is not confirmed via NV makes me wonder....

Could be they are contractually obliged to keep the specifics silent until Nintendo officially announces that (or they can't say anything about it ever, although they kinda did by saying the chip runs the same architecture as the GTX 10** series).
 

MacTag

Banned
Oh right. Now that you mention that it sounds familiar. I had it in my head that they were two components for some reason.

Edit: Anyways, I still think a cell phone processor is pretty weak for a home console.
X1 isn't a phone processor, it's only been used in tablets and set tops. X2 hasn't been used in any cs devices yet but it's targeted at automotive.

Btw the Jaguar CPU in Xbox One, PS4 and PS4 Pro was designed for laptops. All the consoles use mobile chips essentially now so your continued rhetoric on that being "weak" in only Nintendo's case seems misplaced.
 

Buggy Loop

Member
So... 3hours on tegra X1 =/= rumors that the final version would use a pascal updated tegra chip..

The two cannot go hand in hand, any updated SoC by Nvidia would from 20nm to 16nm minimum, without even mentionning upgrades on the architecture.

You cannot make this rumor mill spin any faster i think.
 
I was wrong about it being Emily Rogers or Laura Kate Dale, rather it was first reported by Eurogamer (who said their sources have confirmed it to them) and a few days later backed up by Direct-feed games, also known as NateDrake on Neogaf, who is seen as a reliable source.

Also, the Nvidia blog post about the Switch states the following:


It seems really likely that this refers to the new Pascal Architecture, since the new GTX 10** generation runs on Pascal architecture.

That's all I can provide for you.

Well, I doubt Nvidia would call an generation older Nvidia tech "Old and shit" for example. It can just be PR speak.
 
Except his info is quite obviously nonsense (assuming we're talking about the same info were he talks about Parker having 1/8th the performance of XBox One due to bandwidth constraints).

I believe he talked about the Tegra X1 being 1/8 the performance of the Xbone SoC. It does seem like he is focusing on bandwidth (important factor, but not the whole story as we know), but the folks over there confirm that he is a dev who would be in a position to have info. Nothing he says is outlandish. Take all with a grain of salt, but I'm giving his info a chance. Looking at lpDDR4 available, his hints would peg Switch's RAM bandwidth at around ~60 GB/s.
 

TLZ

Banned



Wakerlink.jpg
 

thefro

Member
Well, I doubt Nvidia would call an generation older Nvidia tech "Old and shit" for example. It can just be PR speak.

It could be, but "the world's top-performing Geforce graphics cards" run on the Pascal architecture, not Maxwell. It strongly implies it's based on their newest tech.
 

IvorB

Member
X1 isn't a phone processor, it's only been used in tablets and set tops. X2 hasn't been used in any cs devices yet but it's targeted at automotive.

Btw the Jaguar CPU in Xbox One, PS4 and PS4 Pro was designed for laptops. All the consoles use mobile chips essentially now so your continued rhetoric on that being "weak" in only Nintendo's case seems misplaced.

I'm aware that the Jaguar is a laptop processor. Are the GPU components of the Xbone and PS4 mobile components also? Are you saying this Tegra processor is on parr with what's in the other two consoles? Also I may be wrong here but I'm pretty sure there is a power gap between laptop components and something that powers a tablet or set top box.
 
I wonder if the TV mode will support 4k . I own the Nvidia Shield TV and you could get reasonable framerates on the Talos Principle downscalled from 4k and the Switch should be stronger than that so indies should have a decent chance of running at 4k.
 

cheezcake

Member
X1 isn't a phone processor, it's only been used in tablets and set tops. X2 hasn't been used in any cs devices yet but it's targeted at automotive.

Btw the Jaguar CPU in Xbox One, PS4 and PS4 Pro was designed for laptops. All the consoles use mobile chips essentially now so your continued rhetoric on that being "weak" in only Nintendo's case seems misplaced.

At their time of release, the GPU's in the X1 and PS4 were certainly not mobile chips.
 

koam

Member
I call BS on the 3 hour battery life. Dev kits are tethered, they don't run on batteries. Battery life will probably be short but there's no way the dev kit can be an indication of battery life.
 

ekurisona

Member
I'm still interested to know what Iwata meant by "fully absorb the Wii U architecture."


context, for those who may not be familiar with this quote...


Question 5:

You have explained your concern about users being divided by hardware. Currently, you have both a handheld device business and a home console business. I would like to know whether the organizational changes that took place last year are going to lead to, for example, the integration of handheld devices and home consoles into one system over the medium term, or a focus on cost saving and the improvement of resource efficiency in the medium run. Please also explain if you still have room to reduce research and development expenses.


Answer 5 Iwata:

Last year Nintendo reorganized its R&D divisions and integrated the handheld device and home console development teams into one division under Mr. Takeda. Previously, our handheld video game devices and home video game consoles had to be developed separately as the technological requirements of each system, whether it was battery-powered or connected to a power supply, differed greatly, leading to completely different architectures and, hence, divergent methods of software development. However, because of vast technological advances, it became possible to achieve a fair degree of architectural integration. We discussed this point, and we ultimately concluded that it was the right time to integrate the two teams.

For example, currently it requires a huge amount of effort to port Wii software to Nintendo 3DS because not only their resolutions but also the methods of software development are entirely different. The same thing happens when we try to port Nintendo 3DS software to Wii U. If the transition of software from platform to platform can be made simpler, this will help solve the problem of game shortages in the launch periods of new platforms. Also, as technological advances took place at such a dramatic rate, and we were forced to choose the best technologies for video games under cost restrictions, each time we developed a new platform, we always ended up developing a system that was completely different from its predecessor. The only exception was when we went from Nintendo GameCube to Wii. Though the controller changed completely, the actual computer and graphics chips were developed very smoothly as they were very similar to those of Nintendo GameCube, but all the other systems required ground-up effort. However, I think that we no longer need this kind of effort under the current circumstances. In this perspective, while we are only going to be able to start this with the next system, it will become important for us to accurately take advantage of what we have done with the Wii U architecture. It of course does not mean that we are going to use exactly the same architecture as Wii U, but we are going to create a system that can absorb the Wii U architecture adequately. When this happens, home consoles and handheld devices will no longer be completely different, and they will become like brothers in a family of systems.

Still, I am not sure if the form factor (the size and configuration of the hardware) will be integrated. In contrast, the number of form factors might increase. Currently, we can only provide two form factors because if we had three or four different architectures, we would face serious shortages of software on every platform. To cite a specific case, Apple is able to release smart devices with various form factors one after another because there is one way of programming adopted by all platforms. Apple has a common platform called iOS. Another example is Android. Though there are various models, Android does not face software shortages because there is one common way of programming on the Android platform that works with various models. The point is, Nintendo platforms should be like those two examples. Whether we will ultimately need just one device will be determined by what consumers demand in the future, and that is not something we know at the moment. However, we are hoping to change and correct the situation in which we develop games for different platforms individually and sometimes disappoint consumers with game shortages as we attempt to move from one platform to another, and we believe that we will be able to deliver tangible results in the future.

from:

Corporate Management Policy Briefing / Third Quarter Financial Results Briefing
for the 74th Fiscal Term Ending March 2014
Q & A

https://www.nintendo.co.jp/ir/en/library/events/140130qa/02.html
 

MacTag

Banned
I'm aware that the Jaguar is a laptop processor. Are the GPU components of the Xbone and PS4 mobile components also? Are you saying this Tegra processor is on parr with what's in the other two consoles?
Tegra hasn't been for phones in going on 4 gens now (Logan, Erista, Parker, Xavier) and the primary target for Tegra now is automotive rather than mobile devices. We don't know exactly what's in the NS but Nvidia pointed to the GPU architecture being found in [desktop] GeForce gaming cards in their press release (meaning Pascal or something newer).

System spec isn't likely isn't going to pass PS4 and Xbox One (although aspects like CPU very well could). That doesn't change the fact you're way off base calling one console weak versus the others for using phone parts (none use phone chips, all use low power mobile chips).
 

IvorB

Member
Tegra hasn't been for phones in going on 4 gens now (Logan, Erista, Parker, Xavier) and the primary target for Tegra now is automotive rather than mobile devices. We don't know exactly what's in the NS but Nvidia pointed to the GPU architecture being found in [desktop] GeForce gaming cards in their press release (meaning Pascal or something newer).

System spec isn't likely isn't going to pass PS4 and Xbox One (although aspects like CPU very well could). That doesn't change the fact you're way off base calling one console weak versus the others for using phone parts (none use phone chips, all use low power mobile chips).

Okay, well when I see "Tegra" I think cell phones and Ouya but if you're saying that things have advanced beyond that recently then fair enough.
 

MacTag

Banned
I wonder if the TV mode will support 4k . I own the Nvidia Shield TV and you could get reasonable framerates on the Talos Principle downscalled from 4k and the Switch should be stronger than that so indies should have a decent chance of running at 4k.
I'm expecting 4k video streaming probably given it's part of Pascal's base featureset. Probably not for games though, even if the system could handle them.
 

KingSnake

The Birthday Skeleton
Zombie, no offense dude. But this was you in 2012 about the Wii-U Gflop count: (I actually remembered your posting style from back then.)

Do you realise that your only contribution to this thread is bashing others? You had not even one post contributing in any way to the discussion. Even a wrong theory or something.
 

99Luffy

Banned
X1 isn't a phone processor, it's only been used in tablets and set tops. X2 hasn't been used in any cs devices yet but it's targeted at automotive.

Btw the Jaguar CPU in Xbox One, PS4 and PS4 Pro was designed for laptops. All the consoles use mobile chips essentially now so your continued rhetoric on that being "weak" in only Nintendo's case seems misplaced.
They were never designed for laptops. Because there isnt a laptop out there with an 8 core jaguar cpu.
 

Hoo-doo

Banned
Do you realise that your only contribution to this thread is bashing others? You had not even one post contributing in any way to the discussion. Even a wrong theory or something.

I have posted many times what I expect out of this thing. And it's in many ways exciting, disappointing in some others.

Why can't I also comment on the general atmosphere in here, which is in my opinion a gross overestimation of the capabilites of the device? Is it only 'contributing to the discussion' if I agree with it being a powerhouse?
I know you prefer an echo chamber, you made that clear in countless NX threads even before the reveal, but this is still a discussion board. You don't get to decide what does and what doesn't contribute to the discussion simply because you have other thoughts on the matter.

And I wasn't bashing anyone, I was merely reminding everyone that at the same place in time before the Wii-U launch, a lot of people had very similar postings of the device's power with plenty of sourcing and insiders.
None of it panned out. I'm merely drawing some parallels here. Maybe this time will be totally different, but i'm thinking it won't. Feel free to disagree.
 

MacTag

Banned
They were never designed for laptops. Because there isnt a laptop out there with an 8 core jaguar cpu.
Jaguar's a low power draw microarchitecture designed for low power draw devices. Like laptops, tablets, microservers and embedded solutions. Not that different from Tegra in target applications at all.
 

Trago

Member
The other exciting this is that there are better dev tools for them to use. Nvidia must be a huge blessing for this thing.
 

Thraktor

Member
I still don't understand why you (and other people) associate any particular performance level with "Pascal-based". Something Pascal-based could be just as fast as X1. Hell, it could be slower!

The microarchitecture is just one facet determining the final performance, and given the relatively small changes between Maxwellv2 and Pascal, not even a particularly important one.

You're quite right, but there is a lower bound to a Pascal GPU (1 SM and let's say 750MHz passively cooled), and if we're talking about Switch as a handheld, then even that lower bound is still far higher than most people would have expected for a new Nintendo handheld.

I posted this in the other Switch thread but it got quickly buried among battery life argument posts. And it seems more appropriate to post it here in a tech focused thread.

This might not mean anything, but its something that caught my eye:

One thing I caught in the preview trailer is during the car ride multiplayer part, is when they slide the Switch into the car seat holder arm thing, there is holes in the back and I believe at the top which you can just barely make out; match the holes and the vents of the Switch?

Would this mean that it could be actively cooled in portable mode too? Same performance and power in docked and non docked modes?

This part is what I mean:

That's an interesting find. It's also possible that the vents are left open to facilitate passive cooling (even without a fan, convection through the heatsink will provide some cooling in a situation like that).

I believe he talked about the Tegra X1 being 1/8 the performance of the Xbone SoC. It does seem like he is focusing on bandwidth (important factor, but not the whole story as we know), but the folks over there confirm that he is a dev who would be in a position to have info. Nothing he says is outlandish. Take all with a grain of salt, but I'm giving his info a chance. Looking at lpDDR4 available, his hints would peg Switch's RAM bandwidth at around ~60 GB/s.

While I don't dispute that he may have some insider info, the notion of comparing bandwidth between GNC 1.0/1.1 and Pascal as if it's a like for like comparison is just silly, given how completely differently they operate. Xbox One has about 150 GB/s of bandwidth per TF of floating point performance, whereas Pascal graphics cards tend to have about 30 GB/s per TF. By his logic Nvidia's current line of graphics cards would be horrifically, cripplingly bandwidth constrained, which they're quite obviously not.
 

Principate

Saint Titanfall
I have posted many times what I expect out of this thing. And it's in many ways exciting, disappointing in some others.

Why can't I also comment on the general atmosphere in here, which is in my opinion a gross overestimation of the capabilites of the device? Is it only 'contributing to the discussion' if I agree with it being a powerhouse?
I know you prefer an echo chamber, you made that clear in countless NX threads even before the reveal, but this is still a discussion board. You don't get to decide what does and what doesn't contribute to the discussion simply because you have other thoughts on the matter.

And I wasn't bashing anyone, I was merely reminding everyone that at the same place in time before the Wii-U launch, a lot of people had very similar postings of the device's power with plenty of sourcing and insiders.
None of it panned out. I'm merely drawing some parallels here. Maybe this time will be totally different, but i'm thinking it won't. Feel free to disagree.
TBF we're talking about at abslute most less than 250 GF difference with the vast majority talking about less than 100 GF there isn't a dramatic overestimation based on the specs going on here a bit idealyic but that's about it
 
This is the part we're all arguing about right? I really don't see it as a lower fps than the gameplay before it but it doesn't help the video is only 23fps to begin with.

ezgif.com-video-to-git5b7q.gif


The game footage in the video is a lower FPS than the 23.9 FPS video. I checked it frame by frame and there are a lot of repeat frames in the game.

But as mentioned, it likely does not mean anything anyway.

Yup, let's burn down that bridge when we come to it.
 
I have posted many times what I expect out of this thing. And it's in many ways exciting, disappointing in some others.

Why can't I also comment on the general atmosphere in here, which is in my opinion a gross overestimation of the capabilites of the device? Is it only 'contributing to the discussion' if I agree with it being a powerhouse?
I know you prefer an echo chamber, you made that clear in countless NX threads even before the reveal, but this is still a discussion board. You don't get to decide what does and what doesn't contribute to the discussion simply because you have other thoughts on the matter.

And I wasn't bashing anyone, I was merely reminding everyone that at the same place in time before the Wii-U launch, a lot of people had very similar postings of the device's power with plenty of sourcing and insiders.
None of it panned out. I'm merely drawing some parallels here. Maybe this time will be totally different, but i'm thinking it won't. Feel free to disagree.
Why are you framing this as if it's a phenomenon unique to Nintendo threads? I remember the PS4 and XB1 speculation threads and people's expectations ran wild in those too. I can't be the only one who remembers the Team Real vs Team CG debacle. Even the PS4 Pro isn't going to live up to those expectations.
 
Why are you framing this as if it's a phenomenon unique to Nintendo threads? I remember the PS4 and XB1 speculation threads and people's expectations ran wild in those too. I can't be the only one who remembers the Team Real vs Team CG debacle. Even the PS4 Pro isn't going to live up to those expectations.

I read through the entire Neo speculation threads. Reminded me of WUST in more ways than one, that's for sure.
 

KingSnake

The Birthday Skeleton
I know you prefer an echo chamber, you made that clear in countless NX threads even before the reveal, but this is still a discussion board. You don't get to decide what does and what doesn't contribute to the discussion simply because you have other thoughts on the matter.

I don't prefer an echo chamber. I actually enjoy more the discussions in contradictory with arguments. But nice generalisation, I appreciate it.

And sure, if you think that posting again and again "remember WUST threads" adds something valuable to a discussion that it's mostly grounded in reality with more optimistic and more pessimistic scenarios, please continue.
 

Astral Dog

Member
I believe he talked about the Tegra X1 being 1/8 the performance of the Xbone SoC. It does seem like he is focusing on bandwidth (important factor, but not the whole story as we know), but the folks over there confirm that he is a dev who would be in a position to have info. Nothing he says is outlandish. Take all with a grain of salt, but I'm giving his info a chance. Looking at lpDDR4 available, his hints would peg Switch's RAM bandwidth at around ~60 GB/s.
Double Wii U :O
Quite impresdive for a handheld system
 

99Luffy

Banned
Jaguar's a low power draw microarchitecture designed for low power draw devices. Like laptops, tablets, microservers and embedded solutions. Not that different from Tegra in target applications at all.
They arent low power though. Lower than ps3 sure but not enough to be portable. The switch is. That alone tells you it isnt an apt comparison.
 
They arent low power though. Lower than ps3 sure but not enough to be portable. The switch is. That alone tells you it isnt an apt comparison.

There are most certainly ARM processors that outperform the jaguars clock for clock. You may be looking at em here. Mobile has advanced much quicker than desktop.
 
While I don't dispute that he may have some insider info, the notion of comparing bandwidth between GNC 1.0/1.1 and Pascal as if it's a like for like comparison is just silly, given how completely differently they operate. Xbox One has about 150 GB/s of bandwidth per TF of floating point performance, whereas Pascal graphics cards tend to have about 30 GB/s per TF. By his logic Nvidia's current line of graphics cards would be horrifically, cripplingly bandwidth constrained, which they're quite obviously not.

I do agree with you here. Reading his posts (if we humor his legitimacy), he would seem to come from a console dev background and be somewhat more familiar with AMD products. NVidia have ways of compensating for lower bandwidth, including a completely different rasterization method. I also wouldn't be surprised if Nintendo souped up the L2 cache a bit, as Latte appeared to do relative to Xenos. If it is in the 60 GB/s range, which is around where we were speculating based on 128-bit lpDDR4, I would be quite pleased. For a handheld console that's nice. Compared to Wii U, it's a decent jump as well--especially if we are talking a unified pool this time.
 

KingSnake

The Birthday Skeleton
That's assuming that Nintendo want to spend as much on the SoC and on batteries as Google does on the Pixel C, which retails for over 500€.
I find that incredibly unlikely. Not to mention that the Pixel C also has far more space for batteries -- these larger tablets are basically all battery.

I don't expect Nintendo to match Pixel C's battery and I don't expect Switch to have more than average battery life. As for the price of SoC, why would a higher clocked Tegra be much more expensive than a lowered clocked one? We anyhow talk about a custom design that is done by Nvidia based on Nintendo's requirements and then outsourced by Nintendo to TSMC or whoever. Wouldn't the price rather depend on the number of chips produced?

In the end there isn't that much difference between 512Gflops and 600Gflops, but it would be quite strange to have active cooling for it.
 
So how does this compare to XB1 and PS4?

Current speculation for the GPU is between 512 and 750 GFlops*, depending on architecture and whether or not there's any upclocking/downclocking going on. CPU will be more powerful than those in the PS4/XB1 core for core, though we have no information on amount of cores. LCGeek has rumored that overall the CPU will be a good deal improved over PS4/XB1.

*Those are Nvidia Flops which supposedly perform a bit better than AMD/GCN Flops, and Tegra apparently get twice the performance from FP16 code compared to FP32 code (which is generally what XB1/PS4 games use), meaning a fully optimized game could get up to 1-1.5 TF of performance. Apparently UE4 makes heavy use of FP16 code.

No idea how accurate any of this is but it's currently the best guesses based on the rumors we have and images of the device.
 
Top Bottom