• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch: Powered by Custom Nvidia Tegra Chip (Official)

Not sarcasm. Now you know.

This superhero stuff is getting more pathetic by the day. Time to get over it.

A shame I now know 'Parker' comes from Peter Parker. Wish I didn't know that. Childish, sad and pathetic.

So what would be good codenames in your eyes?

Childish...get it together man, if you want to be full on adult enjoy yourself, yet I think a little bit of "childishness" isnt that bad alltogether...
 

Schnozberry

Member
I have a feeling they're gonna let Jen Hsun Huang talk about it for a while, since he can hype anything. I don't think the old rules apply to Nintendo anymore- I can definitely see them talking about the specs of this since they'll be basically unprecedented for the form factor.

Seems likely. He can bullshit with the best of them.
 

thefro

Member
Can you theorize the performance of those specs in comparison to X1? Also do you think it will run overclocked when doc?

It'll be upclocked (in comparision to undocked) when docked and downlclocked when undocked. That's what laptops do these days.

Overclocking the chip would lead to more hardware failures.
 

nikatapi

Member
I have a feeling they're gonna let Jen Hsun Huang talk about it for a while, since he can hype anything. I don't think the old rules apply to Nintendo anymore- I can definitely see them talking about the specs of this since they'll be basically unprecedented for the form factor.

That would be very cool actually. Give us a good indication of what the system is capable of, maybe even some tech demos. I feel like it would be good for both Nvidia and Nintendo.
 

Donnie

Member
Well the full clocked version is still memory bandwidth limited compared to last gen consoles. It does have colour compression and other rendering optimizations that improve bandwidth efficiency, but even the PS3 had double the GPU bandwidth.

The full clocked version of X1 has less total memory bandwidth than PS3 (same amount of GPU bandwidth), but that doesn't make it bandwidth limited by comparison. As you mention the chip has quite significant optimisations that improve memory utilization.

So depending on where you fall on the spectrum of how much emphasis you put on the leaked dev kit, it could be a worrying sign that dev kits were running on TX1 hardware. If you're one of those who thinks that Nvidia's custom SoC statement implies a Parker like SoC, then the good news is the memory bus is 128bit for the chips inside the Drive PX2. I still have my doubts that NS is going to resemble Parker in any meaningful way though.

Considering its going to be a custom SoC they can use whatever bus width they like. Personally I think its likely they'll have some kind of dedicated video memory.
 
Not sarcasm. Now you know.

This superhero stuff is getting more pathetic by the day. Time to get over it.

A shame I now know 'Parker' comes from Peter Parker. Wish I didn't know that. Childish, sad and pathetic.
They need some manly names like KitKat and Nougat.
 

BDGAME

Member
The big question I have is: what kind of hardware the NS need to have to be able to run this generation's games at 720p?

Last generation, The PS Vita was what? A 24 GFlops machine that can run games at 544p or less and, sometimes, can archive visuals close to the ps3 and theirs 240Gflops.

I believe that NS will be closer to Xone than Vita was from ps3 (maybe a 500 or 750 GFlops machine in portable mode)

Other thing to consider: when ps4 pro runs games at 4K, these games will have more details than the normal ps4 or all the extra juice will be utilized only to increase resolution?
 

Gamepad

Member
The manufacturer and the product gives you a very good idea about the range of graphical capabilities. According to Nvidia:

Nintendo Switch is powered by the performance of the custom Tegra processor. The high-efficiency scalable processor includes an NVIDIA GPU based on the same architecture as the world’s top-performing GeForce gaming graphics cards.

Depending on how you want to interpret the "top-performing" part, there are only 2 Tegra products that fit into the description: Tegra X1 and Tegra Parker. So Switch is powered by a custom SoC built on one of these. GPU on X1 is approximately 3 times more powerful than Wii U's and Parker's GPU is 4.25 times x Wii U GPU, only considering FP32. CPU on both is also more capable than the Wii U's. The architecture is more modern.

This based on the official information from Nvidia.

On top of that you have all the rumours and additional educated guesses that corroborate into a pretty clear picture about the range of graphical capabilities of Switch.

So this is where we stand.

We know nothing. Just PR-talk like "Tegra", "based on" etc.

Remember when reputable sites reported that "Wi U packs the same processor technology found in Watson, the supercomputer."

Rumors that the Wii U CPU was derived from IBM's high end POWER7 server processor proved false. Espresso shares some technology with POWER7, such as eDRAM and Power Architecture, but those are superficial similarities.

Some people just want to believe anything they read.
 

guek

Banned
Some people just want to believe anything they read.

You know, it is possible to read rumors and take part in speculation without automatically believing everything with absolute certainty. We "knew nothing" for sure before the reveal but a substantial number of rumors turned out to be true as is often the case in the video game industry.
 

mrklaw

MrArseFace
do the Nvidia 'X' chips have gamestream built in? Wondering if it might be possible to stream from a switch to the dock (or to a future accessory) - might be a good solution for games that benefit from touch but you still want to play on the TV.
 
We know nothing. Just PR-talk like "Tegra", "based on" etc.

Remember when reputable sites reported that "Wi U packs the same processor technology found in Watson, the supercomputer."



Some people just want to believe anything they read.

I think the major difference here is that one source is from the actual supplier, and the other was speculation from reputable sites. Pretty big difference.

Believing the chip is based on either X1 or Parker isn't exactly wild speculation considering Nvidia's statement.
 
Not sarcasm. Now you know.

This superhero stuff is getting more pathetic by the day. Time to get over it.

A shame I now know 'Parker' comes from Peter Parker. Wish I didn't know that. Childish, sad and pathetic.

internal company codenames for computer chips used to video games: too serious and mature to employ a superhero reference.

how do you come to have enough burning hatred for superheroes to think that this makes sense?
 

Schnozberry

Member
do the Nvidia 'X' chips have gamestream built in? Wondering if it might be possible to stream from a switch to the dock (or to a future accessory) - might be a good solution for games that benefit from touch but you still want to play on the TV.

Are you talking about mirroring the display, or Wii U like functionality where the tablet displays a different overlay than the TV?
 

mrklaw

MrArseFace
Are you talking about mirroring the display, or Wii U like functionality where the tablet displays a different overlay than the TV?

mirroring. Like a reverse WiiU. So eg you're playing mario maker on the TV but want to use touch to create levels. Not great if you're docked, so streaming to the TV would be good.
 

Gamepad

Member
I think the major difference here is that one source is from the actual supplier, and the other was speculation from reputable sites. Pretty big difference.

Believing the chip is based on either X1 or Parker isn't exactly wild speculation considering Nvidia's statement.

No it's not because the information came from IBM. Here's a tweet from the official IBM Watson account.

"#WiiU uses same #power7 chips."

That turned out to be false just like so many other things.

So for the last time. It's all rumors until we have the retail version in our hands.

I have nothing more to add.
 

guek

Banned
No it's not because the information came from IBM. Here's a tweet from the official IBM Watson account.



That turned out to be false just like so many other things.

So for the last time. It's all rumors until we have the retail version in our hands.

I have nothing more to add.

You had nothing to add to begin with. All you're doing is trying to stifle discussion.
 

tkscz

Member
No it's not because the information came from IBM. Here's a tweet from the official IBM Watson account.



That turned out to be false just like so many other things.

So for the last time. It's all rumors until we have the retail version in our hands.

I have nothing more to add.

While it wasn't a Watson, it did use components that the watson used, so they could technically make that claim. That's the difference here. Nvidia isn't claiming to be using a specific chip, only that it's custom and based on their "top Geforce" tech, and right now, Pascal is their top. If rumors are to be believed and history is to be followed, then the Tegra X2 (parker) would be the chip Nintendo would go with. If it saves battery and heat, Nintendo will go for it.
 

Schnozberry

Member
mirroring. Like a reverse WiiU. So eg you're playing mario maker on the TV but want to use touch to create levels. Not great if you're docked, so streaming to the TV would be good.

I don't think anything would technologically prevent them from doing it, but wouldn't an IR pointer and gyro function essentially the same way?
 

Thraktor

Member
Considering its going to be a custom SoC they can use whatever bus width they like. Personally I think its likely they'll have some kind of dedicated video memory.

What form of video memory would you expect them to use, though? eDRAM isn't available on any of the plausible manufacturing processes, SRAM would be very expensive if they need 32MB+, and the only off-die memory available which would fit their needs would be HBM2, which would also be extremely expensive, even in a minimal 4GB single-stack configuration (although arguably may be a better choice than SRAM, given the orders of magnitude higher capacity).

It's kind of redundant with tile-based rendering, though, which should give them most of the benefits of dedicated VRAM with just a few MBs of cache, and also make developers' lives a little easier with a single logical memory pool.
 

dr_rus

Member
Well, it has been rumored and it would make perfect sense, given the fact that Nvidia claims the Pascal chip to be either 40% faster than the Maxwell chip at the same power draw (which would be very interesting for Nintendo in docked mode), or 60% more power efficient than the Maxwell chip while delivering the same performance (which would be very interesting for Nintendo in portable mode).
X1 isn't a typical Maxwell chip though as it's using 20SoC process and have different SMs. Energy improvements between desktop Maxwell and Pascal parts do not necessarily transfer to X1 and Parker comparison.

The main difference that could concern Switch is the memory controller. So it matters a bit.
Memory controller is abstracted beyond the crossbar and can be any width the product requires. X1 can easily have 128 or even 256 bit memory bus if necessary and Parker can just as easily have the same 64 bit bus as X1. Thus why "based on Parker" isn't saying much without unit numbers / bus widths / clocks.
 

mrklaw

MrArseFace
I don't think anything would technologically prevent them from doing it, but wouldn't an IR pointer and gyro function essentially the same way?

for some types of games - like exactly mario maker - you'd want absolute pointer controls (touch immediately in a specific location) rather than a relative pointer control (like a mouse). You could maybe get away with it, but I think some things would be a lot more usable with direct touch.

Although then you'd have to undock it, which might be something Nintendo don't want to make you do..
 

Thraktor

Member
is a 256 wide bus completely out of the question for a custom tegra?

Probably. LPDDR4 chips come in a maximum I/O width of 64 bits, meaning you'd need four chips for such a configuration, increasing costs, power consumption and physical size. Phones pretty much universally use a single 32 or 64 bit memory chip, and the only consumer product I can think of which uses more than one LPDDR module is the iPad Pro, which uses two 64 bit chips for a 128 bit bus (to allow it to drive an extremely high resolution display).
 
Probably. LPDDR4 chips come in a maximum I/O width of 64 bits, meaning you'd need four chips for such a configuration, increasing costs, power consumption and physical size. Phones pretty much universally use a single 32 or 64 bit memory chip, and the only consumer product I can think of which uses more than one LPDDR module is the iPad Pro, which uses two 64 bit chips for a 128 bit bus (to allow it to drive an extremely high resolution display).

58345511.jpg


SARCASM
 

Donnie

Member
What form of video memory would you expect them to use, though? eDRAM isn't available on any of the plausible manufacturing processes, SRAM would be very expensive if they need 32MB+, and the only off-die memory available which would fit their needs would be HBM2, which would also be extremely expensive, even in a minimal 4GB single-stack configuration (although arguably may be a better choice than SRAM, given the orders of magnitude higher capacity).

It's kind of redundant with tile-based rendering, though, which should give them most of the benefits of dedicated VRAM with just a few MBs of cache, and also make developers' lives a little easier with a single logical memory pool.

I'm not talking about something as large as 32MB+ as it wouldn't be used for rendering anyway (with a tile buffer already being present). Maybe just a very large texture cache. I mean they may not have any separate pool. I'm only basing this on Nintendo's love of fast video memory, they seem to have it on every system they produce. But yeah considering the mode of rendering used maybe this will be the end of that.
 

Lonely1

Unconfirmed Member
Probably. LPDDR4 chips come in a maximum I/O width of 64 bits, meaning you'd need four chips for such a configuration, increasing costs, power consumption and physical size. Phones pretty much universally use a single 32 or 64 bit memory chip, and the only consumer product I can think of which uses more than one LPDDR module is the iPad Pro, which uses two 64 bit chips for a 128 bit bus (to allow it to drive an extremely high resolution display).

And this is where Fujitsu's secret 128-bit bus high density FCRAM will save the day! Or they could incluse 50+ 512Mbits modules for ungodly wide I/O! :p
 

AntMurda

Member
While it wasn't a Watson, it did use components that the watson used, so they could technically make that claim. That's the difference here. Nvidia isn't claiming to be using a specific chip, only that it's custom and based on their "top Geforce" tech, and right now, Pascal is their top. If rumors are to be believed and history is to be followed, then the Tegra X2 (parker) would be the chip Nintendo would go with. If it saves battery and heat, Nintendo will go for it.

Has there been any rumor from dev kit leaks? Is it also possible the first alpha kit were x1 but the final dev kits / specs will be parker?
 

Thraktor

Member
I'm not talking about something as large as 32MB+ as it wouldn't be used for rendering anyway (with a tile buffer already being present). Maybe just a very large texture cache. I mean they may not have any separate pool. I'm only basing this on Nintendo's love of fast video memory, they seem to have it on every system they produce. But yeah considering the mode of rendering used maybe this will be the end of that.

OK, I thought you meant an actively managed VRAM pool, but I'd definitely be in agreement with you in expecting an increased GPU L2 cache, or maybe a big shared L3.

And this is where Fujitsu's secret 128-bit bus high density FCRAM will save the day! Or they could incluse 50+ 512Mbits modules for ungodly wide I/O! :p

The issue is that the Wide I/O memory format, which is essentially the successor to the FCRAM used in 3DS, has never really made it into production. Every time they announced a new faster version of the standard it would just be eclipsed by LPDDR, which could provide the same bandwidth at a lower cost, with lower power consumption and on a narrower bus. There don't seem to be nearly as many exotic memory standards around for Nintendo to choose from compared to the good old days of 1T-SRAM and FCRAM.
 

Doctre81

Member
Probably. LPDDR4 chips come in a maximum I/O width of 64 bits, meaning you'd need four chips for such a configuration, increasing costs, power consumption and physical size. Phones pretty much universally use a single 32 or 64 bit memory chip, and the only consumer product I can think of which uses more than one LPDDR module is the iPad Pro, which uses two 64 bit chips for a 128 bit bus (to allow it to drive an extremely high resolution display).

Oh I thought parker uses a 128 bit bus.

Parker2.PNG
 

Schnozberry

Member
for some types of games - like exactly mario maker - you'd want absolute pointer controls (touch immediately in a specific location) rather than a relative pointer control (like a mouse). You could maybe get away with it, but I think some things would be a lot more usable with direct touch.

Although then you'd have to undock it, which might be something Nintendo don't want to make you do..

Yeah, I was more thinking your second point. I certainly agree that touch is far more ideal in this scenario, but with a capacitive screen you'd need some kind of accessory pen or be forced to deal with finger level accuracy. IR might actually offer greater precision, especially since Mario Maker would need to be reimagined for pinch to zoom to get the kind of pixel precision that was offered with the original.
 
Probably. LPDDR4 chips come in a maximum I/O width of 64 bits, meaning you'd need four chips for such a configuration, increasing costs, power consumption and physical size. Phones pretty much universally use a single 32 or 64 bit memory chip, and the only consumer product I can think of which uses more than one LPDDR module is the iPad Pro, which uses two 64 bit chips for a 128 bit bus (to allow it to drive an extremely high resolution display).

How would that impact performance, or are there ways you expect Nintendo to make up any deficiencies?
 

Thraktor

Member
Oh I thought parker uses a 128 bit bus.

It does, but I haven't seen it in any consumer products yet.

I'm certainly not discounting the possibility of a 128 bit bus, but I was just trying to highlight how unlikely 256 bit wide memory would be by showing how rare even 128 bit LPDDR interfaces are.

How would that impact performance, or are there ways you expect Nintendo to make up any deficiencies?

I don't think they really need that kind of bandwidth. With tile-based rendering and a large cache they should be able to get by with a 128 bit bus, or even 64 bit depending on the system's performance.
 

Schnozberry

Member
Oh I thought parker uses a 128 bit bus.

Parker2.PNG

It does. He was just referring to how many memory chips would be required. For a 128-bit interface, they'd need two 64-bit chips. It's pretty common in the mobile space to use a single chip because of space constraints, so a single 64-bit interface is common. LPDDR4 chips with a 128-bit interface don't exist as of right now. It wouldn't be completely out of the ordinary for Nvidia to use two chips though, because the Shield Android TV has a configuration with two 32-bit chips, as does the Shield Tablet shown below.

IY4XbJO.jpg
 

Doctre81

Member
It does. He was just referring to how many memory chips would be required. For a 128-bit interface, they'd need two 64bit chips. It's pretty common in the mobile space to use a single chip because of space constraints, so a single 64-bit interface is common. LPDDR4 chips with a 128bit interface don't exist as of right now. It wouldn't be completely out of the ordinary for Nvidia to use 2 chips, though, because the Shield Android TV has a configuration with 2 32-bit chips, as does the Shield Tablet shown below.

IY4XbJO.jpg

Oh gotcha. Sounds like we can at least expect ( or hope for) a 128-bit bus. Which would be quite an improvement over wiiu.

It does, but I haven't seen it in any consumer products yet.

I'm certainly not discounting the possibility of a 128 bit bus, but I was just trying to highlight how unlikely 256 bit wide memory would be by showing how rare even 128 bit LPDDR interfaces are.
.

Ahh I see. Thanks for replying.
 

tkscz

Member
And this is where Fujitsu's secret 128-bit bus high density FCRAM will save the day! Or they could incluse 50+ 512Mbits modules for ungodly wide I/O! :p

That's something I've been wondering. Do we no they are using LPDDR4? I mean, FCRAM in the 3DS is probably a much faster RAM to go with. Not sure if it's possible or not to get the Tegra to work with it though.
 

Zil33184

Member
Oh gotcha. Sounds like we can at least expect ( or hope for) a 128-bit bus. Which would be quite an improvement over wiiu.

We'll see, but if they do go with 128 bit lpddr4 then even an underclocked X1 could hit PS4 quality at 960x540. That would also easily upscale to 1080p and rumours are that the target resolution for NS is 540p. The only bottleneck at that point would be CPU performance.
 

ozfunghi

Member
X1 isn't a typical Maxwell chip though as it's using 20SoC process and have different SMs. Energy improvements between desktop Maxwell and Pascal parts do not necessarily transfer to X1 and Parker comparison.

It was coming straight out of Nvidia documentation concerning these two Tegra chips.
 

Schnozberry

Member
Oh gotcha. Sounds like we can at least expect ( or hope for) a 128-bit bus. Which would be quite an improvement over wiiu.

Yeah, it really depends on if they plan to use large cache pools or embedded memory. If that's the case, then they wouldn't necessarily need the extra bandwidth. Not that it would hurt.

I should correct myself from my earlier post. 128-bit LPDDR4 chips do exist, but it doesn't appear that they are made in sizes that make sense for the Switch. The largest one I can find is 2GB.
 
We'll see, but if they do go with 128 bit lpddr4 then even an underclocked X1 could hit PS4 quality at 960x540. That would also easily upscale to 1080p and rumours are that the target resolution for NS is 540p. The only bottleneck at that point would be CPU performance.

What rumor is that? The Zlatan rumor stating 504p? Hasn't that been thoroughly debunked?
 

Thraktor

Member
That's something I've been wondering. Do we no they are using LPDDR4? I mean, FCRAM in the 3DS is probably a much faster RAM to go with. Not sure if it's possible or not to get the Tegra to work with it though.

LPDDR4 is far, far faster than FCRAM. On a bandwidth per chip basis it's competitive with GDDR5, and you'd need to jump to HBM or HMC for something "much faster" than it.

I should correct myself from my earlier post. 128-bit LPDDR4 chips do exist, but it doesn't appear that they are made in sizes that make sense for the Switch. The largest one I can find is 2GB.

That's interesting, especially with it only being available at 2GB or below (wider I/O LPDDR modules are typically reserved for larger capacities). Who's manufacturing them?
 

Thraktor

Member
Micron. I should confess that I may be reading this incorrectly.

https://www.micron.com/products/dram/lpdram

MRjAjdt.png

Hmm, that's interesting, although I suspect the 128 bit I/O parts may be LPDDR2 or LPDDR3. I know some of Intel's laptop CPUs have supported LPDDR3, so perhaps these are for cheap netbooks/chromebooks?

Although, yeah, it's possible we're reading this wrong, as they don't specifically say "I/O width", so it could be relating to the internal organisation of the DRAM cells. (This is definitely stretching past my knowledge of how RAM operates.)
 

Schnozberry

Member
Hmm, that's interesting, although I suspect the 128 bit I/O parts may be LPDDR2 or LPDDR3. I know some of Intel's laptop CPUs have supported LPDDR3, so perhaps these are for cheap netbooks/chromebooks?

Although, yeah, it's possible we're reading this wrong, as they don't specifically say "I/O width", so it could be relating to the internal organisation of the DRAM cells. (This is definitely stretching past my knowledge of how RAM operates.)

Looking at the Hynix and Samsung LPDDR4 product lists, I think we are reading it correctly. Although it very well could be 128-bit LPDDR3 for notebook or x86 tablet configurations.

Samsung

Hynix

All of Samsung's 4GB packages are 64-bit, which are organized as 4x16Gb stacked chips to make one module. The Cell organization seems to be directly related to that. I believe that is referred to as x64.
 
Top Bottom