• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch: Powered by Custom Nvidia Tegra Chip (Official)

Pretty much all GPUs these days are bandwidth constrained relative to compute, especially ones that have over 5x the compute power of console and less than twice the bandwidth. Compute scales faster than bandwidth, and it's a problem. See 4K and VR if you think this is merely academic as opposed to something people in the industry take seriously.

As for the XBO vs TX1 situation, Nvidia's bandwidth optimisations aren't going to net significant enough performance increases to offset only having 25.6GB/s of memory bandwidth.

I do see a lot of people engaging in insane amounts of fudge factoring without a single piece of evidence though. Apparently fp16, color compression, "NV FLOPS", and tiled rasterization somehow mean the TX1 level hardware likely to be in the NS is somehow a portable XBO or PS4.

You think fp16, NV flops and tiled rasterization etc. will do nothing to bring it a little bit nearer to xbox one performance than specs would tell? and we don't know if it is only 25GB/s, it could be 50 (like X2) and nintendo could implement some sort of fast ram too. would not be the 1st time.
 

Speely

Banned
I think this video is relevant again for another watch.

Tegra X1 Unreal 4 Elemental Demo

Pretty impressive for only 25GB/s. Here is hoping we see that level of performance at least in undocked mode if its custom Pascal X2?

It's exciting to see Nintendo using accessible modern tech. I think the SoC in the Switch is going to yield some geat results. I would love some concrete info on Parker vs Maxwell, though.
 

Zil33184

Member
You think fp16, NV flops and tiled rasterization etc. will do nothing to bring it a little bit nearer to xbox one performance than specs would tell? and we don't know if it is only 25GB/s, it could be 50 (like X2) and nintendo could implement some sort of fast ram too. would not be the 1st time.

There was a dev kit leak and rumours from a couple months back alluding to TX1, whereas advocates of 128-bit buses and edram have nothing to base their conjecture on besides their own desire for better specs and extremely selective reading of PR statements. If Nintendo intended on a 128-bit bus in the final hardware then it would have been in the prototype dev board.

Also NVFLOPS aren't real, and Nvidia's bandwidth optimizations might get you some decent percentage bandwidth efficiency but will it compare to a "lowly" GCN 1.0 part with 6x the RAM bandwidth? No it won't.
 

Rodin

Member
I'm sorry, is there a specific reason why we're treating this 25.6GB/s stuff as fact, while also ignoring that Nintendo and Nvidia most definitely used some SRAM to increase bandwidth? Takeda specifically said that having a small pool of high speed memory on die is "in their dna".
 
Something I could see happen if the Dock really does boost the system is Nintendo asking devs to focus on resolution rather than performance in order to make sure the portable version is playable.
You don't really need high resolutions on the portable (especially since it's 720p) but being 900p or 1080p on the TV would be nice.
Laptops have that option where you can choose best performance or battery save performance and it is noticeable though likely not big enough for 720p->1080p
 

Principate

Saint Titanfall
There was a dev kit leak and rumours from a couple months back alluding to TX1, whereas advocates of 128-bit buses and edram have nothing to base their conjecture on besides their own desire for better specs and extremely selective reading of PR statements. If Nintendo intended on a 128-bit bus in the final hardware then it would have been in the prototype dev board.

Also NVFLOPS aren't real, and Nvidia's bandwidth optimizations might get you some decent percentage bandwidth efficiency but will it compare to a "lowly" GCN 1.0 part with 6x the RAM bandwidth? No it won't.

Pretty sure there was also rumours implying pascal from a reliable leaker so far, which is why anyone is bothering to talk about it, so it's wasn't entirely wishful thinking.
 
There was a dev kit leak and rumours from a couple months back alluding to TX1, whereas advocates of 128-bit buses and edram have nothing to base their conjecture on besides their own desire for better specs and extremely selective reading of PR statements. If Nintendo intended on a 128-bit bus in the final hardware then it would have been in the prototype dev board.

Also NVFLOPS aren't real, and Nvidia's bandwidth optimizations might get you some decent percentage bandwidth efficiency but will it compare to a "lowly" GCN 1.0 part with 6x the RAM bandwidth? No it won't.

The devkit leak was brought by a modder who is no insider and believed Switch would be a AMD homeconsole a week before. She even said to me that i shouldn't take the "specsheet" too seriously.

And if they used the X1 as a placeholder (they probably used it at one point) the specsheet would be near that, but not saying anything about final hardware. We simply don't know, but i believe Nate who is still saying it's Pascal. And Nvidia indicated it. At least it is not impossible. Or do you have official specs?

NV flops are real in the way that my 4,5teraflops gtx1060 (Pascal) is faster than the 5,8teraflops Polaris rx 480 ;) And Xbox One doesn't even have 6x the bandwidth of 25 GB/s, i am reading something like 68gb/s +esram.

I am not indicating that fp16 (etc.) "magic" will boost a 600-700 gflops Pascal device to 1,3 (GCN 1.0) tflops, that would be probably wrong. But it would be nearing the performance ballpark and with a lower resolution....
 

AfroDust

Member
I think this video is relevant again for another watch.

Tegra X1 Unreal 4 Elemental Demo

Pretty impressive for only 25GB/s. Here is hoping we see that level of performance at least in undocked mode if its custom Pascal X2?


I've been watching the Unreal Protostar demo to get an idea of what Switch's SoC is capable of, but yea, it's pretty mind blowing that a mobile chip comes anywhere close to replicating a demo running on a full fat console.
 

Schnozberry

Member
If Nintendo intended on a 128-bit bus in the final hardware then it would have been in the prototype dev board.

Also NVFLOPS aren't real, and Nvidia's bandwidth optimizations might get you some decent percentage bandwidth efficiency but will it compare to a "lowly" GCN 1.0 part with 6x the RAM bandwidth? No it won't.

You're arguing with straw men. No one has insisted on the 128-bit bus, just that it was a possibility based on the design of Parker that was originally shown almost a year ago now. We also don't know how early the dev board was that Eurogamer described as running the Tegra X1, as the description of it sounded a lot like a Jetson TX1 development board with it's noisy cooler and running hot as the sun. It's very possible that the Switch is running a customized version of the X1, but there would be literally no benefit to Nintendo for doing so, as Nvidia has known since 2012 that 20nm was a failure and a dead end, and would have pushed them towards 16nm when attempting to make this sale. They had internally abandoned 20nm prior to when Iwata first announced that the NX was in development, and really only continued forward with the X1 because they had made contractual commitments to TSMC and High Power chips on the node were a total failure. It wouldn't make sense for them to try and push the X1 on Nintendo at 20nm long term, especially if they had limited quantities that would quickly deplete if the Switch is a success.

We also had rumors from people here on the site (NateDrake), who asserted that he had a source he trusted that told him that the final hardware of the Switch would run a newer chip than what was in early Devkits. The speculation about ESRAM or additional cache setup is conjecture based on Nintendo's history of spending a lot of money on fast memory, even when seemingly going cheap on other components.

As Thraktor posted earlier, there is advanced memory compression and tile based rendering in place on the Nvidia card that isn't in place on the AMD chips in the other consoles. That doesn't magically make up the chasm of a difference between the power envelopes in play here, but it means that the Nvidia chip will punch above it's expected weight. There are also proprietary Nvidia Tech like Multi Res Shading and their proprietary tools and API's for Vulkan (including speeding up the importation of DirectX assets) that should make the Switch and attractive platform to port to from software that started on PC.

Lots of reasons to be optimistic, at least for ports. I don't think anyone sincerely believes the Switch will be as powerful as the Xbox One or PS4, but it should be able to run competent ports if all the technology is leveraged.
 

MacTag

Banned
There was a dev kit leak and rumours from a couple months back alluding to TX1, whereas advocates of 128-bit buses and edram have nothing to base their conjecture on besides their own desire for better specs and extremely selective reading of PR statements. If Nintendo intended on a 128-bit bus in the final hardware then it would have been in the prototype dev board.

Also NVFLOPS aren't real, and Nvidia's bandwidth optimizations might get you some decent percentage bandwidth efficiency but will it compare to a "lowly" GCN 1.0 part with 6x the RAM bandwidth? No it won't.
Rumors are also stating Pascal for final hardware, alpha kits using what's avaiable last summer have no bearing on that ending up true or not. Even 3DS had a 128-bit bus and embedded ram fyi, this isn't exactly new ground for Nintendo.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
There was a dev kit leak and rumours from a couple months back alluding to TX1, whereas advocates of 128-bit buses and edram have nothing to base their conjecture on besides their own desire for better specs and extremely selective reading of PR statements. If Nintendo intended on a 128-bit bus in the final hardware then it would have been in the prototype dev board.

Also NVFLOPS aren't real, and Nvidia's bandwidth optimizations might get you some decent percentage bandwidth efficiency but will it compare to a "lowly" GCN 1.0 part with 6x the RAM bandwidth? No it won't.

NateDrake has been adamant since the summer that this thing is using Pascal, which would explain the need for noisy active cooling in the devkits.
 
I think this video is relevant again for another watch.

[URL=" X1 Unreal 4 Elemental Demo[/URL]

Pretty impressive for only 25GB/s. Here is hoping we see that level of performance at least in undocked mode if its custom Pascal X2?

Interesting.. Thanks for the video!

How likely is it that Nintendo will use fp16 Tegra parker? fp32 is 750 GFLOPS, while fp16 is 1.5 TFLOPS. Or better yet, why wouldn't it be able to use the fp16?

https://en.wikipedia.org/wiki/Tegra#Tegra_X2
 
Interesting.. Thanks for the video!

How likely is it that Nintendo will use fp16 Tegra parker? fp32 is 750 GFLOPS, while fp16 is 1.5 TFLOPS. Or better yet, why wouldn't it be able to use the fp16?

https://en.wikipedia.org/wiki/Tegra#Tegra_X2

It's been discussed pretty heavily in the past 2-3 pages. Bottom line is the Switch will likely make some use of its increased FP16 performance but there is likely a cap on the performance gains they see, so that 2x performance likely won't be reachable.
 
I wonder if the NS custom chip will be based in the X1 or the new X2 tho, I can see Nintendo wanting to use old proven tech and NVIDIA wanting them to use the new thing.
 
I wonder if the NS custom chip will be based in the X1 or the new X2 tho, I can see Nintendo wanting to use old proven tech and NVIDIA wanting them to use the new thing.
This is a showcase for NVidia, I think they have a huge incentive to have the latest and greatest in this machine, even if it means not charging Nintendo the premium that they'd want for their latest and greatest.
 

antonz

Member
There was a dev kit leak and rumours from a couple months back alluding to TX1, whereas advocates of 128-bit buses and edram have nothing to base their conjecture on besides their own desire for better specs and extremely selective reading of PR statements. If Nintendo intended on a 128-bit bus in the final hardware then it would have been in the prototype dev board.

Also NVFLOPS aren't real, and Nvidia's bandwidth optimizations might get you some decent percentage bandwidth efficiency but will it compare to a "lowly" GCN 1.0 part with 6x the RAM bandwidth? No it won't.

The Dev Kit leak came from a person who before they wiped their twitter account clean and "started over" admitted they basically knew nothing about the NX and were just making shut up on the fly.

Just because someone can open up zip files on a gamedisc does not make them an expert.

As for Alpha kits etc. No whats in an alpha kit does not 100% match up whats in final kits etc. The Spec's you are hanging onto so dearly is a devkit any single person in the world can buy off Amazon.com right now. Its not the NX Devkit.
 
I wonder if the NS custom chip will be based in the X1 or the new X2 tho, I can see Nintendo wanting to use old proven tech and NVIDIA wanting them to use the new thing.

i think nintendo also likes power efficient hardware. and pascal is a modernized version of Maxwell, shrinked to 16nm as i understand it. Also 20nm isn't such a good process. I also see Nvidia push for Pascal, seeing that they are probably heavyly involved in the Switch hardware and software. And they say it's using the architecture that powers the worlds strongest Geforce GPUs, that would be Pascal at the moment. Yeah, it could be just pr speak ;)
 
I wonder if the NS custom chip will be based in the X1 or the new X2 tho, I can see Nintendo wanting to use old proven tech and NVIDIA wanting them to use the new thing.

People need to really elaborate whenever they day things like this, because it's a gross mischaracterization of Nintendo. Nothing in their history shows they use old tech simply because it's "proven" or anything like that.

Before my "Switch is the GC successor thread" was closed for reasons, I explained Nintendo intentionally used evolutions of the chip in the GC, mainly for BC and, presumably, familiarity for their devs. The Switch is a RESET, and will use a brand new chipset family.
 

Vena

Member
The Dev Kit leak came from a person who before they wiped their twitter account clean and "started over" admitted they basically knew nothing about the NX and were just making shut up on the fly.

Just because someone can open up zip files on a gamedisc does not make them an expert.

As for Alpha kits etc. No whats in an alpha kit does not 100% match up whats in final kits etc. The Spec's you are hanging onto so dearly is a devkit any single person in the world can buy off Amazon.com right now. Its not the NX Devkit.

Huh? Who? What did I miss?
 

wsippel

Banned
Of course there are X1 chips in the devkits. Kits have gone out months ago, Parker has only recently been revealed, and there simply is no X2. As far as I'm aware, it hasn't even been announced yet. Only the PX2 for cars has been announced. I don't know what exactly Switch will end up using, but it probably won't be an off-the-shelf X1. I mean, it's not impossible, but I think Parker or some sort of half-step between the two is more likely.

Also, using funky "old and proven" tech already bit Nintendo in the ass, with fabs shutting down and what not, and the current NTD team has a lot of mobile expertise and probably won't be fazed by adopting state-of-the-art technology and manufacturing processes.
 

Zil33184

Member
You're arguing with straw men. No one has insisted on the 128-bit bus, just that it was a possibility based on the design of Parker that was originally shown almost a year ago now. We also don't know how early the dev board was that Eurogamer described as running the Tegra X1, as the description of it sounded a lot like a Jetson TX1 development board with it's noisy cooler and running hot as the sun. It's very possible that the Switch is running a customized version of the X1, but there would be literally no benefit to Nintendo for doing so, as Nvidia has known since 2012 that 20nm was a failure and a dead end, and would have pushed them towards 16nm when attempting to make this sale. They had internally abandoned 20nm prior to when Iwata first announced that the NX was in development, and really only continued forward with the X1 because they had made contractual commitments to TSMC and High Power chips on the node were a total failure. It wouldn't make sense for them to try and push the X1 on Nintendo at 20nm long term, especially if they had limited quantities that would quickly deplete if the Switch is a success.

We also had rumors from people here on the site (NateDrake), who asserted that he had a source he trusted that told him that the final hardware of the Switch would run a newer chip than what was in early Devkits. The speculation about ESRAM or additional cache setup is conjecture based on Nintendo's history of spending a lot of money on fast memory, even when seemingly going cheap on other components.

As Thraktor posted earlier, there is advanced memory compression and tile based rendering in place on the Nvidia card that isn't in place on the AMD chips in the other consoles. That doesn't magically make up the chasm of a difference between the power envelopes in play here, but it means that the Nvidia chip will punch above it's expected weight. There are also proprietary Nvidia Tech like Multi Res Shading and their proprietary tools and API's for Vulkan (including speeding up the importation of DirectX assets) that should make the Switch and attractive platform to port to from software that started on PC.

Lots of reasons to be optimistic, at least for ports. I don't think anyone sincerely believes the Switch will be as powerful as the Xbox One or PS4, but it should be able to run competent ports if all the technology is leveraged.

The only hard specs leaked point to TX1 level hardware with no mention of a 128 bit bus on the early hardware. I just don't buy the hand wavy explanation of the huge variance between the leaked dev kit and what some are conjecturing about the final spec.

Also, there's no straw man I've constructed. Just a lot of people grasping at straws. In your own post there's conjecture about EDRAM, while others are taking about Denver cores or how half a teraflop is somehow a low balling estimation of what the NS is capable of.

I'm kinda dismayed by the reaction to the possibility that NS is comparable to TX1. The TX1 is actually a great mobile part compared to the shit Nintendo have released in the last decade. It doesn't achieve the impossible, but no hardware vendor does. There are always compromises.
 
Of course there are X1 chips in the devkits. Kits have gone out months ago, Parker has only recently been revealed, and there simply is no X2. As far as I'm aware, it hasn't even been announced yet. Only the PX2 for cars has been announced. I don't know what exactly Switch will end up using, but it probably won't be an off-the-shelf X1. I mean, it's not impossible, but I think Parker or some sort of half-step between the two is more likely.

Also, using funky "old and proven" tech already bit Nintendo in the ass, with fabs shutting down and what not, and the current NTD team has a lot of mobile expertise and probably won't be fazed by adopting state-of-the-art technology and manufacturing processes.

To the bolded, the blog post in the OP has nVidia stating that the Switch uses a custom Tegra SoC so we already know for a fact it isn't an off the shelf TX1. No speculation needed when nVidia confirms it!
 
It's been discussed pretty heavily in the past 2-3 pages. Bottom line is the Switch will likely make some use of its increased FP16 performance but there is likely a cap on the performance gains they see, so that 2x performance likely won't be reachable.

But how do you know its using FP16 instead of 32?
 

Deadbeat

Banned
People need to really elaborate whenever they day things like this, because it's a gross mischaracterization of Nintendo. Nothing in their history shows they use old tech simply because it's "proven" or anything like that.

Before my "Switch is the GC successor thread" was closed for reasons, I explained Nintendo intentionally used evolutions of the chip in the GC, mainly for BC and, presumably, familiarity for their devs. The Switch is a RESET, and will use a brand new chipset family.
For sake of argument lets assume they are using the X2 (and the thing is docked so juice isnt an issue). What are we looking at in terms of raw performance? Now what are we looking at if its X1?
 
But how do you know its using FP16 instead of 32?

I'm not sure I understand the question... Floating point precision has to do with shader programming, so it's completely up to the programmers to choose the precision they want to work with. So it's a development choice.

I meant architecturally, not physically.

I was just responding to you saying the Switch probably won't use an off the shelf TX1, and how we know 100% now that it won't. That's essentially all we know for sure based on the OP.
 

Astral Dog

Member
Something I could see happen if the Dock really does boost the system is Nintendo asking devs to focus on resolution rather than performance in order to make sure the portable version is playable.
You don't really need high resolutions on the portable (especially since it's 720p) but being 900p or 1080p on the TV would be nice.
Laptops have that option where you can choose best performance or battery save performance and it is noticeable though likely not big enough for 720p->1080p

what if devs need to lower the resolution (and framerate) to bring ONE ports without much trouble? are you sure Nintendo will bring those restrictions to third party developers to make up for their low power hardware?

i think graphically intensive NS games will be 720p, and there will be a good mix of 720p and 1080p games depending on how pretty they look.

NS is an HD system but still a portable
 

AmyS

Member
i think nintendo also likes power efficient hardware. and pascal is a modernized version of Maxwell, shrinked to 16nm as i understand it. Also 20nm isn't such a good process. I also see Nvidia push for Pascal, seeing that they are probably heavyly involved in the Switch hardware and software. And they say it's using the architecture that powers the worlds strongest Geforce GPUs, that would be Pascal at the moment. Yeah, it could be just pr speak ;)

Good arguments for Pascal.
 

Schnozberry

Member
The only hard specs leaked point to TX1 level hardware with no mention of a 128 bit bus on the early hardware. I just don't buy the hand wavy explanation of the huge variance between the leaked dev kit and what some are conjecturing about the final spec.

Also, there's no straw man I've constructed. Just a lot of people grasping at straws. In your own post there's conjecture about EDRAM, while others are taking about Denver cores or how half a teraflop is somehow a low balling estimation of what the NS is capable of.

I'm kinda dismayed by the reaction to the possibility that NS is comparable to TX1. The TX1 is actually a great mobile part compared to the shit Nintendo have released in the last decade. It doesn't achieve the impossible, but no hardware vendor does. There are always compromises.

Even if the Switch is running Pascal, it will be ~25% improvement over the TX1 at similar thermal boundaries, but that's huge when we're taking the difference between 500 and 700Gflops. The people talking about Denver cores are out of their minds. EDRAM is quite literally impossible at 16nm. Even ESRAM would would be too large for any kind of quantity in a mobile SOC. Larger L2 and L3 caches are not out of the question, and would pay dividends.
 
Had a late day at work. The wait from 15:00 until now to read about Switch has been Hell.

But I fucking love it! I think Sony and Microsoft need to be a little worried. This is a core gaming home machine that can be picked up and taken with you.

Battery life will obviously be key. A screen that big would be great in 1080p but I'd settle for 720p for a longer battery life.

Fucking bring it on!

I beg to differ, since Sony and Microsoft are guaranteed third party support.

Considering you can't play with a PS4 on a park bench, I'd say better.
If I'm sitting on a park bench I'm enjoying the sun and fresh air, not playing video games. And that's what most people do in the park. I'm also sure you know that isn't what he/she was asking.

I'm just loling at some of the expectations in this thread, 64 player bf1? Where do you plan on playing that? On the train? Bus? You know, the two places most people who use handhelds actually use them. You know what, I'm out, won't post in this thread again. I'll leave this mess alone.
 
Of course there are X1 chips in the devkits. Kits have gone out months ago, Parker has only recently been revealed, and there simply is no X2. As far as I'm aware, it hasn't even been announced yet. Only the PX2 for cars has been announced. I don't know what exactly Switch will end up using, but it probably won't be an off-the-shelf X1. I mean, it's not impossible, but I think Parker or some sort of half-step between the two is more likely.

Also, using funky "old and proven" tech already bit Nintendo in the ass, with fabs shutting down and what not, and the current NTD team has a lot of mobile expertise and probably won't be fazed by adopting state-of-the-art technology and manufacturing processes.

Haven't Nvidia referenced an X2 once but mysteriously avoided it ever since? I've seen some speculation that they're withholding info and also cancelled the Shield 2 because of their deal with Nintendo.
 

nordique

Member
I've been watching the Unreal Protostar demo to get an idea of what Switch's SoC is capable of, but yea, it's pretty mind blowing that a mobile chip comes anywhere close to replicating a demo running on a full fat console.

my thoughts exactly. pretty incredible what could be in such a small form factor
 

wsippel

Banned
Haven't Nvidia referenced an X2 once but mysteriously avoided it ever since? I've seen some speculation that they're withholding info and also cancelled the Shield 2 because of their deal with Nintendo.
Something like this was brought up as a possibility in the latest Linus Tech Tips podcast. I wouldn't rule it out. Nvidia were kinda desperate, they really, really wanted a high profile customer for Tegra - and it doesn't get much more high profile than this.
 

Speely

Banned
I really REALLY hope it's a custom Parker chip. All signs point to yes on it, but until it's confirmed for certain, all my hype is sort of compromised by the possibility that everything seems so perfect (to me) about the Switch and I will later learn that Nintendo and Nvidia worked out a deal for super-cheap Maxwell chips.

I don't think that's the case, per se, but that it's a possibility irks me, not so much because of raw power (though that is part of it) but mostly because of power efficiency.
 

AmyS

Member
I really REALLY hope it's a custom Parker chip. All signs point to yes on it, but until it's confirmed for certain, all my hype is sort of compromised by the possibility that everything seems so perfect (to me) about the Switch and I will later learn that Nintendo and Nvidia worked out a deal for super-cheap Maxwell chips.

I don't think that's the case, per se, but that it's a possibility irks me, not so much because of raw power (though that is part of it) but mostly because of power efficiency.

I feel like, yeah, I'm in the same boat.

Really hope it's a custom Tegra Parker SoC w/ Pascal architecture.
 

Zil33184

Member
Even if the Switch is running Pascal, it will be ~25% improvement over the TX1 at similar thermal boundaries, but that's huge when we're taking the difference between 500 and 700Gflops. The people talking about Denver cores are out of their minds. EDRAM is quite literally impossible at 16nm. Even ESRAM would would be too large for any kind of quantity in a mobile SOC. Larger L2 and L3 caches are not out of the question, and would pay dividends.

Or you could get TX1 level performance in a small portable tablet with decent battery life. Remember that the TX1 was used in a home console and a downclocked 10 inch tablet less than a year ago. Neither of those resemble the Switch which looks like it has less space for its battery.
 

Schnozberry

Member
Or you could get TX1 level performance in a small portable tablet with decent battery life. Remember that the TX1 was used in a home console and a downclocked 10 inch tablet less than a year ago. Neither of those resemble the Switch which looks like it has less space for its battery.

I doubt it will even reach Shield TV levels of performance untethered as a tablet. That machine consumed up to 20W of power at peak. The Pixel C tablet that the X1 lived in started out clocked 30% lower than Nvidia's peak performance numbers, and it rarely stayed there when running on battery. Even with Pascal you'd need to pull an unreasonable amount of wattage for a battery powered device to maintain high clock rates. It's when the unit is docked and not constrained by battery power that the performance becomes a bit more interesting.
 
I wonder how much of a steal this chip is for Nintendo. I mean Nvidia was hungry for a mass market buyer. It seems like they did a good amount of heavy lifting for Nintendo. I am more interested in hearing more about this deal.

How much power will this consume? I am assuming its like 20 watts of power max playing skyrim.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
On the subject of what part of fragment shaders could be done in fp16, it's an often forgotten fact that up until SM2, all fragment shaders were done in fx16 on ATI (sign, 3 int bits, 12 fractional bits). Then for a very long while they were done in fp24 (again, on ATI), and there were no practical reasons to go fp32 aside from unifying the shaders.

The only hard specs leaked point to TX1 level hardware with no mention of a 128 bit bus on the early hardware. I just don't buy the hand wavy explanation of the huge variance between the leaked dev kit and what some are conjecturing about the final spec.

Also, there's no straw man I've constructed. Just a lot of people grasping at straws. In your own post there's conjecture about EDRAM, while others are taking about Denver cores or how half a teraflop is somehow a low balling estimation of what the NS is capable of.

I'm kinda dismayed by the reaction to the possibility that NS is comparable to TX1. The TX1 is actually a great mobile part compared to the shit Nintendo have released in the last decade. It doesn't achieve the impossible, but no hardware vendor does. There are always compromises.
I get your POV, but let's try not to get ahead of ourselves. While yes, the devkits apparently constituted a TX1, assuming the NS would also not change on that total BW figure also implies the NS would be a step back from the wiiU which had 12.8GB/s + [70.4GB/s .. 35.2GB/s] of edram, depending on what you believe, for essentially the same or higher target resolutions. That would impose some serious issues of porting wiiU titles to the NS.
 

Schnozberry

Member
I wonder how much of a steal this chip is for Nintendo. I mean Nvidia was hungry for a mass market buyer. It seems like they did a good amount of heavy lifting for Nintendo. I am more interested in hearing more about this deal.

How much power will this consume? I am assuming its like 20 watts of power max playing skyrim.

The original semiaccurate.com leak that pointed us in the direction of Nvidia stated that they were pretty distraught over losing the console bids for the PS4/Xbox One and were willing to give Nintendo a really strong deal in terms of both software support and hardware to make it work. It sounded kind of unbelievable at the time but given that the rest of the rumor came true I'm not sure what to believe.

Shield TV was 5-20w with a Maxwell Chip. Pascal will hopefully be less on both the low and high ends. We don't really know until we have a better idea of the internals.
 
Top Bottom