• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch: Powered by Custom Nvidia Tegra Chip (Official)

Really weird how Mario seemed to be running completely fine, but not so much Zelda. I guess in the end some dumb stuff behind the scenes happened. I can't believe choppy performance will actually be an issue for BoTW on Switch, no matter which mode. It would be absolutely disappointing to say the least.

I would guess that Mario is still fairly early in the development stages and running on a development PC, whereas that Zelda footage was likely right from the Wii U build.
 

kami_sama

Member
When rumours suggest below Xbox One performance and more powerful than Wii U, does that apply to handheld mode or docked mode?

Both, I think. If we think it's a TX1, it does 512 GFLOPS at 1GHz on FP32.
That's when docked, pushing 1080p. If it pushes 720p, that's half the pixels, and we can consider the frequency halves, which gives us 250 GFLOPS.
Considering the Wii U has 176 GFLOPS, that's already better. The one has 1.3 TFLOPS.

That's all speculation, though.
 

Thraktor

Member
What? They quite obviously are.

Care to provide any evidence that Pascal cards are horrifically bandwidth constrained? (At the typical resolutions people would be expected to use them, at least).

Also there's a huge difference between having 25GB/s total memory bandwidth and 150GB/s in absolute terms. Bandwidth per pixel is a much more important metric than bandwidth per compute, since the latter is destined to decrease faster and image quality is about work done per pixel. At 900p30 an XBO has to render 43,200,000 pixels. That's less than twice as much as the 27,648,000 pixels NS would have to render at 720p30, and the XBO has 6x memory bandwidth.

Yes, bandwidth per compute is a silly metric, that was my point (as that's effectively the way the "insider" on the Anandtech forums was calculating things). Bandwidth per pixel is a more fair comparison, if the devices you're comparing use the same graphics architecture. The bandwidth consumption of a GCN 1.0 era GPU accessing an uncompressed buffer using immediate-mode rasterisation will be very different from a Pascal GPU accessing a compressed buffer using tile-based rasterisation (especially if we don't know the cache configuration of the latter).

The mobile code paths of UE4 do give up features present in the main branch in order to accommodate FP16. For mobile games it is a good trade-off, but fully featured games probably wouldn't consider it.

Do you have any sources on features being dropped from mobile specifically due to lack of precision in FP16? (I'm not doubting that there are such features, I've just had a difficult time finding reliable info on this kind of stuff online.)

Would a custom X1 really even be a realistic option for the retail unit, though?

Who will produce these chips? We had a lot of reports in the past years about how the 20nm chips are quickly skipped in favour of 16nm by most of the TSMC customers.

We have this article from January:

http://www.extremetech.com/computin...-10nm-production-this-year-claims-5nm-by-2020

Plus:

http://www.tsmc.com/uploadfile/ir/quarterly/2016/3tBfm/E/TSMC 3Q16 transcript.pdf

Going through TSMC's earnings reports and calls, they're barely even mentioning the 20nm segment and when it is, it's bundled with 16nm.

With mid-range smartphone SoCs jumping straight from 28nm to 16/14nm, I think it's safe to say that 20nm is just straight up uneconomical at this point, and if there's any capacity left it's being used for legacy products (Apple still sells some devices that use the A8, for example). The only possible reason Switch would use a 20nm chip would be if Nvidia had entered into a large wafer commitment with TSMC for 20nm and offered Nintendo an obscenely good deal in order to use it up, but I very much doubt that that's the case.

To preempt Hoo-doo, yes, I know that Nintendo didn't follow the logical path with Wii U and went for the old fab process, but that did create them a lot of trouble later in Wii U's life, so one would hope they won't ignore the fact that 20nm is almost dead in the near future.

It's worth keeping in mind that Nintendo's decision to use eDRAM on Wii U's GPU prevented them from using 28nm, even if they had wanted to.

Preempting the counter point doesn't invalidate it. AFAIK they also produced Wii's at 90nm for the entirety of it's life, never shrinking to 65nm or 45nm. When there is a history of illogical HW decisions, "potentially creating trouble" isn't a sufficient argument against it.

I don't think this is true at all. While nobody (to my knowledge) has actually decapped Wii CPUs & GPUs from across the production timeline, later models definitely had smaller package sizes, required smaller heatsinks and consumed less power, all of which would definitely indicate that they performed die shrinks just like MS and Sony.

An audience of people who, thanks to Apple's brilliant marketing of the Retina display many years ago, care more about resolution and clarity of the display than ever before.

A 6" 720p display has a PPI greater than any of Apple's Retina Macs and only slightly lower than their Retina iPads. Let's not pretend that this is going to be a pixelated mess in comparison.
 
NippleViking said:
This is more than an 800% increase in pixels over the 3DS isn't it?
GhostTrick said:
If you're comparing just one eye's view of the top screen, then the Switch screen would be 960% of the pixels, or an 860% increase. If you consider both left/right views of top screen plus the bottom screen, Switch would be about 343% of the pixels or a 243% increase.
 

SuperHans

Member
Is it possible for this to have some sort of GSync implementation in handheld mode? Laptops with gsync don't require the gsync module that monitors do right? Could it use something like that.
 

Thraktor

Member
Is it possible for this to have some sort of GSync implementation in handheld mode? Laptops with gsync don't require the gsync module that monitors do right? Could it use something like that.

It's entirely technically possible, even without Nvidia's involvement (Apple's iPads now have adaptive refresh rate displays). It'd definitely be a nice feature if it's cheap and easy to implement, although it could make TV play feel a bit stuttery in comparison.
 

Schnozberry

Member
It's entirely technically possible, even without Nvidia's involvement (Apple's iPads now have adaptive refresh rate displays). It'd definitely be a nice feature if it's cheap and easy to implement, although it could make TV play feel a bit stuttery in comparison.

Hitting a locked 30 or 60 might be easier docked. Having adaptive refresh would be more beneficial on the go.
 

G.ZZZ

Member
You're assuming that the games will push the resources to the max when in handheld mode. That will probably be the case with 3rd party games, but 1st party games should be able to render 720p in handheld mode without too many issues. Take for example Zelda. It runs already at 720p on Wii U. Even if we assume a very moderate 2 x Wii U power in handheld mode (which would mean a Tegra X1 running at 688Mhz), that should run Zelda at 720p without sweating. So docked, active cooled and pushed to the max it is pretty realistic to be able to run at 1080p.

Again, that's a developer's choice, and also a Nintendo one, to a point. It's possible, but it's not certain. Nothing is, unless nintendo come out and say that they need every game on this to be programmed for 720p in mobile mode and 1080p in docked mode. Which i don't think they'll do tbh. As i see it, it's possible, but not probable.
 
I've seen this thrown around a bit but never gotten a real answer on it:

Would it be possible for the Switch to utilize the Nvidia PC streaming that the Shield products have? Most people say it's unlikely but I'm curious why.

Is it an additional hardware expenditure? Does it open up the Switch much more easily to hacking/homebrew?
 

MacTag

Banned
Im thinking 32GB is too small. We need at least 100GB. And patches could be created on the cart itself.

Nintendo should subsidize the cost of the carts to help devs not feel gunshy.
Patches on the cards just ups expense for Nintendo and 3rd parties. No need for that, all patches, dlc and saves should be stored on system/expanded memory.
 

G.ZZZ

Member
Just saw this video of Metal Gear Rising comparison between X360 and Shield TV

https://www.youtube.com/watch?v=fwz65Bvxrv8

Is the Shield TV performance a good indicator for level of performance we can expect from Switch?

The ShieldTV is running Android so no, while it may be the same architecture, it's much much slower than an equivalent console. The shieldTV is like 3 times as fast as the X360 but the games works worse on it.
 
Both, I think. If we think it's a TX1, it does 512 GFLOPS at 1GHz on FP32.
That's when docked, pushing 1080p. If it pushes 720p, that's half the pixels, and we can consider the frequency halves, which gives us 250 GFLOPS.
Considering the Wii U has 176 GFLOPS, that's already better. The one has 1.3 TFLOPS.

That's all speculation, though.

Thanks. If this speculation is close to reality, then that would be really great.
 

MacTag

Banned
The ShieldTV is running Android so no, while it may be the same architecture, it's much much slower than an equivalent console. The shieldTV is like 3 times as fast as the X360 but the games works worse on it.
MGR was also ported by Nvidia rather than Konami/Platinum. Outsourced ports generally aren't the best way to judge system capability even without a resource hogging mobile OS running on top of them.
 

atbigelow

Member
I've seen this thrown around a bit but never gotten a real answer on it:

Would it be possible for the Switch to utilize the Nvidia PC streaming that the Shield products have? Most people say it's unlikely but I'm curious why.

Is it an additional hardware expenditure? Does it open up the Switch much more easily to hacking/homebrew?
Mainly because this is a Nintendo product, not an NVidia product. It probably has nothing to do with their goals.
 
Mainly because this is a Nintendo product, not an NVidia product. It probably has nothing to do with their goals.

What kind of downside is there to providing PC streaming functionality? Nintendo's goal should be to make the product as appealing as possible to as large a market as possible, so if there is no hardware/hacking reason why Nvidia gamestream could be a bad thing then I don't see why they wouldn't include it.
 

Oregano

Member
What kind of downside is there to providing PC streaming functionality? Nintendo's goal should be to make the product as appealing as possible to as large a market as possible, so if there is no hardware/hacking reason why Nvidia gamestream could be a bad thing then I don't see why they wouldn't include it.

Nintendo probably won't want people using their hardware to buy other vendor's software.
 
What kind of downside is there to providing PC streaming functionality? Nintendo's goal should be to make the product as appealing as possible to as large a market as possible, so if there is no hardware/hacking reason why Nvidia gamestream could be a bad thing then I don't see why they wouldn't include it.
Because they are Nintendo.
 

Branduil

Member
Really weird how Mario seemed to be running completely fine, but not so much Zelda. I guess in the end some dumb stuff behind the scenes happened. I can't believe choppy performance will actually be an issue for BoTW on Switch, no matter which mode. It would be absolutely disappointing to say the least.

Did you even read the post you're quoting? The implication is that the choppiness is due to the editor of the commercial not having enough footage to last through the shot, so they just stretched out the material they were given.
 
Nintendo probably won't want people using their hardware to buy other vendor's software.

I guess I can see both pros and cons in that area.

Pros:

-more people buy the hardware, higher install base means more people have the option to buy Nintendo software
-Switch is seen playing PC games like the Witcher 3, it gives people the impression that you can play all multiplats on it

Cons:

-Nintendo misses out on the revenue from PC games
-Third parties see less of an incentive to put their games on Nintendo's hardware when PC ports will work fine
--Nintendo makes less revenue from third parties overall

So while it would likely increase the install base and maybe increase the amount of software sold, it could very badly damage their licensing revenue from third party games. That makes sense then, thank you for helping me work it out.
 

numble

Member
~240dpi (the screen is 6.2") is within spitting distance of the iPad Pro/Air's retina 264ppi and you keep both at roughly the same distance so 720p is perfect for a device this size. Phones have to render webpages and whatsapp, not Zelda.

A 6" 720p display has a PPI greater than any of Apple's Retina Macs and only slightly lower than their Retina iPads. Let's not pretend that this is going to be a pixelated mess in comparison.

For the 7.9" iPad mini that is more comparable to the 6" Switch, the PPI on the Retina iPad is 326 PPI. That is a 30% difference. Not spitting difference or slightly lower.
 
For the 7.9" iPad mini that is more comparable to the 6" Switch, the PPI on the Retina iPad is 326 PPI. That is a 30% difference. Not spitting difference or slightly lower.

Yeah, but you have to power those extra pixels... It's much better to aim for slightly lower IQ to maintain the power/framerate/battery balance.
 

Hermii

Member
Did you even read the post you're quoting? The implication is that the choppiness is due to the editor of the commercial not having enough footage to last through the shot, so they just stretched out the material they were given.
It's incredible how much confusion and worry can come from a small error like that. The funny thing is that it was probably Wii U footage as well.
 

KingSnake

The Birthday Skeleton
With mid-range smartphone SoCs jumping straight from 28nm to 16/14nm, I think it's safe to say that 20nm is just straight up uneconomical at this point, and if there's any capacity left it's being used for legacy products (Apple still sells some devices that use the A8, for example). The only possible reason Switch would use a 20nm chip would be if Nvidia had entered into a large wafer commitment with TSMC for 20nm and offered Nintendo an obscenely good deal in order to use it up, but I very much doubt that that's the case.

Unless the chips were already manufactured (which doesn't make sense in the context, because Nvidia states that Switch uses a custom chip), I don't see why even such a commitment couldn't be re-negotiated into a 16nm one. Especially since TSMC claimed several times in their earnings info that they get better margins for this fabrication node.

We are really in a situation in which Pascal architecture is the best solution for everybody involved.
 
720p is absolutely the right compromise for the portable system. It's a quantum leap over previous handhelds, will keep performance and battery life in check while maximising the amount of games running at native resolution.

Combined with modern temporal AA it should produce a very smooth image on the 6" screen.
 

tuxfool

Banned
Do you have any sources on features being dropped from mobile specifically due to lack of precision in FP16? (I'm not doubting that there are such features, I've just had a difficult time finding reliable info on this kind of stuff online).

Nothing that I can find easily, it was something I heard asked of an engineer on a stream (and I don't remember where as it was a while ago). I don't know how significant the differences are though, I got the impression that it is mostly all there. I also got the impression that it was also more of a case of features lagging rather than outright impossibility.
 

Oregano

Member
I seem to recall someone saying that Mobile UE4 defaults/maxes out at 720p. That won't be too much of an issue with a device designed around that resolution though.
 
Very few if any games you play on it render at that resolution. Any actual intensive games are far sub-native.

Yeah, I suppose that makes sense overall though I suspect that the Nvidia Shield has come close or actually been able to hit that on some games....Doom 3 I believe but I would have to check.

Edit: Doom 3 would be part of the "very few" you mentioned.
 

SuperHans

Member
It's entirely technically possible, even without Nvidia's involvement (Apple's iPads now have adaptive refresh rate displays). It'd definitely be a nice feature if it's cheap and easy to implement, although it could make TV play feel a bit stuttery in comparison.

Hitting a locked 30 or 60 might be easier docked. Having adaptive refresh would be more beneficial on the go.

It would be good on the go alright dips under 30 wouldn't feel as bad. Really happy Nintendo have gone with this platform. It'll give them a good path to upgrade for the next few gens. Especially with nvidia going all in on tegra for automotive. They really milked the IBM GameCube tech so this will be much better if they go the same route.
 

AzaK

Member
I was more than happy to buy my own storage. I got an awesome deal on a 2TB external HDD on Japanese Amazon so it only drove the cost up by a little over 6,000 yen which is the price of a new game. The advantage is that I could get whatever size I wanted. Unlike my PS4, which has a 500GB drive in it, but the damn thing fills up so fraggin' fast. If I want to try demos, or something, I have to constantly remove things and add things. If people like managing the fridge, then that's fine, but I'd rather not worry about it. Still, most Wii U games are on disc, so if someone went all physical, I think the basic internal memory should be fine. If someone was going to go digital, then I would imagine they'd be smart enough to buy/borrow an external storage device.

Don't think what I said doesn't mean I don't want choice. For it to be OK to me, two things need to happen.

1) The price of the console has to be reduced. Wii U wasn't, It was expensive for what it was and then you pretty much had to have an external HDD.

2) It needs an internal slot for the drive where you don't have to have a stupid externally connected device.

My real issue is for people who want to go digital, or if AAA's want patches it's almost certain to require additional storage and I think it's a little misleading for a company to sell a machine as a complete system when it's almost required.
 

Lonely1

Unconfirmed Member
For the 7.9" iPad mini that is more comparable to the 6" Switch, the PPI on the Retina iPad is 326 PPI. That is a 30% difference. Not spitting difference or slightly lower.

The iPad mini main function is to display text, per-recorded video and mostly static GUIs. The main function of the Switch is to display graphical intensive games per the trailer. So no, is not "more comparable".
 

Lonely1

Unconfirmed Member
Yeah, I suppose that makes sense overall though I suspect that the Nvidia Shield has come close or actually been able to hit that on some games....Doom 3 I believe but I would have to check.

Edit: Doom 3 would be part of the "very few" you mentioned.

Im certain that Doom 3 doesnt run at at 1440 x 2560 on any Shield device.
 

LCGeek

formerly sane
Is it possible for this to have some sort of GSync implementation in handheld mode? Laptops with gsync don't require the gsync module that monitors do right? Could it use something like that.

Gsync and lightboost are hardware hacks to make up for the fact microsoft won't do it's job. If nintendo were smart enough and paid for the tech such features done by the OS would leverage immense benefits to consumers.

Will they is another matter. Yet while I love nintendo engineers they often have to deal with limits and have poor awareness. All 3 of the big companies do in leveraging some of the new big tech we have had in the last decade that applied to gaming right would have huge pay offs.
 
Gsync and lightboost are hardware hacks to make up for the fact microsoft won't do it's job. If nintendo were smart enough and paid for the tech such features done by the OS would leverage immense benefits to consumers.

Will they is another matter. Yet while I love nintendo engineers they often have to deal with limits and have poor awareness. All 3 of the big companies do in leveraging some of the new big tech we have had in the last decade that applied to gaming right would have huge pay offs.

Based on Nvidia's blog post it seems like Nvidia is supporting the Switch quite heavily with the tools on their end, so there might be more of an equal partnership in designing the OS/features than we've seen with previous Nintendo features.


Also, earlier this year you were talking about how the NX CPU performed decently ahead of that of the XB1, and that you'd know more as the next devkits were released in August. Do you know if any of that has changed, or if there's any other info you can share? Definitely understandable if you can't :)
 

Pif

Banned
Both, I think. If we think it's a TX1, it does 512 GFLOPS at 1GHz on FP32.
That's when docked, pushing 1080p. If it pushes 720p, that's half the pixels, and we can consider the frequency halves, which gives us 250 GFLOPS.
Considering the Wii U has 176 GFLOPS, that's already better. The one has 1.3 TFLOPS.

That's all speculation, though.

You can't compare amd/ibm flops with nvidia flops, they say.
 

numble

Member
The iPad mini main function is to display text, per-recorded video and mostly static GUIs. The main function of the Switch is to display graphical intensive games per the trailer. So no, is not "more comparable".
You actually think that a 6" device is more comparable to 10", 12", 15" and 24" devices than a 7" device?
 

dogen

Member
Just saw this video of Metal Gear Rising comparison between X360 and Shield TV



Is the Shield TV performance a good indicator for level of performance we can expect from Switch?

Look at the Doom 3 BFG port. More than 2x res and still slightly better performance. And it was already a well optimized game on 360.
 

Nerrel

Member
x-quoting myself from the BotW thread



#TeamReasonablySmooth

That sounds convincing to me, but god, Nintendo should have known better than to put footage in their reveal trailer that actively makes the system look like it's struggling to run a Wii U game.
 

Zil33184

Member
Care to provide any evidence that Pascal cards are horrifically bandwidth constrained? (At the typical resolutions people would be expected to use them, at least).



Yes, bandwidth per compute is a silly metric, that was my point (as that's effectively the way the "insider" on the Anandtech forums was calculating things). Bandwidth per pixel is a more fair comparison, if the devices you're comparing use the same graphics architecture. The bandwidth consumption of a GCN 1.0 era GPU accessing an uncompressed buffer using immediate-mode rasterisation will be very different from a Pascal GPU accessing a compressed buffer using tile-based rasterisation (especially if we don't know the cache configuration of the latter.)

Pretty much all GPUs these days are bandwidth constrained relative to compute, especially ones that have over 5x the compute power of console and less than twice the bandwidth. Compute scales faster than bandwidth, and it's a problem. See 4K and VR if you think this is merely academic as opposed to something people in the industry take seriously.

As for the XBO vs TX1 situation, Nvidia's bandwidth optimisations aren't going to net significant enough performance increases to offset only having 25.6GB/s of memory bandwidth.

I do see a lot of people engaging in insane amounts of fudge factoring without a single piece of evidence though. Apparently fp16, color compression, "NV FLOPS", and tiled rasterization somehow mean the TX1 level hardware likely to be in the NS is somehow a portable XBO or PS4.
 
Gsync and lightboost are hardware hacks to make up for the fact microsoft won't do it's job. If nintendo were smart enough and paid for the tech such features done by the OS would leverage immense benefits to consumers.

Will they is another matter. Yet while I love nintendo engineers they often have to deal with limits and have poor awareness. All 3 of the big companies do in leveraging some of the new big tech we have had in the last decade that applied to gaming right would have huge pay offs.
Any updates on the hardware?
 
Top Bottom