• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Confirmed: The Nintendo Switch is powered by an Nvidia Tegra X1

Status
Not open for further replies.
I rotated the image in another post to match Nvidias graphic better, Thraktor used one or the other and I must have quoted the other in this thread.


Speaking of, rotating the Switch now to match the TX1

CRpTnPu.jpg


g7C1bQq.jpg



If that ain't damn near identical...

Yup I got that eventually, thanks though. It's just confusing since they all appear to be tall rectangles, so I figured a 90 degree turn would look like a wide rectangle!

But yeah this look identical, maybe doing an overlay in photoshop might clue us in to any minor differences.
 

ethomaz

Banned
Or you know, some more optimization time could be enough to stabilize Zelda's performance, but no let's make definitive statements about a system's power based on a launch port!
You can do the same optimization on the X2 to have even better look.

I don't even know why are you guys trying to say the performance boost was not good for Switch lol

A $599 Switch probably would've sold just a bit worse.
Do you think the X2 chip costs $200 more than X1 chip lol

C'mon.

Both chips may cost below $100... way bellow by the size of it... XB1/PS4's APU cost below $100 at launch.
 

guek

Banned
Still ridiculously overpriced for a hunk of plastic with nothing more than HDMI and USB plugs in them. Should be $10, $20 tops.

No it is not.

USB-C docks are not cheap. Granted, some of these docks also have Ethernet or card reading capabilities but that's not what's driving up the price. $60 for the dock alone is in line with market prices and is only marginally overpriced if at all.
 

TI82

Banned
Sorry if someone said this already, but is this the first time Nvidia has had a card in a system? It seems it's always been custom or AMD.

Edit: Seems the OG Xbox had NVidia as well
 

Nabbis

Member
Switch's screen may be low res, but it's not low quality. It has nice contrast and the colors pop. It's really great.

I get if you like the thing but the materials and 720p is not a mark of high quality in 2017. Well, it makes sense given this confirmation for the X1 but that only speaks ill of the whole package instead of complementing Nintendo on using the appropriate resolution for the current power.
 
You can do the same optimization on the X2 to have even better look.

I don't even know why are you guys trying to say the performance boost was not good for Switch lol

You aren't getting a TX2 in the Switch for $300.

I get if you like the thing but the materials and 720p is not a mark of high quality in 2017. Well, it makes sense given this confirmation for the X1 but that only speaks ill of the whole package instead of complementing Nintendo on using the appropriate resolution for the current power.

A 1080p screen would destroy the battery on this device. It's literally a pipe dream. All you would get is sub native res games.
 
It's low quality quality in the sense it's lower resolution and isn't glass compared to tablets that are cheaper. I get that 720p is kind of a sweet spot but it isn't as if scaling doesn't exists, not to mention 720p is a cheaper part than 1080p or higher.
Are you really suggesting upscaling? That would murder image quality. Just look at half of the 3D Vita games out there.
 
You can do the same optimization on the X2 to have even better look.

I don't even know why are you guys trying to say the performance boost was not good for Switch lol

The X2 is
A. Too expensive
B. Brand new
C. Not even being used by NVidia in their newest Shield device

The X2 was not an option for the Switch.
 

AmyS

Member
I think a future Tegra V1 (Volta GPU architecture, 512 cuda cores) and 8 ARM CPU cores shrunk down to 7nm (from Xavier's 16nm) will make a nice SoC for Switch II in 2021.

Especially if they "switch" the RAM from LPDDR4 to the forthcoming low-cost HBM.
 

AmFreak

Member
If the way more efficient X1 thermal throttles down to below Switch Clocks, what on earth makes you believe that the Shield tablet isn't?

Gpu stayed at 750Mhz for over 100 runs/or more than an hour of T-Rex. Went down after that, probably because of battery.
http://www.anandtech.com/show/8329/revisiting-shield-tablet-gaming-ux-and-battery-life

Throttling is normally an issue for small, fan-less Android devices with strong SoCs. Most can only maintain the maximum performance for a short period, only to then lower the clock rates of both the graphics card and processor due to temperature limits. We can give the all-clear signal for the Shield tablet. Even after running 3DMark Unlimited 10 times consecutively, the Tegra K1 still reached a similar rate as the first time. The CPU also stably clocked at 2.2 GHz after 15 minutes in the Stability Test.

http://www.notebookcheck.net/NVIDIA-Shield-Tablet-with-Tegra-K1-Review.125892.0.html
 

ZOONAMI

Junior Member
I hear you. I bought a Shield back in November 2015 that used Tegra X1. It's not apples to apples comparison but the fact that I had access to this tech almost two years ago in this day and age isn't very sexy in technology realm.

Yeah it's just like another SM would have been cool especially since they've been calling it custom all this time.
 

OryoN

Member
Is there certainty that customizations weren't/couldn't have been made to have the A53's run simultaneously? Wouldn't they be a good fit for OS/background features? I'm aware that there're details floating around suggesting devs only have access to 3 of the 4 A57's, but it has not been confirmed what that core is reserved for, or if it's possible to free up down the line.

Anyway, as I stated weeks ago, whether it's a standard Tegra X1 or not makes no difference to me in the end, but I was leaning on/hoping that some easily recognizable customizable(more cache/esram etc) would help explain why the device performs as well as it does. Even at this early stage, I'm seeing stuff on Switch that is well beyond my initial expectations for what a device so small would output. Perhaps I was looking for a more complicated answer, but it seems that the low driver overhead finally unlocked the Tegra beast.
 

kIdMuScLe

Member
My LG quick charger that came with an old phone charges it just fine while gaming not on the dock, but yeah there aren't really even any 5amp USB c chargers on the market other than nintendos.

That's not what I'm asking for... I'm asking for one that could display on the tv, power 3x USB port and charge the system at the same time
 

senj

Member
You can do the same optimization on the X2 to have even better look.

I don't even know why are you guys trying to say the performance boost was not good for Switch lol

Because the 20% performance boost can only be obtained with a chip that costs $299 by itself in a bare-bones board configuration.

Even with large volume discounts the switch would cost more than 20% more than it does today if they'd gone with the X2. The cost doesn't justify the speed bump -- which is exactly why Nvidia isn't bothering to use it in their Shield, either.
 

ethomaz

Banned
The X2 is
A. Too expensive
B. Brand new
C. Not even being used by NVidia in their newest Shield device

The was not an option for the Switch.
I'm sure both... X1 and X1 cost way below $100... you guys are making it looks like these small chips costs $200.

Because the 20% performance boost can only be obtained with a chip that costs $299 by itself in a bare-bones board configuration.

Even with large volume discounts the switch would cost more than 20% more than it does today if they'd gone with the X2. The cost doesn't justify the speed bump -- which is exactly why Nvidia isn't bothering to use it in their Shield, either.
What I'm reading lol

X2 may cost $50.

Where you guys are getting $299???
 
Important to note that TX1 can only use either cluster at once, not a mix of cores from each, so the OS would have to run on an A57, not an A53, while games run on the A57s.

My question about that is what is causing that limitation? Is it a hardware limitation which cannot be overcome even with any customization? Is it a firmware limitation?

Could it be possible that they overcame this with a customization so minor that we can't seem to see it when comparing the die shots?
 

EDarkness

Member
Yeah it's just like another SM would have been cool especially since they've been calling it custom all this time.

Yeah. This is my biggest take away from all of this. Nvidia and Nintendo could have saved everyone some time by just saying it was an X1 from the beginning. Going with this "Custom" business just drug this whole thing on for no reason. Why not just tell us straight?
 

Oregano

Member
I get if you like the thing but the materials and 720p is not a mark of high quality in 2017. Well, it makes sense given this confirmation for the X1 but that only speaks ill of the whole package instead of complementing Nintendo on using the appropriate resolution for the current power.

The Xbox One struggles with 1080p and even some PS4 games are lower than that. You'd be getting more simplistic visual at 1080p or everything would be far below native res anyway.
 

ultrazilla

Member
Am I the only one who isn't really concerned about this?

It's hands down more powerful than a Wii U/360/PS3. You get portable and home gaming.

And I honestly believe that going forward, Nintendo will keep this design and iterate on just the tablet(basically the system) as Nvidia brings out more advanced chips. Since many people will have the joycons, docks, pro controllers, etc.....Nintendo could just launch Switch 2.0 tablets with beefier specs and hopefully keep costs down as it won't come packed with the above mentioned hardware.

That or they'll make use of their additional add-on computational device patent that seemed like it was meant for the Switch(IMO) which will basically be add on devices that plug into the dock to increase performance.

So Nintendo fans won't have to buy entire new systems like the PS4 Pro or Xbox One S or Scorpio to get better spec'd hardware. They'll just release tablet iterations or the add on docks/devices.

I'm really happy with my Switch and I'm sure once developers and Nintendo themselves get a grasp-get used to developing for it, we'll naturally see better looking and performing games. They're off to a good start with over a million units sold already.
 

NOLA_Gaffer

Banned
The Xbox One struggles with 1080p and even some PS4 games are lower than that. You'd be getting more simplistic visual at 1080p or everything would be far below native res anyway.

In addition pushing the pixels on a 1080p screen would be even more of a battery drain.
 

saskuatch

Member
I'm shocked they are completely identical, the x1 is quite old. Surely they would have had some even minor improvement or revision by now
 
By the way, I'm happy to see so much Italian technology there.
I met a guy from STMicroelectronics once, he said that working for Nintendo was a real challenge.
 

ckaneo

Member
As I already said, I think the screen and the battery make up more of the difference.
Yeah I know. There is more than enough present with the switch to make it cost 300. throw in Nintendo's profit and there you go.

Not sure why people think just the chip matters. Which is obviously weak compared to other consoles
 

LordOfChaos

Member
My question about that is what is causing that limitation? Is it a hardware limitation which cannot be overcome even with any customization? Is it a firmware limitation?

Could it be possible that they overcame this with a customization so minor that we can't seem to see it when comparing the die shots?

Yeah, the way the caches are wired (the CCI).

TX1 uses the oldest form of big.little

vTrv4YW.png



To run the OS on an A53 while running the game on 4 A57s, you'd need the newest form, HMP, which allows any and all cores to be active at once. The oldest form is full cluster switching, the middle form is one core of each big.little pair can be active at once (not useful for this scenario as you'd still lose an A57).


Best I can tell the wiring situation as well looks damn near identical. A gif switching between the two would help.
 

ZOONAMI

Junior Member
Are you really suggesting upscaling? That would murder image quality. Just look at half of the 3D Vita games out there.

720p content looks fine on a 1080p screen.

Most tablet games are not running at native resolution.

But yes I would have preferred a 1080p screen and 3SMs to offer 1080p gaming for at least more titles reliably. Dynamic 900p docked is disappointing imo for Zelda.

Switch is already running games below 720p in portable mode. Not sure if it's upscaling Zelda to 720p or not, but there was a thread about dynamic resolution running below 720p in portable and 900p docked.

720p UI docked is inexcusable.
 
Yeah, the way the caches are wired (the CCI).

TX1 uses the oldest form of big.little

To run the OS on an A53 while running the game on 4 A57s, you'd need the newest form, HMP, which allows any and all cores to be active at once. The oldest form is full cluster switching, the middle form is one core of each big.little pair can be active at once (not useful for this scenario as you'd still lose an A57).

Yeah I remember seeing that before. I think the main point of the question though is, is HMP something that could have been added with minimal physical changes to the die? Or would it require some kind of wiring or interface that would immediately be noticeable on a die shot?
 

LordOfChaos

Member
Yeah interesting, only those two and now this one and it's not a custom chip either.

Semicustom is part of AMDs strategy, meanwhile Nvidia has caused issues with partners for how much control they want over their IP. It burned Microsoft and then Sony in turn.
 

LordOfChaos

Member
Yeah I remember seeing that before. I think the main point of the question though is, is HMP something that could have been added with minimal physical changes to the die? Or would it require some kind of wiring or interface that would immediately be noticeable on a die shot?

The wiring between the core pairs would likely have to change, looks pretty identical to me. If you look at say Ryzen, they pay a wiring overhead for each core having equally timed access to all of the L3. Not exactly the same thing but I don't think an identical chip has ever gone from non-HMP to HMP.


Actually thinking about it, bigger than that, "core pairs" is a misnomer on my part as they aren't paired, they're still in two clusters.
 

ethomaz

Banned
I found the size of the X1.

121mm2 using 20nm.

121mm2 in a 300mm wafer give you about 500 chips.

$10000 a 300mm wafer in 20nm... give you ~$20 per chip for nVidia.

Add nVidia profit over it.

X1 cost to Nintendo something from $30-40... to give a big margin of error $30-50.

How much to you guys think nVidia will sell X2 to Nintendo???
 
If it's just off the shelf parts then the price is coming down quicker at least.

Or this means they are burning through old Nvidia x1 stock for the first year or two and will switch to X2 later at the same price.

Comparability should be 100% and even possible to make old software run better on the new machine.
 

We're actually suppose to believe that the old tablet chip is better at sustaining performance at higher clocks than the TX1 and this tablet does not throttle at all? I don't. Run a game on it and see if it doesn't throttle.
 

LordOfChaos

Member
We're actually suppose to believe that the old tablet chip is better at sustaining performance at higher clocks than the TX1 and this tablet does not throttle at all? I don't. Run a game on it and see if it doesn't throttle.

750MHz was the throttled clock, iirc full GPU speed is like 1050mhz or something.

Our own MDave showed the TX1 Switch dropped to about 1700MHz CPU, 700MHz GPU on sustained load. Still above where the Switch landed on CPU, about right for GPU, but obviously Nintendo chose clocks that would absolutely 100% never be dipped below. And in a smaller form factor than the Shield of course.
 
Status
Not open for further replies.
Top Bottom