• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Confirmed: The Nintendo Switch is powered by an Nvidia Tegra X1

Status
Not open for further replies.

koss424

Member
Now I'm curious where Nintendo's enormous R&D spend has gone the past several years. Did they just trash bin multi-year, multi-billion dollar concepts to ship what basically amounts to an off the shelf Chinese Android tablet with some slot on Wiimotes?

This is a silly thread. Is this even new information?
 

KHlover

Banned
They give it a high initial memory footprint so they can add features in later on without it breaking older titles, it's pretty standard. Sony and MS ended up giving some memory back to the games towards the end of the last gen life-cycle once they were done adding features to the OS.

Hopefully the high memory footprint indicates they are going to consider adding memory hungry features like game recording and in-game voice chat to the system at some point in it's lifecycle.

I mean that's not even an assumption, Nintendo have flat out stated that.
 

Afrikan

Member
edit- ok ignore this post, I've been inforomed.

Since it is stock... isn't this more powerful than what previous X1 believers expected?

Or is Nintendo underclocking the stock settings? But isn't that then customizing?

Or am I totally off with the discussion here.
 

Mercador

Member
Yep, it might be the next Vita. I just hope for early owners that the third party will be there. Zelda might be the best game ever but if there's not much other games than first party one, there will be gaming drought.

Personally, I'll wait for the next revision before jumping in. I want to see third party support and how it runs on Switch.
 

ZOONAMI

Junior Member
The last time Nintendo had a completely new, built-from-the-ground-up console GPU (not merely semi-custom work based on a PC GPU) was the Flipper for Project Dolphin (GameCube).

Even the Wii U's Latte GPU was pretty much a semi-custom lower-end RV7xx with 32 MB EDRAM. -- And the Wii's GPU was just an overclocked Flipper.

Anyway, Flipper was designed by ArtX from about 1998 to 2000. It wasn't based on any Radeon design because there were no Radeon GPUs back in 1998-1999. ArtX had made a PC GPU for Acer and showed it at Comdex 1999 but this didn't go anywhere.
Everything ArtX did went into Flipper for Nintendo. There wasn't anything like it before or after, not counting the Wii's overclocked version (named "Hollywood").

Which is why GC was deceptively powerful. Thing really was a mini beast. Amazing launch price too. I think if it had DVD support it would have had a shot at winning the gen.

Ah sry mobile fucked up the quote.
 

Rodin

Member
Since it is stock... isn't this more powerful than what previous X1 believers expected?

Or is Nintendo underclocking it's stock settings? But isn't that then customizing?

Or am I totally off with the discussion here.

They're talking about the die configuration, which is identical to a TX1. They don't know anything about clocks.

The reason why we wanted to see these die shots was to understand if some changes were made (e.g. A53 removed in favor of SRAM for higher peak bandwidth), but it doesn't seem to be the case.
 
Since it is stock... isn't this more powerful than what previous X1 believers expected?

Or is Nintendo underclocking it's stock settings? But isn't that then customizing?

Or am I totally off with the discussion here.

It's underclocked. Devices with the stock tx1 throttle quite a bit at stock clocks, at around the same values that Nintendo chose to lock as profiles for consistent performance metrics.
 

Xdrive05

Member
This thing should be able to do some real nice work in portable boost mode, which will get a res bump when docked. I hope we see some output from the wider dev community including western devs who already have engines and previous experience targeting the Maxwell bag of tricks. I'm hoping the existing Vulcan work would translate nicely to this thing too!
 

LordOfChaos

Member
But porting games should be fine, right?

Porting as in to run on the OS and APIs of the Shield? Then yeah, sure, should be possible. I thought the discussion was about replicating the Switch OS on the Shield (which would be near impossible with it being closed off anyways - CEMU doesnt run the Wii U OS, it tries to emulate it as a black box, which is why its legal)
 
They're talking about the die configuration, which is identical to a TX1. They don't know anything about clocks.

The reason why we wanted to see these die shots was to understand if some changes were made (e.g. A53 removed in favor of SRAM for higher peak bandwidth), but it doesn't seem to be the case.

We have clocks, Eurogamer reported it TWICE and we even have an old SDK leak before they reported an additional change to it.
 

ggx2ac

Member

Reading the Anandtech article, the GPU only lasted 116 minutes due to battery power before it throttled below 750MHz with ambient temperatures being around 15°C to 18°C compared to room temperature.

GPUClockTRexShieldTablet2_575px.PNG


Also:

It's also notable that relatively little time is spent at the full 852 MHz that the graphics processor is capable of. The vast majority of the time is spent at around 750 MHz, which suggests that this test isn't pushing the GPU to the limit, although looking at the FPS graph would also confirm this as it's sitting quite close to 60 FPS throughout the run.

Plus, it runs very hot internally:

Internally, it seems that the temperatures are much higher than the 45C battery temperature might suggest. We see max temperatures of around 85C, which is edging quite close to the maximum safe temperature for most CMOS logic. The RAM is also quite close to maximum safe temperatures. It definitely seems that NVIDIA is pushing their SoC to the limit here, and such temperatures would be seriously concerning in a desktop PC, although not out of line for a laptop.

Sure, the GPU runs at 290 GFLOPS at 750 MHz but, it can't do FP16 like the X1, the A15 CPU is only 32-bit not 64-bit, it uses 64-bit LPDDR3 RAM which is slower than the Switch's LPDDR4 RAM.

The difference between the Tegra Shield Tablet and the Nintendo Switch in tech is practically a generational leap.
 
We have clocks, Eurogamer reported it TWICE and we even have an old SDK leak before they reported an additional change to it.

And the first time they reported the clocks they were described as "final". Yet they came back with a new boost mode.

That's not to say that I think a docked boost mode can happen, especially if the Switch SoC is just a normal 20nm TX1, but the fact that they've essentially come back to say the "final" clocks weren't actually final should show us that nothing is certain until it's been officially confirmed.

Then again Nvidia officially confirmed a custom Tegra chip, so who the hell knows anymore.
 

Donnie

Member
So basically this year's iPhone will be stronger than the Switch.

No not for gaming. It will still have to stick to around 2w power draw in game, which is less than a third of what Switch uses. The SoC could potentially be theoretically more powerful if its using a much smaller process node. But RAM bandwidth will be the big issue, apparently 4GB 64bit LPDDR4 at full speed consumes over 2w on its own.
 

Rodin

Member
We have clocks, Eurogamer reported it TWICE and we even have an old SDK leak before they reported an additional change to it.

That has nothing to do with what i said. He asked if this was better than what we expected because it's stock, but since in this analysis there are no different clocks from the ones we've heard from EG (in fact there are no clocks at all for obvious reasons), nothing actually changed. We wanted these shots for other reasons.

The only new thing we know is that there are A53 cores in there, so hopefully they're used to run the OS and the console has full access to the 4 A57 cores to run games.
 
So basically this year's iPhone will be stronger than the Switch.
In synthetic benchmarks for 10 minutes before the clock speeds are strangled by the OS to preserve battery life and with no "complex/traditional" games to use all that horsepower, yes.
 
My question is: if Nintendo had waited until the end of the year to give the Switch a proper launch stacked with games and a fully functional OS would they had been able to take advantage of any significant performance/wattage gains with some new upcoming chip?

Or is this the best they could have hoped for at that point anyway?
 

Rodin

Member
My question is: if Nintendo had waited until the end of the year to give the Switch a proper launch stacked with games and a fully functional OS would they had been able to take advantage of any significant performance/wattage gains with some new upcoming chip?

Or is this the best they could have hoped for at that point anyway?

Depending on when the designed was locked, they could've maybe had the chance to go with TX2, but the difference wasn't significant enough to stick with a dead platform for 8 more months.
 

Bowl0l

Member
Porting as in to run on the OS and APIs of the Shield? Then yeah, sure, should be possible. I thought the discussion was about replicating the Switch OS on the Shield (which would be near impossible with it being closed off anyways - CEMU doesnt run the Wii U OS, it tries to emulate it as a black box, which is why its legal)
Imagine a quiet launch of Nvidia Shield 2018 with 4GB RAM and the announcement of all Switch games ported to it.
 

tsumineko

Member
After seeing Breath of the Wild running on this thing I really couldn't care less about what the hardware is. Thing is a beast.
 

Spladam

Member
The last time Nintendo had a completely new, built-from-the-ground-up console GPU (not merely semi-custom work based on a PC GPU) was the Flipper for Project Dolphin (GameCube).

Even the Wii U's Latte GPU was pretty much a semi-custom lower-end RV7xx with 32 MB EDRAM. -- And the Wii's GPU was just an overclocked Flipper.

Anyway, Flipper was designed by ArtX from about 1998 to 2000. It wasn't based on any Radeon design because there were no Radeon GPUs back in 1998-1999. ArtX had made a PC GPU for Acer and showed it at Comdex 1999 but this didn't go anywhere.
Everything ArtX did went into Flipper for Nintendo. There wasn't anything like it before or after, not counting the Wii's overclocked version (named "Hollywood").

http://www.eetimes.com/document.asp?doc_id=1145794
Very cool post Amy S, another piece of the very interesting story that is the rise and fall of SGI, and it's affect on computer graphic titans of today.

On Topic: Do we actually have confirmation that there is NO customization on the X1 inside the Switch?
 

badb0y

Member
In synthetic benchmarks for 10 minutes before the clock speeds are strangled by the OS to preserve battery life and with no "complex/traditional" games to use all that horsepower, yes.

It seems like some of you guys haven't read an iPhone review in like 6 years.

http://www.anandtech.com/show/10685/the-iphone-7-and-iphone-7-plus-review/5

Throttling occurs but not nearly as bad as you claim. Also seems like it lasts 2.5-3 hours while looping benchmarks. All of this is moot anyways since the moment you detach the Switch from the dock it drops it's GPU clocks to like 35% of what it is when it's docked.
 
After seeing Breath of the Wild running on this thing I really couldn't care less about what the hardware is. Thing is a beast.
After seeing BOTW running on it I wish it were more powerful. It chugs along at 20fps with horrible aliasing/resolution on my TV far too often.
 
So the fact that the SoC is an off the shelf part should mean a X2 powered New Switch won't be too difficult for Nintendo to make happen.
 
After seeing BOTW running on it I wish it were more powerful. It chugs along at 20fps with horrible aliasing/resolution on my TV far too often.

There's actually a software bug with the OS at the moment which causes Wi-Fi scanning to use up GPU resources, for some reason. Disabling it stabilizes the FPS quite a bit in a number of games.
 

John Harker

Definitely doesn't make things up as he goes along.
More like 1/4 to 1/3 XBO

This news is hilarious if true. Nintendo charging a premium for mediocre tech whilst adding shit no one wants (joycons)

C'mon, Keep that in context of yourself.
The Switch is an amazing piece of tech and the joycons are a game changer.

There has never been an experience like Zelda on day 1 of a system launch, ever. I've played that game on busses, airplanes, in bed, and on my 60inch, with all sorts of different access points depending my mood.

I randomly played snipperclips with a guy next to me on a flight, kickstand down. It gives you so much flexibility in how you experience your gaming.


The bias makes my head hurt.
 

MisterR

Member
After seeing Breath of the Wild running on this thing I really couldn't care less about what the hardware is. Thing is a beast.

It's a Wii U game that chugs along with frequent frame rate drops. Great game, but certainly doesn't paint the Switch as a beast.
 
There's actually a software bug with the OS at the moment which causes Wi-Fi scanning to use up GPU resources, for some reason. Disabling it stabilizes the FPS quite a bit in a number of games.

I thought Digital Foundry tested this with BOTW and found that this was not true, at least in case of Zelda. The framerate is downright horrible at times, and sometimes for a long stretch.
 

KingSnake

The Birthday Skeleton
There's actually a software bug with the OS at the moment which causes Wi-Fi scanning to use up GPU resources, for some reason. Disabling it stabilizes the FPS quite a bit in a number of games.

That's fake news. Or placebo. The people who actually tested properly, like Digital Foundry found no difference in framerates.

Edit: You can do it actually yourself. Disable WiFi and go to Faron Woods. If it rains even better. There's no difference. You can count the frames.
 

Oregano

Member
Expect it 2-3 years after nvidia introduce it into a new shield.

Nvidia won't give a shit about the shield line now that they actually have a client to continually supply Tegra chips to. Switch will probably have made more money for them in the next year than they've made on Shield altogether.
 

SuperSah

Banned
I couldn't care less what it's running.

It's impressed me so far. Fast RMX looks incredible (and is proven to surpass the Wii U title in fidelity) and the world of Zelda has never looked or felt so good. So sure, it might be "underpowered" this and that but for a portable console which fits in a ridiculously small body which is largely battery and cooling, I'm so fucking impressed.

Keep it up Nintendo. It's gonna be insane when the games finally reach full stride and they're playable in your damned hands.
 

n0razi

Member
As someone who owns 2 shield's ($200 ea.) for video streaming and home automation... It pains me to buy an almost identical product for $300 that does less
 
As someone who owns 2 shield's ($200 ea.) for video streaming and home automation... It pains me to buy an almost identical product for $300 that does less

1) There isn't a Shield Tablet X1 and 2) Shield/Switch are different products that do different things.
 
It definitely doesn't "chug along". Framerate drops a pretty rare.

and...

It's running on a damn handheld.
Dude, 20fps is chugging by anyone's standards. And it does it often. Any time there's a lot of grass, and even more so with weather happening.

This thread is discussing the hardware. Yes it's a handheld, and yes BOTW runs like crap (and looks it) in docked mode. They aren't mutually exclusive.
 

tsumineko

Member
Dude, 20fps is chugging by anyone's standards. And it does it often. Any time there's a lot of grass, and even more so with weather happening.

This thread is discussing the hardware. Yes it's a handheld, and yes BOTW runs like crap in docked mode. They aren't mutually exclusive.

I have so rarely seen what you're describing. I have no idea why you think it's so frequent. It's mostly only happened in busy towns, never outside. It's pretty consistently 30fps.

The way BOTW runs on a handheld console is incredible, and I just don't know what you expect out of the hardware...
 
Status
Not open for further replies.
Top Bottom