• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch: Powered by Custom Nvidia Tegra Chip (Official)

Zedark

Member
Current speculation for the GPU is between 512 and 750 GFlops*, depending on architecture and whether or not there's any upclocking/downclocking going on. CPU will be more powerful than those in the PS4/XB1 core for core, though we have no information on amount of cores. LCGeek has rumored that overall the CPU will be a good deal improved over PS4/XB1.

*Those are Nvidia Flops which supposedly perform a bit better than AMD/GCN Flops, and Tegra apparently get twice the performance from FP16 code compared to FP32 code (which is generally what XB1/PS4 games use), meaning a fully optimized game could get up to 1-1.5 TF of performance. Apparently UE4 makes heavy use of FP16 code.

No idea how accurate any of this is but it's currently the best guesses based on the rumors we have and images of the device.

That's quite a nice amount of power in its own right, especially for handheld Nintendo games this provides a huge paradigm shift in terms of graphical options.
 

dr_rus

Member
Wait, it was confirmed Pascal?!

I mean, why wouldn't it be Pascal if it's launching in March 2017? It's also doesn't matter much if it's Pascal or Maxwell if we're talking about the same production process.

*Those are Nvidia Flops which supposedly perform a bit better than AMD/GCN Flops, and Tegra apparently get twice the performance from FP16 code compared to FP32 code (which is generally what XB1/PS4 games use), meaning a fully optimized game could get up to 1-1.5 TF of performance. Apparently UE4 makes heavy use of FP16 code.

That's not how it works. Some shaders may be ok with using FP16 precision and for them the throughput will be twice of FP32 but the sheer number of such shaders compared to those which actually need FP32 precision isn't big - less than 25% of all shaders for sure. So in the end even a fully FP16 optimized game is looking at producing twice the math performance only for a 1/4th of its code. It's a nice thing to have especially on a fixed h/w console platform but it won't result in games magically running twice as fast.
 

Retrobox

Member
Was it our dear Emily who said XB1/PS4 ports should be feasible? In that case, there is no reason to not believe that now is there?
Of course, some won't give a damn if they have to optimise too much, but things are not sounding that bad right now.
 

Zil33184

Member
Xbox One has about 150 GB/s of bandwidth per TF of floating point performance, whereas Pascal graphics cards tend to have about 30 GB/s per TF. By his logic Nvidia's current line of graphics cards would be horrifically, cripplingly bandwidth constrained, which they're quite obviously not.

What? They quite obviously are.

Also there's a huge difference between having 25GB/s total memory bandwidth and 150GB/s in absolute terms. Bandwidth per pixel is a much more important metric than bandwidth per compute, since the latter is destined to decrease faster and image quality is about work done per pixel. At 900p30 an XBO has to render 43,200,000 pixels. That's less than twice as much as the 27,648,000 pixels NS would have to render at 720p30, and the XBO has 6x memory bandwidth.
 
That's not how it works. Some shaders may be ok with using FP16 precision and for them the throughput will be twice of FP32 but the sheer number of such shaders compared to those which actually need FP32 precision isn't big - less than 25% of all shaders for sure. So in the end even a fully FP16 optimized game is looking at producing twice the math performance only for a 1/4th of its code. It's a nice thing to have especially on a fixed h/w console platform but it won't result in games magically running twice as fast.

Ah is that the case? I recall Thraktor talking about UE4 supporting FP16 code for essentially all pixel shaders, but I admit I'm not too knowledgeable about it.

Either way my main point is that direct flop comparisons don't work that perfectly. Switch potentially will perform better for certain games/tasks than XB1/PS4, while still being below those in raw power.
 

Fredrik

Member
What? They quite obviously are.

Also there's a huge difference between having 25GB/s total memory bandwidth and 150GB/s in absolute terms. Bandwidth per pixel is a much more important metric than bandwidth per compute, since the latter is destined to decrease faster and image quality is about work done per pixel. At 900p30 an XBO has to render 43,200,000 pixels. That's less than twice as much as the 27,648,000 pixels NS would have to render at 720p30, and the XBO has 6x memory bandwidth.
So Switch won't run XB1 ports unless heavily optimized?
Sounds like WiiU all over again. :/
Maybe we should see Switch as a 3DS successor rather than WiiU successor, it'll definitely be an awesome portable console but not quite as awesome as a home console.
 

G.ZZZ

Member
So Switch won't run XB1 ports unless heavily optimized?
Sounds like WiiU all over again. :/
Maybe we should see Switch as a 3DS successor rather than WiiU successor, it'll definitely be an awesome portable console but not quite as awesome as a home console.

Laws of physics what are those.
 

KingBroly

Banned
Was it our dear Emily who said XB1/PS4 ports should be feasible? In that case, there is no reason to not believe that now is there?
Of course, some won't give a damn if they have to optimise too much, but things are not sounding that bad right now.

The big thing about why Switch can handle those ports and why Wii U couldn't is because Wii U's architecture was an archaic, outdated mess. Wii U couldn't handle Frostbite, UE4, Unity 5, DX11, etc. etc. etc.
 

MacTag

Banned
They arent low power though. Lower than ps3 sure but not enough to be portable. The switch is. That alone tells you it isnt an apt comparison.
Jaguar was designed for notebooks and tablets. You could also do a souped up Tegra for desktop (in fact they are for servers), that doesn't mean the architecture wasn't designed for low power draw. All the consoles use essentially mobile chips in them for core components, even if only one actually is mobile, which is why that's a dumb argument to use against NS with no context. The last console that didn't use mobile chips for it's CPU or GPU was Wii U (lol) although they still stuck an ARM9 chip in there for security and OS/background ops.
 
I have posted many times what I expect out of this thing. And it's in many ways exciting, disappointing in some others.

Why can't I also comment on the general atmosphere in here, which is in my opinion a gross overestimation of the capabilites of the device? Is it only 'contributing to the discussion' if I agree with it being a powerhouse?
I know you prefer an echo chamber, you made that clear in countless NX threads even before the reveal, but this is still a discussion board. You don't get to decide what does and what doesn't contribute to the discussion simply because you have other thoughts on the matter.

And I wasn't bashing anyone, I was merely reminding everyone that at the same place in time before the Wii-U launch, a lot of people had very similar postings of the device's power with plenty of sourcing and insiders.
None of it panned out. I'm merely drawing some parallels here. Maybe this time will be totally different, but i'm thinking it won't. Feel free to disagree.

I agree that just trouncing about the idea that the Switch is going to be a power house doesn't add to the discussion without proper evidence. However, trying to "keep it real because of what happened in the WUST threads" is quite erroneous as well. The situation surrounding the Switch and the Wii U are very different, for a few reasons.

1. The Wii U chip was an unknown entity at the time. There was not very much known about it, aside from a few hints like "using technology found in IBM's POWER-series of CPUs" and "being powered by a custom Radeon GPU." Not very much information could legitimately be deduced from there, and everyone had to go off of approximate real-world performance compared to the Xbox 360. This is further complicated by the change in dev-kit hardware, where very early on, they were closer to an Xbox 360 than what actually came of it. And because of this, we had users such as bgassassin concluding from the smallest shreds of evidence that we had the hardware was going to be more powerful than what it was. Some users ran with it, some didn't.

2. The Switch has baseline. This is due to knowledge leaked by proven sources. We know that the dev-kits are running Tegra X1s, and that a custom (possibly Pascal-based) Tegra is going to power the final retail unit. We also know that there are products on the market that use the Tegra X1, most notably the Google Pixel C tablet. It's not that dangerous to deduce that we can expect at a minimum the performance of the Switch SoC being similar to that of the Pixel C when in portable mode. Where it does get a little complicated is trying to figure out the power of the system when docked. We know from sources that have proven themselves to be reliable that the system does produce more power when docked. However, we don't know much more.

3. We don't know how much involvement Nintendo has had in the development of the Switch SoC. It's readily apparent that they had significant involvement in the Wii U MCM, as they wanted to minimize power draw whilst trying to maintain a balanced power envelope and keeping Wii backwards compatibility. The Switch is a different story, as it's a derivation of an architecture they didn't help devise themselves. It's also by a company that they have not produced a retail product with before, so we can't deduce the working relationship between Nintendo and Nvidia. Another thing we know is that since the Wii U architecture was devised, Nintendo's hardware division has been significantly restructured. How that's affected the development pipeline, we don't know. But we can't assume that it's the same as it was before the restructure.

We can also observe a very subtle change in their philosophy, mainly through their marketing and branding (bye, blue and grey Nintendo). But I'm not going to assert that this is a key point in differentiating the situations.

But the point still stands. The Wii U and the Switch situations are not very similar at all. So coming into the thread and asserting that the Wii U is good prior evidence as to what can happen isn't conducive to the discussion. Sure, we should make sure not to overhype the power of the system, but at the same time, we can't concretely take the negativest-Nancy approach.
 

Astral Dog

Member
So what amount of memory are expecting for the games? With the rumors that Switch wil have 4GB on portable mode and 6 on the deck what amount could be reserved for the OS?
 

Zil33184

Member
It's not that dangerous to deduce that we can expect at a minimum the performance of the Switch SoC being similar to that of the Pixel C when in portable mode. Where it does get a little complicated is trying to figure out the power of the system when docked. We know from sources that have proven themselves to be reliable that the system does produce more power when docked. However, we don't know much more.

I think the NS/Pixel C comparison is probably spot on. Tegra X1's GPU on the SHIELD Android TV has a peak clock of 1GHz, but in the Pixel C it's downclocked to 850MHz. The Pixel C has a 34.2 Wh battery, so depending on the battery specs for the NS we can guess at any similar downclocks. The dock should increase performance by allowing the NS to reach 1GHz GPU clock.
 

cheezcake

Member
Its good for the size of the screen, and remember, cant be too Power Hungry!

Yeh true actually, for a 6-7" inch screen that's 209-244 ppi which isn't high by modern standards but it isn't too low either. Given the resolution of most modern phones is >720p with smaller form factors I don't really buy the power hungry argument, probably more of a cost issue.
 

Rodin

Member
So what amount of memory are expecting for the games? With the rumors that Switch wil have 4GB on portable mode and 6 on the deck what amount could be reserved for the OS?

I think it will be 6GB LPDDR4 total, with 1.5-2GB reserved to the OS (so 4-4.5GB for games), plus 4MB SRAM on die. It won't have bandwitdh issues, that's for sure.

Yeh true actually, for a 6-7" inch screen that's 209-244 ppi which isn't high by modern standards but it isn't too low either. Given the resolution of most modern phones is >720p with smaller form factors I don't really buy the power hungry argument, probably more of a cost issue.

~240dpi (the screen is 6.2") is within spitting distance of the iPad Pro/Air's retina 264ppi and you keep both at roughly the same distance so 720p is perfect for a device this size. Phones have to render webpages and whatsapp, not Zelda.
 
Yeh true actually, for a 6-7" inch screen that's 209-244 ppi which isn't high by modern standards but it isn't too low either. Given the resolution of most modern phones is >720p with smaller form factors I don't really buy the power hungry argument, probably more of a cost issue.

Most phones aren't rendering complex 3D games at 720p though, so it's most certainly a power issue. It's not the power requirement for the screen itself, it's the requirement of the GPU to render at the screen's native resolution.
 

cheezcake

Member
~240dpi (the screen is 6.2") is within spitting distance of the iPad Pro/Air's retina 264ppi and you keep both at roughly the same distance so 720p is perfect for a device this size. Phones have to render webpages and whatsapp, not Zelda.

The distance you'll be using it at would generally be more akin to an iPad mini though, which is 326 PPI. 240 is certainly not bad though.

Most phones aren't rendering complex 3D games at 720p though, so it's most certainly a power issue. It's not the power requirement for the screen itself, it's the requirement of the GPU to render at the screen's native resolution.

You don't have to render at the screen native resolution though
 

z0m3le

Banned
What? They quite obviously are.

Also there's a huge difference between having 25GB/s total memory bandwidth and 150GB/s in absolute terms. Bandwidth per pixel is a much more important metric than bandwidth per compute, since the latter is destined to decrease faster and image quality is about work done per pixel. At 900p30 an XBO has to render 43,200,000 pixels. That's less than twice as much as the 27,648,000 pixels NS would have to render at 720p30, and the XBO has 6x memory bandwidth.

Your assumption is X1, everything about the final hardware points to Pascal, there actually isn't much of a reason for Nintendo to go with Maxwell at all, they would save more money with Pascal since it is mass produced, lower power requirements (smaller battery needed and less cooling) not to mention that 20nm Maxwell was a one off and quite dead today, so they would have had to pay to shrink Maxwell into 16nm or lower.

There is a very unique set of components to why Wii U turned out like it did. NEC did the fab'ing of the GPU and EDRAM, the EDRAM pool was extremely large compared to what we thought it would be, because NEC could only do it at 45nm at that time, which made the GPU have to exist at 45nm. They were using an old and dead CPU Architecture which they updated themselves, costing whatever that did. Wii U also was created with BC in mind, this is foregoing BC, and hell even X1 has about half the graphical processing power that XB1 has, it lacks bandwidth which is where Nintendo would historically spend money, both the 3DS and Wii U have great memory bandwidth, and custom solutions "designed" by Nintendo to get there.

There is some clear differences between Wii U's design goals and NS which will at the end of the day, make a very large difference to what we can expect.
 

Astral Dog

Member
Yeh true actually, for a 6-7" inch screen that's 209-244 ppi which isn't high by modern standards but it isn't too low either. Given the resolution of most modern phones is >720p with smaller form factors I don't really buy the power hungry argument, probably more of a cost issue.

you have to consider that resolution is not within a bubble, when the Vita appeared with its amazing OLED screen at 544p it struggled to run games at high framerates and native resolution, Switch is a system designed for playing complex games thus cant be compared directly to top of the line Smartphones or mobile devices on all features,

plus look at Switch like this, the resolution and graphics jump from 3DS is tremendous :)
 

Reallink

Member
It could be, but "the world's top-performing Geforce graphics cards" run on the Pascal architecture, not Maxwell. It strongly implies it's based on their newest tech.

The 980ti (Maxwell) is still the third fastest GPU in the world, which certainly qualifies as "top-performing GeForce gaming graphics cardS" (emphasis on the plural S). This quote means and says absolutely nothing. If anything it suggests it's NOT Pascal, else they would have used the more boisterous singular--top or fastest "card" in the world. Come to think of it, the plural vaugery is actually very out of character for Nvidia, they speak in absolutes (best, fastest, most advanced, etc...) In limiting the subset to Geforce cards and using the plural, I think this may very well be confirmation it's Maxwell. It makes no sense to have not to have used the singular if it were Pascal, Volta has no bearing on what they say or do in October of 2016. It doesn't even exist.
 

z0m3le

Banned
The 980ti (Maxwell) is still the third fastest GPU in the world, which certainly qualifies as "top-performing GeForce gaming graphics cardS" (emphasis on the plural S). This quote means and says absolutely nothing.

That is why you don't hang your logic on their quote, the reason it doesn't say pascal is because Volta will be coming out right around that time, and they don't want the device to be outdated right after it is released right?

There is obvious choices being made. Anyways you guys enjoy digging around wust and translating it to switch. Hell even Maxwell would port any XB1 game over fine as long as the memory bandwidth is there, PS4 Pro isn't the only object capable of checker board rendering.
 

nordique

Member
I think you guys need to be careful about associating that press release and GeForce to the switch. Anyone remember the Wii U sharing the same cpu as "Watson?" How did that turn out? Just saying that was a mess! You have to understand a press release is designed to get you excited while also being vague at the same time... at least when it comes to Nintendo hardware it seems.

The thing is with the Wii U though people who knew the specs did say it would be absolutely no where near the Watson chip and tempered expectations by saying its weaker than the last gen 360 CPU

This is different, NateDrake might end up being wrong sure but according to his sources it is Pascal based. Or at least the final product will be.

In the end it matters little, as performance will be enough to get ports (it supports UE 4.0) and it will still be below XBO/PS4 in power regardless of Maxwell or Pascal architecture.

The best thing about it being Pascal would be related to longer battery life, so hopefully that is in the final console vs the dev kits.


It being based on an X1 even means it will be a pretty solid jump from the Wii U GPU (more than what we got from GCN to Wii which was a negligible difference for the most part)
 

Kurt

Member
Sure, it would be nice. But not at the expense of what little battery life this thing already has.

Agrez but many people are already think that the battery life is official. Also i dont care if its 720 or 1080. It doesnt add that much to a game. Prefer something unique which i still expect in the final review
 

Lonely1

Unconfirmed Member
Lets put it this way. Even on the lower-end of predictions, speculations and rumors, the Switch will be closer to the XBOne than what the Vita was to the PS3 on all aspects but RAM ammount. And a lot of people say that the Vita has PS3-like graphics and even got a number of PS3/360 downports!

Plz, stop with the WUST thing.

Probably like a bit less than half the power of Xbox One, a bit less than third of PS4. Roughly speaking.

EDIT: I'm probably way off lol.

That's my take as well.
 
The reason I believe it will be Pascal based, is because Nintendo would be interested in the more power efficient chip.

Even if it dosn't end happening, I totally expect a revision down the line that has a Pascal chip.
 

Schnozberry

Member
Then what's the point of having a 1080p screen, if you're just rendering games at 720p? Slightly better PPI for Netflix/web browsing? It doesn't really make sense to do.

Multi Res shading would allow them to render the middle portion of the screen at full resolution and the edges of the screen at a lower resolution to improve performance. It's an Nvidia feature that is exclusive to Maxwell and Pascal GPU's. It was implemented on PC for Shadow Warrior 2. The data foundry video on SW2 goes into some depth about it.

It would be useful for 1080p TV's for sure.
 

dcx4610

Member
I'd be shocked if it's even on par with the Xbox One. There's only so much power you can fit into a tablet. As long as it's improved over the Wii U and is in the realm of power of the XB1, I think that's fine. It'll at least get ports at that point.

It won't be the best version but the upside is that it's portable. I see a lot of double dipping on games if it gets decent 3rd party support.
 

tuxfool

Banned
Ah is that the case? I recall Thraktor talking about UE4 supporting FP16 code for essentially all pixel shaders, but I admit I'm not too knowledgeable about it.

Either way my main point is that direct flop comparisons don't work that perfectly. Switch potentially will perform better for certain games/tasks than XB1/PS4, while still being below those in raw power.

The mobile code paths of UE4 do give up features present in the main branch in order to accommodate FP16. For mobile games it is a good trade-off, but fully featured games probably wouldn't consider it.

Multi Res shading would allow them to render the middle portion of the screen at full resolution and the edges of the screen at a lower resolution to improve performance. It's an Nvidia feature that is exclusive to Maxwell and Pascal GPU's. It was implemented on PC for Shadow Warrior 2. The data foundry video on SW2 goes into some depth about it.

It would be useful for 1080p TV's for sure.

You'll have to note that they recommend multi-res shading if you're running resolutions higher 1080p. Multi Res shading also works better if the screen that you use is larger in your FOV. A TV or a small mobile screen fit much more readily into the center of your field of view, making multi-res shading work less well.
 

KingSnake

The Birthday Skeleton
Would a custom X1 really even be a realistic option for the retail unit, though?

Who will produce these chips? We had a lot of reports in the past years about how the 20nm chips are quickly skipped in favour of 16nm by most of the TSMC customers.

We have this article from January:

http://www.extremetech.com/computin...-10nm-production-this-year-claims-5nm-by-2020

According to TSMC’s recent conference call, it expects demand for 20nm to drop sharply as companies transition to 16nm and FinFETs. The foundry believes its total market share of the 16nm node will rise from roughly 40% in 2015 (much of this is likely Apple), to above 70% in 2016. Since we know Samsung and GlobalFoundries will be fabbing chips for AMD, Apple, and Qualcomm, TSMC’s prediction implies its customers have grabbed the design wins they’re going to win.

Plus:

http://www.tsmc.com/uploadfile/ir/quarterly/2016/3tBfm/E/TSMC 3Q16 transcript.pdf

As to our 16-nanometer FinFET, the defect density and cycle time continue to improve and are very competitive. In addition to mobile application processor, other applications such as cellular baseband, graphic processor for video game, AR/VR, deep learning and AI have strongly adopted our 16-nanometer solutions. As a result, our 16-nanometer business this year is expected to become more than five-fold of this level compared to last year.

Going through TSMC's earnings reports and calls, they're barely even mentioning the 20nm segment and when it is, it's bundled with 16nm.

To preempt Hoo-doo, yes, I know that Nintendo didn't follow the logical path with Wii U and went for the old fab process, but that did create them a lot of trouble later in Wii U's life, so one would hope they won't ignore the fact that 20nm is almost dead in the near future.
 

99Luffy

Banned
Then what's the point of having a 1080p screen, if you're just rendering games at 720p? Slightly better PPI for Netflix/web browsing? It doesn't really make sense to do.
Low pixel density can be distracting. PPI doesnt change when youre playing lower resolution content.

That said, I think 720p at 7 inches will be fine. I can browse on my 720p 12" chromebook all day.
 

Reallink

Member
Would a custom X1 really even be a realistic option for the retail unit, though?

Who will produce these chips? We had a lot of reports in the past years about how the 20nm chips are quickly skipped in favour of 16nm by most of the TSMC customers.

We have this article from January:

http://www.extremetech.com/computin...-10nm-production-this-year-claims-5nm-by-2020



Plus:

http://www.tsmc.com/uploadfile/ir/quarterly/2016/3tBfm/E/TSMC 3Q16 transcript.pdf



Going through TSMC's earnings reports and calls, they're barely even mentioning the 20nm segment and when it is, it's bundled with 16nm.

To preempt Hoo-doo, yes, I know that Nintendo didn't follow the logical path with Wii U and went for the old fab process, but that did create them a lot of trouble later in Wii U's life, so one would hope they won't ignore the fact that 20nm is almost dead in the near future.

Preempting the counter point doesn't invalidate it. AFAIK they also produced Wii's at 90nm for the entirety of it's life, never shrinking to 65nm or 45nm. When there is a history of illogical HW decisions, "potentially creating trouble" isn't a sufficient argument against it.
 

XINTSUAI

Neo Member
I mean, why wouldn't it be Pascal if it's launching in March 2017? It's also doesn't matter much if it's Pascal or Maxwell if we're talking about the same production process.



That's not how it works. Some shaders may be ok with using FP16 precision and for them the throughput will be twice of FP32 but the sheer number of such shaders compared to those which actually need FP32 precision isn't big - less than 25% of all shaders for sure. So in the end even a fully FP16 optimized game is looking at producing twice the math performance only for a 1/4th of its code. It's a nice thing to have especially on a fixed h/w console platform but it won't result in games magically running twice as fast.

And PS4 also can work with FP16.
 

KingSnake

The Birthday Skeleton
Preempting the counter point doesn't invalidate it. AFAIK they also produced Wii's at 90nm for the entirety of it's life, never shrinking to 65nm or 45nm. When there is a history of illogical HW decisions, "potentially creating trouble" isn't a sufficient argument against it.

Weren't the first 65nm GPUs only produced in 2007?
 

pulsemyne

Member
Untitled.png

That's not the only interesting part from that site. If you go into detail on the test then it says the display used was a 24 inch 5 point multitouch display. Now that's not going to be for a car as their screens are smaller BUT that does sound like the kind of thing put into a display used to promo a games system at a store. Basically a big switch system designed to show people it's games.
 
Preempting the counter point doesn't invalidate it. AFAIK they also produced Wii's at 90nm for the entirety of it's life, never shrinking to 65nm or 45nm. When there is a history of illogical HW decisions, "potentially creating trouble" isn't a sufficient argument against it.

There's actually not much evidence that many Wii Us were manufactured beyond the original launch allotment. Nintendo's forecast for the 1st year of Wii U sales is still less than the actual life to date sales if I remember correctly. Not much incentive to do a die shrink on a dead product.

Wii U was also stuck with ancient architecture due to Nintendo's insistence on hardware based BC. The Switch isn't going to be held back by any such requirements.
 

XINTSUAI

Neo Member
People are forgetting the size of the cartridges, how much Gigas would be stored in? 8 ? 12 Gigas? GTA6 easily will go to 50 GB.

This takes us to RAM, Ps4 and Xone both have 5 Gigas (only for the game), if Switch will have something like 4 GB of RAM, probably only 3 will be avaliable to the game itself...

GameCube and its minidiscs all over again.
 

linkboy

Member
People are forgetting the size of the cartridges, how much Gigas would be stored in? 8 ? 12 Gigas? GTA6 easily will go to 50 GB.

This takes us to RAM, Ps4 and Xone both have 5 Gigas (only for the game), if Switch will have something like 4 GB of RAM, probably only 3 will be avaliable to the game itself...

GameCube and its minidiscs all over again.

The size of the card (these aren't cartridges) has no bearing on the amount of storage it has. MicroSD cards are small as can be and can hold 200GB.

 

Lonely1

Unconfirmed Member
People are forgetting the size of the cartridges, how much Gigas would be stored in? 8 ? 12 Gigas? GTA6 easily will go to 50 GB.

This takes us to RAM, Ps4 and Xone both have 5 Gigas (only for the game), if Switch will have something like 4 GB of RAM, probably only 3 will be avaliable to the game itself...

GameCube and its minidiscs all over again.

32GB for the cartidges is the rumour.
 

Reallink

Member
There's actually not much evidence that many Wii Us were manufactured beyond the original launch allotment. Nintendo's forecast for the 1st year of Wii U sales is still less than the actual life to date sales if I remember correctly. Not much incentive to do a die shrink on a dead product.

Wii U was also stuck with ancient architecture due to Nintendo's insistence on hardware based BC. The Switch isn't going to be held back by any such requirements.

I said Wii (as in original Wii), not Wii U. While PS3 and Xbox did 65nm and 45nm revisions, if I'm not mistaken Nintendo was still doing 90nm all the way through ending production in '12 or '13.
 

Astral Dog

Member
People are forgetting the size of the cartridges, how much Gigas would be stored in? 8 ? 12 Gigas? GTA6 easily will go to 50 GB.

This takes us to RAM, Ps4 and Xone both have 5 Gigas (only for the game), if Switch will have something like 4 GB of RAM, probably only 3 will be avaliable to the game itself...

GameCube and its minidiscs all over again.

well, nobody is actually expecting the Switch to compete with the specs or hardware of the two home HD systems. it still has a lot of handheld limitations wich is why i expect many 720p games that will help with compression.

As for the game carts, i think the lowest was supposed to be 32GB, no way the will be that limited
 

Hermii

Member
hmm that's a little disappointing. 1080p would be nice.
It shouldn't be higher than what the hardware can actually render at native res. 720p at a 1080p screen looks much worse and draws more battery than 720p at native. A dedicated game system is not a phone or a tablet.
 
I said Wii (as in original Wii), not Wii U. While PS3 and Xbox did 65nm and 45nm revisions, if I'm not mistaken Nintendo was still doing 90nm all the way through ending production in '12 or '13.

Oh, I totally misread that. The 90nm should have given it away I guess.

Anyway it was never confirmed but the Wii chipset supposedly received die-shrinks to 55 and then 40nm over time according to Eurogamer.

http://www.eurogamer.net/articles/df-hardware-nintendo-wii-mini-review
 

XINTSUAI

Neo Member
well, nobody is actually expecting the Switch to compete with the specs or hardware of the two home HD systems. it still has a lot of handheld limitations wich is why i expect many 720p games that will help with compression.

As for the game carts, i think the lowest was supposed to be 32GB, no way the will be that limited

32GB for the cartidges is the rumour.

Well 32Gigas sounds enough, my doubt it's about this game size in a mobile device with the same 32GB of internal storage, where obviously some of this gigas will be destinated to the OS... so this games could not be sold by Online store... Ok I can purchase a SD card with 200 GB for internal storage.. but this break up a lot the install base for digital purchases. Look at the VITA, the games have a maximum of 3.5 GB. To fit on the SD card with less avaliable memory, so everyone that have a card, can buy the game.

These are subjects related to business model.

I think Switch will be much more like a mobile (tablet, android, smartphones) device, than a home console in terms of game library.
 
I wonder if the TV mode will support 4k . I own the Nvidia Shield TV and you could get reasonable framerates on the Talos Principle downscalled from 4k and the Switch should be stronger than that so indies should have a decent chance of running at 4k.

I personally don't think it will, and even as a 4K TV owner, I don't care all that much, especially since it's a portable device. What I DO hope Switch has though, is hardware upscaling like the Xbox One S. That will really help with input lag when relying on the television's upscaler.
 
Top Bottom