• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch: Powered by Custom Nvidia Tegra Chip (Official)

i think you're right. I would be OK with this...

If they don't get big third party games. Things like Overwatch, COD, and EA games, I don't see their strategy unless they are going for a full evergreen strategy overall for all games.

So assuming we get things like Dragon Quest Builders, World of Final Fantasy, Digimon, maybe Rise of Tomb Raider (360 version) and Modern Warfare Remake. What else do Nintendo do going forward? They can't survive on random and small Japanese games in the west. It would be like Vita.

The only other option I see is trying to have most publishers create Evergreen titles. Less games overall, but games that can sell over the entire generation of Switch. Rainbow Six Seige would be an example. CS:GO is another example.

Interesting strategy if so and one I think eventually the market adopts. But for now, I'm not sure it will work.
 
Also if this was so powerful and Xbone One level graphics Nintendo woukd have shown off The Witcher 3 and Infinite Warfare over Splatoon and Skyrim which both don't look much different to what we can have on Wii U by the way.

I'm pretty sure the choice of Skyrim over a bunch of other games is more complicated than simply showing off graphical power. Witcher 3 is unlikely to get a late port (same with Fallout 4), the Switch getting your bog-standard CoD is not remotely surprising (even the Wii U got a couple of CoD games), whereas Skyrim Remastered isn't even released yet and is likely getting a launch port, and thus pretty much the immediate go-to title for Nintendo to say "hey, we're actually getting Bethesda titles, this is a big deal!" Also, Splatoon is more palatable for general audiences and was a pretty fucking big success on a near-dead console and pretty much the best-selling shooter in Japan by a massive margin, of course Nintendo would put it up front and centre.

Seriously, you're reading too much into it.
 

Oregano

Member
Tablet share seems to be over 10% and growing, though. And of course it would make them accessible on most home PCs.

Which part do you mean? The Shields I've seen look more like a DS without the bottom screen, or GBASP.

The Shield Tablet.

shield-tablet-controller-header-image.png
 

Reki

Member
So I was wondering something. It has been said that the new architecture and Nvidia's involvement are good signs for third party developers. But I was thinking about the learning curve for the internal studios, as they were developing for pretty different machines not so long ago.

I'm not an expert, so just ignore the question if silly, but can't the learning time impact negatively the first party projects (as did the transition to HD development when working on Wii U)?
 

z0m3le

Banned
Notwithstanding uncertainty about RAM, and I will assume the chips will run clocked to their standard clock which is perhaps possible when it is docked, the power question boils down to three uncertainties:

1. What Tegra chip will be used (Tegra X1 or Tegra Parker)?
2. How does the power measurement method (known as FLOPS) translate between AMD and NVIDIA cards (NVIDIA's FLOPS tend to be 'stronger', i.e. the communication allows for the FLOPS to be realised more efficiently, giving, and I found this ratio in another Neogaf thread, a ratio of approximately 4:3 in favour of NVIDIA, and this translates approximately one on one to an increase in FLOPS as compared to AMD. We want the AMD FLOPS, because we talk about FLOPS for Xbox One and PS4 as well, being 1.31(?) TFLOPS and 1.84 TFLOPS, but they use AMD rather than NVIDIA).
3. There are two ways (actually, there are more, but without loss of much accuracy we can say there are two) to do computations: FP32 and FP16. The first is slower but more accurate and the second is faster (roughly twice, in fact) but less accurate. Finding a balance between accuracy and speed in this method can increase the FLOPS rate (using only FP16 would give twice the number as compared to using only FP32, for example).

Tegra X1 has 512 GFLOPS for FP32, and Tegra Parker has roughly 750 GFLOPS FP32. If we could, for example, use FP16 for 1/3 of all computations, then we gain a 20% increase in FLOPS.* So you see there can be a significant increase. Xbox One and PS4 cannot use this so-called mixed-precision computation (PS4 Pro can, that's where the rumours about PS4 Pro doing 8.4 TFLOPS come from), so the Switch has a potential advantage in power in this regard.

Let's do a calculation: assuming the Tegra X1, we have 512 GFLOPS of NVIDIA FP32. Assuming (with no particular reason for assuming this, but some more technically-schooled GAFfers called it plausible, but it differs on a game-by-game basis) that 1/3 of the computations can be done in FP16 (which, as I mentioned, results in a 20% gain), we can compute the comparison between the two as follows:
Switch power = 512 * 1.20 * 4/3 = 819 GFLOPS. (the 4/3 is that NVIDIA to AMD ratio I mentioned before) This resultant number is how the Switch actually compares in power to PS4 and Xbox One (which are, respectively, 1.84 TFLOPS and 1.31(?) TFLOPS, remember 1 TFLOPS = 1000 GFLOPS).

If, on the other hand, the Switch uses a Tegra Parker, then the power will be (with the usual caveats that we are guessing a lot of numbers):
Switch power = 750 * 1.2 * 4/3 = 1200 GFLOPS = 1.20 TFLOPS. So, you see that using the latter setup, the Switch could be very close in power to the Xbox One. Remember, though, that the gain from FP16 computations is just a guess, as well as the ratio between NVIDIA and AMD FLOPS (the effect, though, is very real, just not numerically determined).

About the clock I mentioned: there is a standard clock value (Tegra X1 has it at 1 GHz), and the FLOPS rate scales linearly with this clock value (so, halving the clock value will half the FLOPS rate). In handheld mode, the clock value will go down as this saves heat production and ergo battery life. In dock mode, however, active cooling could possibly allow the chip to run at full clock speed and therefore allow the power values I determined above.

Disclaimer: The info I produce here is produced by someone who is not a computer engineer (yet), so there might be something wrong in my explanation. If someone spots an error in my explanation (remember, though, that this is purely a FLOPS determination: we simply don't know how RAM and other things will play into the equation), please correct me.

TL;DR/Conclusion: Depending on many factors, the Switch can possibly be very close to the Xbox One for at least a number of games, but that does assume lot of things we simply do not know, and things that often depend on a game-by-game basis. On the other end of the spectrum, though, the power could possibly be roughly half of the Xbox One, so you see there is a lot we do not know and a large margin for errors.

*: See Thraktor's post (#1551) to see how this gain can be calculated.

This also lines up with Emily's blog from April where she wrote: "In terms of raw power, numerous sources tell me that NX is much closer to Xbox One than PlayStation 4. Even that might be stretching it a tiny bit." being 100% right on NS so far, I'm going to put a lot of weight behind this.

We don't know the final clocks so it might not be quite that high, we are talking a smaller gap than XB1 to PS4 with NS to XB1, that is a very interesting place to be if it pans out, and when docked, there isn't much of a price reason that it couldn't hit those numbers, when not docked though, it could reach as high as 600gflops, and when punching in your formula above to that number, it could be between 900gflops and 1tflop on the go, with the lower resolution, and some draw back of graphics, that could certainly handle current gen gaming on the go.

If it's X1, well... Tegra Shield TV was passively cooled and this is not, also it doesn't mean they didn't shrink the process node, I can't imagine 10s of millions of a product being produced at 20nm if they have a choice, 16nm is cheaper and creates less heat, so even the X1 might end up with 600+ gflops before punching in your formula.
 
So I was wondering something. It has been said that the new architecture and Nvidia's involvement are good signs for third party developers. But I was thinking about the learning curve for the internal studios, as they were developing for pretty different machines not so long ago.

I'm not an expert, so just ignore the question if silly, but can't the learning time impact negatively the first party projects (as did the transition to HD development when working on Wii U)?

This is something I was thinking about in September when people were wondering why the reveal took so long. Nintendo really seemed to know the ins and outs of the PowerPC architecture, and while most third parties have had lots of experience with Nvidia hardware, it's very unlikely that Nintendo's internal developers had.

This could explain that tweet from Tom Phillips about how Nintendo put off the reveal until Mario was more stable. SPD likely has very little experience on Nvidia hardware.
 

Zedark

Member
This also lines up with Emily's blog from April where she wrote: "In terms of raw power, numerous sources tell me that NX is much closer to Xbox One than PlayStation 4. Even that might be stretching it a tiny bit." being 100% right on NS so far, I'm going to put a lot of weight behind this.

We don't know the final clocks so it might not be quite that high, we are talking a smaller gap than XB1 to PS4 with NS to XB1, that is a very interesting place to be if it pans out, and when docked, there isn't much of a price reason that it couldn't hit those numbers, when not docked though, it could reach as high as 600gflops, and when punching in your formula above to that number, it could be between 900gflops and 1tflop on the go, with the lower resolution, and some draw back of graphics, that could certainly handle current gen gaming on the go.

If it's X1, well... Tegra Shield TV was passively cooled and this is not, also it doesn't mean they didn't shrink the process node, I can't imagine 10s of millions of a product being produced at 20nm if they have a choice, 16nm is cheaper and creates less heat, so even the X1 might end up with 600+ gflops before punching in your formula.

Yeah, I did my determination on many assumptions, and even assumed they were using non-customised chips, which we know isn't the case. It definitely will be a formidable system, though, at the very least for what it proposes to do (being a handheld and a console at the same time).
 
Do you remember when the Wii U was revealed, and naysayers were claiming that it wouldn't be more powerful than the Xbox 360? I do, because I was one of the people arguing "of course it'll be better, don't be stupid." Didn't exactly work out that way. It was unthinkable that any component of the Wii U would struggle to outperform or even fall behind 7 year old hardware, yet Nintendo found a way. And the Wii U wasn't a portable system.

I have hope that this will turn out to be fairly powerful given that Nintendo is letting Nvidia have a large role in the hardware and there are hints that this is a newer version of the Tegra than we've seen, but I don't think you should rule anything out when it comes to the hardware being underwhelming. In no way shape or form has Nintendo earned enough trust for anyone to say "of course it'll be more powerful than Wii U." Until we know for sure what the hardware is and what it's capable of, I still consider "Wii U level performance" a possibility.

Specially when all the games they've shown in the reveal are actually little more than WiiU ports. I'm fairly convinced the switch is just on a WiiU level in portable mode and slightly better when docked. People really should not expect much more from Nintendo.
 

Thraktor

Member
This is something I was thinking about in September when people were wondering why the reveal took so long. Nintendo really seemed to know the ins and outs of the PowerPC architecture, and while most third parties have had lots of experience with Nvidia hardware, it's very unlikely that Nintendo's internal developers had.

This could explain that tweet from Tom Phillips about how Nintendo put off the reveal until Mario was more stable. SPD likely has very little experience on Nvidia hardware.

To be honest I don't think it would be that big of an issue. Nintendo's internal teams would have had Nvidia hardware to work on from pretty much the moment the deal was done (perhaps two years before launch), and they dealt with arguably a bigger architectural jump from Wii to Wii U (moving from a fixed-function to fully programmable graphics architecture).

Edit: And CPU ISA really wouldn't be that big of a deal outside of whoever's working on the compiler (if they're not using a third party compiler). Also keep in mind that Nintendo have continuously been developing for ARM-based handhelds since the GBA.
 
I'm fairly convinced the switch is just on a WiiU level in portable mode and slightly better when docked. People really should not expect much more from Nintendo.

This assertion completely ignores the fact that the Switch will be powered by a custom Tegra that is at least as powerful as an existing Tegra GPU that surpasses previous-gen hardware by a considerable margin and probably has a CPU that easily outdoes current-gen console CPUs (because said current-gen console CPUs are outright garbage and don't take much to outdo).

It's one thing to be pessimistic, but this is too ridiculous to take seriously. We're not getting a Wii U level device or lower, not by a long shot. The actual power relative to current-gen consoles is the big question here.
 

lenovox1

Member
Specially when all the games they've shown in the reveal are actually little more than WiiU ports. I'm fairly convinced the switch is just on a WiiU level in portable mode and slightly better when docked. People really should not expect much more from Nintendo.

If you're making presumptions based on hardware that already exists, than the base level of power you can expect is Nvidia's own Shield Tablet if anything.

You can't base expectations on a CPU and GPU platform that they are no longer using.

That's the worst case.

There's nothing to contribute to the discussion of the Switch's fabled "power" if all you have to say is, "Nintendo always uses weak and underpowered hardware."
 

Mokujin

Member
Emily Rogers (who got essentially everything about the Switch right so far) reported that the final unit will include active cooling.

Is swear I have followed everything Emily has said but I don't remember her saying that the final unit had active cooling, in the fact sheet she wrote before the reveal she said "Cooling is still a little noise" but she doesn't specify if its final hardware or development kits.

How would those vents work for passive cooling? Do you blow in them? You shake the tablet to cool it? There's no reason to make them so thick without active cooling. Dissipate heat through metal surfaces, no vents like those.

Hot air goes up, lowers device temperature, also being thick even without active cooling favours spreading heat. I remember my old VCRs having vents, but no active cooling.

I'm not going to persevere doubting about this, but I can't see right now active cooling in the final unit being more than an educated guess, and while it's even more absurd, if active cooling is there, we don't even know if the device needs to have it working on portable mode like Nvidia Shield.

All in all I don't like people being so sure about 500-700 gflops figures based on rumours and educated guesses, I would be more than pumped if those were true, but I prefer to keep my expectations in check, hope we get some juicy leaks before next year to know a bit more about the final hardware.
 

orioto

Good Art™
This is something I was thinking about in September when people were wondering why the reveal took so long. Nintendo really seemed to know the ins and outs of the PowerPC architecture, and while most third parties have had lots of experience with Nvidia hardware, it's very unlikely that Nintendo's internal developers had.

This could explain that tweet from Tom Phillips about how Nintendo put off the reveal until Mario was more stable. SPD likely has very little experience on Nvidia hardware.

Really, with what they showed in the end ? 2 seconds...
 

KAL2006

Banned
This assertion completely ignores the fact that the Switch will be powered by a custom Tegra that is at least as powerful as an existing Tegra GPU that greatly surpasses previous-gen hardware and probably has a CPU that easily outdoes current-gen console CPUs (because said current-gen console CPUs are outright garbage and don't take much to outdo).

It's one thing to be pessimistic, but this is too ridiculous to take seriously.

Are you comparing to existing Rehearsal that's underclocked as I have a feeling it will be underclocked in Portable mode. While in doccked mode it can be compared to a Rehearsal Gpu
 
Is swear I have followed everything Emily has said but I don't remember her saying that the final unit had active cooling, in the fact sheet she wrote before the reveal she said "Cooling is still a little noise" but she doesn't specify if its final hardware or development kits.

Here:

https://twitter.com/ArcadeGirl64/status/788389544257163264

@ArcadeGirl64 Update: Everything in this September 2nd blog post remains *at least* 90% accurate. Including the active cooling.

And the aforementioned blog post:

https://arcadegirl64.wordpress.com/2016/09/02/rumor-what-should-we-expect-from-the-final-nx-product/


Really, with what they showed in the end ? 2 seconds...

Yeah it seemed strange but that's what he said. Take it with a grain of salt (as with all rumors).

Edit:

To be honest I don't think it would be that big of an issue. Nintendo's internal teams would have had Nvidia hardware to work on from pretty much the moment the deal was done (perhaps two years before launch), and they dealt with arguably a bigger architectural jump from Wii to Wii U (moving from a fixed-function to fully programmable graphics architecture).

Edit: And CPU ISA really wouldn't be that big of a deal outside of whoever's working on the compiler (if they're not using a third party compiler). Also keep in mind that Nintendo have continuously been developing for ARM-based handhelds since the GBA.

Ah, I didn't see this before, that's good to hear. It's just that Nintendo's console teams have been working with essentially the same architecture for 15 years, so I wouldn't really be surprised to see SOME growing pains. Hopefully nothing at all like the Wii U HD development issues.
 

AzaK

Member
So are we all assuming that games running while docked will run better than on tablet screen?

It's more than an assumption. Laura Kate Dale has proven she has some solid insiders feeding her info and she mentioned there was some improvement when docked. What level if improvement who knows but I think it's safe to say that at this point there's something.

I think it speaks more about its architecture rather than power.
Or more about how Nintendo paid for the support.
 

Lonely1

Unconfirmed Member
It's more than an assumption. Laura Kate Dale has proven she has some solid insiders feeding her info and she mentioned there was some improvement when docked. What level if improvement who knows but I think it's safe to say that at this point there's something.

Team dedicated scaler.
 

Panajev2001a

GAF's Pleasant Genius
People afraid this won´t be powerful, lads it is Nvidia doing the god damn chip, Nvidia is known for absolutely INSANE graphics cards and was the company who blamed Ps4 and called it a low end Pc, Switch will be powerfull, more than most think, Nvidia isn´t the type of company that does UP stuff, let alone graphics card :D

http://www.zdnet.com/article/nvidia-calls-ps4-hardware-low-end/

i mean let´s be serious here for a moment, Switch will be the most powered handheld device on the market by HUGE HUGE margins, 720p 60fps on the go, 1080p at home!

I cannot wait for the GPU they will deliver for the PS3, insane chip... no bugs nor bottlenecks and surely the latest bleeding edge tech they have ;).

Seriously, will Switch be more powerful than PS Vita? Yes. Will it be more powerful than iPhone 7 or the next iPad revision... maybe neck and neck, thanks to the fact they target a much lower resolution than the iPhone 7 when in tablet mode and they can run at full clock in docked mode. Still, next year Android tablets and iPad revision will pull ahead by a nice margin quite likely.
 

7roject28

Member
I thing all Nintendo need to do is aim for the most power they can get out of a $275 Switch with decent battery life and be able to have slightly more horsepower when docked.

If I had to guess it will be
Undocked - slightly better than Wii U power
Docked (fans and cooling system turn on) - moderately more powerful than Wii U

People need to stop thinking this will have AAA multiplatform games. Has no one not learnt anything from Wii and Wii U. Also if this was so powerful and Xbone One level graphics Nintendo woukd have shown off The Witcher 3 and Infinite Warfare over Splatoon and Skyrim which both don't look much different to what we can have on Wii U by the way.

This isn't a computer where you can just install those games on it to test it.
 

Xhaner5

Neo Member
To be honest I don't think it would be that big of an issue. Nintendo's internal teams would have had Nvidia hardware to work on from pretty much the moment the deal was done (perhaps two years before launch), and they dealt with arguably a bigger architectural jump from Wii to Wii U (moving from a fixed-function to fully programmable graphics architecture).

Edit: And CPU ISA really wouldn't be that big of a deal outside of whoever's working on the compiler (if they're not using a third party compiler). Also keep in mind that Nintendo have continuously been developing for ARM-based handhelds since the GBA.

Also, mid-level tech employees don't stay put for that long they keep jumping around.

But with all the partners so far as well as the UE4, It has to be something behind it for EPIC to even consider being on there, unless they are on there, but only to provide the engine for lower tier market, hopefully not.

I'm not sure where old controller come in, the sensor bar theory and the IR things on the joycons I can see them on both are interesting, but will it support wiimotes, with the motion controls which did well and not even fully realized as shown in WiiU reveal trailer with the FPS gun attachment, with gyro, accel, magnetometer and IR you would be more accurate and probably not having to deal with desync and drift, going back to the old control style, not sure really how would that work with people getting used to, but for old timers maybe it won't be a problem, since I got my PC, i never wanted to play FPS games with an analog stick anymore, so it's probably not going to change, even tho Metroid Prime was a slow-paced game and having analogs didn't make you suck at the game, but for other FPS games I'm not sure. I may be open with Metroid Prime, since that's how I started with, I never played FPS games barely before I got the Metroid Prime PAK on Day 1. So I was playing FPS or let's be totally correct FPA with analogs for like 4 years, before COD2 exploded on PC.
 

Pif

Banned
I cannot wait for the GPU they will deliver for the PS3, insane chip... no bugs nor bottlenecks and surely the latest bleeding edge tech they have ;).

Seriously, will Switch be more powerful than PS Vita? Yes. Will it be more powerful than iPhone 7 or the next iPad revision... maybe neck and neck, thanks to the fact they target a much lower resolution than the iPhone 7 when in tablet mode and they can run at full clock in docked mode. Still, next year Android tablets and iPad revision will pull ahead by a nice margin quite likely.

To quiete likely in their turn to get again overshadowed by a Volta Switch architecture not much after.

Never ending cycle. I bet Nintendo is going to upgrade and revise the shit out it's new shinny cow.
 

lenovox1

Member
Seriously, will Switch be more powerful than PS Vita? Yes. Will it be more powerful than iPhone 7 or the next iPad revision... maybe neck and neck, thanks to the fact they target a much lower resolution than the iPhone 7 when in tablet mode and they can run at full clock in docked mode. Still, next year Android tablets and iPad revision will pull ahead by a nice margin quite likely.

Not very likely in strictly graphics compute performance if we're speaking strictly on ARM assemblies only. Apple's A9 processor may already be faster, but we don't know what's inside the Switch.

www.pcworld.com/article/3006268/tab...st-a-laptop.amp.html?client=ms-android-att-us

(Referencing the "3DMark and Graphic Performance" header and following section)

In terms of graphics compute, Nvidia's implementation is very much ahead of its time.

http://www.laptopmag.com/reviews/tablets/google-pixel-c

(Referencing the 3DMark benchmark on page 5 of the 7 benchmarks listed underneath "Verdict.")

Separately, with regards to actual video game performance, it shouldn't ever come close unless Apple actually and eventually starts taking video game performance seriously with regards to development tools, APIs and developer support.
 

Xhaner5

Neo Member
Just in an LPVG article:

Unknown "Nintendo" Source:
There are no plans for the Switch handheld to be able to connect to your TV without the use of the dock.

Holy Carp!!! Are you thinking what I'm thinking!

That means the handheld may not have a HDMI port, and no HDMI chips and/or circuitry support, as well as other things, it may be split away to only be in the dock.

So this is encouraging it does point to the direction some hardware could be in the dock, may not be a true SCD, a true SCD to count would need an SoC, ARM cores, GPU cores, and RAM.

Let's hope it's not the lowest choice, only HDMI hardware to negotiate output.

  • No SCD: Only HDMI circiutry
  • Faux SCD: EXTOUT (HDMI), Upscaler (makes all the technically-unsavvy sources and media to talk in the direction of a SCD)
  • mini-SCD: EXTOUT (HDMI), RAM
  • mid-SCD: EXTOUT (HDMI), GPU and RAM
  • full-SCD: EXTOUT (HDMI), CPU,GPU,RAM
 

Zedark

Member
Just in an LPVG article:

Unknown "Nintendo" Source:


Holy Carp!!! Are you thinking what I'm thinking!

That means the handheld may not have a HDMI port, and no HDMI chips and/or circuitry support, as well as other things, it may be split away to only be in the dock.

So this is encouraging it does point to the direction some hardware could be in the dock, may not be a true SCD, a true SCD to count would need an SoC, ARM cores, GPU cores, and RAM.

Let's hope it's not the lowest choice, only HDMI hardware to negotiate output.

Fake SCD: Only HDMI circiutry and Upscaler
  • mini-SCD: + RAM
  • mid-SCD: + CPU and RAM
  • full-SCD: + CPU,GPU,RAM

I mean, yes it could mean that, but the article mentions that they only confirmed it for launch, and received the explanation that they didn't want to confuse people. So I don't think this evidence tells us anything new, it just says they don't want bundles without the dock at launch. Still, I would love it if they added some beef in the dock, as I am not looking for an exclusively portable system, but rather a hybrid that packs as much power as possible when docked and is still portable (meaning no heat problems amongst other things).

All kneel before the holy Carp!
 

MrGerbils

Member
People have probably already brought up every possible comparison to the Nvidia Shield Tablet, but just in case this was missed, I thought it was interesting.

From Nvidia's website:
Estimates based on playing Half-Life 2: Episode One on:

Optimized Setting (Default): Almost 2 hours battery life.
Battery Saver Setting: Almost 4 hours of battery life.

This sounds like it's probably similar to the battery life we can expect for the Switch based on the rumors.
 

Lonely1

Unconfirmed Member
To quiete likely in their turn to get again overshadowed by a Volta Switch architecture not much after.

Never ending cycle. I bet Nintendo is going to upgrade and revise the shit out it's new shinny cow.

Is not like current mobile devices are doing much more than the Vita did with several orders of more power, though.
 

Zedark

Member
People have probably already brought up every possible comparison to the Nvidia Shield Tablet, but just in case this was missed, I thought it was interesting.

From Nvidia's website:


This sounds like it's probably similar to the battery life we can expect for the Switch based on the rumors.

Hmm, the trade-off for that battery life is rough, though. Half the CPU power gone can't be a workable solution for playing modern console games, can it?
 

Locuza

Member
[...]
Again, i'm talking about an average, not linear. And i'm talking "ballpark" not "exact". Just to get an idea. 5%? 25%? 60%?
[...]
I got your point but that's already something what I would describe as a relatively precise estimation.
You could say 5-40% in the next 10 years, does this help? Very unlikely.
Every game will use a different ratio and on average it will vary strongly with the growing amount of game titles and the time which passed.

Even as a rough ballpark you won't get a good answer from today's speculation other than a generous range.
 
Yeah, I highly doubt Nintendo opted for 4GB HBM.
... But then again, with only one GPU using HBM1 and everyone opting for the higher capacity HBM2 in their new cards, could an HBM1 contract be cost effective?

How power efficient would HBM be compared to mobile RAM? How cost effective compared to redesigning Tegra with eDRAM?

Still, though... Not happening...
 

Doczu

Member
Just in an LPVG article:

Unknown "Nintendo" Source:


Holy Carp!!! Are you thinking what I'm thinking!

That means the handheld may not have a HDMI port, and no HDMI chips and/or circuitry support, as well as other things, it may be split away to only be in the dock.

I think this could be somewhat futureproof for the Switch when SCD comes into play.
Why upgrade the tablet, when you can ad some sweet Tflops to the dock?
 

Hermii

Member
How much ram is likely? Is 6gb too optimistic?

I cannot wait for the GPU they will deliver for the PS3, insane chip... no bugs nor bottlenecks and surely the latest bleeding edge tech they have ;).

Seriously, will Switch be more powerful than PS Vita? Yes. Will it be more powerful than iPhone 7 or the next iPad revision... maybe neck and neck, thanks to the fact they target a much lower resolution than the iPhone 7 when in tablet mode and they can run at full clock in docked mode. Still, next year Android tablets and iPad revision will pull ahead by a nice margin quite likely.

It doesn't matter how it compares to iPhone 7 on paper, because no-one is going to make games that really pushes the iPhone 7. The switch is going to blow it away when it comes to the technical and artistic quality of the games.
 
I think this could be somewhat futureproof for the Switch when SCD comes into play.
Why upgrade the tablet, when you can ad some sweet Tflops to the dock?

I think my only issue with that idea would be the "Switch" concept. If I have a dock that adds that much power, what happens when I un-dock the console? That also wouldn't improve the "gaming on the go" side of things either. Unless... They release a "new" Nintendo Switch down the line with the SCD being a cheaper option for those who don't want to fully upgrade.
 
How much ram is likely? Is 6gb too optimistic?

6GB at the bare minimum is my hope. Anything less and you can kiss the hope of getting most current-gen 3rd party games ported over goodbye, regardless of the Switch's userbase. 8GB is obviously ideal, but 6GB is also workable if that lets themkeep the costs down, and they can keep the underlying OS lightweight.
 

Doctre81

Member
Just in an LPVG article:

Unknown "Nintendo" Source:


Holy Carp!!! Are you thinking what I'm thinking!

That means the handheld may not have a HDMI port, and no HDMI chips and/or circuitry support, as well as other things, it may be split away to only be in the dock.

So this is encouraging it does point to the direction some hardware could be in the dock, may not be a true SCD, a true SCD to count would need an SoC, ARM cores, GPU cores, and RAM.

Let's hope it's not the lowest choice, only HDMI hardware to negotiate output. Along with potentional additional processing.

  • No SCD: Only HDMI circiutry
  • Faux SCD: EXTOUT (HDMI), Upscaler (makes all the technically-unsavvy sources and media to talk in the direction of a SCD)
  • mini-SCD: EXTOUT (HDMI), RAM
  • mid-SCD: EXTOUT (HDMI), GPU and RAM
  • full-SCD: EXTOUT (HDMI), CPU,GPU,RAM


That is pretty much the purpose of the dock. Video outs, possible usb ports etc.
 

MrGerbils

Member
Hmm, the trade-off for that battery life is rough, though. Half the CPU power gone can't be a workable solution for playing modern console games, can it?

Yeah they even say that the power saver mode also straight up limits framerate to 20fps.. and then on top of that they still say "we only saw a slight performance decrease." Doesn't sound great.

So maybe expect 2 hours of battery life out of the Switch.
 
What's the % in power efficiency improvement on Pascal over Maxwell?

Nvidia claims that it's 40% I believe. EDIT: 60% more power efficiency, 40% better performance for the same power is what Nvidia claims, sorry about the false info. Very confusing stuff.

Yeah they even say that the power saver mode also straight up limits framerate to 20fps.. and then on top of that they still say "we only saw a slight performance decrease." Doesn't sound great.

So maybe expect 2 hours of battery life out of the Switch.

The Shield Tablet doesn't even use a TX1 (TK1 I believe), so the power efficiency gained by a Parker based chip is fairly high. Also consider the Shield is running an Android OS overhead, which will likely eat up a good amount of battery and we don't know what size battery the Switch will be using compared to the Shield Tablet.

But I would expect the reported 3 hours of battery life to be fairly accurate for intensive gaming sessions.
 
Nvidia claims that it's 40% I believe.



The Shield Tablet doesn't even use a TX1 (TK1 I believe), so the power efficiency gained by a Parker based chip is fairly high. Also consider the Shield is running an Android OS overhead, which will likely eat up a good amount of battery and we don't know what size battery the Switch will be using compared to the Shield Tablet.

But I would expect the reported 3 hours of battery life to be fairly accurate for intensive gaming sessions.
OK, thanks.
 

ozfunghi

Member
Pascal will allow higher clocks at lower power draw than Maxwell, so you can get some of the best of both worlds, especially in docked mode. It would be even better for efficiency if they went with ARM A72 cores, but that seems pretty pie in the sky. Nvidia stuck with A57 for their Parker Design, choosing to focus on adding their Denver Cores instead.

Well, they could go for 60% more power efficient at the same performance in portable mode. And go for 40% more powerful at the same power draw in docked mode.

There is also that Nvidia Flops vs AMD Flops advantage (on PCs at least) which could wind up making an exact Flop comparison kind of cloudy. I think ~500 GFlops is a good estimate at this point considering we know the July devkits were using TX1s with active cooling, and the final device has multiple vents and reports of active cooling.

We actually have a pretty clear picture of what's going to be in this thing. Obviously the final SoC will be custom (as per the OP) but having the devkit use an off the shelf TX1 does tell us quite a bit.

A gaffer with better knowledge than me (or so he had me believe haha) told me that there is no magical difference that has Nvidia flops perform better than AMD flops. It's down to the fact that AMD favors compute compared to Nvidia. Meaning those "flops" aren't used to boost graphics or framerate. Hence Nvidia getting better graphics results in benchmarks for the same amount of flops.

What's the % in power efficiency improvement on Pascal over Maxwell?
Nvidia claims that it's 40% I believe.

No, Nvidia claims 60% more power efficient at the same performance or 40% more performant at the same power draw.
 
Well, they could go for 60% more power efficient at the same performance in portable mode. And go for 40% more powerful at the same power draw in docked mode.

A gaffer with better knowledge than me (or so he had me believe haha) told me that there is no magical difference that has Nvidia flops perform better than AMD flops. It's down to the fact that AMD favors compute compared to Nvidia. Meaning those "flops" aren't used to boost graphics or framerate. Hence Nvidia getting better graphics results in benchmarks for the same amount of flops.

I've heard that the difference is seen generally in PCs, but it's a bit harder to compare over consoles due to their large amount of other differences beyond GPU architecture. But I admit I don't really know either. It would be nice to get some clarification on the matter.

No, Nvidia claims 60% more power efficient at the same performance or 40% more performant at the same power draw.

That's right, my bad. I'll edit my original post.
 
I think my only issue with that idea would be the "Switch" concept. If I have a dock that adds that much power, what happens when I un-dock the console? That also wouldn't improve the "gaming on the go" side of things either. Unless... They release a "new" Nintendo Switch down the line with the SCD being a cheaper option for those who don't want to fully upgrade.

If they're the same family of chips, then theoretically it shouldn't matter. You'd be able to keep using your old Shift, until it reaches a point where new games ONLY supported a min version. Kind of like how while you can install iOS on old iPhones, you can only go so far back. Shift software would be the same
 

Xhaner5

Neo Member
I think this could be somewhat futureproof for the Switch when SCD comes into play.
Why upgrade the tablet, when you can ad some sweet Tflops to the dock?

I mean, yes it could mean that, but the article mentions that they only confirmed it for launch, and received the explanation that they didn't want to confuse people. So I don't think this evidence tells us anything new, it just says they don't want bundles without the dock at launch. Still, I would love it if they added some beef in the dock, as I am not looking for an exclusively portable system, but rather a hybrid that packs as much power as possible when docked and is still portable (meaning no heat problems amongst other things).

I think my only issue with that idea would be the "Switch" concept. If I have a dock that adds that much power, what happens when I un-dock the console? That also wouldn't improve the "gaming on the go" side of things either. Unless... They release a "new" Nintendo Switch down the line with the SCD being a cheaper option for those who don't want to fully upgrade.

Let's begin:

1. A gaming system, comprising: a first game console comprising one or more processors configured to locally execute a first game and provide video output of the first game to a display and audio output of the first game to a speaker, the game console including a first physical communication interface and a first wireless communication interface; and a first supplemental computing device configured to detachably couple to the first game console via the first physical communication interface, the first supplemental computing device comprising: a second physical communication interface; a second wireless communication interface; one or more processors configured to provide, via the first and second physical communication interfaces, processing resources to the first game console to assist the first game console in locally executing the first game; and memory for receiving data associated with the first game from the first game console and storing the data for later access by the first game console; wherein the first game console is further configured to couple, via the first wireless communication interface, to a second supplemental computing device, the second supplemental computing device including one or more processors to provide processing resources to the first game console and memory for providing storage resources to the first game console; and wherein the one or more processors of the first supplemental computing device are configured to provide, via the second wireless communication interface, processing resources to a second game console for assisting the second game console in locally executing a second game on the second game console, the second game being different from the first game and the second game console being located remotely from the first game console.

Lawyer Boreucrat Translator:
If this patent is really being used for NX - then>

The Switch Handheld is the MAIN as stated already by nintendo, in the patent it's "first"
SO:

  • The main console can connect to one SCD physically (Dock)
  • The main console can connect to another SCD wirelessly (cloud assisted computing)

Which means, even if you don't have the dock you may be able to stream additional processing of your game according to your wireless speeds and connection stability.

With Cloud SCD, you could possibly choose from 2 options:
  • Assist For Power Saving - Enhances battery life with same visuals
  • Assist For Enhanced Visuals - Enhances visuals at no extra battery cost

The Cloud SCDs would be a global (regional - ping is an issue) network of NintenDocks being hooked to the internet at home (wired ethernet recommended), so yes you could assist your Switch Handheld from your Dock at home, however you may be far away so the Cloud will choose the closest available Dock to you to lower the connection latency, and the user who has his Dock at home sharing his SCD to the Cloud get's "incentives" and "rewards".

Who knows what nintendo will pick, they could pick a fixed option and only allow you to assist processing for battery saving purposes, so the swaying battery usage may not be so noticable (as the handheld takes over if cloud doesn't return results fast enough, timeout) versus an improved visuals where you'd just see weird things if the connection slows down, either you'd just have some details disappear or lighting effects or shadows cut-out in the middle, or the game itself would pause until connection is stable or worse just stutter like a slow loading video.


That is pretty much the purpose of the dock. Video outs, possible usb ports etc.

Actually, technically speaking, there is a big difference between "not having a HDMI port" or "not supporting external output" because it has to do with the whole HDMI standard and that takes additional hardware and firmware support, chips, circuitry, etc, Because you could have all the HDMI support inside, but just cut out the HDMI port, route the signal wires through another (proprietary) connector, passthrough the Dock, and have a HDMI output on the Dock and an average joe would call it "only the dock supports HDMI" no it doesn't, the dock, in this example case, is only an "HDMI extension cable enclosed in a box" - so it's is not the same, you can't interchange those two sentences, but this is how the media and technically unsavvy people talk, they interchange a lot of stuff, they don't even know it.

>>> Which means we cannot be sure of anything, that's why many rumors are off, because the information is degraded before it gets out, even the original source may be wrong in his communication, he may understand it better but when he says it he leaves out details that he thinks are "granted knowledge" or simply "enough", everybody does this unintentionally in daily life, or simply doesn't take time to explain it in detail, and when these sources talk to the media they take 20 mins, then when someone else reads that he assumes things in a way that his base knowledge can understand, so someone with a lower level of technical experience and knowledge will interpret the same words differently, then it gets down the line, to the public, and then the public adds another layer of reinterpretation to that ontop, half of those people that would get upset,enraged,mixed, or not interested may actually be as a result of their interpretation of the leak, so there's so many factors.

Everybody and on other forums is questioning this because the dock is actually quite big in the back side, for USB and HDMI out it seems way big and nintendo is always neurotic to make a console small, so it just doesn't make sense, I'm predicting there is either cooling or SCD and cooling inside there.


Continuing:

2. The gaming system as recited in claim 1, wherein the one the one or more processors of the first game console are further configured to: identify additional supplemental computing devices within a threshold network distance of the first game console, the additional supplemental computing devices including the second supplemental computing device; determine a network distance between the first game console and each of the additional supplemental computing devices; and select, for use in assisting the first game console in locally executing the first game, the second supplemental computing device based at least in part on the determined network distances.

Boom haha, I didn't even read this part and I already mentioned it, they call it "within a threshold network distance" and the wording could mean more than one SCD could be involved.

I was about to mention that in practise the SCDs in cloud would work as one big SCD basically all providing something to the handhelds, or at least SCDs who are near eachother, so maybe the Handheld could have more SCDs assisting it over the Cloud, may not mean bigger and better, may only mean that the workload is split.

This whole cloud assisted SCD is kinda elaborate, but I can't figure out if it means only for handheld undocked, or the whole dock, because it keeps mentiongin "second supplemental device" so the first one is Dock, the second is on the Cloud, so that would mean it's possible to add computing assistance ONTOP of an already physically assisted Docked console.

But it does sound a bit too good to be true, US internet infrastructure isn't that good compared to europe, it'll be seen if they even use this cloud assisting on launch, and even then it's going to depend if there's an available SCD near your position, and the connection would have to be quite stable, speed would need to be moderate, and the stuff that could be assisted is also in question, stuff that needs to happen quickly in a game can't be assited, you definitely won't be assisting the processing procedure of a button press, it will be those areas which have no effect on the gameplay, graphics, but even that is not everything, you're not going to render half a model and have mario without legs running around until Doe's SCD from arizona finishes supplementing right ?


3. The gaming system as recited in claim 1, wherein the one the one or more processors of the first supplemental computing device are further configured to: identify additional supplemental computing devices within a threshold network distance of the first supplemental computing device, the additional supplemental computing devices including the second supplemental computing device; determine a network distance between the first supplemental computing device and each of the additional supplemental computing devices; and select, for use in assisting the first game console in locally executing the first game, the second supplemental computing device based at least in part on the determined network distances.


So now the whay I think these 2. and 3. points are, is that the 2. speaks for the undocked mode, and 3. for docked mode.

If we believe the SCD is in the dock, hey, the SCD may just be a standalone sold brick, but im just going over this as if it were in the Dock.


4. The gaming system as recited in claim 1, wherein neither the first supplemental computing device nor the second supplemental computing device include a display driver, an audio driver, and a user-control interface.

So what do we have left, CPU, GPU and RAM - also this is big I'm not going to go over, I'm going to read for myself everything, but I can't comment on everything as some thing as straighforward.

The patent answers it here actually better than I could:
Thus in terms of network distance, a network computing device that is "close" has relatively low latency or hops, and one that is "far away" has relatively large latency or numbers of hops. Relatively close supplemental computing devices may be able to provide services at a nearly real-time speed (e.g. processing real-time graphics and sound effects), while relatively far away devices may only be able to provide asynchronous or supplementary support to the events occurring on the console (e.g. providing for weather effects in games, artificial intelligence (AI), etc.

Also there are terminology that there is a "second game, second memory" there is a chance that actually indeed the Dock or what could have more STORAGE MEMORY not just RAM, which means you could save games on there, but it also mentions "partly" which means being prcessed when game run thefefore it's just RAM, but the sound of it makes me think you could save an active game on it like a ramdisk without having to load it and just SWITCH between 2 games, putting one into pause/suspend, but that's probably a bit too optimistic, the "part" part comes in with big games, I don't see a 32GB dedicated switch memory on top of SCD (RAM,GPU,CPU)

More likely it could mean there's another GameCard port on the Dock and the "memory" points to the storage memory the card uses. Or it simply means internal storage, the mediocre flash storage space, or maybe it'll be fast this time around.

However this 2 Game Switch idea may not be far fetched, if they don't provide a lot of storage for installing or no storage at all, since making these GameCards supposably go 400MB/s it's the new tech it's probably not that cheap, they may only have some 4-8GB of internal flash space for savegames, OS, configs, profiles, social content - without game installing at all, so with 2 game cards, you could have the swithc support, but for bigger games you could have the limit increased, I said this earlier, don't think the maximum of game size is 32GB, if there's another GameCard port on the Dock, it's plausible some "assisting" may happen again and you'd be able to play games designed for double the space in mind and use both the GameCards without having to "please insert Disk 2" in the middle of the game.



-------
Ofcourse this is all my late explanation of this, it's probably nothing new to most of the gurus around the internet.
 
These two cards are a good idea of the difference in performance per flop in real world game benchmarking:

AMD R9 390: 5,914 GFLOPS
Nvidia GTX 970: 3,494 GFLOPS

https://youtu.be/vtry-A-t7Qk

That's a pretty big difference for similar performance

That would give Nvidia flops a 1.7x advantage over AMD flops which sounds way too high to be honest. I'd love if someone could clarify what causes these differences if they even truly exist.

(I didn't watch all of the video but I'm pretty sure neither of the chips were overclocked, right?)
 
That would give Nvidia flops a 1.7x advantage over AMD flops which sounds way too high to be honest. I'd love if someone could clarify what causes these differences if they even truly exist.

(I didn't watch all of the video but I'm pretty sure neither of the chips were overclocked, right?)

It's pretty much NVidia's DX11 driver is superior to AMD's which is why you see so many games with equal or better fps.

If you want to consider overclocking, 970's overclock waaaaay better than 390's.
 
These two cards are a good idea of the difference in performance per flop in real world game benchmarking:

AMD R9 390: 5,914 GFLOPS
Nvidia GTX 970: 3,494 GFLOPS

https://youtu.be/vtry-A-t7Qk

This is only true if you compare them on dx11. Both Xbox1 and PS4 use low level APIs.

@Skittzo0413 The 970 uses a newer architectur then the 390. 390 vs 780(ti) would be a much better comparison. But usually Nvidia needs fewer Gflops for equal performance.
 
Top Bottom