• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.

ggx2ac

Member
Whatever apology or shit you want from these 2 you are not entitled to. At no point should anyone have taken their word as gospel. They have insider sources but that doesn't imply things don't change and they never guaranteed anything. They simply stated what their sources had relayed to them. I really dunno what the fuck your problem is frankly.

People are melting down over expectations of the Switch and are looking for something to blame.

Had nothing been leaked, people would have melted down when the Switch was revealed in October because there would have been expectations of it being a home console more powerful than a PS4.
 

z0m3le

Banned
I think what Hoo-doo meant is that some people are too optimistic thinking that the Switch will end up being better than what is currently reported.

It may help to look at how bad the specs could go and not just how good they can be, like with how I disagree that Nintendo would go for 16nm and instead go for 28nm considering the inconsistencies with the clock speeds.

That's not to say it's true, it's just I'm finding some blind optimism here and there like some people now thinking that the SCD is a guarantee and that it will be 4 TFLOPS and cost $200 and provide 4K gaming. What makes me question that for example is that Nintendo are in the game of making things affordable and peripherals do not sell that well to a console userbase. If an SCD did ever happen I'd expect it to use the same Tegra tech possibly at stock clock speeds so that it could be cheaper than $100 along with using features like Wi-fi to make the visuals in portable mode better as stated in the patent.

First off, the sky isn't falling, the numbers I was talking about are 2SM @ 307mhz and 768mhz there is no optimism there, just running the numbers and comparing architecture performances, you can ignore it, you can come up with your own numbers, but don't think those are pie in the sky optimism, it's just the cold numbers.

As for the person who posted about that dock, it was me, you are literally replying to me and saying some people when you mean "you" I am speculating on the real possibility that Nintendo is using the USB-C as an expansion port like they have done on many many prior opportunities.

The 4TFLOPs number comes from current nvidia gpu offerings, you can buy a 4tflop gpu for your pc for $199. As for "same tegra tech" Tegra is just a package, there is no difference between GPUs in desktops and inside tegra chips.

I was expecting anything from 435gflops to 750gflops, and for the portable to have the same performance at a lower resolution (of 720p) OH NO, 393gflops! what will we do, that is 10% less than could possibly be acceptable. Optimism is fine to have if you keep in mind that that is the highest we could expect. Now that we know the clocks, it is best to accept the 2SM as the most likely possibility and move forward with that speculation.

If people want to lower expectations further, they are free to do so, but it isn't based on any numbers we have, we know the gpu isn't r700 anymore, if people want to treat it like it is, they should also remember that 360 was 240gflops.
 

Hermii

Member
Because 540p for handheld mode would be horrible? I dont think Nintendo would like many sub native games on their new HD portable system.

Though i agree that 1080p is not confirmed yet, we cant be sure how all games will use that extra procesding power or how much there it is if they want to push the graphics more (900p is a real posibility for some games)
If it focused on 540p for handheld mode it would have had a 540p screen and devs would have no excuse not to hit native.
 
If it focused on 540p for handheld mode it would have had a 540p screen and devs would have no excuse not to hit native.

Then Nintendo would be imposing unnecessary resolution limitations on their own games. As well as any other titles that would actually be targeting 720p/1080p.
 

Hermii

Member
Then Nintendo would be imposing unnecessary resolution limitations on their own games. As well as any other titles that would actually be targeting 720p/1080p.
I was just replying to the poster that said focusing on 540p would mean sub native games, I wasn't arguing they should have done it.
 

OryoN

Member
The problem with Eurogamer's article is that they didn't stress bolded enough. They wrote the article as if Nintendo is going to use a non-customized SoC. So, now everybody is thinking and basing their opinion on that Nintendo is using a of the shelf underclocked Tegra X1 when we know that can't be the case..

This. I noted at least one time where the editor was guilty of doing just that. It's like, they began reporting the new development, and ultimately admitted that they don't know the configuration of the custom chip, toward the end. Yet, in the midst of all that, they handled the report as if those initial dev kits specs were now downgraded... and viola! The Nintendo Switch!

If it was the case, then why would Nintendo even need the fan? Eurogamer's article is nothing but bad journalism

Not only that, but if developers were indeed using kits based on Tegra X1(even requiring a fan) for game development, then suddenly Nintendo downgrades( switch! haha) the clocks so drastically, including the CPU(by half!) while keeping the same chipset, that would have the most adverse affect on games currently in development. Less complex ones would still run, while others may not, or the time and effort required to optimize them might not be worth the trouble. That would be a disaster, especially for any publishers hoping to have something on launch day!

The logical conclusion would be that; The final silicon is roughly able to match or even exceed what was in the dev kits, even at those low clocks. I would also imagine that securing enough final hardware for a worldwide launch was more important, so developers got by with the tx1 kits in the meantime. 'Final' kits must be finally trickling out.
 

Vic

Please help me with my bad english
This. I noted at least one time where the editor was guilty of doing just that. It's like, they began reporting the new development, and ultimately admitted that they don't know the configuration of the custom chip, toward the end. Yet, in the midst of all that, they handled the report as if those initial dev kits specs were now downgraded... and viola! The Nintendo Switch!



Not only that, but if developers were indeed using kits based on Tegra X1(even requiring a fan) for game development, then suddenly Nintendo downgrades( switch! haha) the clocks so drastically, including the CPU(by half!) while keeping the same chipset, that would have the most adverse affect on games currently in development. Less complex ones would still run, while others may not, or the time and effort required to optimize them might not be worth the trouble. That would be a disaster, especially for any publishers hoping to have something on launch day!

The logical conclusion would be that; The final silicon is roughly able to match or even exceed what was in the dev kits, even at those low clocks. I would also imagine that securing enough final hardware for a worldwide launch was more important, so developers got by with the tx1 kits in the meantime. 'Final' kits must be finally trickling out.
Most likely this.
 
Not only that, but if developers were indeed using kits based on Tegra X1(even requiring a fan) for game development, then suddenly Nintendo downgrades( switch! haha) the clocks so drastically, including the CPU(by half!) while keeping the same chipset, that would have the most adverse affect on games currently in development. Less complex ones would still run, while others may not, or the time and effort required to optimize them might not be worth the trouble. That would be a disaster, especially for any publishers hoping to have something on launch day!

The logical conclusion would be that; The final silicon is roughly able to match or even exceed what was in the dev kits, even at those low clocks. I would also imagine that securing enough final hardware for a worldwide launch was more important, so developers got by with the tx1 kits in the meantime. 'Final' kits must be finally trickling out.
I'm on a similar mindset on this topic. The other way this could happen without any dev complaining about a downgrade was that the dev kits were downclocked to begin with. The weird thing about that idea was that no one commented about that. In fact, it was implied to be overclocked due to the dev kits having a loud fan. Hmm. That is an enigma.
 

NateDrake

Member
This. I noted at least one time where the editor was guilty of doing just that. It's like, they began reporting the new development, and ultimately admitted that they don't know the configuration of the custom chip, toward the end. Yet, in the midst of all that, they handled the report as if those initial dev kits specs were now downgraded... and viola! The Nintendo Switch!



Not only that, but if developers were indeed using kits based on Tegra X1(even requiring a fan) for game development, then suddenly Nintendo downgrades( switch! haha) the clocks so drastically, including the CPU(by half!) while keeping the same chipset, that would have the most adverse affect on games currently in development. Less complex ones would still run, while others may not, or the time and effort required to optimize them might not be worth the trouble. That would be a disaster, especially for any publishers hoping to have something on launch day!

The logical conclusion would be that; The final silicon is roughly able to match or even exceed what was in the dev kits, even at those low clocks. I would also imagine that securing enough final hardware for a worldwide launch was more important, so developers got by with the tx1 kits in the meantime. 'Final' kits must be finally trickling out.

Final kits began to go out about a month ago.
 

ggx2ac

Member
As for the person who posted about that dock, it was me, you are literally replying to me and saying some people when you mean "you" I am speculating on the real possibility that Nintendo is using the USB-C as an expansion port like they have done on many many prior opportunities.

The 4TFLOPs number comes from current nvidia gpu offerings, you can buy a 4tflop gpu for your pc for $199. As for "same tegra tech" Tegra is just a package, there is no difference between GPUs in desktops and inside tegra chips.

I was expecting anything from 435gflops to 750gflops, and for the portable to have the same performance at a lower resolution (of 720p) OH NO, 393gflops! what will we do, that is 10% less than could possibly be acceptable. Optimism is fine to have if you keep in mind that that is the highest we could expect. Now that we know the clocks, it is best to accept the 2SM as the most likely possibility and move forward with that speculation.

If people want to lower expectations further, they are free to do so, but it isn't based on any numbers we have, we know the gpu isn't r700 anymore, if people want to treat it like it is, they should also remember that 360 was 240gflops.

Except it's not really a "real possibility" when you're talking about selling a $200 peripheral for people with 4K TVs. How high a percentage of the userbase can you expect to buy that if it's not restricting the games they can play now?
 

Vic

Please help me with my bad english
As brought up by many other people, the problem with Eurogamer's article is the vague assumption that those leaked clocks are applied to early dev kit hardware.

My guess is that once the parties involved, mainly Nintendo and Nvidia, agreed on a specific raw performance target for the Switch SoC, which was probably aimed around what the Tegra X1 was capable of, they've started to design a SoC that was more fitted for a dedicated portable gaming machine that would be mass-produced.The early overclocked X1 dev kits are probably reaching around the level of performance that the custom SoC is able to perform but with far less efficiency.

Basically what OryoN and many others have alluded to already.
 

z0m3le

Banned
Except it's not really a "real possibility" when you're talking about selling a $200 peripheral for people with 4K TVs. How high a percentage of the userbase can you expect to buy that if it's not restricting the games they can play now?

It's completely optional and doesn't require any real extra coding, it is much like how you can buy a faster gpu for your PC. If 1m people buy it, it could entirely be worth selling if they make a small profit, that is how the gpu industry works.
 
Has the possibility of Pascal been abandoned? If the Dev kits were Maxwell with a fan maybe moving to Pascal with lower clocks solves the fan issue in addition to the expanded battery life target that was discussed ?
 

Hermii

Member
Has the possibility of Pascal been abandoned? If the Dev kits were Maxwell with a fan maybe moving to Pascal with lower clocks solves the fan issue in addition to the expanded battery life target that was discussed ?

Most likely its some weird hybrid at 20nm imo. Emily, Natedrake, Eurogamer all said its Maxwell. Digital Foundry said its Maxwell with some Pascal customisations.
 

z0m3le

Banned
Most likely its some weird hybrid at 20nm imo. Emily, Natedrake, Eurogamer all said its Maxwell. Digital Foundry said its Maxwell with some Pascal customisations.

The secret is pascal and maxwell are the same architectures, pascal is just maxwell borrowing some of Volta's features, if you look at a Nvidia roadmap for 2014, pascal doesn't exist, just maxwell then Volta.

Tegra maxwell X1 borrowed from Volta and moved from 28nm to 20nm, Nvidia had to delay Volta out of 2016, so they released maxwell iteration #3 on 16nm because tsmc was ready, and Nvidia called it pascal. It's not even a real secret, pascal and maxwell have the same flop performance, the power envelope changed with the smaller node, that just offered higher clocks, but we know the clocks, so maxwell and pascal discussion boils down to the process node used and the only consequence is power consumption.

This is why I find digital foundry's curiosity of pascal and 16nm dense, it doesn't matter, the clocks are low enough that battery life should be around 5 hours full out, maybe 3hrs if 28nm (really would be surprised about this node) and maybe 7hrs for 16nm (these are all just guessimates, I haven't looked into it too much but gtx 980 is 165watts and gtx 1060 with the same 4.6tflops (small overclock) is 123watts.
 

Speely

Banned
Maxwell and Pascal are not very different. Moreover, customized versions of either could look more and more like one another, especially if one considers a fully-custom solution for the Switch.

We know very little in regard to this device's actual performance. January is almost here, buds.
 
Maxwell and Pascal are not very different. Moreover, customized versions of either could look more and more like one another, especially if one considers a fully-custom solution for the Switch.

We know very little in regard to this device's actual performance. January is almost here, buds.


January is gunna be like the show Lost. We are gunna get more questions and not many answers lol.
 

z0m3le

Banned
We know very little in regard to this device's actual performance. January is almost here, buds.

No, we know pretty much exact performance given different SM counts. If the configuration hasn't changed from x1 in the dev kits, it's about 75% of Tegra x1 with these clocks, or about 10% less powerful than pixel c, but android is a bad platform, it is also about 9% the performance of gtx 1060. (xb1 is about 20% of gtx 1060 to give more references)

If it does have 3sm, it's 50% faster, so 115% of x1, and about 14% of gtx 1060. These are all unoptimized numbers and without mixed precision. As for what this means based on what we thought? Not much really, we moved from around ~70% of xb1 (fp32) at the high end, to ~45% of xb1 (fp32) with the same core config as x1.

Most people I saw was looking at x1 itself for the performance, which is somewhere under 55% of xb1.

I am not sure how much those power envelopes would change anything, it's all roughly similar. I love Nintendo and will buy their stuff, but I'm also a PC gamer with a gtx 1060, so I don't really mind. I'll but 3rd parties on the switch, for the first time it sounds like a real positive to be able to take my game with me, and I'm actually excited to pick up skyrim again.
 

Kevin

Member
Looks like the Switch supports Vulkan & OpenGL:

http://phoronix.com/scan.php?page=news_item&px=Nintendo-Switch-Vulkan-Conform

I know a lot of you are disappointed that the Switch isn't pushing PS4 Pro or high end gaming pc level graphics on a portable tablet based gaming system but I am definitely excited for this console if for nothing else then the cast amount of awesome Nintendo titles that I'll be able to play at home and on the go on a console that actually looks sleek.

I have honestly not been this excited for a Nintendo console in more then a decade.
 

Speely

Banned
No, we know pretty much exact performance given different SM counts. If the configuration hasn't changed from x1 in the dev kits, it's about 75% of Tegra x1 with these clocks, or about 10% less powerful than pixel c, but android is a bad platform, it is also about 9% the performance of gtx 1060. (xb1 is about 20% of gtx 1060 to give more references)

If it does have 3sm, it's 50% faster, so 115% of x1, and about 14% of gtx 1060. These are all unoptimized numbers and without mixed precision. As for what this means based on what we thought? Not much really, we moved from around ~70% of xb1 (fp32) at the high end, to ~45% of xb1 (fp32) with the same core config as x1.

Most people I saw was looking at x1 itself for the performance, which is somewhere under 55% of xb1.

I am not sure how much those power envelopes would change anything, it's all roughly similar. I love Nintendo and will buy their stuff, but I'm also a PC gamer with a gtx 1060, so I don't really mind. I'll but 3rd parties on the switch, for the first time it sounds like a real positive to be able to take my game with me, and I'm actually excited to pick up skyrim again.

You always make good points, and educated ones at that. But in this case I sort of have to point out that although you may be correct in a vacuum, you also admit to a range of variance in the final Switch specs that pretty much agrees with my assertion that we don't know what performance will be upon release. End of the day, we know neither the specifics of the hardware nor the customization Nintendo and Nvidia worked toward.

It's too soon to settle on givens.
 

J@hranimo

Banned
It seems that Swtch will use display port over usb-c https://twitter.com/mochi_wsj/status/811850362785148928

Yup! As well as a few other tidbits:

YFMLiQB.jpg


https://twitter.com/mochi_wsj/status/811851248106283008
https://twitter.com/mochi_wsj/status/811851139553456128
https://twitter.com/mochi_wsj/status/811850362785148928
 
Status
Not open for further replies.
Top Bottom