• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.

z0m3le

Banned
This is just a really silly day. A+B=C (A = Cuda Cores, B = Clock Speed, C = flops)

What we know is that this is actively cooled, so it should consume 6 watts or more.
We also know X1's power consumption and this configuration is around 3 watts for the SoC.
We know that Pixel C running manhattan benchmark with X1 consumes ~8 watts with a bigger brighter screen, with a CPU consuming 4watts alone and was passively cooled.

Yesterday we assumed 256 cuda cores, this configuration is simply too low of a power consumption.

Thraktor, you did a post a few months ago about different estimated power draws for different cuda core counts, considering we should expect 2.5watts minimum IMO from the gpu, with maxwell 20nm, what configuration would that lend us to? I do think 3SM is the safe bet, and fits with what we've been hearing, but I am not sure how much power draw 4SM would consume, from what I remember of that post, you were saying that more cores is more power efficient than higher clocks.

Thrakor said:
The bolded isn't true. In fact, pretty much the opposite is true, as in a tightly thermally constrained environment (i.e. a handheld) the marginal benefit to increased parallelism (i.e. more SMs) can be quite large.

To demonstrate, let's look at a power curve for Pascal I've put together. Unlike my previous power curves for A72 and A53 CPU clusters (which are based on solid real-world data from Anandtech and should be considered reasonably accurate), this is a much more rough approximation based on just four data points:

- TSMC's claims of "40% higher speed" and "60% power saving" over 20nm, each applied separately to the TX1's GPU drawing 1.5W at 500MHz (divided by 2 for 750mW per SM).
- Power draw readings from the GTX1080 before and after overclocking (full board power readings, minus GDDR5X, divided by number of SMs).

Obviously I'm extrapolating a lot from fairly poor data, but hopefully it should be in the right ballpark, and enough for our discussion in any case. (I should also note that this isn't strictly a measure of power draw for the SMs themselves, but rather a measure of the draw of an entire Pascal GPU "per SM", so including other components like ROPs, TMUs, etc., assuming they're always in roughly the same proportion to SMs). In any case, here's the power curve:

pascal_powercurve.png


The important thing to note is that, like virtually all IC power curves, it's not linear, and for a given increase in clock speed you require a much larger increase in power consumption to get you there. What this means is that you'll get better performance by using more SMs at a lower clock speed than fewer SMs at a higher clock speed.

Let's look at the clock speed (and raw floating point performance) that could be achieved with different numbers of SMs within the power constrains we might expect for a handheld GPU:

1x SM:

1000 mW - 780 MHz - 200 Gflops FP32 - 400 Gflops FP16
1500 mW - 915 MHz - 234 Gflops FP32 - 468 Gflops FP16
2000 mW - 1025 MHz - 262 Gflops FP32 - 525 Gflops FP16

2x SM:

1000 mW - 595 MHz - 305 Gflops FP32 - 609 Gflops FP16
1500 mW - 700 MHz - 358 Gflops FP32 - 717 Gflops FP16
2000 mW - 780 MHz - 400 Gflops FP32 - 800 Gflops FP16

3x SM:

1000 mW - 510 MHz - 392 Gflops FP32 - 783 Gflops FP16
1500 mW - 600 MHz - 461 Gflops FP32 - 922 Gflops FP16
2000 mW - 670 MHz - 515 Gflops FP32 - 1030 Gflops FP16

As you can see, a 3x SM configuration can achieve nearly the same performance with 1000mW that a 2x SM configuration can with twice that, and a full 50% more than a 1x SM config can manage with 2000mW at hand.

This isn't to say that I expect a 3x SM GPU in the NX, but there would certainly be a sizeable performance jump over 2x SMs if they decided to do so.
 

bomblord1

Banned
This is just a really silly day. A+B=C (A = Cuda Cores, B = Clock Speed, C = flops)

What we know is that this is actively cooled, so it should consume 6 watts or more.
We also know X1's power consumption and this configuration is around 3 watts for the SoC.
We know that Pixel C running manhattan benchmark with X1 consumes ~8 watts with a bigger brighter screen, with a CPU consuming 4watts alone and was passively cooled.

Yesterday we assumed 256 cuda cores, this configuration is simply too low of a power consumption.

Thraktor, you did a post a few months ago about different estimated power draws for different cuda core counts, considering we should expect 2.5watts minimum IMO from the gpu, with maxwell 20nm, what configuration would that lend us to? I do think 3SM is the safe bet, and fits with what we've been hearing, but I am not sure how much power draw 4SM would consume, from what I remember of that post, you were saying that more cores is more power efficient than higher clocks.

The eurogamer article quite literally says Nintendo briefed devs on 256 cuda cores and adding more is highly unlikely.
 

ggx2ac

Member
I don't remember reading this info.

Yeah, I never saw that either. Just that the clockspeeds are what the retail units will run at.

What the clockspeeds also tell me is that it feels unlikely that this would be 16nmFF for the SoC.

So all that leaves is the SM setup which is unknown.
 

EloquentM

aka Mannny
It's there

specs
This leaked spec actually appeared on Twitter before Nintendo's official reveal. Thought by many to be out of date or fake, we can confirm that Nintendo has briefed developers recently with the same information. One source tells us that the 4K30 aspect of the spec was not part of the developer presentation, but everything else was. We can assume that the clock-speeds are theoretical maximums, and not the 768/307.2MHz combo we've confirmed as locked in retail hardware.

CPU: Four ARM Cortex A57 cores, max 2GHz
GPU: 256 CUDA cores, maximum 1GHz
Architecture: Nvidia second generation Maxwell
Texture: 16 pixels/cycle
Fill: 14.4 pixels/cycle
Memory: 4GB
Memory Bandwidth: 25.6GB/s
VRAM: shared
System memory: 32GB, max transfer rate: 400MB/s
USB: USB 2.0/3.0
Video output: 1080p60/4K30
Display: 6.2-inch IPS LCD, 1280x720 pixels, 10-point multi-touch support
 

Vic

Please help me with my bad english
Yeah, I never saw that either. Just that the clockspeeds are what the retail units will run at.

What the clockspeeds also tell me is that it feels unlikely that this would be 16nmFF for the SoC.

So all that leaves is the SM setup which is unknown.
The only info we have is what the microarchitecture is based on, CPU, GPU and RAM clock speeds. That's it. A very limited set of info which isn't enough to jauge the level of performance of the device.
 

ggx2ac

Member
It's there

This leaked spec actually appeared on Twitter before Nintendo's official reveal.

They definitely have something wrong there. This was leaked after Switch was revealed and confirmed to be powered by Nvidia.

The person that "leaked" this did so to protect their arse after arrogantly dismissing claims it would be powered by Nvidia and claimed that NX would be powered by AMD and DMP.
 
Honestly.. What's to stop Nintendo from upclocking the cpu and gpu right before launch?They did that with the wii u apparently, right?.

More than likely they had the x1 from the get go and just down clocked to see how much they can improve with the battery life.
They could do it back to x1 specs, which is increase by around ~ 25% power which could be from 150-->200 TFLOPS in portable and 400-->512TFLOPS docked.

god this makes me sound like in the denial stage of grief


If the specs are really that mediocre, Nathan's battery life rumor of +5 hours might end up being legit all(he said the source is completely separate from the Pascal rumor he's known since July btw).

Here's how I see the next few years pan out
-march 2017 launch/launch dinwo Switch specs get a cpu+gpu clock speed boost via firmware update
-2019 switch with 16nm pascal comes out with 750 GFLOPS docked and same battery life(5-8)
-2019-2020 SCD or a separate home console entirely is released with specs in between ps4 and ps4 pro, with advertisement of its VR(if it takes off the next two years)
 

Hermii

Member
Honestly.. What's to stop Nintendo from upclocking the cpu and gpu right before launch?They did that with the wii u apparently, right?.

More than likely they had the x1 from the get go and just down clocked to see how much they can improve with the battery life.
They could do it back to x1 specs, which is increase by around ~ 25% power which could be from 150-->200 TFLOPS in portable and 400-->512TFLOPS docked.

god this makes me sound like in the denial stage of grief


If the specs are really that mediocre, Nathan's battery life rumor of +5 hours might end up being legit all(he said the source is completely separate from the Pascal rumor he's known since July btw).

Here's how I see the next few years pan out
-march 2017 launch/launch dinwo Switch specs get a cpu+gpu clock speed boost via firmware update
-2019 switch with 16nm pascal comes out with 750 GFLOPS docked and same battery life(5-8)
-2019-2020 SCD or a separate home console entirely is released with specs in between ps4 and ps4 pro, with advertisement of its VR(if it takes off the next two years)
Sounds more like a mixture of denial and bargaining.
 

Nuu

Banned
I'm amazed that people are disappointed by the specs. What were people expecting from Nintendo? It's been twelve years since they've taken the low spec route.
 
Well, pre release, they would probably tell devs to aim for the lowest possible outcome just in case there are yield or thermal issues with the production units...

It's safe to say that the clocks DF are talking about are the worst case scenario "aim for this and you can't overshoot" targets for launch window devs since they were prepping games without final hardware.

That said, this is Nintendo, they may stick with their "doomsday" GPU clocks for launch and slowly increase through firmware updates later, or never increase throughout the lifespan of the system in order to play it safe...

Or, maybe these clocks are their thermal threshold because they had designed the Switch with Pascal in mind but Nvidia missed the window so they had to go with Maxwell Tegra X1 as suggested in the DF video...

If this is the case, it wouldn't be the first time Nvidia missed their target when working with Nintendo. The 3DS was originally going to be powered by Tegra but Nvidia failed to deliver. Nintendo had to throw together a back-up plan and that's how we ended up with such and underpowered handheld.
 

antonz

Member
They definitely have something wrong there. This was leaked after Switch was revealed and confirmed to be powered by Nvidia.

The person that "leaked" this did so to protect their arse after arrogantly dismissing claims it would be powered by Nvidia and claimed that NX would be powered by AMD and DMP.

To be fair the origin of the "devkit" was not with Nikki. It was posted October 20th on pastebin by a French self claimed "hacker" who seems to do some stuff in the video game scene
 
Sounds more like a mixture of denial and bargaining.

I'm not exactly betting that they're going to upclock. But will be interesting to see what Nintendo does near launch. I know they are listening.

More console revisions/upgrades are inevitable in 2-3 years. Nintendo said the Switch is part of a family of systems, so likely we'll get a switch with pascal 16nm chip with a performance boost and same battery life(or better), and either an SCD or just a separate home dedicated console around PS4 to PS4 pro level.
 

random25

Member
When it comes to CPU/GPU specs, everything right now is speculation unless there's an official confirmation and bench testing of the final retail hardware. Not saying what DF said is entirely false, but I wouldn't deal much with by-the-numbers discussion and speculate further from those numbers until everything is 100% confirmed.

For all we know, the Switch is probably running on much worse clock on a much worse CPU lol.
 
I'm amazed that people are disappointed by the specs. What were people expecting from Nintendo? It's been twelve years since they've taken the low spec route.

Definitely not 0.5x to 1.0x Xbone in power. We were expecting xbone specs for handheld, and PS4 specs when docked with +5 hours of battery life for $200.
 
Well, pre release, they would probably tell devs to aim for the lowest possible outcome just in case there are yield or thermal issues with the production units...

It's safe to say that the clocks DF are talking about are the worst case scenario "aim for this and you can't overshoot" targets for launch window devs since they were prepping games without final hardware.

That said, this is Nintendo, they may stick with their "doomsday" GPU clocks for launch and slowly increase through firmware updates later, or never increase throughout the lifespan of the system in order to play it safe...

Or, maybe these clocks are their thermal threshold because they had designed the Switch with Pascal in mind but Nvidia missed the window so they had to go with Maxwell Tegra X1 as suggested in the DF video...

If this is the case, it wouldn't be the first time Nvidia missed their target when working with Nintendo. The 3DS was originally going to be powered by Tegra but Nvidia failed to deliver. Nintendo had to throw together a back-up plan and that's how we ended up with such and underpowered handheld.
Reminds me on what the PSP did. It would be bizarre for Nintendo to do something like that, but the words in the doc implied that it's possible.

Definitely not 0.5x to 1.0x Xbone in power. We were expecting xbone specs for handheld, and PS4 specs when docked with +5 hours of battery life for $200.
Who was this? No one I know.
 

z0m3le

Banned
The eurogamer article quite literally says Nintendo briefed devs on 256 cuda cores and adding more is highly unlikely.
It also briefed devs in the same picture with 1tflops performance (likely 16fp) and in the euro gamer article, second to last paragraph it speculates that it could have more Cuda cores, even if it then says it is unlikely, it shows that they can't confirm 256 cuda cores, so you probably shouldn't use them as a source to do so.
 

z0m3le

Banned
It's not like I'd actually mind these speculated specs, it's that I would be disappointed that they used a fan if it didn't need one, and this device with those clock speeds and the X1 configuration just doesn't need a fan with a vent on top. I live in Seattle and this device would simply die outside.
 
It's not like I'd actually mind these speculated specs, it's that I would be disappointed that they used a fan if it didn't need one, and this device with those clock speeds and the X1 configuration just doesn't need a fan with a vent on top. I live in Seattle and this device would simply die outside.

Assuming DF's specs are true(of course they still don't know number of cores), who's to say everything in the patent would be included in the retail version, like the fans? We don't know until then I guess.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
That's absurdly unreasonable unless you were hoping for an hour of battery life.
So. That's what they were expecting. That's what the rumors pointed to. Switch threads were full of it.
 
Switch threads were full of people claiming others expected this... Not actual people expecting it... There may have been 2?

By the time nvidia and hybrid was confirmed in the first reveal, nobody was really expecting ps4 specs. We were hoping Pascal and somewhere between 500GFLOPS to 1TFLOP when docked, based off rumors about Switch using Pascal.
 

Zedark

Member
That's absurdly unreasonable unless you were hoping for an hour of battery life.

This is just a neogaf meme you know? People weren't actually expecting xbox one 1 performance (they were expecting tegra x1 performance), yet everyone seems adamant in insisting people in speculation threads did expect XB1 performance.
 

ggx2ac

Member
Switch threads were full of people claiming others expected this... Not actual people expecting it... There may have been 2?

Oh yeah, there were at least 2 people I remember that were expecting a PS4 handheld. That was very early on in speculation though after the first Eurogamer report.

The GPU Flops were constrained to a Pascal Tegra especially since that's what we're told by NateDrake that he heard it was using Pascal and Parker was the only reference with 768 GFlops max.

So funny enough, there are random people that think we had insane expectations for what is revealed now when regarding insiders we thought 2x Wii U would be the minimum for power.

I wanted at least 2x Wii U minimum since it'd be cool to have a handheld twice as powerful as Nintendo's last console. Oh well, it's still a huge jump from the 3DS.

Until things get cleared up though, it's as though we're going through exactly what happened with 3DS that people were comparing it to the 360 because of its capabilities and references to things like how Capcom got Resident Evil 5 running on the 3DS but that can't run on the Wii.
 

z0m3le

Banned
Oh yeah, there were at least 2 people I remember that were expecting a PS4 handheld. That was very early on in speculation though after the first Eurogamer report.

The GPU Flops were constrained to a Pascal Tegra especially since that's what we're told by NateDrake that he heard it was using Pascal and Parker was the only reference with 768 GFlops max.

So funny enough, there are random people that think we had insane expectations for what is revealed now when regarding insiders we thought 2x Wii U would be the minimum for power.

I wanted at least 2x Wii U minimum since it'd be cool to have a handheld twice as powerful as Nintendo's last console. Oh well, it's still a huge jump from the 3DS.

Until things get cleared up though, it's as though we're going through exactly what happened with 3DS that people were comparing it to the 360 because of its capabilities and references to things like how Capcom got Resident Evil 5 running on the 3DS but that can't run on the Wii.

For a handheld with 2sm, it is still over 2x wii u.
 

ggx2ac

Member
For a handheld with 2sm, it is still over 2x wii u.

I don't know the exact details but I get that the Maxwell microarchitecture isn't 1:1 with what the Wii U's GPU was.

And of course we still have to wait for the actual components to be revealed to compare.

So it could technically still be 2x Wii U with better memory bandwidth, deferred rasterizer tech, 64-bit ARM CPUs whereas Wii U still used 32-bit etc etc

All while running possibly at 5W seeing as it's downclocked by a lot while the Wii U was around 35W iirc.
 
Oh yeah, there were at least 2 people I remember that were expecting a PS4 handheld. That was very early on in speculation though after the first Eurogamer report.

The GPU Flops were constrained to a Pascal Tegra especially since that's what we're told by NateDrake that he heard it was using Pascal and Parker was the only reference with 768 GFlops max.

So funny enough, there are random people that think we had insane expectations for what is revealed now when regarding insiders we thought 2x Wii U would be the minimum for power.

I wanted at least 2x Wii U minimum since it'd be cool to have a handheld twice as powerful as Nintendo's last console. Oh well, it's still a huge jump from the 3DS.

Until things get cleared up though, it's as though we're going through exactly what happened with 3DS that people were comparing it to the 360 because of its capabilities and references to things like how Capcom got Resident Evil 5 running on the 3DS but that can't run on the Wii.

Counting architectural differences, supposedly it will be around 1.5x as powerful undocked. And with dock mode, could be up to 4x. A shame Dock mode is just going to increase resolution only though.

I wonder if there will be games that will take advantage of dock mode power only.
 

Vic

Please help me with my bad english
Counting architectural differences, supposedly it will be around 1.5x as powerful undocked. And with dock mode, could be up to 4x. A shame Dock mode is just going to increase resolution only though.

I wonder if there will be games that will take advantage of dock mode power only.
Forcing the resolution to 1080p when in console mode might not be a strict requirement. 900p + better image filtering techniques could happen.
 
I'm amazed that people are disappointed by the specs. What were people expecting from Nintendo? It's been twelve years since they've taken the low spec route.
It's one thing for us to expect them to use weak hardware to save money. It's another thing for us to expect them to design a custom SOC just to downclock said hardware. I'm not sure how we could have seen that coming haha.
 

Speely

Banned
Forcing the resolution to 1080p when in console mode might not be a strict requirement.

It's absolutely not, and likely will never be for this iteration of the Switch. The EG piece even details how devs can intentionally "throttle" (my paraphrasing, not theirs afaik) docked performance for devs who just want to target the portable mode and call it a day.

Edit: Just saw your edit... I see that you were remarking upon devs utilizing other benefits as opposed to increased resolution, not just commenting on a forced rez increase. Apologies for the misunderstanding.
 

ggx2ac

Member
Forcing the resolution to 1080p when in console mode might not be a strict requirement. 900p + better image filtering techniques could happen.

Plus, they made it easier for smaller devs to not have to upclock the Switch while docked for both the GPU and RAM so there'll be games with no change at all while docked. That might anger some people. lol

I'm thinking Pokémon Stars will be the first to do this.
 
Plus, they made it easier for smaller devs to not have to upclock the Switch while docked for both the GPU and RAM so there'll be games with no change at all while docked. That might anger some people. lol

I'm thinking Pokémon Stars will be the first to do this.

Watch Stars be GCN level 3ds graphics with 720p to 1080p resolution.

The salt mountain will be seen from space by astronauts...
 
Status
Not open for further replies.
Top Bottom