That's why you need a fan.
lol
That's why you need a fan.
probably could've used an internal fan
of course I'm late
Rather than customise the SoC to remove them they could instead customise to allow them to run in conjunction with higher powered cores in order to run backgrounds tasks like the OS? Not that you'd need 4 A53's to run a OS I suppose but a couple of them would do it with a tiny power draw.
That's because, from all rumours and circumstantial evidence, it seems like 20nm is a bad node. The only advantage over 28nm is a smaller footprint. Its disadvantages are high cost, mediocre perf/watt and subpar thermals (it doesn't run a lot cooler than 28nm, like it should). It's the reason Qualcomm retained an improved 28nm process for midrange chips like the 650/652 while moving to 14nm for 820/821.They could pick 20nm but they stuck a fan in the Switch when it shouldn't be necessary with how low clocked it is.
TSMC have apparently cut the price of 28nm and 20nm nodes by 10% a few months ago but I assume that 28nm is still cheaper in comparison.
I was following your post on this topic, and it led me to the same question: why was that info with the 1T GFLOPS (fp16) in a recent briefing. And why there was a clock speed doc leak without the listing for the type of CPU, core count, cuda count, etc? I don't think it's farfetch to assume that Nintendo is aiming for performance of that TX1 kit via customizations.Ok I can tell you're just going to repeat the same thing over and over and ignore anything I say so I wont bother anymore, life is too short to keep repeating myself
I'll just say its going to be interesting to see the whole picture. Because like I said, Nintendo don't do off the shelf GPU's and down clocks do not make a custom GPU.
Also, I take the fact that the Tegra X1's A53 cores haven't been mentioned in any dev leaks as proof they're OS-locked and not dev-accessible.
Actual switch developers have been confirmed by digital foundry to have been briefed with those same specs. It doesn't matter if they were pulled from another development board. Devs working on the system have confirmed that's what they are working with.
The lesson from this is never trust in "insiders".
We don't know if the actual dev kits have a small form factor or similar to a retail Switch, in which case having a fan makes sense.
No that I recall, Nate said it was a TX1 Maxwell, Emily said it was close to XBONE and was confirmed by Laura, it doesn't seems the case anymore, OsirisBlack said it would be able to handle XBONE/PS4 ports with out much problem. But yeah, we had missing this down clock gift from Nintendo all this time.
It was emphasized that the briefing happened recently.DF said developers have been briefed on those TX1 devkit specs. We don't know when though. It could very well be when those TX1 devkits were sent out before July.
The fact that DF is explicitly saying there are likely customizations they don't know about clearly says they're not presenting those specs as any sort of confirmation.
DF said developers have been briefed on those TX1 devkit specs. We don't know when though. It could very well be when those TX1 devkits were sent out before July.
The fact that DF is explicitly saying there are likely customizations they don't know about clearly says they're not presenting those specs as any sort of confirmation.
This thing will be marketed as a home console on the go. If multi platform games are going to be severely gimped, or not even possible to get to run in some cases it's going to hurt it a lot like the Vita.
It was emphasized that the briefing happened recently.
They said as much in their Youtube Video. They only have a source for the clocks, and that the Dev Kits line up with the leaked specs.
It'll basically just be another Nintendo handheld and perform as such in sales if they are lucky.
I do think it had the added benefit of Nintendo's entire output as well as console only features like local multiplayer on one unit right out the box.It'll basically just be another Nintendo handheld and perform as such in sales if they are lucky.
It makes no sense that developers would know the clock speeds and not the core count, the people who leaked these clocks (not eurogamer, but their source) is causing a lot of needless speculation here. Clock speeds without core counts for CPU and GPU are not going to give developers performance numbers to target. I hope someone comes forward with core counts. As for the whole 28nm vs 20nm, 16nm is more likely than 28nm, 28nm is just really out there, not to say it isn't impossible but it is wacky tinfoil hat crazy.
The pie in the sky scenario is that Nintendo was approximating performance of the final lower clocked custom chip that has more SM's/Cores by using a full clocked TX1 in the dev kits that was readily available for people to work on.
Is it possible that the dev kits were stock X1s overclocked as speculated (because of the loud fan) and that Nintendo was just approximating a performance target that they would later downclock and use more SMs to hit?
I am asking because early rumors mentioned a 3-hour gaming battery life, and according to NateDrake they are now targetting 5-8 (don't give me any NateDrake guff... sources can be mistaken and plans can change.)
If the original dev kits were running at higher clocks with standard X1s, 3 hours sounds about right since dev kits don't need good battery life. They just need to hit performance targets on the same architecture, right? High clock speeds eat up power quickly.
So cheap X1s running at higher clocks emulating the performance of lower-clocked final chips with more SMs (which would be more power-efficient than higher clocks) at least seems like a possibility to me as a cheaper and faster way to get devs up and running while still optimizing for a less power-hungry final retail model.
If it doesn't end up happening, I will still be excited, but I am tentatively considering a 3 SM setup right now. It's doable and makes more sense to me than Nintendo considerably downgrading the device after dev kits went out (this is assuming that the X1s were indeed overclocked and not just so inefficient that they needed fans.)
Is it possible that the dev kits were stock X1s overclocked as speculated (because of the loud fan) and that Nintendo was just approximating a performance target that they would later downclock and use more SMs to hit?
I am asking because early rumors mentioned a 3-hour gaming battery life, and according to NateDrake they are now targetting 5-8 (don't give me any NateDrake guff... sources can be mistaken and plans can change.)
If the original dev kits were running at higher clocks with standard X1s, 3 hours sounds about right since dev kits don't need good battery life. They just need to hit performance targets on the same architecture, right? High clock speeds eat up power quickly.
So cheap X1s running at higher clocks emulating the performance of lower-clocked final chips with more SMs (which would be more power-efficient than higher clocks) at least seems like a possibility to me as a cheaper and faster way to get devs up and running while still optimizing for a less power-hungry final retail model.
If it doesn't end up happening, I will still be excited, but I am tentatively considering a 3 SM setup right now. It's doable and makes more sense to me than Nintendo considerably downgrading the device after dev kits went out (this is assuming that the X1s were indeed overclocked and not just so inefficient that they needed fans.)
I said the same on the previous page. It's unlikely but not entirely implausible.
If X1 dev kits were clocked at 1.152ghz it would give you exactly the same performance as 3SM at 768mhz and would draw much less power in the 3SM result, (590gflops fp32 or 1.18tflops fp16) which fits with everything we heard before, it just isn't something we know and so assuming that becomes more optimistic than maybe we should be. 393gflops might sound like a big down grade, but it is still just outside the pessimistic view of pixel c. The handheld performance actually doesn't matter at all since it is worked out to perform identical to the docked performance at a lower resolution. My big "hope" is that they went with 6 A57 cores @ 1ghz rather than 4, 4 is enough to port games, but it will be a hassle when we've been hearing the opposite for months, it is difficult to figure out. 6 A57 cores @ 1ghz is similar to 6 jaguar cores around 1.5ghz but that is a pretty rough estimation.
Just wanted to ask as I have read it a time or two: lower clocked chips with more SMs can improve the battery life in comparison to higher clocked chips with less SMs?
Is it possible that the dev kits were stock X1s overclocked as speculated (because of the loud fan) and that Nintendo was just approximating a performance target that they would later downclock and use more SMs to hit?
I am asking because early rumors mentioned a 3-hour gaming battery life, and according to NateDrake they are now targetting 5-8 (don't give me any NateDrake guff... sources can be mistaken and plans can change.)
If the original dev kits were running at higher clocks with standard X1s, 3 hours sounds about right since dev kits don't need good battery life. They just need to hit performance targets on the same architecture, right? High clock speeds eat up power quickly.
So cheap X1s running at higher clocks emulating the performance of lower-clocked final chips with more SMs (which would be more power-efficient than higher clocks) at least seems like a possibility to me as a cheaper and faster way to get devs up and running while still optimizing for a less power-hungry final retail model.
If it doesn't end up happening, I will still be excited, but I am tentatively considering a 3 SM setup right now. It's doable and makes more sense to me than Nintendo considerably downgrading the device after dev kits went out (this is assuming that the X1s were indeed overclocked and not just so inefficient that they needed fans.)
^ Make that two.
I wonder if anyone here can try to get in touch with the author of the DF article on Twitter or something to see if he can clarify what exactly they mean when they say "developers have recently been briefed" on those stock TX1 specs. Clearly based on their speculation about CUDA cores this doesn't mean those specs are confirmed, but I'm curious if it just refers to recent devkits.
There's also another scenario that we can consider, and it's Mario Switch performances. We heard from LKD or Emily (i think) that performances weren't good until recently (september-november), and then they got fixed. Now i know that simple optimization and new builds can fix these issues, and it's most likely just that... but what if the older build was on older kits with stock tx1 (512gflops) and the new one is on the final hardware with 3SM? At the clockspeed revealed by DF, it would be 590 gflops.I said the same on the previous page. It's unlikely but not entirely implausible.
Yeah the CPU is the most underwhelming part for sure. I really didn't expect that clock (my worst case scenario was 1.4GHZ).Yeah the more I consider the GPU part of this rumor the less important it feels. It was never going to be that high in portable mode- around 200GF would seem appropriate if the TX1 was at its max clock when docked.
But the CPU situation sounds like a real problem, especially in light of everything we've been hearing. Hopefully they customized it with 2-4 additional A57s at the very least.
Just wanted to ask as I have read it a time or two: lower clocked chips with more SMs have better battery life than higher clocked chips with less SMs?
It makes no sense that developers would know the clock speeds and not the core count, the people who leaked these clocks (not eurogamer, but their source) is causing a lot of needless speculation here. Clock speeds without core counts for CPU and GPU are not going to give developers performance numbers to target. I hope someone comes forward with core counts. As for the whole 28nm vs 20nm, 16nm is more likely than 28nm, 28nm is just really out there, not to say it isn't impossible but it is wacky tinfoil hat crazy.
If it doesn't end up happening, I will still be excited, but I am tentatively considering a 3 SM setup right now. It's doable and makes more sense to me than Nintendo considerably downgrading the device after dev kits went out (this is assuming that the X1s were indeed overclocked and not just so inefficient that they needed fans.)
Based on a chart posted by Thraktor a few months ago, yes this is accurate. I think this is because power draw increases exponentially as clocks increase but linearly as SMs increase.
It would be easier to cool since you have same heat spread over more surface area. Contrarily it'll be more costly to manufacture.
28nm is so asinine I'm having a hard time formulating a response to it. I know Nintendo are seeing as using old tech, but you'd have to believe they went to Nvidia and saw the finished design on the 20nm Tegra X1 and said "Hey, can we pay you a bunch of unnecessary money to make that run even shittier on an older process node?"
Given what we know, I don't even think the number of SMs being greater than 2 is unlikely. Otherwise I'm somewhat perplexed at the inclusion of the fan.
Even though I'm confident on 2 SM's I just thought I would throw this out.
What about the possibility of a 16nm chip with 4 SM's? Would that account for the fan and clocks? It would also explain why the system could have been originally running on OC'd 2SM setup.
The lesson from this is never trust in "insiders".
Since this is the first time they've built a new console from scratch since the Gamecube, looking at their decisions there might be a good way to determine their thinking here. Was the Gamecube built on the most modern process node?
Although since this is more like a portable that might not be the most comparable situation. But with it being a portable you'd expect them to go with the smaller more efficient process node to begin with.
Who the hell knows...
Since this is the first time they've built a new console from scratch since the Gamecube, looking at their decisions there might be a good way to determine their thinking here. Was the Gamecube built on the most modern process node?
Although since this is more like a portable that might not be the most comparable situation. But with it being a portable you'd expect them to go with the smaller more efficient process node to begin with.
Who the hell knows...
My lesson was "never trust in anybody but LKD, not even Eurogamer & Digital Foundry guys". Eurogamer have been wrong couple of times before (where is my Mother 3?) and I don't know who the Digital Foundry are. LKD's track record still looks like best one, she's yet to be wrong. And if she says Switch is much powerful than Wii U in portable mode and Dark Souls 3 is running on Switch on satisfactory level, I believe her. Ignore everything else.