I don't have access to limiting the GPU speeds unfortunately. The kernel ignores the request, as well as for memory speeds too. I think I would need to modify and build the kernel, but that is beyond my abilities hah. The only way beyond that I guess would be making the chip hot enough so the GPU thermal throttles to my desired speed, such as taking a blow dryer to the Shield TV haha! I might be crazy enough to do that.
I'm not sure it's worth going to such lengths (particularly as it would be hard to infer too much from a GPU clock that's not perfectly stable).
Thraktor, you might need to turn on triple buffering when comparing with my results because Android forces it on (and another thing I can't turn off it seems!), and explains the no tearing on the Shield TV. Not sure how much that will affect tests.
I suspect the demos are triple-buffered anyway (no tearing on my end), but I'll check when I'm back on my PC. I probably should also note that they run windowed on PC, so when I say 1080p it's a maximised window so the actual resolution being rendered is a little less (maybe by 5-10%).
What's the die size of TX1? Some folks floated the 121mm^2, but I can't find a solid source on that.
I'm fairly certain the chip measures 11mm x 11mm so 121mm^2. The numbers is your post are the correct measurements for the package AFAIK. When I first bought my Shield TV, some forum posters at SemiAccurate and Beyond3D were tearing it apart and measuring too.
There's an iFixit teardown of the Pixel C, and if you go down to step 10 you can see the logic board, including the TX1. Comparing to the DRAM chips above it, it would seem that the 11mm x 11mm measurement is accurate, so 121mm² it is. (I had misremembered this as the measurement for the package, but that certainly looks like an exposed die).
I think we are safe from TN. Nintendo had been experimenting with IPS panels in the new 3DS, and I didn't see too many complaints. The newest releases by Nintendo indicate that the company is shifting to IPS for their screens.
I'd be a bit surprised if it's not IPS. Sourcing of 3DS screens was constrained by the 3D tech, whereas now it's a straightforward, standard resolution panel, so they shouldn't have any issues finding suppliers. The other thing is that TN screens seem to have pretty much disappeared at this kind of screen size, with even budget phones almost all using IPS screens these days. IPS is effectively the "cheap" phone display technology these days, with OLED replacing it as the more expensive option.
The one thing we really should be looking out for on the display front, though, is adaptive sync (i.e. G-sync, but probably without being called G-sync). For a fully integrated device like Switch, where Nintendo gets to choose and/or design everything from the OS to the APIs to the GPU to the display controller to the panel itself, supporting adaptive sync isn't actually all that expensive. There's up front R&D cost, to be sure, but Nvidia have obviously already done most of the R&D necessary. From that point it's a matter of using a display controller which can properly support adaptive-sync, and then getting all your software ducks in a row.
My only worry about this is that because Nintendo's internal software teams do an exceptionally good job of locking frame rates to either 30 or 60fps they might not see the value of adaptive sync. For third-party games it could be a huge benefit, though. Those inevitable not-quite-30fps ports would
feel a hell of a lot smoother on an adaptive sync display than a fixed refresh display even if the actual frame-rate is identical. Compared to using a bigger GPU (which costs money) or using a higher portable clock speed (which costs battery life), adaptive sync is an exceptionally cheap way to give players a smoother gaming experience in portable mode, so let's hope they go with it.
Creating a bespoke API isn't insubstantial.
I don't know how much of a bespoke API they've created, to be honest. In fact, compared to past hardware, where they actually have needed to create APIs from scratch (or nearly from scratch) to accommodate the new hardware, they could get themselves into a far better software and tools position than they've ever been in before with far less work involved.
From a graphics API perspective Vulkan already fits pretty much all their needs, and given they've joined the Vulkan working group and certified Switch as conformant to the Vulkan spec, it would certainly seem like they're using it as their main graphics API. Regarding tools and development, Nvidia already has a full (and reportedly excellent) set of development and debug tools which should be usable for Switch with minimal customisation. Then Gameworks includes plenty of graphics and physics libraries which should be usable on Switch with, again, relatively minimal work.
I suspect the 500 man-years probably does include quite a lot of software work (possibly even including work Nvidia was already doing anyway, such as bringing Gameworks to Vulkan), but if we're comparing the amount of work Nintendo and Nvidia need to put into software for Switch versus what Nintendo would have needed to do to get previous consoles to the same level, then Switch would be a bit of a cakewalk for them.