• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.

antonz

Member
I didn't see those tests earlier. Those are very interesting results. I admit that I'm a bit surprised on how often are seeing 768MHz. It does seem like they were really focused on balancing the clocks to the max stabilitized speed. Is the Switch also a smaller form factor compared to the Shield TV?

Switch is roughly the same size of the new smaller basic Shieldtv model but it is only half an inch thick versus the shieldTV being an inch thick.
 

Donnie

Member
The new Shield TV is quite a bit shorter and narrower than Switch. Its thicker though which makes it around the same size as Switch overall.*

*Based on generally accepted estimates of Switch's dimensions.
 
Roughly half the depth.

It appears that is is a smaller form factor but that is only based on appearances

Will go with Vena on this one

Switch is roughly the same size of the new smaller basic Shieldtv model but it is only half an inch thick versus the shieldTV being an inch thick.
Thanks. And the Switch has a screen and runs on batteries. Considering the little space they had, can we be sure that they didn't use 16nm to reach the base performance that we are looking at?
 

Donnie

Member
In one of my tests, if the CPU was actively used when clocked at 1GHz, then the GPU would throttle depending on how high the CPU usage was.

It's because of these tests, the way the GPU throttles quite easily when the CPU uses a lot of power, and the Eurogamer clock speed leaks which lead me come to the conclusion that Nintendo / Nvidia have used a TX1 as a base and done light modifications (HDMI spec difference, USB C stuff that isn't present in the TX1 that we know is in the patents) but hopefully they did increase the memory bus width. They don't really need to do anything else to the GPU. If so, it could be speculated to be viewed as a sort of Parker design but on the 20nm node.

Cheaper R&D costs because the TX1 is a well designed chip in the first place means less to change and modify I believe.

Well I agree your tests suggest a similar node and power draw and certainly its going to be based on that SoC. But I still expect it to be custom, more so than just input/output connectors. I think if it was simply a TX1 they'd have called it a TX1. Even if its just a few small changes within the GPU and a cache redesign, its going to be a custom SoC.
 

antonz

Member
Just to clarify on size too. The Joycons certainly make it larger but all the hardware is within the tablet so feel its more comparable.

Basic Shield TV 2017
3.858 inches Height
6.26 inches Wide
1.02 Inches Depth

Estimated Tablet Size of Switch
4.19 inches Height
5.83 inches Wide
.55 inches Depth
 

Donnie

Member
Shield TV 2017:-

Height: 98 mm (3.85 inches)
Width: 159 mm (6.259 inches)
Depth: 25 mm (1.01 inches)

Switch*

Height: 106 - 115 mm (4.17 - 4.53 inches)
Width - 184 - 196 mm (7.24 - 7.71 inches)
Depth - 15 mm (0.59 inches)

*Size based on estimates:

http://imgur.com/a/u6Vhp
http://arstechnica.co.uk/gaming/2016/10/how-big-is-the-nintendo-switch-screen/

EDIT Hadn't seen your post before posting Antonz, so not trying to correct you. Though I think the number you have for Switch width is a mistake in the arstechnica article.
 

AmyS

Member
In one of my tests, if the CPU was actively used when clocked at 1GHz, then the GPU would throttle depending on how high the CPU usage was.

It's because of these tests, the way the GPU t

hrottles quite easily when the CPU uses a lot of power, and the Eurogamer clock speed leaks which lead me come to the conclusion that Nintendo / Nvidia have used a TX1 as a base and done light modifications (HDMI spec difference, USB C stuff that isn't present in the TX1 that we know is in the patents) but hopefully they did increase the memory bus width. They don't really need to do anything else to the GPU. If so, it could be speculated to be viewed as a sort of Parker design but on the 20nm node.

Cheaper R&D costs because the TX1 is a well designed chip in the first place means less to change and modify I believe.

I really hope they increased the memory bus width. If the Switch uses the standard 64-bit bus that Tegra X1 has, that means Nintendo has stuck with a 64-bit memory bus for 5 console generations in a row (Nintendo 64, GameCube, Wii, Wii U and now very likely the Switch, unless they took elements of Parker like the 128-bit bus).

Although this has absolutely no bearing on Switch (just a bit of trivia, no pun intended) Nvidia introduced a 128-bit memory bus in 1997 with the RIVA 128 card.
 

Donnie

Member
Its better to have a narrower bus at a higher frequency than a wider bus at a lower frequency (given the same overall bandwidth). So increased clock frequency is the better way to go compared to widening the memory bus.

Of course considering we already know Switch's memory speed obviously the wider the bus the better. Still think we'll get some kind of new cache design in the SoC that will alleviate main memory bandwidth though.
 

antonz

Member
Shield TV 2017:-

Height: 98 mm (3.85 inches)
Width: 159 mm (6.259 inches)
Depth: 25 mm (1.01 inches)

Switch*

Height: 106 - 115 mm (4.17 - 4.53 inches)
Width - 184 - 196 mm (7.24 - 7.71 inches)
Depth - 15 mm (0.59 inches)

*Size based on estimates:

http://imgur.com/a/u6Vhp
http://arstechnica.co.uk/gaming/2016/10/how-big-is-the-nintendo-switch-screen/

EDIT Hadn't seen your post before posting Antonz, so not trying to correct you. Though I think the number you have for Switch width is a mistake in the arstechnica article.

I do actually agree with you looking at the Width. A quick look at 6.2" LCD has shown on average 155mm width. So yeah I would say at a minimum looking at 7 inch width because they are completely missing not only bezel but the extra space around the bezel
 

bumpkin

Member
I wonder when Nintendo is going to add Switch to the Developer portal. It's still just Wii U and 3DS, no mention of the Switch anywhere. :(
 

Vash63

Member
I really hope they increased the memory bus width. If the Switch uses the standard 64-bit bus that Tegra X1 has, that means Nintendo has stuck with a 64-bit memory bus for 5 console generations in a row (Nintendo 64, GameCube, Wii, Wii U and now very likely the Switch, unless they took elements of Parker like the 128-bit bus).

Although this has absolutely no bearing on Switch (just a bit of trivia, no pun intended) Nvidia introduced a 128-bit memory bus in 1997 with the RIVA 128 card.

Memory bus width is based on the actual bus width of the RAM multiplied by number of parallel chips. It is not a spec that improves with technology as a wider bus also means more board traces and memory chips - something that does not increase with time. Bus width can be better correlated to cost than technical generation.
 

M3d10n

Member
I wonder when Nintendo is going to add Switch to the Developer portal. It's still just Wii U and 3DS, no mention of the Switch anywhere. :(

Definitely not before the presentation since anyone can create a developer account and read all the dev documents, which would reveal all the specs right away.
 

Mr Swine

Banned
Hmm isn't it possible that they have gone with a 128bit bus if they plan to release a New Switch down the road with more memory like they did with true DSI and New 3DS? They can't do that with a 64bit bus
 

LordOfChaos

Member
In one of my tests, if the CPU was actively used when clocked at 1GHz, then the GPU would throttle depending on how high the CPU usage was.

It's because of these tests, the way the GPU throttles quite easily when the CPU uses a lot of power, and the Eurogamer clock speed leaks which lead me come to the conclusion that Nintendo / Nvidia have used a TX1 as a base and done light modifications (HDMI spec difference, USB C stuff that isn't present in the TX1 that we know is in the patents) but hopefully they did increase the memory bus width. They don't really need to do anything else to the GPU. If so, it could be speculated to be viewed as a sort of Parker design but on the 20nm node.

Cheaper R&D costs because the TX1 is a well designed chip in the first place means less to change and modify I believe.



I have a feeling you're right about the lighter modifications, the self admitted 500 people years (250 people x 2 years) really only sounds impressive outside of practically any modern chip, if you include making a whole new API in there (even with Vulkan/Mantle as a base to work off), it's really not much. On a scale of TX1 to fully custom I'd expect it to be much closer to the TX1 side then.
If there's a lot more to it, I'd happily be wrong!

And agreed on memory too, it's been Nintendo's concern since N64 and it may be one of the things they touch on. This is a good thing, bandwidth is where mobile has been strangled even while the chips themselves superseded the last generation of consoles in power.
 

Narroo

Member
I have a feeling you're right about the lighter modifications, the self admitted 500 people years (250 people x 2 years) really only sounds impressive outside of practically any modern chip, if you include making a whole new API in there (even with Vulkan/Mantle as a base to work off), it's really not much. On a scale of TX1 to fully custom I'd expect it to be much closer to the TX1 side then.
If there's a lot more to it, I'd happily be wrong!

And agreed on memory too, it's been Nintendo's concern since N64 and it may be one of the things they touch on. This is a good thing, bandwidth is where mobile has been strangled even while the chips themselves superseded the last generation of consoles in power.
Maybe they meant 500 man years in terms of time charged? If that's the case, it's more like 8.5 years for 250 people. Usually man-hours and man-years are quoted in time worked, not time employed, though I could be wrong.
 

Hermii

Member
Maybe they meant 500 man years in terms of time charged? If that's the case, it's more like 8.5 years for 250 people. Usually man-hours and man-years are quoted in time worked, not time employed, though I could be wrong.

That sounds like a hell of a lot, I doubt it. I think a 250 person team working for 2 years sounds.a lot too :p

Im on team lightly modified TX1.
 

Donnie

Member
I'm sure its not going to be nearly as custom as WiiU's SoC. But I'm still going for significant modifications. I mean Pica200 had a whole set of instructions and eDRAM added and Nintendo still didn't consider it to be a custom GPU. I see no reason for them to claim this SoC is custom if that's not the case.

Only 12 hours left until the reveal I suppose. I mean I don't expect specs to be announced but surely there'll be a lot of developer interviews and we may find something out..
 

usmanusb

Member
I'm sure its not going to be nearly as custom as WiiU's GPU. But I'm still going for significant modifications. I mean Pica200 had a whole set of instructions and eDRAM added and Nintendo still didn't consider it to be a custom GPU. I see no reason for them to claim this GPU is custom if that's not the case.

Only 12 hours left until the reveal I suppose. I mean I don't expect specs to be announced but surely there'll be a lot of developer interviews and we may find something out..

Mostly Nintendo mentioned modestly but in reality it is more than modest.
 

usmanusb

Member
What is your opinion of using switch as a VR in a similar manner as Google cardboard Vr but with some modifications.. It will be affordable headset
 

z0m3le

Banned
Well Switch is almost exactly the same internal area as the new Shield TV. Which will have a 2Ghz CPU (using 4 x as much power to run as a A57 at 1Ghz). Didn't MDave's tests show that with the CPU at 1Ghz no GPU throttling occurred? (full 1Ghz). So cooling a similarly performing GPU locked at 768Mhz should be no issue at all with that CPU speed in Switch's casing.

I think MDave's tests help explain why Nintendo chose a 1Ghz CPU (as well as the massive increase in power draw at higher speeds). But I don't think it does anything to suggest Switch's GPU is simply a Tegra X1 GPU. I still expect a custom GPU and see no reason to change that view at all.

Shield TV is 26mm thick (actually 1mm thicker than Shield TV from 2015) Switch is 15mm thick, so I don't know where you got that info from but Switch has a much smaller area to cool a similar power consumption thanks to the need to charge a battery and the screen... which I didn't factor into the estimation of 13watts, so there is that...

PS the battery takes up space, it won't be available for airflow, and it's going to be something like twice as big as the 3DS batteries. I'm also still thinking the idea that Nintendo and Nvidia would order 10s of millions of switch chips at 20nm is pretty crazy, but if it can work, then that is what it is. I'm just skeptical that the smaller switch can cool as well as the larger shield tv.
 

Donnie

Member
Shield TV is 26mm thick (actually 1mm thicker than Shield TV from 2015) Switch is 15mm thick, so I don't know where you got that info from but Switch has a much smaller area to cool a similar power consumption thanks to the need to charge a battery and the screen... which I didn't factor into the estimation of 13watts, so there is that...

I didn't say it was as thick, I'm talking about the overall size. Switch is slimmer, but its also taller and wider than the new Shield TV. Most estimates have it somewhere between 80-90% the overall area of Shield TV.
 

LordOfChaos

Member
Maybe they meant 500 man years in terms of time charged? If that's the case, it's more like 8.5 years for 250 people. Usually man-hours and man-years are quoted in time worked, not time employed, though I could be wrong.


Which would scale comparisons to any other chip making accordingly, in which case I wouldn't have to change my point. I don't think it's that though as it would scale chips that took 600 engineers 5 years to crazy proportions:

http://www.investopedia.com/terms/m/manyear.asp

A method of describing the amount of work done by an individual throughout the entire year. The man-year takes the amount of hours worked by an individual during the week and multiplies it by 52 (or the number of weeks worked in a year).

Anyways, we don't really know if that's all engineers, or a mix of marketing and management, or if all of those 500 man years went to Switch or some were including progress of general GPU technology, etc.
 

MuchoMalo

Banned
I have a feeling you're right about the lighter modifications, the self admitted 500 people years (250 people x 2 years) really only sounds impressive outside of practically any modern chip, if you include making a whole new API in there (even with Vulkan/Mantle as a base to work off), it's really not much. On a scale of TX1 to fully custom I'd expect it to be much closer to the TX1 side then.
If there's a lot more to it, I'd happily be wrong!

And agreed on memory too, it's been Nintendo's concern since N64 and it may be one of the things they touch on. This is a good thing, bandwidth is where mobile has been strangled even while the chips themselves superseded the last generation of consoles in power.

Okay, I'm surprised that for how many times this has been mentioned nobody has said anything but... I don't think that 250 people were working 24/7 for two years straight. Also, everyone keeps comparing this to making a new GPU core from the ground up...


Edit: Never mind on the first part, but still does it really take 2 years to just decide on clock speeds? I don't think it does or else skipping 20nm would have been a much bigger setback for AMD and Nvidia.
 

z0m3le

Banned
Shield TV 2017:-

Height: 98 mm (3.85 inches)
Width: 159 mm (6.259 inches)
Depth: 25 mm (1.01 inches)

Switch*

Height: 106 - 115 mm (4.17 - 4.53 inches)
Width - 184 - 196 mm (7.24 - 7.71 inches)
Depth - 15 mm (0.59 inches)

*Size based on estimates:

http://imgur.com/a/u6Vhp
http://arstechnica.co.uk/gaming/2016/10/how-big-is-the-nintendo-switch-screen/

The 3ds XL battery is 1800mah, I expect Switch's battery to have a larger capacity and be physically larger as well.

Height: 68mm
Width: 38mm
Depth: 6.7mm

This is a 3rd party battery because I couldn't find dimensions for the first party battery (they should be the same size, since they fit in a compartment on the device and have the same capacity) and am not at home to measure them myself. It's a lot of dead space inside the device that we aren't taking into account, the depth is already an issue, as you effectively cut air space in half, so yes if X1 is throttling, there is some hard questioning on whether the switch can actually cool the 20nm version of X1.

https://www.amazon.com/dp/B0121SAK5I/?tag=neogaf0e-20
 

LucidFlux

Member
I have a feeling you're right about the lighter modifications, the self admitted 500 people years (250 people x 2 years) really only sounds impressive outside of practically any modern chip, if you include making a whole new API in there (even with Vulkan/Mantle as a base to work off), it's really not much. On a scale of TX1 to fully custom I'd expect it to be much closer to the TX1 side then.
If there's a lot more to it, I'd happily be wrong!


You're right. If you look at Nvidia's blog post, it seems the 500 man-years covers the entirety of their involvement in switch's development, not solely the chip. Who knows how much of that 500 went towards modifying the SoC. I guess we'll see soon enough. Here is the quote from Nvidia:

"But creating a device so fun required some serious engineering. The development encompassed 500 man-years of effort across every facet of creating a new gaming platform: algorithms, computer architecture, system design, system software, APIs, game engines and peripherals. They all had to be rethought and redesigned for Nintendo to deliver the best experience for gamers, whether they’re in the living room or on the move."
 

Donnie

Member
The 3ds XL battery is 1750mah, I expect Switch's battery to have a larger capacity and be physically larger as well.

Height: 68mm
Width: 38mm
Depth: 6.7mm

This is a 3rd party battery because I couldn't find dimensions for the first party battery and am not at home to measure them myself. It's a lot of dead space inside the device that we aren't taking into account, the depth is already an issue, as you effectively cut air space in half, so yes if X1 is throttling, there is some hard questioning on whether the switch can actually cool the 20nm version of X1.

https://www.amazon.com/dp/B0121SAK5I/?tag=neogaf0e-20

Yeah there's no doubt that there'll be less space inside to some degree. I just think the fact Nvidia have made the new Shield so much smaller shows that internal space was never any issue in the original Shield TV as far as throttling goes. Which kind of makes sense when you look at the cooling solution they used. Will a bit less space again necessarily cause an issue?, I don't know with any certainty I suppose.

I mean I'm not saying I think the SoC is definitely 20nm, it could be 16nm who knows.
 

z0m3le

Banned
You're right. If you look at Nvidia's blog post, it seems the 500 man-years covers the entirety of their involvement in switch's development, not solely the chip. Who knows how much of that 500 went towards modifying the SoC. I guess we'll see soon enough. Here is the quote from Nvidia:

"But creating a device so fun required some serious engineering. The development encompassed 500 man-years of effort across every facet of creating a new gaming platform: algorithms, computer architecture, system design, system software, APIs, game engines and peripherals. They all had to be rethought and redesigned for Nintendo to deliver the best experience for gamers, whether they’re in the living room or on the move."

They also say it's a custom chip, the 500 man years is pointless to speculate on, because you could bulk all Tegra's development itself into it as well. It's just not a useful number.
 

Hermii

Member
Edit: Never mind on the first part, but still does it really take 2 years to just decide on clock speeds? I don't think it does or else skipping 20nm would have been a much bigger setback for AMD and Nvidia.
No, Mdave figured out these Clock speeds are a good sweetspot for a tx1 in a few hours lol. They obviously did more than that.
 

MuchoMalo

Banned
Creating a bespoke API isn't insubstantial.

I know, but the thing is that what you guys are talking about is an off-the-shelf X1 with lower clocks, and I don't think anybody would call that custom at all. How much work does it take to change the number of cores, really? I also highly doubt that the API is from the ground up, but if it is it would indeed take up pretty much all of that work, I admit.

And it still doesn't change the fact that it's silly to compare it to making a new GPU architecture.
 

Hermii

Member
I know, but the thing is that what you guys are talking about is an off-the-shelf X1 with lower clocks, and I don't think anybody would call that custom at all. How much work does it take to change the number of cores, really? I also highly doubt that the API is from the ground up, but if it is it would indeed take up pretty much all of that work, I admit.

And it still doesn't change the fact that it's silly to compare it to making a new GPU architecture.
Most expect an enchanced memory subsystem. That takes some work.
 

LordOfChaos

Member
I know, but the thing is that what you guys are talking about is an off-the-shelf X1 with lower clocks, and I don't think anybody would call that custom at all. How much work does it take to change the number of cores, really? I also highly doubt that the API is from the ground up, but if it is it would indeed take up pretty much all of that work, I admit.

And it still doesn't change the fact that it's silly to compare it to making a new GPU architecture.


A lot of the things you're saying we're talking about aren't what we're talking about. A new memory hierarchy (as mentioned) for instance wouldn't be insubstantial either. I'm just saying I think if TX1 was a 1 on an arbitrary scale and fully custom was a 10, I'd expect this to be a 3, kinda thing.


Comparing the work involved for a fully new GPU is a fair point just to say it's not going to be a substantially different uArch.
 

MuchoMalo

Banned
Most expect an enchanced memory subsystem. That takes some work.

Yeah, but not too much. It would either be the addition of eSRAM or doubling the bus. It would technically count as custom though, I guess.

A lot of the things you're saying we're talking about aren't what we're talking about. A new memory hierarchy (as mentioned) for instance wouldn't be insubstantial either. I'm just saying I think if TX1 was a 1 on an arbitrary scale and fully custom was a 10, I'd expect this to be a 3, kinda thing.

Comparing the work involved for a fully new GPU is a fair point just to say it's not going to be a substantially different uArch.

Well, nobody was expecting it to be fully custom, and anyone expecting a new architecture is, frankly, stupid. Do we know for sure that the PS4 and Xbone APUs took longer to develop? I wouldn't call those "fully custom" either btw.

For the record, I'm not really expecting more customization than that; I'm just pointing out that the man-hours thing is being downplayed too much and that comparing it to making a new uArch is unfair because nobody in their right mind expects anything like that.
 

z0m3le

Banned
rumor is that switch can only read up to 128gb micro sd card. Will there be a way for it to read my 200gb?

The 3DS has a limit of 32GB, my 64GB works fine.

There is no hardware limitation for 128GB that I know of, it should work with a theoretical 2TB MicroSD card.
 

AlStrong

Member
Hmm isn't it possible that they have gone with a 128bit bus if they plan to release a New Switch down the road with more memory like they did with true DSI and New 3DS? They can't do that with a 64bit bus

The amount of RAM per chip can increase down the road too. We're at 16Gbit density for LPDDR4. There exists 32Gbit for LPDDR3. Just a matter of time.
 

tsab

Member
What is your opinion of using switch as a VR in a similar manner as Google cardboard Vr but with some modifications.. It will be affordable headset

a 720p screen is very bad for VR. I had a nexus 4 with a 720p screen and google cardboard, although I liked the novelty, is not a very good solution because the screen is not that very dense for a good (note I didn't say perfect, just good-ok) immersion.
Also I think the screen with the bezel may be too big wearing it as a VR helmet comfortably

rumor is that switch can only read up to 128gb micro sd card. Will there be a way for it to read my 200gb?

WiiU and 3DS could read bigger cards (and access all the storage space) than the ones that were "officially" supported. But we'll see...
 
a 720p screen is very bad for VR. I had a nexus 4 with a 720p screen and google cardboard, although I liked the novelty, is not a very good solution because the screen is not that very dense for a good (note I didn't say perfect, just good-ok) immersion.
Also I think the screen size may be too big wearing it as a VR helmet
Screen size is fine, don't forget dat bezel
 

prag16

Banned
is not a very good solution because the screen is not that very dense for a good (note I didn't say perfect, just good-ok) immersion.

Yeah. 720p would be terrible for that. I have a cardboard with an LG G4 (1440p) and the individual pixels are still noticeable even in that case.
 
Yeah. 720p would be terrible for that. I have a cardboard with an LG G4 (1440p) and the individual pixels are still noticeable even in that case.

1920x2160 per eye is probably going to be the resolution in which people stop complaining about the screen-dooring.
 

OryoN

Member
You're right. If you look at Nvidia's blog post, it seems the 500 man-years covers the entirety of their involvement in switch's development, not solely the chip. Who knows how much of that 500 went towards modifying the SoC. I guess we'll see soon enough. Here is the quote from Nvidia:

"But creating a device so fun required some serious engineering. The development encompassed 500 man-years of effort across every facet of creating a new gaming platform: algorithms, computer architecture, system design, system software, APIs, game engines and peripherals. They all had to be rethought and redesigned for Nintendo to deliver the best experience for gamers, whether they’re in the living room or on the move."

While it's true that the time invested is in reference to the entirety of their involvement with the Switch, the statement - if taken at face value - also suggests a redesign of the architecture itself. If we are to take those comments very literally, it all sounds well beyond simply tinkering with clocks. They clearly went out of their way to make this thing as efficient as possible, so I can't imagine them and Nintendo looking at that X1 and not seeing several ways to improve it. Heck, lowering the clocks may have well given them even more incentive to seek efficiency/gains in other areas of the chip.

I'm on team-moderately customized
(Supposing: Wii U gpu = highly customized, 3DS Pica gpu = lightly customized).

But yes, hopefully devs/engineers will be a bit more forthcoming with details in the coming days/weeks. Either way...hyped!
 

prag16

Banned
1920x2160 per eye is probably going to be the resolution in which people stop complaining about the screen-dooring.

That should be "good enough" for most I'd think, but I'm sure some (nitpickers or the eagle eyed) will complain until it gets to 3840x4320.
 

LordOfChaos

Member
Yeah. 720p would be terrible for that. I have a cardboard with an LG G4 (1440p) and the individual pixels are still noticeable even in that case.

a 720p screen is very bad for VR. I had a nexus 4 with a 720p screen and google cardboard, although I liked the novelty, is not a very good solution because the screen is not that very dense for a good (note I didn't say perfect, just good-ok) immersion.
Also I think the screen with the bezel may be too big wearing it as a VR helmet comfortably

.



I tried VR on my iPhone 6S, 750P on 4.7", and it was godawful. Since resolution is haled per eye and it's so close to your face it was very much like sitting right in front of a 90s CRT SDTV.

That said, I think the Switch could be ok for cutesy VR demos, rather than anything you keep on for a while.
 

Thraktor

Member
I don't have access to limiting the GPU speeds unfortunately. The kernel ignores the request, as well as for memory speeds too. I think I would need to modify and build the kernel, but that is beyond my abilities hah. The only way beyond that I guess would be making the chip hot enough so the GPU thermal throttles to my desired speed, such as taking a blow dryer to the Shield TV haha! I might be crazy enough to do that.

I'm not sure it's worth going to such lengths (particularly as it would be hard to infer too much from a GPU clock that's not perfectly stable).

Thraktor, you might need to turn on triple buffering when comparing with my results because Android forces it on (and another thing I can't turn off it seems!), and explains the no tearing on the Shield TV. Not sure how much that will affect tests.

I suspect the demos are triple-buffered anyway (no tearing on my end), but I'll check when I'm back on my PC. I probably should also note that they run windowed on PC, so when I say 1080p it's a maximised window so the actual resolution being rendered is a little less (maybe by 5-10%).

What's the die size of TX1? Some folks floated the 121mm^2, but I can't find a solid source on that.

I'm fairly certain the chip measures 11mm x 11mm so 121mm^2. The numbers is your post are the correct measurements for the package AFAIK. When I first bought my Shield TV, some forum posters at SemiAccurate and Beyond3D were tearing it apart and measuring too.

There's an iFixit teardown of the Pixel C, and if you go down to step 10 you can see the logic board, including the TX1. Comparing to the DRAM chips above it, it would seem that the 11mm x 11mm measurement is accurate, so 121mm² it is. (I had misremembered this as the measurement for the package, but that certainly looks like an exposed die).

I think we are safe from TN. Nintendo had been experimenting with IPS panels in the new 3DS, and I didn't see too many complaints. The newest releases by Nintendo indicate that the company is shifting to IPS for their screens.

I'd be a bit surprised if it's not IPS. Sourcing of 3DS screens was constrained by the 3D tech, whereas now it's a straightforward, standard resolution panel, so they shouldn't have any issues finding suppliers. The other thing is that TN screens seem to have pretty much disappeared at this kind of screen size, with even budget phones almost all using IPS screens these days. IPS is effectively the "cheap" phone display technology these days, with OLED replacing it as the more expensive option.

The one thing we really should be looking out for on the display front, though, is adaptive sync (i.e. G-sync, but probably without being called G-sync). For a fully integrated device like Switch, where Nintendo gets to choose and/or design everything from the OS to the APIs to the GPU to the display controller to the panel itself, supporting adaptive sync isn't actually all that expensive. There's up front R&D cost, to be sure, but Nvidia have obviously already done most of the R&D necessary. From that point it's a matter of using a display controller which can properly support adaptive-sync, and then getting all your software ducks in a row.

My only worry about this is that because Nintendo's internal software teams do an exceptionally good job of locking frame rates to either 30 or 60fps they might not see the value of adaptive sync. For third-party games it could be a huge benefit, though. Those inevitable not-quite-30fps ports would feel a hell of a lot smoother on an adaptive sync display than a fixed refresh display even if the actual frame-rate is identical. Compared to using a bigger GPU (which costs money) or using a higher portable clock speed (which costs battery life), adaptive sync is an exceptionally cheap way to give players a smoother gaming experience in portable mode, so let's hope they go with it.

Creating a bespoke API isn't insubstantial.

I don't know how much of a bespoke API they've created, to be honest. In fact, compared to past hardware, where they actually have needed to create APIs from scratch (or nearly from scratch) to accommodate the new hardware, they could get themselves into a far better software and tools position than they've ever been in before with far less work involved.

From a graphics API perspective Vulkan already fits pretty much all their needs, and given they've joined the Vulkan working group and certified Switch as conformant to the Vulkan spec, it would certainly seem like they're using it as their main graphics API. Regarding tools and development, Nvidia already has a full (and reportedly excellent) set of development and debug tools which should be usable for Switch with minimal customisation. Then Gameworks includes plenty of graphics and physics libraries which should be usable on Switch with, again, relatively minimal work.

I suspect the 500 man-years probably does include quite a lot of software work (possibly even including work Nvidia was already doing anyway, such as bringing Gameworks to Vulkan), but if we're comparing the amount of work Nintendo and Nvidia need to put into software for Switch versus what Nintendo would have needed to do to get previous consoles to the same level, then Switch would be a bit of a cakewalk for them.
 
Status
Not open for further replies.
Top Bottom