• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Laura Dale: NX battery 3 hours max on dev kit, dock improves performance, touchscreen

Status
Not open for further replies.

OCD Guy

Member
These stuff are probably NDA'd to hell and back and for good reasons, I don't think the consumer product should be judged by the DevKit. Isn't Laura Dale trashing these NDA's, or is she doing this by proxy?

She's proudly stated that NDA's don't apply to her on twitter and she used a term along the lines of Journalist protection (I can't recall the exact term she used)

So while she will do what she can to protect her source she's got zero fear of any repercussions to herself.
 

Durante

Member
Just out of curiosity, do you put any stock into the notion that a different version of the Tegra chipset might yield a significant boost? I don't know enough about mobile architecture myself to make heads or tails of this stuff. So I'm just anticipating the worst but hoping for the best.
The most important part is that Pascal is built on 16/14 nm. It should get roughly 40% more performance in the same power envelope, or ~50% less power consumption with the same performance as TX1 at 20nm. Of course, these numbers also heavily depend on how much you want to spend -- you can increase the die area of the GPU to go more parallel at lower clock rates with better efficiency, or make the GPU smaller and go with higher clock rates and worse efficiency.

What is basically certain to me is that Nintendo will need to reduce the TDP from the TX1 dev kit (/ shield TV) for a handheld. With Maxwell at 20nm, they could only do that by reducing its performance.

With Pascal at 16/14nm, there are a few options:
  • Keep the same number of cores, keep a similar clock, achieve similar performance, and get less power consumption due to the architectural and fabrication improvments.
  • Increase the number of cores, slightly downclock them, achieve better performance and lower power.
  • Decrease the number of cores, clock them the same or slightly higher, get somewhat less performance and only slightly better power consumption.
And lots of variations on those. Of them, 2) is the most expensive and 3) is the cheapest. All of them are possible, personally I consider 1) the most likely.
 

Palmer27

Member
EDIT: Sounds like the honeymoon period is already over lol. 24 hours at NeoGAF!

This is very entertaining.

So far the only game I'm potentially missing out on is the new 3d world-land thing, which already looks a little sterile by design for my 3d mario itch.

Please don't impress me anymore nintendo, I think of your consoles as utilities for your titles: I don't want to have to put my money down before a price drop and some decent games!
 

Orca

Member
Do people really think there's processing power in the dock or just that it goes into low power mode when disconnected? Sorry if it's been covered but I'm on the road to the heritage classic and can't keep up.
 

tuxfool

Banned
This is true...I wonder why though...something somewhere holding things down.

It is completely analogous to the energy problems facing the world today. The research of chemistry and physics of batteries just hasn't yielded many favorable results. Additionally a lot of the gains have been easier to achieve on the consumption side via process node shrinks etc.
 
3 hours?! Christ... How the fuck do you even call that portable?

Hopefully 3rd party batteries will be able to save us

See this ish right here is why Nintendo and Apple get away with including cheap batteries I mean do you own a WiiU.Don't nobody dare utter those words publicly again we shouldn't have to depend on third parties for something so crucial as battery life.

Let me be very clear if you're gonna advertise something as "portable" it better damn well get atleast 4-5 hours of battery life out the box before a battery extender.
 
The most important part is that Pascal is built on 16/14 nm. It should get roughly 40% more performance in the same power envelope, or ~50% less power consumption with the same performance as TX1 at 20nm. Of course, these numbers also heavily depend on how much you want to spend -- you can increase the die area of the GPU to go more parallel at lower clock rates with better efficiency, or make the GPU smaller and go with higher clock rates and worse efficiency.

What is basically certain to me is that Nintendo will need to reduce the TDP from the TX1 dev kit (/ shield TV) for a handheld. With Maxwell at 20nm, they could only do that by reducing its performance.

With Pascal at 16/14nm, there are a few options:
  • Keep the same number of cores, keep a similar clock, achieve similar performance, and get less power consumption due to the architectural and fabrication improvments.
  • Increase the number of cores, slightly downclock them, achieve better performance and lower power.
  • Decrease the number of cores, clock them the same or slightly higher, get somewhat less performance and only slightly better power consumption.
And lots of variations on those. Of them, 2) is the most expensive and 3) is the cheapest. All of them are possible, personally I consider 1) the most likely.

Thanks for the response!
 
The most important part is that Pascal is built on 16/14 nm. It should get roughly 40% more performance in the same power envelope, or ~50% less power consumption with the same performance as TX1 at 20nm. Of course, these numbers also heavily depend on how much you want to spend -- you can increase the die area of the GPU to go more parallel at lower clock rates with better efficiency, or make the GPU smaller and go with higher clock rates and worse efficiency.

What is basically certain to me is that Nintendo will need to reduce the TDP from the TX1 dev kit (/ shield TV) for a handheld. With Maxwell at 20nm, they could only do that by reducing its performance.

With Pascal at 16/14nm, there are a few options:
  • Keep the same number of cores, keep a similar clock, achieve similar performance, and get less power consumption due to the architectural and fabrication improvments.
  • Increase the number of cores, slightly downclock them, achieve better performance and lower power.
  • Decrease the number of cores, clock them the same or slightly higher, get somewhat less performance and only slightly better power consumption.
And lots of variations on those. Of them, 2) is the most expensive and 3) is the cheapest. All of them are possible, personally I consider 1) the most likely.

I had no clue that we were still up in the air over Pascal vs. Maxwell architecture behind the custom nVidia Tegra powering Switch. With all of the information released so far, I could have sworn I read that the Tegra in Switch was Maxwell based. A Pascal solution would be very impressive! Thanks for the great post.
 

OCD Guy

Member
Do people really think there's processing power in the dock or just that it goes into low power mode when disconnected? Sorry if it's been covered but I'm on the road to the heritage classic and can't keep up.

Low power mode when disconnected makes much more sense. The Nvidia SoC reduces clock frequency when portable to promote better battery life, and reduce heat so the device doesn't need a fan when portable, and goes to "normal" clocks when docked.

The only thing the dock has there for me personally is active cooling and potentially additional storage as well as any connections, usb ports etc of course.

There's no way in hell I can see Nvidia hardware be it a cpu, gpu or additional SoC in the dock.

I actually hope there's active cooling on the dock that passes through the handheld vent holes, as opposed to a fan on the handheld itself. Can you imagine how small and shitty sounding the fan would sound lol.
 

Peterthumpa

Member
How...? Without the second screen?

Like this?
bJy0HF3.jpg
 
OK. Yeah. Thanks. So three generations of the hardware didn't play games at all. I think I probably need to be more clear. I get that there's demand for boxes or HDMI sticks that have media capabilities. I've not seen any evidence of a demand for cheap Android/iOS TV boxes that have a focus on gaming.

Don't want to derail the thread, but I'd say that comes down equally to Apple's apathy at embracing the Apple TV as a gaming console. 1) They required support of their very limited two-button TV remote for all games until this year. This seriously hurt game dev opportunities for third-parties. 2) lack of HD space means most content has to stream in. 3) no social streaming capabilities until this year, and still no Twitch support.

I think it has the potential to attract a gaming audience, but until they ship an Apple TV SKU with a bundled game controller and add more HD space, it will remain a media-focused device instead of gaming.
 

AmyS

Member
Straight up, what are the chances the docking unit contains its own Tegra SoC, and therefore, that's why it takes a few seconds to sync up with the tablet ?
 

Shpeshal Nick

aka Collingwood
Honestly I don't know what I expected.

3 hours if it's the final figure isn't great. I think 5 would have been perfect if they could have hit that number.
 

OCD Guy

Member
Straight up, what are the chances the docking unit contains its own Tegra SoC, and therefore, that's why it takes a few seconds to sync up with the tablet ?

Chances are slim. It would be extremely complex to incorporate, and expensive for very little benefit. I just can't see the Switch having two SoC's,

I'm also not reading any sign of it in this statement from Nintendo either:

"The dock is not the main console unit of Nintendo Switch. The main unit of Nintendo Switch is the unit that has the LCD screen, which the two Joy-Con controllers can be attached to and detached from. The main function of the Nintendo Switch Dock is to provide an output to the TV, as well as charging and providing power to the system."

The dock to me is nothing more than a breakout box. The most plausible scenario is simply different power states for the SoC depending on what the Switch is doing, as opposed to an additional SoC that provides some sort of boost.
 
I fully expect retail units to have a better battery life. It won't be a huge gain, but a 5 hour battery life is definitely possible if they use the Pascal GPU and manufacture the chip at 14nm. Plus, the battery in the devkits may not be as good as the one that will go in the final units.
 
Something had to be sacrificed, I'm glad it wasn't something else like the screen quality. I was able to deal with apprx three hour battery life the entire 4 1/2 years I owned my OG launch 3DS with little issue until I finally upgraded to a regular size N3DS in 2015 so I'm more than used to managing my portable battery time until a better solution is available.
 

Peterc

Member
pls take this switch on a plane*

*only plane rides shorter than 3 hours plz



I really hope this info is wrong. way to kneecap your own selling point

What device did you used for playing games before on a plane.
A devkit using full brightness?

How long did you played the whole trip?
 

OCD Guy

Member
Something had to be sacrificed, I'm glad it wasn't something else like the screen quality.

Still waiting to hear about that.

Please don't be shitty, as Nintendo seem to like garbage tier screens in their portable devices.

The Wii U gamepad screen was absolutely terrible. Awful black levels, bad viewing angles, low res, and them fuzzy reds. Anyone not notice Mario's hat for example?

If it's an ips panel we'll certainly have good viewing angles but expect crappy blacks, and ips glow, but I'll likely just keep exchanging my unit until I get one with minimal backlight bleed lol
 

Xhaner5

Neo Member
Straight up, what are the chances the docking unit contains its own Tegra SoC, and therefore, that's why it takes a few seconds to sync up with the tablet ?

Somewhat possible from the:

- The space inside the mobile part is not enough to compete anywhere near PS4/Xone, so with that it's pretty much not a home gaming console at all, and nintendo said it's meant to be mainly a home console.
- The delay when docking may give it away as well - This makes sense as OS and the main Hardware would pause whatever your doing and sync up the supplmental processing, it has to init the HW, fill the RAM, read more stuff off the GameCard
- You got more reliable sources saying there may be truth to this but you know it's up in the air.

But that resync could also be just the main console reclocking it's CPU and switching into max power mode without anything on the dock, the delay is needed to refresh the engine a bit and sync with the HDTV, detection of the HDTV as well happens to see what max resultion and refresh rate it supports (plug-n-play)

There could only be some kind of separate upconversion when passing through the HDMIs, who knows.
 

ozfunghi

Member
I had no clue that we were still up in the air over Pascal vs. Maxwell architecture behind the custom nVidia Tegra powering Switch. With all of the information released so far, I could have sworn I read that the Tegra in Switch was Maxwell based. A Pascal solution would be very impressive! Thanks for the great post.

The Tegra inside the DEVKIT is Maxwell (as reported by Eurogamer). The custom Tegra that is rumored to end up in the final hardware, would be Pascal based.

The numbers Nvidia are releasing are 40% more powerful at the same power draw, or 60% (Durante says 50% but that's not what Nvidia claims) more power efficient at the same performance.
 

Proelite

Member
Straight up, what are the chances the docking unit contains its own Tegra SoC, and therefore, that's why it takes a few seconds to sync up with the tablet ?

0% why would Nintendo say otherwise?

I believe that the dock would allow the Tegra soc to run at max clock. I am thinking 512 - 768 gigaflops (32bit) max. Considering the highest end of the newest Tegra line would be fraction of the TDP of the latest xb1s soc, I don't see why they need to be conservative with heat and power.

Portable mode is probably clocked at half speed, as 720 is almost half the resolution of 1080p.
 

EVH

Member
I really hope that we end up getting some clear specs for this device, doesn't matter if from Nvidia, Nintendo or as a confirmed leak, but it would be the best to stop just making guesses.
 

ZOONAMI

Junior Member
Is there any reason to believe the capacity of the battery won't be expanded once we get a definitive product (aka not a dev kit)?

None. That's why that being the main part of the thread title and OT is dumb, not to mention the 3 hours isn't quantified at all (screen brightness, is that at load playing games or just screen time, etc.).

Not to mention not qualifying that 3 hours of actual gaming time isn't bad vs virtually any modern tablet or cell phone.
 

Kimawolf

Member
So Nintendo says they still have some kind of surprise or two left for the Switch. Do you think this could be some kind of separate dock? the base one may be...well basic, but perhaps there is another one which adds some oomph to the system?
 

OCD Guy

Member
So Nintendo says they still have some kind of surprise or two left for the Switch. Do you think this could be some kind of separate dock? the base one may be...well basic, but perhaps there is another one which adds some oomph to the system?

Using additional power as a bullet point is unlikely given how Nintendo operate nowadays.

The days of expansion packs like the N64 are long gone I think.

I think one of the surprises is going to be revolve around an input method of controlling your games. Nintendo love to throw new ways of controlling games, and detachable controllers isn't Nintendo enough for me lol.
 

LoveCake

Member
Regarding the battery life.

From what I have read about other devices battery lifespans, the more intense the calculations/processes the CPU and/or GPU (processors) do the more power they use/drain, so a more graphically intense game is going to use more battery power than a less graphically intense game.

Also if the N/S is docked and charging and/or constantly being powered, this now has been found to cause damage to batteries.

"Once your smartphone has reached 100% charge, it gets “trickle charges” to keep it at 100% while plugged in. It keeps the battery in a high-stress, high-tension state, which wears down the chemistry within.

When fully charged, remove the battery” from its charging device. “This is like relaxing the muscles after strenuous exercise.” You too would be pretty miserable if you worked out nonstop for hours and hours
."

My original day one 3DS lasts pretty well, I only play AC: NL and the Streetpass games and with the original battery I get about 3hrs still with the 3D full on, but when it gets to the last bar it really drops away fast, this is the same with my Kindle Fire HDX7 (which is about the same size as the N/S maybe?) once it gets to about 30% the drop-off is pretty fast, its connected to my Wi-Fi constantly which is a drain.

What I want to know is, how much storage is on the N/S, I hope its not going to be like the WiiU with say 32GB onboard and then you have to get a external HDD to plug into the dock, similar to how the WiIU is, yes it lets people chose, but it adds cost to the price of the console, I got a 320GB HDD when I got my WiiU and it was about £40 so this on top of the console means it's £340 not £300 and people are not going to want to skimp in getting a small capacity HDD so this pushes the cost up even more, then more again if going the SSHD or SSD route.
 
The most important part is that Pascal is built on 16/14 nm. It should get roughly 40% more performance in the same power envelope, or ~50% less power consumption with the same performance as TX1 at 20nm. Of course, these numbers also heavily depend on how much you want to spend -- you can increase the die area of the GPU to go more parallel at lower clock rates with better efficiency, or make the GPU smaller and go with higher clock rates and worse efficiency.

What is basically certain to me is that Nintendo will need to reduce the TDP from the TX1 dev kit (/ shield TV) for a handheld. With Maxwell at 20nm, they could only do that by reducing its performance.

With Pascal at 16/14nm, there are a few options:
  • Keep the same number of cores, keep a similar clock, achieve similar performance, and get less power consumption due to the architectural and fabrication improvments.
  • Increase the number of cores, slightly downclock them, achieve better performance and lower power.
  • Decrease the number of cores, clock them the same or slightly higher, get somewhat less performance and only slightly better power consumption.
And lots of variations on those. Of them, 2) is the most expensive and 3) is the cheapest. All of them are possible, personally I consider 1) the most likely.

Thanks for your input.

Now it should be obvious why Nintendo would aim at getting a Pascal chipset instead of a Maxwell. Power consumption for a portable console is a major challenge.

Personally, I'm more interested in what finding more about the power difference when it is in docked mode.
 

ZOONAMI

Junior Member
It would need a fucking massive battery to go beyond 5 hours.

It would probably cost 499 usd if they tried.

Yeah people don't seem to understand that a modern device with a 720p+ large screen is gonna need like a 9000mah battery to push much more than 3 hours of a gaming load. Devices like the iPad pro with that big of a battery are expensive. They're probably working with like a 4000-5000mah battery for the retail device. A huge battery also needs a larger device, like something with a 10 inch foot print. This isn't a 10 inch footprint device and the controllers don't count as part of the footprint as you can't use them for battery space.

If the retail unit is less than 3000mah then we can start complaining imo. If its easily swappable though I still wouldn't complain if it's somewhat less than 3000mah.
 
Troughout the thread I stated multiple times the battery life could be worse. But both, better or worse battery life, are currently irrelevant.


The entire thread premise(or well the main topic) is based on one secondhand estimate based on one developer kit.

We are lacking context, details and the final consumer device.

When all is said and done, the battery life of the Nintendo Switch could be 5 minutes.

But as it is, her tweets and article are not going to tell us. It is incredible premature and goes against journalistic standards.

PS: I expect the battery life to be around 2 hours lol
The blame lies squarely on Nintendo. Their usual obsession with secrecy may continue to bite them in the ass as as an increasingly negative narrative begins to take hold due to rumors and speculation. Prepare for the "news" of bad online infrastructure, a high price tag, 2 hour battery life, too little RAM, weak maxell architecture, etc. This is why Sony and Microsoft got ahead of all that by giving people actual information instead of misleading trailers about millenial adults crowding around Nintendo tablets. The Switch looks interesting, but you can't just drop a bomb like this one after total radio silence and expect people not to let their imaginations run wild with no information with which to control the narrative.
 

Oersted

Member
The blame lies squarely on Nintendo. Their usual obsession with secrecy may continue to bite them in the ass as as an increasingly negative narrative begins to take hold due to rumors and speculation.

People hoped for the same when the blatantly obviously faked NX controllers hit(which she fell for).

I doubt it will happen now.

Wii U gets a lot of praise as having the best games this gen. The reception was bad but then it got a lot of praise.

Moving goalposts is such a embarassing looking sport. Try Quidditch.
 
Status
Not open for further replies.
Top Bottom