• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Curious Case of the Switch Foxconn Leak (Now a hardware fanfiction thread)

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
This is not true. See page 105 onwards. and they wouldn't encode 8-bit components anyway, but pad with zeros in a 10-bit value. But that would be incredibly wasteful.

But again this doesn't matter because the HDMI in the dock is NOT using the USB bandwidth but an actual HDMI signal on the same cable running alongside full bandwidth USB.

Getting way off-topic here, but for the sake of clarity, you would be right if any equipment worked internally in YCbCr 4:4:4 colorspace, but most if not nearly all internal digital processing of images / colorspace is done in YCbCr 4:2:2 instead, as the extra 2 bit's allow "headroom" to stop truncation and loss of data when performing operations on that data, and the data is not just simply packed / zero padded, it is encoded from YCbCr 4:4:4 to YCbCr 4:2:2 and it is sent as YCbCr 4:2:2.

YCbCr 4:2:2 is 10bit.

There is very little gain in working in internally in one device in 10bit, encoding to 8bit for transmission, then re-encoding at the other end back to 10bit for more processing, so it's just not done, they just transmit the 10bit YCbCr 4:2:2 instead, thankfully for gamers, that encode/transmit/decode would only add to processing latency :p

If you can get your hands on a protocol analyzer that can handle the speeds necessary, you can see for yourself with a simple probe. (Not that tapping the line is ever "easy", If ever there was a great demonstration of the "Observer Effect" from physics, realtime protocol analysis comes the closest :p)
 
And I still want to know if I get the docked clock speeds when having the handheld just charging at its USB-C port without being in the dock.

edit: if so - this would explain why the undocked mode at the demos looked and ran so good because most units were probably being charged.

No they werent. People undocked them and kept playing. Without charging.
 

JB2448

Member
There is no denying the allure of Nintendo snake oil lubing up the magic dust generator stuffed inside each console.

The best is when reality sets in people will go back to the "gameplay not graphics" until the cycle starts over again.
The problem with this line of thought is that you're saying they're the same people that hoped for these pie in the sky graphics capabilities despite the reality being right in front of them. That just isn't true.

For the record, I don't particularly care about graphical fidelity in a Nintendo console (the reason I came into this thread is because of following the leakmill for this hardware prerelease period because of how unorthodox everything has been), but I just had to post something when I saw the slight against people that really don't deserve it. This forum should be better than that.
 
521.gif







































































521.gif
 

Mokujin

Member
And I still want to know if I get the docked clock speeds when having the handheld just charging at its USB-C port without being in the dock.

edit: if so - this would explain why the undocked mode at the demos looked and ran so good because most units were probably being charged.

The cord on the unit was to avoid them being stealed and it's attached to the back, plus people could have not docked/undocked the unit with a cord connected to the usb.

nintendo-switch-console-hands-on-0031-1500x1000.jpg
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
memory bandwidth was always the speculated problem, but if the bandwidth was that much of an issue, why would mario kart 8 be 1080p and the other games 720p with no AA? surely atleast they would have AA or be 900p or something
 

Thraktor

Member
memory bandwidth was always the speculated problem, but if the bandwidth was that much of an issue, why would mario kart 8 be 1080p and the other games 720p with no AA? surely atleast they would have AA or be 900p or something

Bandwidth requirements can vary quite a lot from one game to the next. In this case it will particularly be a matter of the extent to which the rendering pipeline can be tiled.
 

z0m3le

Banned
I don't realize that. Maybe because I do realize that e.g. memory bandwidth is a factor.

We still have no idea what the memory bandwidth is. He was commenting on the GPU and CPU power not being able to drive the resolution and I explained to him that those specs would allow for the resolution.
 

Theonik

Member
We still have no idea what the memory bandwidth is. He was commenting on the GPU and CPU power not being able to drive the resolution and I explained to him that those specs would allow for the resolution.
Your leaks indicate LPDDR4 is used 2 chips worth which might mean either a 64bit or 128bit bus. That gives us a very good idea of the ballpark for bandwidth.
 
Didn't Todd Howard from Bethesda say he saw the most impressive demo he's ever seen from the Switch? The SCD patent is as old as the collaboration with Nvidia per Semiaccurates original leak, what if Nintendo has been shopping around this power dock with the switch from the beginning?

I'm actually personally convinced he was referring to the HD rumble. People came away from the event talking about that being absolutely mind-blowing in context of the marble game. I'm 80% sure now that Skyrim is coming out so late because they're adding HD rumble functionality so that each different ability equipped to your two hands feels substantially different.

Might even see motion controls if Bethesda is working on Skyrim VR.

You guys are delusional. This SCD fantasy is, again, completely counter to common sense.
Power is really not Nintendo's concern. They are not going to bend over backwards and put in years of R&D just so that a small subset of edge cases can buy a SCD with a 1060 in it and enjoy Mario in 4K. Whoever pitches that idea at Nintendo would get laughed out of the room.

I have to agree with you a bit here. It's far too early to actually think this may be happening. We have a patent, yes. We have a leak discussing prototype hardware, yes. But I don't know how any of that can mean people who saw the machine months to years ago somehow were blown away by the visuals. That doesn't track logically.

Could we see a 4k SCD dock one day? Maybe. I'd say 2018 at the absolute earliest. But the fact that we have absolutely no corroboration on it from insiders should tell us that, even if it exists, it's nowhere close to ready.
 

z0m3le

Banned
Your leaks indicate LPDDR4 is used 2 chips worth which might mean either a 64bit or 128bit bus. That gives us a very good idea of the ballpark for bandwidth.

We don't know if there is any memory on the SoC. Also 128bit bus leads to 52GB/s which is much faster than we assumed before this leak of the X1's 26GB/s. Thraktor's post about it being odd to have 2 chips if it is 64bit is already pointing to something interesting to discuss from this leak.
 

Zedark

Member
I'm actually personally convinced he was referring to the HD rumble. People came away from the event talking about that being absolutely mind-blowing in context of the marble game. I'm 80% sure now that Skyrim is coming out so late because they're adding HD rumble functionality so that each different ability equipped to your two hands feels substantially different.

Might even see motion controls if Bethesda is working on Skyrim VR.

Hmm, if any game would be able to sell me on motion controls, it would be Skyrim. Would be interesting if they decided to implement it, as the battle system, due to its simplicity, lends itself quite well to motion control. If true, it would be a valid reason to postpone the to fall of this year, but if not true, then they seem to just take it leisurely with the Switch, which would be a bit disappointing.
 

z0m3le

Banned
Hmm, if any game would be able to sell me on motion controls, it would be Skyrim. Would be interesting if they decided to implement it, as the battle system, due to its simplicity, lends itself quite well to motion control. If true, it would be a valid reason to postpone the to fall of this year, but if not true, then they seem to just take it leisurely with the Switch, which would be a bit disappointing.

In relation to this thread's leak, it could also be because October Devkits were powerful enough to implement more graphical effects (more mods) but we will know more when Switch is hacked a bit.
 

Theonik

Member
We don't know if there is any memory on the SoC. Also 128bit bus leads to 52GB/s which is much faster than we assumed before this leak of the X1's 26GB/s. Thraktor's post about it being odd to have 2 chips if it is 64bit is already pointing to something interesting to discuss from this leak.
Extra memory on the SoC seems unlikely. The leaker states a 10x10mm die area which is already smaller than the X1 having eDRAM or eSRAM takes a lot of die area that reduces the space available for the actual GPU on the die.
 
I'll post one last point on this

The October Devkits being more powerful than the July Devkits means that either the SoC changed or the clocks changed. Literally what it means. This leak is saying that the clocks changed, and if these are final clocks as it would seem they are from both the devkit getting more powerful and the stress testing of 8 solid days, it would also lead to the A57 in X1, being swapped out for the A72 because even at 16nm, the A57 would draw too much power at 1.78ghz.

I hope those interested in this topic understand the paragraph above.

I appreciate this post and the speculation from this Foxconn leak, but it did get a bit sidetracked by SCD speculation.

There are still two problems I have with this leak though:

1) The timing of DF's clock speed leak. The article clearly suggests that the clock speed leak happened recently, as in December. In the video, they apparently suggest the leaked specs (i.e. the Jetson TX1) info was from 4-5 months prior, but I don't think they meant the clock speeds were that old too. If so, that would suggest Nintendo has chosen lower clock speeds than those that Foxconn was testing at, which isn't that surprising or crazy.

2) The speculation on the CPU cores. Basically the leaker thinks that the CPU cores are A72 or A73 specifically because the clock speed is too high for it to be A57 in portable mode. Well, if the above situation is accurate (leaked speeds came after the Foxconn leak) then the power consumption issue is no longer present. I don't think there's anything preventing A57s from running at 1.78GHz in a stress test environment.

Now, the main thing supporting this Foxconn leak, as you said, is LKD indicating that the October devkits are more powerful. That's still a pretty vague claim though, so we have no idea in what way. So perhaps we can get some clarification from Eurogamer or DF about when exactly those clock speed leaks happened, as that would likely settle this matter.

Hmm, if any game would be able to sell me on motion controls, it would be Skyrim. Would be interesting if they decided to implement it, as the battle system, due to its simplicity, lends itself quite well to motion control. If true, it would be a valid reason to postpone the to fall of this year, but if not true, then they seem to just take it leisurely with the Switch, which would be a bit disappointing.

I agree that it would be fantastic, but I'm not holding my breath on motion controls specifically. I really doubt Bethesda would be implementing them just for the Switch, so the only way I see them coming is if Skyrim VR is happening alongside.

But I'm absolutely expecting HD rumble because I don't think that requires all that much work, and is very interesting tech, seemingly according to Todd Howard. Imagine feeling ice cracking in one hand, shock crackling in your other hand, the tension of a bow string, or the clang of a mace hitting someone's head... It's one of the best possible showcases of HD rumble for the "core" audience, so I think it's something Nintendo has pushed and Todd Howard is very interested in.
 

Hermii

Member
Im pretty sure there are Digital Foundry members on gaf, why don't one of us PM them to settle the matter on how recent the clock speed leak is?
 

Zedark

Member
I agree that it would be fantastic, but I'm not holding my breath on motion controls specifically. I really doubt Bethesda would be implementing them just for the Switch, so the only way I see them coming is if Skyrim VR is happening alongside.

But I'm absolutely expecting HD rumble because I don't think that requires all that much work, and is very interesting tech, seemingly according to Todd Howard. Imagine feeling ice cracking in one hand, shock crackling in your other hand, the tension of a bow string, or the clang of a mace hitting someone's head... It's one of the best possible showcases of HD rumble for the "core" audience, so I think it's something Nintendo has pushed and Todd Howard is very interested in.

Stop it, you're getting me all excited now! That does seem like a thing Todd Howard would see potential in, as it aids towards making the environments come to live more and to root oneself into the world, so I could see them do that indeed.
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
Im pretty sure there are Digital Foundry members on gaf, why don't one of us PM them to settle the matter on how recent the clock speed leak is?

John the only one we have direct connection to, and who knows if he would know anything, or if he would say anything even if he knew.

Richard is the one to ask and we cant get in contact with him
 
Im pretty sure there are Digital Foundry members on gaf, why don't one of us PM them to settle the matter on how recent the clock speed leak is?

Yeah that should clear this up rather easily. It wouldn't really indicate that the Foxconn leaker is wrong but it would indicate that those clock speeds aren't intended to be used in the final hardware, at least at launch.

Stop it, you're getting me all excited now! That does seem like a thing Todd Howard would see potential in, as it aids towards making the environments come to live more and to root oneself into the world, so I could see them do that indeed.

It's also one of the only reasons I can think of as to why the game is coming so late (Fall 2017). It's not like fairly simple Nvidia hardware should be all that hard to port a ~5 year old game with some graphical enhancements to. It also explains why Skyrim on the Switch doesn't have a subtitle yet... I'm still at 80% sure HD rumble is happening. Don't get mad at me if it doesn't happen though!
 

Hermii

Member
John the only one we have direct connection to, and who knows if he would know anything, or if he would say anything even if he knew.

Richard is the one to ask and we cant get in contact with him

I don't think there would be any reason to be secretive about how old these leaks are. Doesnt hurt to ask.
 

z0m3le

Banned
Extra memory on the SoC seems unlikely. The leaker states a 10x10mm die area which is already smaller than the X1 having eDRAM or eSRAM takes a lot of die area that reduces the space available for the actual GPU on the die.

If we are talking about 4MB of ram on 16nm, this might not be the case, there was 8MB of VRAM on the 3DS and it didn't take up the entire surface area of the device btw.

I appreciate this post and the speculation from this Foxconn leak, but it did get a bit sidetracked by SCD speculation.

There are still two problems I have with this leak though:

1) The timing of DF's clock speed leak. The article clearly suggests that the clock speed leak happened recently, as in December. In the video, they apparently suggest the leaked specs (i.e. the Jetson TX1) info was from 4-5 months prior, but I don't think they meant the clock speeds were that old too. If so, that would suggest Nintendo has chosen lower clock speeds than those that Foxconn was testing at, which isn't that surprising or crazy.

2) The speculation on the CPU cores. Basically the leaker thinks that the CPU cores are A72 or A73 specifically because the clock speed is too high for it to be A57 in portable mode. Well, if the above situation is accurate (leaked speeds came after the Foxconn leak) then the power consumption issue is no longer present. I don't think there's anything preventing A57s from running at 1.78GHz in a stress test environment.

Now, the main thing supporting this Foxconn leak, as you said, is LKD indicating that the October devkits are more powerful. That's still a pretty vague claim though, so we have no idea in what way. So perhaps we can get some clarification from Eurogamer or DF about when exactly those clock speed leaks happened, as that would likely settle this matter.

Honestly, I didn't want to create this thread because it goes against the Eurogamer rumor, the problem I have with the Eurogamer rumor is that you would think they would be more specific anywhere about the time line. Sure they say that it is final target specs, but those specs often change with these designs, so if the information is from July Devkits or from October Devkits, that is important. Some developers were using July devkits as recent as 3 weeks ago, and still might be, so I'd like for them to really put it to rest with "The final devkits from October run at these clocks".

Personally I think not using A72 with the 1.7ghz clock was the biggest mistake of this hardware design, it just doesn't make sense to me, and after seeing all the tech in the device, it makes even less sense. I lived through Wii U, so Nintendo often makes completely dumb decisions, but they are based on some odd focus they have, and with Wii U it was Wii BC and embedded memory, but here? What could possibly lead them to keep A57 @ 1ghz when they can get about twice that performance at the same power consumption by moving to 16nm, especially when XB1s and PS4 Pro are both harder and more expensive to manufacture (SoC wise) and have moved on to this node?

The video was worded weirdly, but the article could be that they confirmed with an old devkit as well, we just don't know what is going on and that is if they even checked recently. If devkits in july got final target specs, and they are 100% sure the clocks can't be changed, then why would they even go back and check with their sources in december? Their trigger was the venture beat article.
 

TLZ

Banned
John the only one we have direct connection to, and who knows if he would know anything, or if he would say anything even if he knew.

Richard is the one to ask and we cant get in contact with him
Someone can try contacting john and he would get in touch with Richard.

But why and how would they know any different now though? Maybe they've not been updated on anything yet. I'd say we're better off waiting til release then we'd have better info.
 

z0m3le

Banned
Someone can try contacting john and he would get in touch with Richard.

But why and how would they know any different now though? Maybe they've not been updated on anything yet. I'd say we're better off waiting til release then we'd have better info.

It is 6 weeks away, the real problem is that other than this, we have only the super devkits to talk about and you saw what happened to the thread when we did.
 

Theonik

Member
If we are talking about 4MB of ram on 16nm, this might not be the case, there was 8MB of VRAM on the 3DS and it didn't take up the entire surface area of the device btw.
The reason why that is, is because it's a very small pool. The issue is you need it to be fairly sizeable to be useful in the context you are thinking about.
 

-shadow-

Member
The cord on the unit was to avoid them being stealed and it's attached to the back, plus people could have not docked/undocked the unit with a cord connected to the usb.

nintendo-switch-console-hands-on-0031-1500x1000.jpg

Did they... Did they really just cut a piece of plastic from the dock to make it fit? :'D
 
Honestly, I didn't want to create this thread because it goes against the Eurogamer rumor, the problem I have with the Eurogamer rumor is that you would think they would be more specific anywhere about the time line. Sure they say that it is final target specs, but those specs often change with these designs, so if the information is from July Devkits or from October Devkits, that is important. Some developers were using July devkits as recent as 3 weeks ago, and still might be, so I'd like for them to really put it to rest with "The final devkits from October run at these clocks".

Personally I think not using A72 with the 1.7ghz clock was the biggest mistake of this hardware design, it just doesn't make sense to me, and after seeing all the tech in the device, it makes even less sense. I lived through Wii U, so Nintendo often makes completely dumb decisions, but they are based on some odd focus they have, and with Wii U it was Wii BC and embedded memory, but here? What could possibly lead them to keep A57 @ 1ghz when they can get about twice that performance at the same power consumption by moving to 16nm, especially when XB1s and PS4 Pro are both harder and more expensive to manufacture (SoC wise) and have moved on to this node?

Honestly, the only reasons I can think of are cost and timing, and it seems like timing is less of an issue than cost would be. I agree it would be very disappointing if the CPU winds up being 4 A57s at 1GHz... That would probably become a bottleneck within the lifetime of the Switch (or the first Switch). I'm hopeful for A72 cores with or without this leak though, as nobody (including Eurogamer) has claimed what the final core configuration would be.

Someone can try contacting john and he would get in touch with Richard.

But why and how would they know any different now though? Maybe they've not been updated on anything yet. I'd say we're better off waiting til release then we'd have better info.

It's not about whether or not they know anything new, it's about whether their clock speed info published in December actually came from December, or if it came from July like the Jetson TX1 specs did.
 

LordOfChaos

Member
Edit: TL:DR: If the leak is accurate (which is a big if), then it looks like Nintendo is making dev kits using Nvidia's GTX 1060 GPU. It seems likely that the final product would be a special dock for Switch allowing it to play games with this more powerful GPU. Although it would be clocked lower than the GTX 1060, it would still be very powerful, potentially competitive with PS4 Pro.

My mind can't roll with this from a messaging standpoint.

Reggie goes on talk shows and says the entire console is in the tablet, there's nothing in the dock.

They reveal the thing and set everyones minds to it having X level of performance. The keynote happened, advertising started, the site is up, gameplay video is up.

And then...What? Maybe a year or two down the line they advertise the new hardware addon? It just feels messy to me somehow, even more than the Pro/Scorpio.


There's also the technical side, with everything they've said it's a USB C port on the Switch, 5Gb/s if gen 1, 10Gb/s if gen 2...Thunderbolt 3 at 40Gb/s limits a card like the 1060 to ~85% of its performance.

There's also that it's Nintendos fanbase, how many would spend maybe 550 dollars for the more powerful version of their console? And if a small number of people buy it, who would develop for it? You could say first party games matter most on Nintendo platforms, but do those need a dock with roughly a 1060? They could use some AA and AF lipstick, sure, but...


I guess I just have a lot of questions :p
 

Mokujin

Member
Did they... Did they really just cut a piece of plastic from the dock to make it fit? :'D

Seems so, would have been more awkward if you couldn't dock the Unit...

"I kid you not you can dock the Switch here sir, but you see we need to secure our devices..."
 

Mokujin

Member
The way people pick and choose the "rumors" they want to believe is hilarious. A rumor is a rumor in my opinion no facts to back it up... unless you have multiple things from the rumor actually confirmed by Nintendo. Sources don't mean shit in my book. They get stuff wrong all the time... for different reasons.

So you put more trust in the speculation bits that you like but not in people with solid sources? Ok.
 

LordOfChaos

Member
The way people pick and choose the "rumors" they want to believe is hilarious. A rumor is a rumor in my opinion no facts to back it up... unless you have multiple things from the rumor actually confirmed by Nintendo. Sources don't mean shit in my book. They get stuff wrong all the time... for different reasons.


Rumours can be wrong, but some places have more of a reputation to uphold, and rely on that reputation for their income, than 'my uncle from china'. Especially when several of those sources converge and seem to agree, i.e Eurogamer, Laura Kate Dale, Emily Rogers, etc.
 

Formless

Member
People keep acting like splatoon at 720p is all switch can do. Even if it is bandwidth starved, its still going to go beyond wii U. Zelda on Wii U is obviously not 900p

I posted in another thread about this but this seems to be overlooked -- how are devs handling the docked/undocked modes? This was raised by DF in their Zelda vid iirc.

If they finish most the game then look at adjusting resolution for the docked mode it may be easier to handle than adjusting the game settings from docked mode to undocked mode, assuming they want good performance on both.

I suspect that Splatoon 2 has not gone through that mastering process.

I'm not sure how bumping resolution worked for Zelda but DF had previously assumed that the 720p -> 1080p bump would be more or less free because of the GPU difference, obviously that's not the case. I imagine Zelda to be much more memory/CPU dependent than MK8.
 

Mokujin

Member
I'm not sure how bumping resolution worked for Zelda but DF had previously assumed that the 720p -> 1080p bump would be more or less free because of the GPU difference, obviously that's not the case. I imagine Zelda to be much more memory/CPU dependent than MK8.

My guess is that Switch version may have grater LOD distances, so it's effectively not only upping resolution but that as well, I don't want to check much BotW comparisons though because I don't want to spoil further the game for me,so not sure if there is proof about this.
 

bomblord1

Banned
Nah. Even Juan is much faster. But it's not a tablet. Don't look at FP16 numbers a lot of game computation needs at least FP32 precision to avoid visible artefacts.

Xbox One is faster but the switch should be well within "spitting distance" "same ballpark" whatever you want to call it. Basically I believe it should be able to handle 900p Xbox One games at say 720p with some cut down effects
 

Donnie

Member
Your leaks indicate LPDDR4 is used 2 chips worth which might mean either a 64bit or 128bit bus. That gives us a very good idea of the ballpark for bandwidth.

Still that ballpark amount is either 2 or 4 times WiiUs bandwidth (depending on if it's 64 or 128). Now of course WiiU also had embedded memory. But if it's 128bit Switch would still have overall more bandwidth (and that's forgetting the bandwidth advantages of time based rendering).
 

Donnie

Member
Nah. Even Juan is much faster. But it's not a tablet. Don't look at FP16 numbers a lot of game computation needs at least FP32 precision to avoid visible artefacts.

Certainly there will always be a need for FP32 and how much you can run in fp16 will depend on the game. The only example we've heard so far was (AFAIR) a Ubisoft dev running 70% of his code in fp16.
 

LordOfChaos

Member
The A72s/A73s have been pointed to as the most significant thing in this rumour, I would say it's actually the possibility of a 128 bit bus. Memory bandwidth is definitely what has held back mobile chips to date, they can on paper be more powerful than PS3/360 level but get pretty severely constrained there.

128 bit could land us around 50GB/s, which if there's no eSRAM or eDRAM or large SRAM pool on-die, could be a saving grace.

Tegra X1 by itself doesn't support 128 bit interfaces I don't think, but it's maybe possible. The 12.7 inch iPad Pro has a 128 bit interface, albeit on a larger die.
 

Lord Error

Insane For Sony
You do realize that Eurogamer's specs would easily allow 1080p and 60fps of 720p and 30fps Wii U games right? It's not
Isn't the only fully comparable game running at basically identical resolution and performance on Switch as it does on WiiU? MK8 is 1080/60 in 1/2 player mode, and 1080/30 in 3/4 player mode on both, no? Even looking at paper performance from Eurogamer's specs it really doesn't seem like something more powerful than WiiU, much less 4x more powerful like you're suggesting it shoudl be (unless I'm reading wrong).
 

prag16

Banned
Isn't the only fully comparable game running at basically identical resolution and performance on Switch as it does on WiiU? MK8 is 1080/60 in 1/2 player mode, and 1080/30 in 3/4 player mode on both, no? Even looking at paper performance from Eurogamer's specs it really doesn't seem like something more powerful than WiiU, much less 4x more powerful like you're suggesting it shoudl be (unless I'm reading wrong).

No, that is incorrect.

Zelda Wii U: 720p30 with significant drops
Zelda Switch docked: 900p30 with minimal drops
Zelda Switch mobile: 720p30 with minimal drops

MK8 Wii U: 720p60 (720p30 for 3/4 players)
MK8 Switch docked: 1080p60 (1080p30 for 3/4 players)
MK8 Switch mobile: 720p60 (720p30 for 3/4 players)

Handheld mode seems only marginally more powerful than Wii U. Docked mode is significantly more powerful.
 
Isn't the only fully comparable game running at basically identical resolution and performance on Switch as it does on WiiU? MK8 is 1080/60 in 1/2 player mode, and 1080/30 in 3/4 player mode on both, no? Even looking at paper performance from Eurogamer's specs it really doesn't seem like something more powerful than WiiU, much less 4x more powerful like you're suggesting it shoudl be (unless I'm reading wrong).

Mario Kart 8 is 720p on Wii U, regardless of the number of players.
 
Beware Nintendo fans who talk about specs and relative performance.

https://youtu.be/giqDQrfdkFA

You have to wonder if they know they are full of shit in how they mislead people or if they really are that high on the Kool Aid. Without a trace of irony he places performance of the Switch ahead of the One in virtually every category.
 

Hermii

Member
No, that is incorrect.

Zelda Wii U: 720p30 with significant drops
Zelda Switch docked: 900p30 with minimal drops
Zelda Switch mobile: 720p30 with minimal drops

MK8 Wii U: 720p60 (720p30 for 3/4 players)
MK8 Switch docked: 1080p60 (1080p30 for 3/4 players)
MK8 Switch mobile: 720p60 (720p30 for 3/4 players)

Handheld mode seems only marginally more powerful than Wii U. Docked mode is significantly more powerful.

As one would expect from a TX1 clocked at Euorogamer speeds.
 

Odrion

Banned
No, that is incorrect.

Zelda Wii U: 720p30 with significant drops
Zelda Switch docked: 900p30 with minimal drops
Zelda Switch mobile: 720p30 with minimal drops

MK8 Wii U: 720p60 (720p30 for 3/4 players)
MK8 Switch docked: 1080p60 (1080p30 for 3/4 players)
MK8 Switch mobile: 720p60 (720p30 for 3/4 players)

Handheld mode seems only marginally more powerful than Wii U. Docked mode is significantly more powerful.
that difference seems pretty similar to XB1 vs PS4, I wouldn't call the power gap between them "significant"
As one would expect from a TX1 clocked at Euorogamer speeds.
exactly

a P1 gpu would mean that we'd see graphics SOMEWHAT comparable to the Xbox One. what Nintendo showed off at their event wasn't even close to that
 

Donnie

Member
that difference seems pretty similar to XB1 vs PS4, I wouldn't call the power gap between them "significant"

exactly

a P1 gpu would mean that we'd see graphics SOMEWHAT comparable to the Xbox One. what Nintendo showed off at their event wasn't even close to that

A P1? Are you referring to Pascal? Because that would still be the same performance as X1 clock for clock.

Of course you won't see the full performance difference between Switch and WiiU with basic ports. I mean with Mario kart you're just seeing the raw processing difference between the two (720p to 1080p, just over twice the performance needed). But you won't get optimisations to push it any further. For instance you aren't going to see a port like that converting FP32 to fp16 where possible, because it's just a quick port.
 
We're likely going to end up seeing the portable mode outperforming Wii U even if slightly while the console mode can greatly increase the resolution.
Games built from the ground up for Wii U like MK8 and FAST Racing NEO targeted 720p 60fps but MK8 had some slight performance issues (though it might've been a coding error?) and FAST Racing was sub 720p at times.
Both are 1080p 60fps locked on Switch TV and 720p 60fps locked on the handheld.

An issue would be looking at games that are farther off as they might now have had time to optimize the docked mode. Splatoon 2, for example, despite the massive difference in clock speeds between the two modes, runs the same on TV and Docked. I think this is a good indication that it's likely something implemented later on, likely like PS4 Pro patches adding higher resolution outputs while building off the base PS4 code.
Zelda, as well, runs at 900p with more dips than the handheld mode at 720p likely indicating that they're still working on that though I wouldn't feel comfortable guaranteeing 1080p.

Overall, I think this is a pretty good compromise between power and visuals. The image quality is going to be MUCH greater than Wii U, an issue that held back a lot of Nintendo's prettier games. If they continue to focus on stylized games with solid performance (outside of streaming issues, Zelda seems a pretty steady 30fps and most of their other games are 60fps locked) I'll be pretty happy with the performance especially if I see it as Nintendo's next handheld with the option to play at higher resolutions. Sounds pretty great to me.

I'm not sure if Eurogamer's report is wrong, but if MK8 and Fast Racing are any indication, I should be pretty happy with it
 

Hermii

Member
A P1? Are you referring to Pascal? Because that would still be the same performance as X1 clock for clock.

Of course you won't see the full performance difference between Switch and WiiU with basic ports. I mean with Mario kart you're just seeing the raw processing difference between the two (720p to 1080p, just over twice the performance needed). But you won't get optimisations to push it any further. For instance you aren't going to see a port like that converting FP32 to fp16 where possible, because it's just a quick port.

I don't know how complicated that process is, but Sony expects that to be done with pro patches.
 
Top Bottom