• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

What is the actual power of the Nintendo Switch?

Costia

Member
...
Blu and I did some math with some benchmarks. The Switch's CPU came up with being roughly 80% of the performance of the PS4 per core. When we consider the dimishing returns of splitting tasks among more than 3 cores and not knowing how much of the 4th core is available for Switch devs, we may be looking at something like 50% of the CPU performance of the PS4 at full utilization.
Switch's ARM CPU @1Ghz is 80% of the jaguar at 1.6Ghz?
So the arm is ~25% better per clock?

Maybe my wording wasn't perfect, but i found two posts from more tech savy people than me.
Hope these help.
Looks like 240 is still the theoretical max.
from what i see each vector processor has a vec4 ALU + scalar ALU, which would mean it can do 5 MADD operations per cycle, so is still 10 per cycle.
But since it is split to 4+1 it would be very hard to take advantage of 100% of the HW processing power.
 

UltimaKilo

Gold Member
Compared to wiiU
In handheld mode the GPU is like a wiiU plus (think 3DS > NEW 3DS) but has the benefit of 3x the usable ram, a significantly stronger CPU, and an architecture about 5-10 years newer (yes I realize that's a wide gap but the architecture on the wiiU was a weird hodgepodge of older and newer).

In docked mode its about 4x the wiiU's GPU with the previous benefits.

Compared to the Xbox One
On paper it's GPU (docked) is a bit under 1/3rd the Xbox One (which actually isn't a huge gap) but it once again benefits from a newer architecture (about 5 years newer), a CPU that is weaker but not signficantly, and about 64% of the usable ram (3.2GB vs 5GB)

Compared to the PS360
10+ year advantage on GPU architecture
About 4-5x the GPU power (docked)
A significantly better CPU than the 360 but not the Cell (the Cell can still outperform the PS4 CPU in some tasks)
6.4x the usable RAM

In practice what you'll probably see is PS360 era graphics running at 1080p 60fps and a few improved effects.
Or 1080p 60fps PS4/Xone games running at 720p 30fps.

Examples we can point to are
Snakepass
sub 900p 30fps PS4
Switch Sub 720p 30fps with a few effects removed

Lego City Undercover
PS4 1080p 30fps/60fps
Switch 1080p 30fps

The Switch also managed to get a full Physically Based Lighting system over the wiiU version, higher resolution shadows, longer draw distance, increased texture resolution, 1080p docked, and a more consistent framerate in this game.

Thanks for this. My biggest disappointment with the Switch was that Nintendo didn't wait for the new Tetra X2, which would have put it on par with xbone. Also, the lack of ram sucks even more, but thank goodness it's not as originally planned.

I know the Switch is a smash hit, but I'm afraid that the decision not to wait for the X2, along with the still limited RAM, will hurt the Switch 2-3 years down the road.
 
Is the clock of Switch CPU and GPU still the same as eurogamer mentioned, did someone check if nintendo changed it for launch or after launch? Have to know if it's under 400 for a bet i probably lost.
 

matthewuk

Member
There's something your all forgetting here.

Look at your phone, now imagine it a bit thicker. Now imagine that that phone could potentially run black ops 2 at 720p as opposed to 600p (ala 360/ps3) now imagine that as soon as you dock it, it now runs at 1080p.

So portable = 360 + ram Pak
Docked = 360 pro

Also remember the switch is working with mostly lower clocks and getting better results

Enhanced 7th gen in your pocket.
 
To have a 4x increase in resolution, the GPU would have to be around 4x stronger than the original 3DS. Wii's GPU is not 4x stronger than the 3DS, and it is missing the fixed shaders that was used to make current games on the 3DS look as good as they are on the system. Besides that, it would probably run too hot for a portable form factor, and would have BC issues without addition hardware.

Oh that's right.. 480/240=2 2X2 pixels=4x. My bad. How much stronger is Wii's GPU over 3DS?


Thanks for this. My biggest disappointment with the Switch was that Nintendo didn't wait for the new Tetra X2, which would have put it on par with xbone. Also, the lack of ram sucks even more, but thank goodness it's not as originally planned.

I know the Switch is a smash hit, but I'm afraid that the decision not to wait for the X2, along with the still limited RAM, will hurt the Switch 2-3 years down the road.

I think it was a good call from Nintendo to release the Switch in March. The Switch is selling like hotcakes with its first party lineup that they are spreading out well, and they can build that momentum all the way to the holidays while building up trust from 3rd party devs. If it was released in the holidays, not only would it have to compete with Scorpio, but they'd have to drag Wii U around to the holidays, which is already on life support. Nintendo couldn't wait to release Zelda this fall, and if it only released it for the Wii U before Switch's release, it would have lost a lot of potential sales.

In terms of power, Pascal would still be below in power when compared to the Xbone. Instead of Xbone being 2x as powerful in GPU, it would be more like 50% more powerful in pure GFLOPS(from X1's 600 to Pascal's 800 after you count mixed precision mode optimization.. and this is just a guesstimate). Not saying it isn't significant by the way.. Because Pascal would likely come along with double the bandwidth in its RAM, and would have A73 CPUs that would make it as powerful as the PS4's CPU at least, all while keeping it as energy efficient as X1(or 60% more energy efficient, which they could use in portable mode). It would scale up so much more nicely with PS4/Xbone, but I still think its a good call that they didn't wait to November to release the Switch.

Anyway, there's a really good chance a Pascal variant of Switch will be released eventually anyway. I'm guessing next year though. I'm just curious if they will focus more on the 60% energy efficiency, or 40% more power pascal has(while keeping same energy consumption as X1).
 

Dakhil

Member
To have a 4x increase in resolution, the GPU would have to be around 4x stronger than the original 3DS. Wii's GPU is not 4x stronger than the 3DS, and it is missing the fixed shaders that was used to make current games on the 3DS look as good as they are on the system. Besides that, it would probably run too hot for a portable form factor, and would have BC issues without addition hardware.

Oh that's right.. 480/240=2 2X2 pixels=4x. My bad. How much stronger is Wii's GPU over 3DS?




I think it was a good call from Nintendo to release the Switch in March. The Switch is selling like hotcakes with its first party lineup that they are spreading out well, and they can build that momentum all the way to the holidays while building up trust from 3rd party devs. If it was released in the holidays, not only would it have to compete with Scorpio, but they'd have to drag Wii U around to the holidays, which is already on life support. Nintendo couldn't wait to release Zelda this fall, and if it only released it for the Wii U before Switch's release, it would have lost a lot of potential sales.

In terms of power, Pascal would still be below in power when compared to the Xbone. Instead of Xbone being 2x as powerful in GPU, it would be more like 50% more powerful in pure GFLOPS(from X1's 600 to Pascal's 800 after you count mixed precision mode optimization.. and this is just a guesstimate). Not saying it isn't significant by the way.. Because Pascal would likely come along with double the bandwidth in its RAM, and would have A73 CPUs that would make it as powerful as the PS4's CPU at least, all while keeping it as energy efficient as X1(or 60% more energy efficient, which they could use in portable mode). It would scale up so much more nicely with PS4/Xbone, but I still think its a good call that they didn't wait to November to release the Switch.

Anyway, there's a really good chance a Pascal variant of Switch will be released eventually anyway. I'm guessing next year though. I'm just curious if they will focus more on the 60% energy efficiency, or 40% more power pascal has(while keeping same energy consumption as X1).

Actually, the Tegra X2 (if the Jetson TX2 spec sheet is any indicator) doesn't use A73 CPUs. But instead, the Tegra X2 uses A57 and Denver 2 CPUs.

FINAL%20JetsonTX2%20PPT%20Deck-14.png
 
Switch's ARM CPU @1Ghz is 80% of the jaguar at 1.6Ghz?
So the arm is ~25% better per clock?


Looks like 240 is still the theoretical max.
from what i see each vector processor has a vec4 ALU + scalar ALU, which would mean it can do 5 MADD operations per cycle, so is still 10 per cycle.
But since it is split to 4+1 it would be very hard to take advantage of 100% of the HW processing power.

Thanks for this. My biggest disappointment with the Switch was that Nintendo didn't wait for the new Tetra X2, which would have put it on par with xbone. Also, the lack of ram sucks even more, but thank goodness it's not as originally planned.

I know the Switch is a smash hit, but I'm afraid that the decision not to wait for the X2, along with the still limited RAM, will hurt the Switch 2-3 years down the road.
Battery tech, form factor, and cost would have still prevented the system from being up to par with the XB1. I wouldn't be surprised if Nintendo launch a "new Switch" in a few years with newer tech and enhanced/added portable/console modes.
 

Pasedo

Member
Everyone is going to jump on to the new hybrid gaming category. Again Nintendo leading the way. Great for the industry as it resets the graphical expectations and will keep the industry excited for many years to come in portable gpu advancements for true hybrid form factors. At some point the Nvidia Tegra mobile line may take over the traditional gpu line. The Oracle has spoken.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Switch's ARM CPU @1Ghz is 80% of the jaguar at 1.6Ghz?
So the arm is ~25% better per clock?
That's an aggregated figure - A57 is generally better per-clock, but how much depends on each individual task.

Looks like 240 is still the theoretical max.
from what i see each vector processor has a vec4 ALU + scalar ALU, which would mean it can do 5 MADD operations per cycle, so is still 10 per cycle.
But since it is split to 4+1 it would be very hard to take advantage of 100% of the HW processing power.
Xenos is a VLIW2 design with a vec4 and a scalar part and 4-element vector GPRs. ISA does madds and dot products in the vec4 part alone - the scalar part is a special unit. Here are the instructions that a 2nd generation Xeons (AMD Z4xx / Qualcomm Adreno 2xx) does: https://github.com/freedreno/freedreno/blob/master/fdre-a2xx/asm/ir.c#L346
 

phanphare

Banned
Thanks for this. My biggest disappointment with the Switch was that Nintendo didn't wait for the new Tetra X2, which would have put it on par with xbone. Also, the lack of ram sucks even more, but thank goodness it's not as originally planned.

I know the Switch is a smash hit, but I'm afraid that the decision not to wait for the X2, along with the still limited RAM, will hurt the Switch 2-3 years down the road.

you think the current version of the Switch will be the only SKU on the market 2-3 years down the road?

giphy.gif
 
To have a 4x increase in resolution, the GPU would have to be around 4x stronger than the original 3DS. Wii's GPU is not 4x stronger than the 3DS, and it is missing the fixed shaders that was used to make current games on the 3DS look as good as they are on the system. Besides that, it would probably run too hot for a portable form factor, and would have BC issues without addition hardware.

Oh that's right.. 480/240=2 2X2 pixels=4x. My bad. How much stronger is Wii's GPU over 3DS?
Actually, the 3DS' Pica200 may be stronger than Wii's GPU, but half of its resource are usually reserved for rendering images for each eye to make the 3D effect. Wii's GPU also doesn't have modern features that would be extremely helpful to properly utilize it. Replacing 3DS' GPU with the Wii's would be a step backwards.

It will definitely be more attractive to use a downclown version of Wii U's GPU, but that will be very impractical to put in a potable system.
 
Top Bottom