• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Confirmed: The Nintendo Switch is powered by an Nvidia Tegra X1

Status
Not open for further replies.

atbigelow

Member
Ok 4 years not 5. Just because it was originally a Wii U game doesn't mean it shouldn't run and look significantly better if Switch is really 4-5x more powerful even with a year of porting (which isn't a rush job by any means).

BotW would be 1080p when docked with better shadows, better texture filtering, better AA and a solid 30fps were Switch as powerful as you say. Just because they're getting 1080p on Fast Racing and Mario Kart doesn't mean they will hit that improvement in every game because as you know different genres require different hardware needs.

If Zelda was rushed for launch where is the framerate optimisation patch almost a month later? I don't think it's a matter of a rushed port at all but rather Switch being a modest hardware leap over Wii U which struggles to run anything other than a closed circuit racing game at 1080p.

As I said earlier Splatoon should run at 1080p in docked mode. If it doesn't it will add further fuel to the fire that Switch isn't much more powerful than Wii U.

We know the Switch is definitely not powerful enough to bruteforce BOTW at 1080p, locked 30 FPS, and so forth. That doesn't mean there weren't ways to get there; it just means they didn't do them for reasons we don't know. There could easily be a bottleneck in their engine that they just didn't have the engineering time to deal with. That's the reality of software development. And if it's Good Enough©, they might not have any reason to go back in and fix it.

I think it'll be more telling to see if they make any improvements when the expansion pack comes out later this year. That'll require a client patch and we may see some additional improvements.
 

lutheran

Member
Cool, puts it more in the realm of getting AAA ports but probably still won't because Nintendo. This is going to be a Nintendo machine, indies, and a lot of Japanese portable titles.

So Vita 2.0 + Nintendo titles.

Actually, most likely, Vita 2.0 + 3DS 2.0 + Nintendo handheld and console titles that will also look great on a TV screen as well as go on the road with you. For me, this is nirvana.
 
If Zelda is a straight Wii U port, then that means that it uses 2 CPU cores and 1GB of RAM on Switch too... Switch-only tailored games should be able to use 3 CPU cores and 3.2GB of RAM.
 

z0m3le

Banned
We know the Switch is definitely not powerful enough to bruteforce BOTW at 1080p, locked 30 FPS, and so forth. That doesn't mean there weren't ways to get there; it just means they didn't do them for reasons we don't know. There could easily be a bottleneck in their engine that they just didn't have the engineering time to deal with. That's the reality of software development. And if it's Good Enough©, they might not have any reason to go back in and fix it.

I think it'll be more telling to see if they make any improvements when the expansion pack comes out later this year. That'll require a client patch and we may see some additional improvements.

Pretty much, as for the problem, it is likely an LoD problem, this is why it happens mostly on the great plateau, and can be avoided by changing the camera angle to reduce the draw distance. It can also be a bandwidth problem that we are seeing. What we do know is that the handheld runs the game much better than the wii u version and the docked gpu in switch is twice as powerful, thus even in this port, it's clearly above Wii U.

I'm not really going to touch on this again, as I said before, comparing wii u to switch with a wii u game is silly to say the least and is much the same as comparing ZoE HD to the original game to try and guage how much more powerful ps3 is to ps2.

If Zelda is a straight Wii U port, then that means that it uses 2 CPU cores and 1GB of RAM on Switch too... Switch-only tailored games should be able to use 3 CPU cores and 3.2GB of RAM.

Wii U has 3 cores and all are available for games, wii u even closes the background operations, causing the home screen to actually load when it is hit rather than just switch over to the background op.
 

Thoraxes

Member
If Zelda is a straight Wii U port, then that means that it uses 2 CPU cores and 1GB of RAM on Switch too... Switch-only tailored games should be able to use 3 CPU cores and 3.2GB of RAM.

https://www.youtube.com/watch?v=QyMsF31NdNc

EwrlBWL.png
yLzAQwK.png
 

Pasedo

Member
So what. The Switch is roughly 3x more powerful than the PS3 and PS3 had amazing looking games like Last of Us, UC3, Klllzone 3. If we can get games that look 3x as good as these I'd be very happy.
 

Hermii

Member
So what. The Switch is roughly 3x more powerful than the PS3 and PS3 had amazing looking games like Last of Us, UC3, Klllzone 3. If we can get games that look 3x as good as these I'd be very happy.

3x as good is ofcourse very subjective. Does PS4 games look 10x as good? Does pro patched games looks 20x as good? Debatable.
 

Durante

Member
It hasn't. Pascal are their first desktop cards with native fp16 support;
No. The first desktop cards which supported higher-speed FP16 calculations by Nvidia were the Geforce FX series. In 2003.

Developers collectively decided that it wasn't worth bothering with and everyone settled on FP32 everything for a decade and a half.
 
No. The first desktop cards which supported higher-speed FP16 calculations by Nvidia were the Geforce FX series. In 2003.

Developers collectively decided that it wasn't worth bothering with and everyone settled on FP32 everything for a decade and a half.

Desktop pascals support fp16 at 1/128 or 1/64 speed. Only p100 supports higher speed fp16

So what. The Switch is roughly 3x more powerful than the PS3 and PS3 had amazing looking games like Last of Us, UC3, Klllzone 3. If we can get games that look 3x as good as these I'd be very happy.

Theres really no way to accurately number how much faster it is. Best we can do is just see what kind of visuals we get out of it compared to ps360
 

Mokujin

Member
but the games have to run on the original ps4 and xbox one. So, no, I don't see thrid parties going through all that trouble. Maybe when next gen starts.

Also, if it's been on nvidia cards for 12 years, why haven't we seen games use this before?
at least on PC.

A very recent documented example of fp16 work is Frostbite using it for the checkerboard resolve shader in Battlefield 1 and claiming up to a 30% speed up thanks to it (shader speed not game speed).

So even in your specific scenario fp16 is already being used, they are not waiting for next gen.

Checkerboard in Battlefield 1 Slide 82
 

Panajev2001a

GAF's Pleasant Genius
The 668gflops has no Nvidia advantage tied into the number, if we are looking to add that, it's another 40% or 935gflops equivalent, or half the PS4. Thing is, that's a best case scenario in favor of the switch. It's easy enough to just look at what amd apu's do with certain gflops and compare them. I think the A8 7600 with 550gflops is a pretty easy low bar for switch when it comes to ports, the main problem with the comparison though is that the apu has a much faster cpu.

Is NVIDIA advantage a generic secret sauce now?
 

mario_O

Member
A very recent documented example of fp16 work is Frostbite using it for the checkerboard resolve shader in Battlefield 1 and claiming up to a 30% speed up thanks to it (shader speed not game speed).

So even in your specific scenario fp16 is already being used, they are not waiting for next gen.

Checkerboard in Battlefield 1 Slide 82

Yes, I also read that they used fp16 on the PS3, to save bandwidth. The ps4 and Xbox one dont support it. But for checkerboard it might make sense.
 

z0m3le

Banned
Is NVIDIA advantage a generic secret sauce now?

Updated number for mixed precision is ~600gflops as a best case, take what you want from technical talk, if it's just to ignore the post and give your own dribble, you might as well stop quoting people actually working through the numbers and showing real world examples, but please continue to push your console war on all of us.

Kinda crazy that its 2017 and w're making comparisons with consoles that came out in 2005-2006

Well it is a handheld. At least we aren't asking how many Gamecubes duct taped together it is.
 
Kinda crazy that its 2017 and w're making comparisons with consoles that came out in 2005-2006

True and while technology has improved, especially in the mobile space, Moore's Law is also slowing down.

I still use my 2009 macbook and it works great for webbrowsing and light workloads. I couldn't imagine using a 2002 ibook in 2009.

Unless we find some new method or material for creating microprocessors, we're hitting a ceiling.

I would never say that the PS4 is only marginally faster than the PS3, but I can't help but see diminishing returns for all but the most keen videophile.
 

Panajev2001a

GAF's Pleasant Genius
Updated number for mixed precision is ~600gflops as a best case, take what you want from technical talk, if it's just to ignore the post and give your own dribble, you might as well stop quoting people actually working through the numbers and showing real world examples, but please continue to push your console war on all of us.

You talked about advantage of FP16, so far so good, then you move to discuss adding another 40% for the NVIDIA advantage (NVIDIA flops redux). I am not entirely convinced of that part of argument and 40% is not a small increase...

Anyways, I will play ball. What am I fighting for?
 

z0m3le

Banned
You talked about advantage of FP16, so far so good, then you move to discuss adding another 40% for the NVIDIA advantage (NVIDIA flops redux). I am not entirely convinced of that part of argument and 40% is not a small increase...

Anyways, I will play ball. What am I fighting for?

That increase of 40% was Maxwell vs GCN, shown in the chart I brought up with the 4.5tflop overclocked GTX 970 vs the 6tflop overclocked AMD RX 480 in DX11, and was about the comparison of Switch's 393gflops maxwell+ GPU in Nvidia's custom API vs AMD's 550gflops A8 7600 AMD APU in DX11. This was just pointing to an extreme case, and I said as much, the performance advantage existed, you can go back to early GTX 1060 (4.3tflop gpus) vs RX 480 (~5.8tflops GPUs) and see people recommending the GTX 1060 at the time. Of course Doom was an early outliner and I recommended to my friend that RX 480 is more future proof, but the reality of the performance at the time was worth most people recommending the lower theoretical performing card over the AMD one. Of source AMD has caught up and future games are up in the air, but Nvidia still holds a flops advantage in most cases.

So yes, in some cases, a 40% advantage could exist over the current gen AMD consoles, this isn't saying that that would be a general case, or that people should expect that, it would be an outliner and I've even pointed out cases were the AMD A8 7600 should out perform switch, but those are also outliners, thanks to newer feature set, closed system and better bandwidth, there should be a case for Switch to meet this particular APU in general, with things going in Switch's favor more often than the reverse.
 
Erm, yes, I know. What does this have to do with my post?

should have quoted guy you responded to. hes mistaken about desktop pascals and fp16

That increase of 40% was Maxwell vs GCN, shown in the chart I brought up with the 4.5tflop overclocked GTX 970 vs the 6tflop overclocked AMD RX 480 in DX11, and was about the comparison of Switch's 393gflops maxwell+ GPU in Nvidia's custom API vs AMD's 550gflops A8 7600 AMD APU in DX11. This was just pointing to an extreme case, and I said as much, the performance advantage existed, you can go back to early GTX 1060 (4.3tflop gpus) vs RX 480 (~5.8tflops GPUs) and see people recommending the GTX 1060 at the time. Of course Doom was an early outliner and I recommended to my friend that RX 480 is more future proof, but the reality of the performance at the time was worth most people recommending the lower theoretical performing card over the AMD one. Of source AMD has caught up and future games are up in the air, but Nvidia still holds a flops advantage in most cases.

So yes, in some cases, a 40% advantage could exist over the current gen AMD consoles, this isn't saying that that would be a general case, or that people should expect that, it would be an outliner and I've even pointed out cases were the AMD A8 7600 should out perform switch, but those are also outliners, thanks to newer feature set, closed system and better bandwidth, there should be a case for Switch to meet this particular APU in general, with things going in Switch's favor more often than the reverse.

this type of logic is just wrong
 
I still use my 2009 macbook and it works great for webbrowsing and light workloads. I couldn't imagine using a 2002 ibook in 2009.
.
This. With a SSD and 8GB of RAM, it (2009 MacBook Pro) still works like a charm for me, coding and even audio processing and sequencing like Cubase is no problem yet.

Some webpages that are heavy on videos and animations are starting to cause slowdowns, though..


I would never say that the PS4 is only marginally faster than the PS3, but I can't help but see diminishing returns for all but the most keen videophile.
Same here. I like my PS4, but it doesn't give me the same "Wow"-effect I had going to a next console gen I had before.
 
Want to explain why?

the performance variance between a 480 and a 1060 has nothing to do with nvidia having "better flops". you cant apply some 40% blanket number. when a 1060 outperforms a 480 its likely because compute performance(flops) isnt the limiting factor.
 

mario_O

Member
Same here. I like my PS4, but it doesn't give me the same "Wow"-effect I had going to a next console gen I had before.

Try going back to a PS3. You'll notice the difference. The other day I played God Of War 3 for a while, and boy does it look rough. 720p games in general look pretty bad on my 4K Sammy.
 

Ninja Dom

Member
True and while technology has improved, especially in the mobile space, Moore's Law is also slowing down.

I still use my 2009 macbook and it works great for webbrowsing and light workloads. I couldn't imagine using a 2002 ibook in 2009.

Unless we find some new method or material for creating microprocessors, we're hitting a ceiling.

I would never say that the PS4 is only marginally faster than the PS3, but I can't help but see diminishing returns for all but the most keen videophile.

This. With a SSD and 8GB of RAM, it (2009 MacBook Pro) still works like a charm for me, coding and even audio processing and sequencing like Cubase is no problem yet.

Some webpages that are heavy on videos and animations are starting to cause slowdowns, though..



Same here. I like my PS4, but it doesn't give me the same "Wow"-effect I had going to a next console gen I had before.

Yup, and I'm still using a late 2009 27" iMac. With 12GB RAM and 1TB SSD. No slowdowns with anything I personally do on it.

And like the above, I certainly wouldn't have been using a 2002 iMac in 2009.

Gonna have to read up on this Moore's Law now.
 

z0m3le

Banned
the performance variance between a 480 and a 1060 has nothing to do with nvidia having "better flops". you cant apply some 40% blanket number. when a 1060 outperforms a 480 its likely because compute performance(flops) isnt the limiting factor.

It was the GTX 970 maxwell out performing the RX480, which has less than half the ram, lower memory bandwidth, less shaders and texture map units, though higher ROPs, and a similar clock (these cards are both just over 1.3ghz in this test) the RX 480 is 2 years newer on a new node (14nm with 3D transistors) and both carry roughly the same TDP.

The system is the same, same CPU/memory/motherboard, it's a benchmark comparison after all.

Now the comparison for the X1 vs the A8 7600 APU actually has more advantages for the X1 GPU, but does lack in CPU processing, which I already stated could be a problem with performance, but this is not a 1:1 comparison, just what the GPU should be capable of if not bottlenecked by the CPU (the memory bandwidth and size should be higher on the Switch, so that won't be a bottleneck on this system)
 
It was the GTX 970 maxwell out performing the RX480, which has less than half the ram, lower memory bandwidth, less shaders and texture map units, though higher ROPs, and a similar clock (these cards are both just over 1.3ghz in this test) the RX 480 is 2 years newer on a new node (14nm with 3D transistors) and both carry roughly the same TDP.

The system is the same, same CPU/memory/motherboard, it's a benchmark comparison after all.

Now the comparison for the X1 vs the A8 7600 APU actually has more advantages for the X1 GPU, but does lack in CPU processing, which I already stated could be a problem with performance, but this is not a 1:1 comparison, just what the GPU should be capable of if not bottlenecked by the CPU (the memory bandwidth and size should be higher on the Switch, so that won't be a bottleneck on this system)

close enough. you are looking at 1 single datapoint of a game not even running on amds optimized driver for the title
 

z0m3le

Banned
close enough. you are looking at 1 single datapoint of a game not even running on amds optimized driver

That was the point of the youtube videos I posted... They were DX11 BF1, Overwatch, The Division, Titanfall 2 and For Honor, I mean did you read my posts and what I said, or just saw me comparing AMD 550gflops to ~400 Nvidia gflops and ignored my post? I was very clear in the post http://www.neogaf.com/forum/showpost.php?p=232544825&postcount=2051 that it is DX11 that you'd want to compare, but this was ignoring mixed precision, which would allow DX12 comparisons and saying these GPUs are similar enough in performance to give a better understanding of what the Switch is capable of from a GPU perspective.
 

Pasedo

Member
And that on a PORTABLE System.

People are never happy.. :(

I mean what graphics techniques have come about that's really made a difference in visuals above and beyond a game like Killzone 3. I cant really pick alot out personally besides things looking a little sharper and running a little smoother. Is that all extra power really provides end of the day?
 
That was the point of the youtube videos I posted... They were DX11 BF1, Overwatch, The Division, Titanfall 2 and For Honor, I mean did you read my posts and what I said, or just saw me comparing AMD 550gflops to ~400 Nvidia gflops and ignored my post? I was very clear in the post http://www.neogaf.com/forum/showpost.php?p=232544825&postcount=2051 that it is DX11 that you'd want to compare, but this was ignoring mixed precision, which would allow DX12 comparisons and saying these GPUs are similar enough in performance to give a better understanding of what the Switch is capable of from a GPU perspective.

Ugghh i give up
 
And that on a PORTABLE System.

People are never happy.. :(

Exactly.

During the Wii Generation I kept saying "Can you imagine what Nintendo could do with the power of the PS3 or 360?", That's one of the reasons I bought a WiiU and the WiiU did have beautiful games but was lacking support, now Skyrim is releasing this holiday along with Super Mario Odyssey... on a tablet you can take with you on the go.
 
That was the point of the youtube videos I posted... They were DX11 BF1, Overwatch, The Division, Titanfall 2 and For Honor, I mean did you read my posts and what I said, or just saw me comparing AMD 550gflops to ~400 Nvidia gflops and ignored my post? I was very clear in the post http://www.neogaf.com/forum/showpost.php?p=232544825&postcount=2051 that it is DX11 that you'd want to compare, but this was ignoring mixed precision, which would allow DX12 comparisons and saying these GPUs are similar enough in performance to give a better understanding of what the Switch is capable of from a GPU perspective.

That's under DX11 where Nvidia can use their superior driver teams and also includes bonus for lower cpu overhead.

Consoles can use GCN much more efficiently than PC software so Nvidia advantage will be smaller there.
 

Turrican3

Member
Would it be possible to *completely* avoid throttling on a chip like the Tegra X1 pushed to the max supported frequency with a properly sized fan+heatsink solution?
 

z0m3le

Banned
That's under DX11 where Nvidia can use their superior driver teams and also includes bonus for lower cpu overhead.

Consoles can use GCN much more efficiently than PC software so Nvidia advantage will be smaller there.

I believe I did say that 40% was the best possible case scenario, not that it would be expected or an average.

Anyways, here is the 920mx which is a 256 cuda core maxwell part at 1ghz so a direct comparison would need to drop about 20% in an unoptimized look at what Switch could do? I don't know if I like this comparison as much, even though the 920mx seems to out perform the A8 7600, it has various CPU configs and doesn't share memory bandwidth with the CPU like AMD's APUs have to, which might make the performance you see on the AMD APUs closer to realistic Switch performance as there is less memory bandwidth than Switch has, but for a more direct comparison without the whole AMD vs Nvidia comparison, here you go, remember to reduce average fps by about 20%. VIDEOS <just found this, as Ghosttrick pointed the 920mx out to me.
 
I mean what graphics techniques have come about that's really made a difference in visuals above and beyond a game like Killzone 3. I cant really pick alot out personally besides things looking a little sharper and running a little smoother. Is that all extra power really provides end of the day?

More hardware power could be very useful in adding more complex world interactions and physics systems like those in BotW, which is why I'm a bit disappointed by the CPU situation for the Switch. But as far as visuals go, more and more GPU power will only really bloat graphics budgets with less returns, so the Switch is more than good enough in that area for me.

Would it be possible to *completely* avoid throttling on a chip like the Tegra X1 pushed to the max supported frequency with a properly sized fan+heatsink solution?

Probably. The chips might be a bit more expensive as there would likely be a larger number of chips binned for not being able to sustain those clocks, but it should be theoretically possible.

But not really at all relevant to the Switch.
 
I mean what graphics techniques have come about that's really made a difference in visuals above and beyond a game like Killzone 3. I cant really pick alot out personally besides things looking a little sharper and running a little smoother. Is that all extra power really provides end of the day?

The difference between Killzone 3 and Horizon is large. I can pick it out easily.
 

Pasedo

Member
More hardware power could be very useful in adding more complex world interactions and physics systems like those in BotW, which is why I'm a bit disappointed by the CPU situation for the Switch. But as far as visuals go, more and more GPU power will only really bloat graphics budgets with less returns, so the Switch is more than good enough in that area for me.

I thought the gpu these days could do most of what the cpu used to do. I think more physics and interactions definitely improves gameplay experience over graphics that already looked good several years ago and I think this is what devs should be focusing on today. It is kind of dumb then that there's so much focus on gpu power and gflops. Let's get that cpu power going.
 

Fafalada

Fafracer forever
z0m3le said:
So yes, in some cases, a 40% advantage could exist over the current gen AMD consoles
There are also situations where that advantage is exactly reversed on the same gen GPUs (NVidia ending up running same code 30-40% slower), but that says nothing for console software on the market. Consoles will target specific hw advantages where it make sense, and likewise avoid the pitfalls.
Cross-GPU comparisons are only really relevant to PC software where there's little to no hw (as opposed to API or hw-family)-targeted optimizations of any kind.
 

BDGAME

Member
One question: We know that Switch is 3 or 4 times more powerful than last gen machines in GPU power and it have 8 times more RAM.

But how it compare to ps3 and X360 in CPU capacity? How much stronger the switch CPU is?
 

Turrican3

Member
Probably. The chips might be a bit more expensive as there would likely be a larger number of chips binned for not being able to sustain those clocks, but it should be theoretically possible.

But not really at all relevant to the Switch.
Thanks.
As for being relevant... well, I was just trying to understand whether it would make sense for Nintendo to release a more powerful, home-only (no dock, no screen, etc.) Switch by just leveraging on the very same hardware, just with higher clocks.
 

KingSnake

The Birthday Skeleton
AMD drivers always suck at optimising the performance of their cards on PC. Stop using PC benchmarks for that "NVIDIA FLOPS vs. AMD FLOPS" meme. There are differences in how the GPUs work that go beyond the drivers like tile-based rendering and color compression, but that doesn't change the definition of FLOPS.

You also can't use the gap for higher level GPUs to deduct some theoretical gap for lower level GPUs. It's a logical fallacy unless you clearly know what the bottlenecks are for both sets of cards and if that scales linearly.

I'm practically a Nvidia fan but reading these arguments is still cringeworthy.
 
Sharper graphics with nicer looking textures? I think HDR makes a nice difference but I don't think that has anything to do with the gpu power.

Um, massive open world filled with giant robots who animate beautifully without frame drops doesn't count? Killzone 3 could show maybe 2 giant robots on screen simultaneously. Horizon can show a dozen in an open world.
 
Status
Not open for further replies.
Top Bottom