• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch: Powered by Custom Nvidia Tegra Chip (Official)

Thanks for the responses. Yeah, I'm keeping my expectations in check, even 900p would be a great surprise for me honestly. Only time will tell though, so I can't wait until folks like Digital Foundry get their hands on this thing and post some comparisons.
 

thefro

Member
Most likely if our current speculation is correct than no as the xbox one struggles to deliver modern games at 1080p so something weaker than it would be doing worst. Now the differences is the switch wont be as held back by its cpu apparently so it might have an advantage in a few games.

There should also be a pretty significant speed advantage over a traditional hard drive for the Switch carts depending on what tech Nintendo uses.
 
Thanks for the responses. Yeah, I'm keeping my expectations in check, even 900p would be a great surprise for me honestly. Only time will tell though, so I can't wait until folks like Digital Foundry get their hands on this thing and post some comparisons.

Im expecting 900p to be a common resolution at least for third parties for the TV mode that being said Nintendo will most likely deliver 1080p 60fps games which look amazing though.

There should also be a pretty significant speed advantage over a traditional hard drive for the Switch carts depending on what tech Nintendo uses.

Yeah the loading times will be Ace on the Switch.
 
Does anyone know of good resources to learn more about this highly technical stuff? Not necessarily limited to Switch, but technical gaming discussion in general? All I can think of are these kinds of GAF threads and Digital Foundry.
 

Scrawnton

Member
I'm confident that NX will use new chips, like Parker. This console is just as important to nvidia to get back into the console race and get more contracts as it is for Nintendo to get back in the race. They aren't going to put out a lackluster system with their names all over it as their last ditch effort to impress hardware manufacturers. Nvidia isn't going to be this desperate for business and then provide X1's...
 

Bl@de

Member
I would expect when portable and on battery only it would down-clock the chip to play in 720p and save power, once docked and using mains power it would use the full resources of the chip and display in 1080p as saving power would not be required

This is what I expect as well. Games play in different resolution with different clockspeeds. ~720p and lower clocks in portable mode, ~1080p and full use of the hardware in dock mode. It's the cheapest and easiest way for Nintendo and a good solution for customers. But developers would need to test two settings (similar to downsampling on PS4 Pro) when developing a game. A game can't turn into a slideshow only because of the different mode.
 
I'm inclined to say that the Switch will play the same third party games as the Xbone/PS4, just not at the same graphics levels/resolution. And frankly, I think that's reasonable enough for a hybrid system, and a realistic assumption to make. The fact that a Nintendo handheld will likely do that so soon is fucking astounding in its own right.

I'm more interested in seeing what kind of visual showcases Nintendo and its devs are gonna create with it, honestly.
 

Ninferno

Member
On one hand you have people complaining about battery life, on the other you have people complaining about the lack of graphical power. Guys, just calm down a bit and be realistic, would you? Nvidia is pretty much as good as you can get in terms of power efficiency in the field of serious gaming, so just let the experts do their job. I'm not saying the Switch will be a great product; I'm only suggesting "wait and see".
 

VanWinkle

Member
On one hand you have people complaining about battery life, on the other you have people complaining about the lack of graphical power. Guys, just be calm down and be realistic, would you? Nvidia is pretty much as good as you can get in terms of power efficiency in the field of serious gaming, so just let the experts do the job. I'm not saying the Switch will be a great product; I'm only suggesting "wait and see".

I would be MORE than okay with Wii U+ level hardware power in the handheld with like 5 hours battery life, as long as it was a good deal more powerful in the dock.
 

Rodin

Member
More equivalent to Xbox 360 probably so slightly better than Wii U.

VYjbFxd.gif
 

Jing_Ke

Member

Yeah, pretty much.

I think that's it for me in regards to Switch threads until the new year. My patience for idiotic speculation was exhausted in the lead up to the reveal trailer.

Not as or just as powerful as the Wii U? Fucking seriously guys?
 
Yeah, pretty much.

I think that's it for me in regards to Switch threads until the new year. My patience for idiotic speculation was exhausted in the lead up to the reveal trailer.

Not as or just as powerful as the Wii U? Fucking seriously guys?

Idiot optimists and idiot pessimists exist in equal measure, really, and both are obnoxious, but the latter is especially evident with anything to do with Nintendo. I sincerely doubt the Switch won't be able to play current-gen games in some form.
 

Thraktor

Member
Hitting a locked 30 or 60 might be easier docked. Having adaptive refresh would be more beneficial on the go.

If there's no resolution difference, then I'd agree with you. If games are expected to run at higher resolutions while docked, though, it could potentially be the other way around if the performance jump isn't enough to accommodate.

It'd be nice if Nintendo also supported adaptive-sync over HDMI for those of us who have our consoles hooked up to monitors, but I somehow doubt they'll be supporting AMD's open standard with Nvidia on board.

For the 7.9" iPad mini that is more comparable to the 6" Switch, the PPI on the Retina iPad is 326 PPI. That is a 30% difference. Not spitting difference or slightly lower.

The iPad Mini doesn't need to be 326 PPI, though. They only arrived at that density because of the way the OS requires 2x resolution jumps for high-PPI scaling (and the previous iPad Mini was 163 PPI). Had iOS handled scaling differently I'm sure they would have stuck with the same 220-264 PPI range that all of their tablets and laptops lie in.

Nothing that I can find easily, it was something I heard asked of an engineer on a stream (and I don't remember where as it was a while ago). I don't know how significant the differences are though, I got the impression that it is mostly all there. I also got the impression that it was also more of a case of features lagging rather than outright impossibility.

I'm not sure how much lagging features would have to do with FP16, though. In general, an algorithm is going to fall into one of three categories when you run it at a lower precision:

1 - Is completely unaffected by the drop in precision, and always produces precisely the same output as it would at a higher precision.

2 - The drop in precision can cause small to medium sized errors in the output compared to higher resolutions. This can range from occasionally flipping the lowest bit (which wouldn't be noticeable in a real-time rendering environment) to frequent errors in higher bits, which can cause visible rendering errors, such as banding.

3 - Is fundamentally unstable at the lower precision, producing results which are totally unrelated to what would be produced at a higher precision.

Any given graphical technique performed in shaders on a GPU is going to fall into one of those categories*. Effectively, it's either going to work at FP16 or it's not, and in most cases this is a characteristic of the algorithm itself, rather than the implementation of the algorithm in code (i.e. it's not simply a matter of tweaking the code to get it working on FP16). In theory you may be able to use another algorithm to produce the same result, but which will produce functional output at FP16, typically at the expense of some performance. This might not always be the case, however (for example graphical techniques which attempt to emulate some physical phenomenon, such as PBR, may be constrained by the workings of that phenomenon itself).

To be honest, though, I have no idea to what extent any of this affects engine programmers attempting to optimise for a platform with good FP16 performance. I have no doubt there are quite a few people out there who have worked hard to figure out which techniques work well in FP16 and which don't, but unfortunately none of them seem interested in giving a GDC talk on it. At this point I'm half tempted to write an FP16 emulation library and start testing stuff myself.

* Although I very much doubt any fall into the third category. It's more of an issue for scientific simulations (which, incidentally, is one of the reasons why FP64 is so important for HPC cards).

Pretty much all GPUs these days are bandwidth constrained relative to compute, especially ones that have over 5x the compute power of console and less than twice the bandwidth. Compute scales faster than bandwidth, and it's a problem. See 4K and VR if you think this is merely academic as opposed to something people in the industry take seriously.

I don't doubt that there are certain situations where Pascal GPUs can be bandwidth constrained, and 4K and VR can certainly represent issues when running bandwidth-intensive engines.

But my point wasn't about Pascal GPUs sometimes being bandwidth constrained in specific situations. My point arose because an "insider" on Anandtech forums claimed (and I'm paraphrasing here) that "because Switch has 1/6th the bandwidth of XBO, therefore it's limited to 1/6th the performance". It's a ludicrous statement, and I was pointing out how ludicrous it is by showing that, by his own logic, all Pascal GPUs would be "horrifically, cripplingly bandwidth constrained". They're not. If they manage to get the performance they do while 80% of their ALU logic is sitting idle (which is what he's implying with the bandwidth comparisons), then Nvidia are capable of straight-up witchcraft.

Pretty much all GPUs these days are bandwidth constrained As for the XBO vs TX1 situation, Nvidia's bandwidth optimisations aren't going to net significant enough performance increases to offset only having 25.6GB/s of memory bandwidth.

I do see a lot of people engaging in insane amounts of fudge factoring without a single piece of evidence though. Apparently fp16, color compression, "NV FLOPS", and tiled rasterization somehow mean the TX1 level hardware likely to be in the NS is somehow a portable XBO or PS4.

But we don't know if it's TX1 level hardware. It could be around that, or it could be above, or it could be below. We also have no idea whether it has 25GB/s or bandwidth, it could well be 50GB/s, or it could even theoretically be 256GB/s. Yeah, that last one is extremely unlikely, but my point is that, aside from using a TX1 in dev kits when that was literally the only off-the-shelf chip available to them, we don't really have any reliable, specific information on either the performance level or memory system of Switch, so any claims that it's going to be bandwidth constrained are just being plucked from thin air.

The original semiaccurate.com leak that pointed us in the direction of Nvidia stated that they were pretty distraught over losing the console bids for the PS4/Xbox One and were willing to give Nintendo a really strong deal in terms of both software support and hardware to make it work. It sounded kind of unbelievable at the time but given that the rest of the rumor came true I'm not sure what to believe.

Shield TV was 5-20w with a Maxwell Chip. Pascal will hopefully be less on both the low and high ends. We don't really know until we have a better idea of the internals.

Charlie from Semiaccurate isn't known for being particularly favourable to Nvidia, though, so it's possible that his "taking a loss on the sale" was just an exaggeration of them accepting much lower margins, closer to the ~15% AMD makes from their console business.

Does anyone know of good resources to learn more about this highly technical stuff? Not necessarily limited to Switch, but technical gaming discussion in general? All I can think of are these kinds of GAF threads and Digital Foundry.

Anandtech and Ars Technica frequently have good articles on hardware. For games specific stuff GDC talks are also a great resource. They're all up for free online a few months after the conference, and you'll find lots of the talks about engine optimisation will give you a good insight as to the how hardware decisions affect game developers.
 

nynt9

Member
Idiot optimists and idiot pessimists exist in equal measure, really, and both are obnoxious, but the latter is especially evident with anything to do with Nintendo. I sincerely doubt the Switch won't be able to play current-gen games in some form.

I mean, if the system has 4GB RAM like some of the rumors, then it might have a hard time running a lot of current gen games.
 

AdanVC

Member
I'm inclined to say that the Switch will play the same third party games as the Xbone/PS4, just not at the same graphics levels/resolution. And frankly, I think that's reasonable enough for a hybrid system, and a realistic assumption to make. The fact that a Nintendo handheld will likely do that so soon is fucking astounding in its own right.

I'm more interested in seeing what kind of visual showcases Nintendo and its devs are gonna create with it, honestly.

This! If Nintendo managed to develop beautiful looking games on Wii U (Mario 3D World, Mario Kart 8, Pikmin 3, Yoshi's Wolly World, Paper Mario Color Splash, Smash, etc) just imagine what they will do on Switch with better hardware and all.
 

Deadbeat

Banned
I mean, if the system has 4GB RAM like some of the rumors, then it might have a hard time running a lot of current gen games.
Games will have to be made to work around portable mode as well. Thats not going to have anywhere near the raw power of the possible chips the system could have. Especially when it comes to cost and battery life.
 
Games will have to be made to work around portable mode as well. Thats not going to have anywhere near the raw power of the possible chips the system could have. Especially when it comes to cost and battery life.

That's not necessarily true. Earlier in this thread we have people talking about the Pixel C tablet which runs a TX1 (which is what's in the Switch July devkit at least) at max clocks with a battery life of 3 hours, and if the TX1 was indeed used to simulate the custom Pascal chip which sees power efficiency gains around 40%, then it's not unreasonable to expect at least the same 512GFlops as the Pixel C with potentially the same or a longer battery life.

I doubt we'll get up 750GFlops in portable mode (Parker at full clocks) but you never know, considering that picture of the car headrest dock includes holes to leave the Switch vents unobstructed, which suggests the fan could be running when on battery (though it could also be used as a convection vent, or used if the Switch is plugged into a power outlet in the car). If the fan runs on battery power then it could perform a lot better in portable mode than we first would think.
 

Zil33184

Member
But we don't know if it's TX1 level hardware. It could be around that, or it could be above, or it could be below. We also have no idea whether it has 25GB/s or bandwidth, it could well be 50GB/s, or it could even theoretically be 256GB/s. Yeah, that last one is extremely unlikely, but my point is that, aside from using a TX1 in dev kits when that was literally the only off-the-shelf chip available to them, we don't really have any reliable, specific information on either the performance level or memory system of Switch, so any claims that it's going to be bandwidth constrained are just being plucked from thin air.

My issue with that argument is the dev kit doesn't have the power and thermal constraints of the final hardware so Nintendo weren't limited to using off the shelf mobile parts in building approximate hardware. Any Pascal part with disabled cores running underclocked could have been placed on a bespoke board to better simulate the final hardware.

I encourage everyone in the thread to read this Eurogamer article about the lead up to the Wii U launch, especially the part detailing the timeline for dev kits. Even for hardware as esoteric as the Wii U, Nintendo did a pretty good job of matching the final spec in their dev kits.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
My issue with that argument is the dev kit doesn't have the power and thermal constraints of the final hardware so Nintendo weren't limited to using off the shelf mobile parts in building approximate hardware. Any Pascal part with disabled cores running underclocked could have been placed on a bespoke board to better simulate the final hardware.
That's a really bold assumption to make, and that's even before considering the Pascal undersupply situation. That scenario would not simulate the component interconnects/buses found on a SoC (e.g. cpu and gpu across pci-e vs whatever bus and coherency protocols Maxwell sits at in the TX1) which appears to be the crux of your 'NS will have 25GB/s cumulative BW to the GPU' argument. We don't know how much BW there will be available and how that will be split in the system.

I encourage everyone in the thread to read this Eurogamer article about the lead up to the Wii U launch, especially the part detailing the timeline for dev kits. Even for hardware as esoteric as the Wii U, Nintendo did a pretty good job of matching the final spec in their dev kits.
I can give you the counter-example of MS sending out G5 Macs as early devkits - those had little to do with the performance characteristics of the 360. Which is actually quite normal - early devkits aim at getting devs familiar with the dominant specifics of the future hw (and apparently MS though PPC64 was the most specific thing about their upcoming console), not as much as the exact performance characteristics, which could be tweaked at the 11th hour (e.g. the cpu and gpu clocks of the cube). Same with early ps4 devkits which were PCs exhibiting nothing like the bus configuration of the end product.
 

KingSnake

The Birthday Skeleton
Starting developing a game or making a port on a devkit with the same architecture and lower power and bandwidth and getting a better devkit when the time comes to actually optimise the game/port for a decent performance has no real downsides. I don't think any developer would complain about that.

I don't understand how can somebody make an issue out of this. X1 is still a very good match for Parker in terms of making the game run on it.
 

Vena

Member
Starting developing a game or making a port on a devkit with the same architecture and lower power and bandwidth and getting a better devkit when the time comes to actually optimise the game/port for a decent performance has no real downsides. I don't think any developer would complain about that.

I don't understand how can somebody make an issue out of this. X1 is still a very good match for Parker in terms of making the game run on it.

Switch is driving some people mad for a very different reason than excitement, or perhaps a very different type of excitement.
 

jett

D-Member
I'm confident that NX will use new chips, like Parker. This console is just as important to nvidia to get back into the console race and get more contracts as it is for Nintendo to get back in the race. They aren't going to put out a lackluster system with their names all over it as their last ditch effort to impress hardware manufacturers. Nvidia isn't going to be this desperate for business and then provide X1's...

I really doubt Nintendo is using cutting edge pascal-based Tegra, if only because the machine must remain cost effective and sold at a reasonable price. X1 should be good enough for Nintendo's purposes, although not enough to be graphically competitive with the other two consoles.
 

KingSnake

The Birthday Skeleton
I really doubt Nintendo is using cutting edge pascal-based Tegra, if only because the machine must remain cost effective and sold at a reasonable price. X1 should be good enough for Nintendo's purposes, although not enough to be graphically competitive with the other two consoles.

So, what do you think it costs more in a Pascal based Tegra than in a X1 based one, considering that the chip is anyhow a custom one?
 

Zil33184

Member
That's a really bold assumption to make, and that's even before considering the Pascal undersupply situation. That scenario would not simulate the component interconnects/buses found on a SoC (e.g. cpu and gpu across pci-e vs whatever bus and coherency protocols Maxwell sits at in the TX1) which appears to be the crux of your 'NS will have 25GB/s cumulative BW to the GPU' argument. We don't know how much BW there will be available and how that will be split in the system.


I can give you the counter-example of MS sending out G5 Macs as early devkits - those had little to do with the performance characteristics of the 360. Which is actually quite normal - early devkits aim at getting devs familiar with the dominant specifics of the future hw (and apparently MS though PPC64 was the most specific thing about their upcoming console), not as much as the exact performance characteristics, which could be tweaked at the 11th hour (e.g. the cpu and gpu clocks of the cube). Same with early ps4 devkits which were PCs exhibiting nothing like the bus configuration of the end product.

My conjecture, and I readily admit that it's just conjecture, is that TX1 is a close match for the performance of the final hardware, that also happens to be architecturally similar to the new custom Tegra in the NS. I'm assuming that if the more aggressive speculation about NS was correct, earlier dev kits would have used a pascal based board to better simulate the performance of final hardware.

It's worth noting that your example of early Orbis dev kits is more similar to my hypothetical pascal based board. Early Orbis dev kits were octo core bulldozers connected to a discrete GCN GPU, which gave a good approximation of PS4's GPU performance over other specifics like bus level architecture or even CPU architecture.

Also the timeline for PS4 dev kits had units with near final SoCs delivered in January 2013, ~10 months before launch. NS dev kits based on TX1 hardware were being used as recently as July. Given a March 2017 release date for NS, that means you would have to believe that Nintendo, or certain third parties at the very least, were still relying on very early, crude approximations of final hardware only 8 months before launch.

Now maybe I'm just a cynic when it comes to Nintendo speculation threads, but absent any well publicized and heavily vetted leaks that would give me reason to reconsider, I think I'll continue to be suspicious of any and all claims of near home console performance in a handheld.
 

viHuGi

Banned
Most likely if our current speculation is correct than no as the xbox one struggles to deliver modern games at 1080p so something weaker than it would be doing worst. Now the differences is the switch wont be as held back by its cpu apparently so it might have an advantage in a few games.

I still think its silly that it seems the switch will not beat my computers ATI radeon 5770 from late 2009 as I was hoping the wiiu would and it didnt and switch seems to be the same though I guess its to be expected with a handheld hell this old card beats the xbox one in some games.

Do we know the CPU? Is Nvidia making it aswell?
 

Schnozberry

Member
So, what do you think it costs more in a Pascal based Tegra than in a X1 based one, considering that the chip is anyhow a custom one?

It wouldn't make any sense for a custom chip to be Maxwell at this point, because the Tegra X1 chips that had already been produced were on 20nm. Since Pascal is essentially a shrunken down Maxwell, whatever they are doing with this isn't a TX1. It seems unlikely to me that Nvidia would submit another 20nm design based on how much they hated it, the only caveat being if they had massive unfulfilled fab commitments. Given the unsuitability of 20nm for anything other than their existing Tegra X1 design wins, my money is on 16nm and a Pascal based design. The timetable of the dev kits and the trouble Nvidia has had producing Pascal based chips in significant quantities lines up for early kits being on an overclocked Tegra X1.
 

Nerrel

Member
Not as or just as powerful as the Wii U? Fucking seriously guys?

Do you remember when the Wii U was revealed, and naysayers were claiming that it wouldn't be more powerful than the Xbox 360? I do, because I was one of the people arguing "of course it'll be better, don't be stupid." Didn't exactly work out that way. It was unthinkable that any component of the Wii U would struggle to outperform or even fall behind 7 year old hardware, yet Nintendo found a way. And the Wii U wasn't a portable system.

I have hope that this will turn out to be fairly powerful given that Nintendo is letting Nvidia have a large role in the hardware and there are hints that this is a newer version of the Tegra than we've seen, but I don't think you should rule anything out when it comes to the hardware being underwhelming. In no way shape or form has Nintendo earned enough trust for anyone to say "of course it'll be more powerful than Wii U." Until we know for sure what the hardware is and what it's capable of, I still consider "Wii U level performance" a possibility.
 

KingSnake

The Birthday Skeleton
Do you remember when the Wii U was revealed, and naysayers were claiming that it wouldn't be more powerful than the Xbox 360? I do, because I was one of the people arguing "of course it'll be better, don't be stupid." Didn't exactly work out that way. It was unthinkable that any component of the Wii U would struggle to outperform or even fall behind 7 year old hardware, yet Nintendo found a way. And the Wii U wasn't a portable system.

I have hope that this will turn out to be fairly powerful given that Nintendo is letting Nvidia have a large role in the hardware and there are hints that this is a newer version of the Tegra than we've seen, but I don't think you should rule anything out when it comes to the hardware being underwhelming. In no way shape or form has Nintendo earned enough trust for anyone to say "of course it'll be more powerful than Wii U." Until we know for sure what the hardware is and what it's capable of, I still consider "Wii U level performance" a possibility.

Even if it's a Tegra X1, it would need to be cut to a third of its power to match Wii U. And that's ignoring the more modern architecture and the differences between AMD and Nvidia. And the vents would be just for design, like those who put an additional fake exhaust pipe on their car to look like it's hiding something more under the hood. So it's pretty far fetched.

With Wii U there was only guessing in blind, but here we pretty much know the minimum of what's in the box.
 

PSGames

Junior Member
For those expecting 1080p mode for the docked portion you realize all textures and effects will more than likely be optimized for 720p and below?

We already know the screen is 720p and when the guy removes the Switch from the Dock the game immediately shows up on the Switch screen which means it is likely using the same assets between the two. And with cartridge space and RAM space (if 4GB is true) coming at a premium it makes sense to only include one set of assets.

Now if this is the case wouldn't it make more sense for the dock to just upscale the image since the assets will be designed for 720p and below anyway?
 
My conjecture, and I readily admit that it's just conjecture, is that TX1 is a close match for the performance of the final hardware, that also happens to be architecturally similar to the new custom Tegra in the NS. I'm assuming that if the more aggressive speculation about NS was correct, earlier dev kits would have used a pascal based board to better simulate the performance of final hardware.

It's worth noting that your example of early Orbis dev kits is more similar to my hypothetical pascal based board. Early Orbis dev kits were octo core bulldozers connected to a discrete GCN GPU, which gave a good approximation of PS4's GPU performance over other specifics like bus level architecture or even CPU architecture.

Also the timeline for PS4 dev kits had units with near final SoCs delivered in January 2013, ~10 months before launch. NS dev kits based on TX1 hardware were being used as recently as July. Given a March 2017 release date for NS, that means you would have to believe that Nintendo, or certain third parties at the very least, were still relying on very early, crude approximations of final hardware only 8 months before launch.

Now maybe I'm just a cynic when it comes to Nintendo speculation threads, but absent any well publicized and heavily vetted leaks that would give me reason to reconsider, I think I'll continue to be suspicious of any and all claims of near home console performance in a handheld.

What's cheaper? Have Nvidia modify X2 chips for early dev kits, that had been previously modified for cars or use X1 development kits that can be purchased on Amazon. While Nvidia focused creating the custom chip and development software that accompanies the final dev kits.

It helped to have what was available also because it allowed them to get dev kits in developers hands pretty quickly.
 

KingSnake

The Birthday Skeleton
For those expecting 1080p mode for the docked portion you realize all textures and effects will more than likely be optimized for 720p and below?

We already know the screen is 720p and when the guy removes the Switch from the Dock the game immediately shows up on the Switch screen which means it is likely using the same assets between the two. And with cartridge space and RAM space (if 4GB is true) coming at a premium it makes sense to only include one set of assets.

Now if this is the case wouldn't it make more sense for the dock to just upscale the image since the assets will be designed for 720p and below anyway?

The games shown in the video don't actually run on the Switch.
 

Philippo

Member
I'm not very good with tech, but i understand that the Switch sits slightly behind XB1 in terms of raw power, and that supports modern engines like UE4.

The question is: how many ports of AAA games can we expect?
Like, are we going to get games like Watch Dogs 2, For Honor, Battlefield1 or Titanfall 2?
Of course they'de be at a lower res and lower details (i guess), but still, is there a good chance?
 

Cerium

Member
I'm not very good with tech, but i understand that the Switch sits slightly behind XB1 in terms of raw power, and that supports modern engines like UE4.

The question is: how many ports of AAA games can we expect?
Like, are we going to get games like Watch Dogs 2, For Honor, Battlefield1 or Titanfall 2?
Of course they'de be at a lower res and lower details (i guess), but still, is there a good chance?
We know Dragon Quest XI is coming. We know the engine support is there. Publishers may see value in releasing portable versions of AAA console games. The conditions are ripe for it, but we'll have to wait and see for confirmation.
 

KingSnake

The Birthday Skeleton
I'm not very good with tech, but i understand that the Switch sits slightly behind XB1 in terms of raw power, and that supports modern engines like UE4.

The question is: how many ports of AAA games can we expect?
Like, are we going to get games like Watch Dogs 2, For Honor, Battlefield1 or Titanfall 2?
Of course they'de be at a lower res and lower details (i guess), but still, is there a good chance?

It all depends on the deals that Nintendo made with the third parties and how much the third parties believe there will be a market for those games on Switch. Of course, if it's cheap to port them, that might help a bit.
 
Top Bottom