• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.

EDarkness

Member
That's my point. Not only do we not know if we was even vetted (as far as well know), but we don't even know what that means.

Yet people are taking his info as gospel.

I leave it up to the Mods to make that call. He's not banned and I'm sure they're reading this thread, so they must figure he's in a position to know things. We've seen folks banned for less and in short order. Now, whether what he knows is right is something else, but Emily has backed up information he's mentioned as well.
 

Schnozberry

Member
Not with sustained clock speeds.

A57-power-curve.png

This is also a 28nm chip. Nvidia wouldn't be using that in the Switch. We're looking at 16nm most likely. It's a chasm of difference between performance per watt.
 

MDave

Member
I have think that there would be no significant problems for 3rd party developers to get Xbox One / PS4 games on the Switch. Why? Based on the information we know:

GPU Performance:

This is for handheld mode. Also the same as a Tegra X1 at full speed, or Pascal at 60%:

The GPU is about the same difference in performance as the difference is between the PS4 1.84TF and Xbox One 1.31TF GPU's. 530 GFLOPS difference between the Xbox One and PS4. Suspecting the Switch has a Tegra that performs at FP32 500 GFs or FP16 1TF, and games use mixed precision so average that to 750 Gigaflops; Roughly the same gap in FLOPS between the Switch and Xbox One as the gap between the Xbox One and PS4.

Pascal at 100% in docked the gap is reduced to half the difference between the Xbox One and PS4; FP32 750GFs and FP16 1.25TFs, and average that to 1TF in mixed precision mode. Now about 310 GFs away from the Xbox One. Smaller then the difference in power between Xbox One and PS4.

The more a developer can take advantage of FP16 the easier it will be to close that gap.

So as an idea of what to expect: The difference you see in resolution or frame rate between the Xbox One and PS4, take that difference down again between Switch and Xbox One and you're painting a pretty good picture of what the Switch can do. For some games the difference is very little, in others the difference is more apparent. I suspect the same between Switch and Xbox one.

One game with very little differences between Xbox One and PS4 versions is Doom 2016, check comparison video on Digital Foundry.

RAM:

Assuming that Xbox One and PS4 developers take advantage of all that uncompressed audio they can put on a Blu-ray disk, if a game has that uncompressed audio in RAM then there is less concern for RAM then I first thought. Switch developers would be assumed to use compressed audio, saving a significant amount of game card storage and RAM usage. And if Xbox One / PS4 developers store data in RAM only for the sake of faster access times because of slow disk and HDD access speeds, even less concern if the Switch game cards allow for much faster access speeds.

CPU:

We don't know anything officially, but leaks suggest faster CPU's then the Xbox One's, which are slightly faster then PS4 CPUs.
 
You have valid points re: architecture, although the real world results have yet to be seen. If RAM bandwidth is really only 25.6 GB/s, I hate to use strong words, but that's pathetic. Worse than Wii U w/ it's eDRAM pool used efficiently. We'll see what they can do with tiling and a small pool of on-die SRAM perhaps. Bandwidth may very well be the bottleneck the poster on anandtech was speaking of. This is all rumor, of course.

Considering RAM is the one thing Nintendo has never been shy about overdoing, I really don't think we should worry about it.

What has he been right about before? What proof do we have he was actually "vetted?" And what's the extent of the "vetting" process?

10k's still around

He has posted fairly frequently over the past few days and his posts have been sourced a lot, so if he is lying about going to the mods before posting this info we would know by now (a mod would tell us/ban him).

Now, it's true that we have no way to know if his info is legit, but mods typically require insiders to prove that they are in a position to know this info. Not that the info itself has to be accurate. So obviously all insider posts should be taken as such, with a grain of salt.
 

Schnozberry

Member
Much like the X1, I'm sure Nintendo's custom Tegra will have an extra core for processing audio. The X1 had a separate Cortex A9 core for this purpose that wasn't exposed to the OS as a processor. They could replace that with a DSP of some kind, or just inherit that part of the architecture. Decompressing audio on the fly won't require use of the primary cores.
 
I leave it up to the Mods to make that call. He's not banned and I'm sure they're reading this thread, so they must figure he's in a position to know things. We've seen folks banned for less and in short order. Now, whether what he knows is right is something else, but Emily has backed up information he's mentioned as well.

If we're taking him at his word simply based on him not being banned...

Forget this--I'm going shopping to pick up some Analytical hats for everyone to wear, because gosh darn do some people here need them

He has posted fairly frequently over the past few days and his posts have been sourced a lot, so if he is lying about going to the mods before posting this info we would know by now (a mod would tell us/ban him).

Now, it's true that we have no way to know if his info is legit, but mods typically require insiders to prove that they are in a position to know this info. Not that the info itself has to be accurate. So obviously all insider posts should be taken as such, with a grain of salt.

10k was also vetted or approved or whatever it was to some degree as well, as I recall...and whatever degree that was was enough for people to take him seriously too
 
If we're taking him at his word simply based on him not being banned...

Forget this--I'm going shopping to pick up some Analytical hats for everyone to wear, because gosh darn do some people here need them

Well correct me if I'm wrong but mods do not generally take kindly to people claiming to be insiders who offer no proof whatsoever.
 
That's my point. Not only do we not know if we was even vetted (as far as well know), but we don't even know what that means.

Yet people are taking his info as gospel.

Actually, I have been wondering for a while now why this thread was even still open, considering the source of the information in the OP. Yet, the more that I think about it, we have numerous sources saying that Tegra X1 powered a version of the dev kits. Why should we expect a drastic change in the final product? Pascal is basically a shrunken down Maxwell, so that's fine. But to expect double the bandwidth? A different CPU architecture? Seems unrealistic this close to launch. Now, we have a couple of insiders here and Emily Rogers saying that this sounds accurate. I didn't want it to be true, but I'm willing to concede it just might be.
 
Considering RAM is the one thing Nintendo has never been shy about overdoing, I really don't think we should worry about it.



He has posted fairly frequently over the past few days and his posts have been sourced a lot, so if he is lying about going to the mods before posting this info we would know by now (a mod would tell us/ban him).

Now, it's true that we have no way to know if his info is legit, but mods typically require insiders to prove that they are in a position to know this info. Not that the info itself has to be accurate. So obviously all insider posts should be taken as such, with a grain of salt.

10k has had previous info on here after having it approved by mods, only for it later to be confirmed fake.

The guys absolutely crazy.
 
If we're taking him at his word simply based on him not being banned...

Forget this--I'm going shopping to pick up some Analytical hats for everyone to wear, because gosh darn do some people here need them



10k was also vetted or approved or whatever it was to some degree as well, as I recall...and whatever degree that was was enough for people to take him seriously too

Like I said, all insider rumors/reports are just rumors. We shouldn't take any of them as fact, vern's included. But why shouldn't we be able to discuss these rumors?

10k has had previous info on here after having it approved by mods, only for it later to be confirmed fake.

The guys absolutely crazy.

If I recall correctly 10k proved that he was in discussion with people claiming to be insiders, no more, no less. Mods verified that this was the case, not that the info was accurate or even that those "insiders" he was talking to were legitimate insiders.

Edit:
Because 10k literally gets DM'd by random Twitter people pretending to be insiders and then posts it as insider news.

So now we can't discuss any potential rumors because one guy was drastically misled?
 
Like I said, all insider rumors/reports are just rumors. We shouldn't take any of them as fact, vern's included. But why shouldn't we be able to discuss these rumors?

Because 10k literally gets DM'd by random Twitter people pretending to be insiders and then posts it as insider news.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Like I said there's no reason to believe Nintendo will be using A57 on the old 20nm process, if the final retail unit uses A57 at all.

5W just for the CPU is already a lot for a handheld. A smart phone usually has a TDP of 3-4 watts for the entire SoC.
 
Like I said, all insider rumors/reports are just rumors. We shouldn't take any of them as fact, vern's included. But why shouldn't we be able to discuss these rumors?

I never said they can't be discussed; they certainly can be in the sense any number thrown out could be. What I warned against was taking them as gospel, which some seemed to be based on their replies
 
I wonder if Nintendo will apply some of their handheld philosophy on the Switch? By that I mean frame rate. The DS was locked at 60fps. The games that ran at 30fps displayed polygonal graphics on both screens by splitting the frames in half. 30 for the top and 30 for the bottom.

Imagine something like that applied to the Switch. A mandatory lock on the frame rate to 60fps. That would be glorious. But probably a pipe dream as well.
 

Donnie

Member
Actually, I have been wondering for a while now why this thread was even still open, considering the source of the information in the OP. Yet, the more that I think about it, we have numerous sources saying that Tegra X1 powered a version of the dev kits. Why should we expect a drastic change in the final product? Pascal is basically a shrunken down Maxwell, so that's fine. But to expect double the bandwidth? A different CPU architecture? Seems unrealistic this close to launch. Now, we have a couple of insiders here and Emily Rogers saying that this sounds accurate. I didn't want it to be true, but I'm willing to concede it just might be.

Correct me if I'm wrong but what's been said is this is close to what we should expect. Not an exact spec and not even a claim that these were even the exact specs of any development kit at any point in time. Also going from A57 in a development kit to say A72 in retail certainly wouldn't break anything development wise so is possible if not expected.
 

Donnie

Member
5W just for the CPU is already a lot for a handheld. A smart phone usually has a TDP of 3-4 watts for the entire SoC.

That is a entire SoC with 4 of its CPU's maxed out at 2.1Ghz. Switch will also be much bigger than your standard smartphone.

Also as I mentioned earlier it could even have any OS related cores down-clocked or disabled in mobile mode and of course when not gaming all CPU cores could down-clock in order to save battery life. That doesn't mean its not going to be possible to hit 2Ghz for a sustained period during a particularly demanding game.
 

Schnozberry

Member
Actually, I have been wondering for a while now why this thread was even still open, considering the source of the information in the OP. Yet, the more that I think about it, we have numerous sources saying that Tegra X1 powered a version of the dev kits. Why should we expect a drastic change in the final product? Pascal is basically a shrunken down Maxwell, so that's fine. But to expect double the bandwidth? A different CPU architecture? Seems unrealistic this close to launch. Now, we have a couple of insiders here and Emily Rogers saying that this sounds accurate. I didn't want it to be true, but I'm willing to concede it just might be.

There's no reason the product would be radically different. It would just be a minor performance boost. It would still be an Nvidia GPU with 2SM's and the same Shader Architecture. It would still be 4 arm cores with in kernel switching for low power mode. It would still have the same ram amount with the same latency, just potentially slightly higher bandwidth due to clock increases.

A72 and A57 was an incremental improvement. The Core itself would be much smaller at 16nm, offer better performance per watt, and higher clocks. There would be no code changes required. For that matter, A57 on 16nm would offer an improvement over 20nm, just not quite as much. Pascal and Maxwell would be nearly identical at 16nm in a consumer configuration, the only difference being that Pascal has a few updates to color compression that would be advantageous to have in a mobile scenario.

The RAM bandwidth wouldn't necessarily need to change either. There is enough bandwidth with a 64-bit bus if they tweak the cache layout, perhaps just adding an L3 cache or a small pool of ESRAM on die to avoid idle clock cycles.

When Emily Rogers says that the custom tegra chip resembles the Tegra X1, it's because the transition from Pascal to Maxwell was largely to do with the die shrink, and not a major architectural shift. To someone who isn't combing through the very fine sand of this, they would appear to be very similar. My hope is that they went with A72 and Pascal, but if they simply die shrunk Maxwell and A57 to 16nm, the differences would be modest at best.
 

EDarkness

Member
If we're taking him at his word simply based on him not being banned...

Forget this--I'm going shopping to pick up some Analytical hats for everyone to wear, because gosh darn do some people here need them



10k was also vetted or approved or whatever it was to some degree as well, as I recall...and whatever degree that was was enough for people to take him seriously too

Calm down, man. You can believe what he said or not and if you think he's full of shit, report him and let the Mods take care of it. I trust that the Mods on this forum are serious about this sort of stuff, and if he's still around, then they have reason to believe he's in a position to know something. The fact that Emily posted about the 4GB RAM and that the specs in the first post are what we should expect, leads me to believe he knows at least a little bit.
 

Doctre81

Member
This is the google translation of part that japanese article

"If provisionally 768 assuming GFLOPS or 1 TFLOPS the theoretical performance values of Nintendo Switch, in turn 2.18 times the Wii U, it comes to theoretical operation performance of 2.84 times.
And in that case, Nintendo Switch is also a high-performance than the Wii U, 1.84 of PS4 TFLOPS, PS4 Pro of 4.2 TFLOPS, but it comes to unable to compete with the 1.31 TFLOPS of Xbox One, Nintendo is competing in the Wii era because you are down from the performance competition with, there is no surprise now here.
 But if the wishful thinking of "you were more high-performance!", Nintendo Switch is so "portable game machine mode and the non-portable game machine mode, the game machine has two modes of operation", and when the AC adapter is driving Toka more GPU core is driven in stationary mode at the time of the power supply connection, to demonstrate the performance of the operation will be Toka to 1 TFLOPS over at a higher clock, when the battery drive is suppressing the core and operation clock running, the corresponding performance extend the game play time to bear in, it is possible that have adopted the specification Nante.
 Of course, "in the 768 GFLOPS when the AC adapter is connected, when the battery drive will drop the resolution down to 384 GFLOPS," but I mean the reverse pattern is also conceivable that (laughs).

 At least, Nintendo Switch is, I think so nice to say that almost no possibility to come beyond the 1.5 TFLOPS single-precision floating-point performance at the time of the battery drive."


I'm not sure if he is guessing or not but he is implying 768 gflops single precision in docked mode and 384 gflops in handheld mode.
 
Calm down, man.

Come again? I'm merely reminding some--such as yourself apparently--to be skeptical. Or more specifically, don't assume everything you read is true. If you took that as anything other than that, or somehow took offense to it, you might be a little too deeply invested yourself
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
That is a entire SoC with 4 of its CPU's maxed out at 2.1Ghz. Switch will also be much bigger than your standard smartphone.

It's not the entire SoC. Obviously, the GPU is not contributing during the benchmark.

Another reference, the Nvidia Shield Android TV with a Tegra X1 (20nm) is consuming 20W during gaming. The process node is different, but it also doesn't have a display.

Assuming that the target for Switch games will be a chip at its maximum clock speeds is just setting yourself up for disappointment.
 

EDarkness

Member
Come again? I'm merely reminding some--such as yourself apparently--to be skeptical. If you took that as anything other than that, you might be in this a little too deep yourself

Heh, heh. I'm always skeptical. But that doesn't mean we can't talk about it anyway. I won't fully believe anything until Nintendo says something.
 

ozfunghi

Member
Yes and I knew these people were deluding themselves.

These people... such as actual developers here on Neogaf (Matt) and actual proven insiders (OsirisBlack)? Who both claimed the hardware would not be the reason why 3rd party ports won't happen.

Have you been reading these threads? Yes, some people absolutely expect this thing to be in spitting distance of the X1/PS4 power-wise.

Which still "could" be possible. CPU is already said to be on par or better by insiders. GPU in the devkit can reach 500GF, once on a smaller fab node, that can amount to 700GF. Then, due to fp16 in roughly 1/3rd of calculations, that could get it close to 1TF and on more modern architecture.. sure, that'd be within spitting distance of the XB1. Memory/bandwidth though, probably not.

Is this a joke? Volta isn't gonna be out for a while. Surprised this isn't using graphene 3-dimensional neural array Kappa.

Yeah, my thought exactly. Hopefully he didn't misinterpret "last architecture" or another discription.

Did you know that the Switch is NOT a phone....

Indeed. Most phones have dozens of apps running simultaniously. Which is completely unnecessary for the Switch.

Seriously guys, calm the fuck down.
You know the small round icon with the arrow inside, that's next to the person you quoted, links back to the original quote, so there is no reason to make a link out of the entire quote? ;-)

If they do go with pascal for the retail units, I wonder if hitting around 700gf, and upping the bandwidth could be possible considering Parker is somewhere around 700 with higher bandwidth while reducing power consumption.

Going from 20 to 16 nm already puts the X1 on 700GF and basically means it's Pascal (since that's the main difference between Maxwell and Pascal).


Here's even more gloom for you: That 1 TFLOP figure is almost surely fp16. Typical fp32 performance will likely be half that.

Why? TX1 does 0.5TF. Put that on a 16nm node and you get 0.7TF. Meaning you don't have to use fp16 for all your code, which would not be possible. Like Blu already speculated, between 25-50% of calculations could benefit from using fp16. So if on average 1/3rd could benefit from fp16, you're already closing in on those 1TF.

vern, who has apparently been vetted by mods, claimed so:

He has not been vetted. He was in contact, but they did not vet him or his info from what i understood.

Is it really necessary to quote Matt's post on every page?

Yes. Because some people still don't get it. Read the first quote in my post, and this one:

The real question is, will it be easy enough to port a game to Switch?

That was my interpretation as well up until Vern stepped in and lent credence to the specs in the OP of this thread. It's all rumor, as I keep saying. 50 GB/s would make much more sense to me. It would still be a potential bottleneck, but it would fit more reasonably within the context of the other specs.

Vern has repeatedly said he knows nothing of specs. Him lending credence to the rumor, most likely means he saw the RAM amount, the CPU etc... and heard those could fit. Memory bandwidth is a tad more complex to wrap your head around as a non-tech guy (believe me).

I think he said he contacted the mods, they checked him out, but he wanted them to leak the info, but they didn't want to do it. If that is the case, then that would explain why they haven't banned him.

From what i understood, he contacted them, but they did not vet him or his info.

Assuming that the target for Switch games will be a chip at its maximum clock speeds is just setting yourself up for disappointment.

And yet this is where the "hybrid" nature of the thing comes into play. No sane person is expecting CPU/GPU to be running full clock in portable mode... but yes, they just might in docked mode.
 

K.Jack

Knowledge is power, guard it well
It's actually based on two well known insiders. There's no insider claiming it's a pain in the ass, like the specs in the OP would suggest.

They've literally provided no technical explanation, on how these specs will pull it off. The bandwidth alone is an issue.

Forgive me for not placing my faith in GAF insiders, who provide no substance.
 

KingSnake

The Birthday Skeleton
They've literally provided no technical explanation, on how these specs will pull it off. The bandwidth alone is an issue.

Forgive me for not placing my faith in GAF insiders, who provide no substance.

What bandwidth are you talking about and what's your reliable source for it? (hint: OP isn't)
 

ozfunghi

Member
They've literally provided no technical explanation, on how these specs will pull it off. The bandwidth alone is an issue.

Forgive me for not placing my faith in GAF insiders, who provide no substance.

I'll believe an actual dev over a shady tweet. Funny you claim they don't provide substance, yet there is nothing confirmed about that memory bandwidth, and if it were, the source is much less reliable than these insiders you're blowing off.
 
Correct me if I'm wrong but what's been said is this is close to what we should expect. Not an exact spec and not even a claim that these were even the exact specs of any development kit at any point in time. Also going from A57 in a development kit to say A72 in retail certainly wouldn't break anything development wise so is possible if not expected.

There's no reason the product would be radically different. It would just be a minor performance boost. It would still be an Nvidia GPU with 2SM's and the same Shader Architecture. It would still be 4 arm cores with in kernel switching for low power mode. It would still have the same ram amount with the same latency, just potentially slightly higher bandwidth due to clock increases.

A72 and A57 was an incremental improvement. The Core itself would be much smaller at 16nm, offer better performance per watt, and higher clocks. There would be no code changes required. For that matter, A57 on 16nm would offer an improvement over 20nm, just not quite as much. Pascal and Maxwell would be nearly identical at 16nm in a consumer configuration, the only difference being that Pascal has a few updates to color compression that would be advantageous to have in a mobile scenario.

The RAM bandwidth wouldn't necessarily need to change either. There is enough bandwidth with a 64-bit bus if they tweak the cache layout, perhaps just adding an L3 cache or a small pool of ESRAM on die to avoid idle clock cycles.

When Emily Rogers says that the custom tegra chip resembles the Tegra X1, it's because the transition from Pascal to Maxwell was largely to do with the die shrink, and not a major architectural shift. To someone who isn't combing through the very fine sand of this, they would appear to be very similar. My hope is that they went with A72 and Pascal, but if they simply die shrunk Maxwell and A57 to 16nm, the differences would be modest at best.

I'm expecting Pascal but not necessarily Cortex A72. Switching from one ARM uArch to the next doesn't seem like it would cause major problems, but it's still a different core and we don't know if Nintendo were willing to pay the licensing costs for the newer architecture. Nvidia are also still using A57 in Parker, somewhat bewilderingly. I hope someone leaks the final specs, so that we don't have to spend months arguing over die photos again.

RAM is something that can change late in the game, but at this point, we are really late in the game and this thing is going to be hitting the assembly lines soon. I believe Emily when she says 4 GB in the final unit, although it would be great if Nintendo upgraded to one of these faster 6 GB modules from Samsung.
 
Is there any credible information on the power of pascal tegra chips? Googled around but didn't find much of anything. lots of noise in my searchterms not helping.
 

KingSnake

The Birthday Skeleton
Is there any credible information on the power of pascal tegra chips? Googled around but didn't find much of anything. lots of noise in my searchterms not helping.

All we know are the specs of Parker:

NVIDIA-Tegra-Parker-SOC_Specs-1-840x471.png


Edit: it doesn't mean that this is going to be in Switch, even if it's Pascal based.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
And yet this is where the "hybrid" nature of the thing comes into play. No sane person is expecting CPU/GPU to be running full clock in portable mode... but yes, they just might in docked mode.

As I've already said, CPU and memory performance (normalized to GPU bandwidth / resolution) have to be the same in mobile mode and docked mode, because you cannot scale down the things that run on the CPU as easily as you can scale down resolution, which roughly scales linearly to GPU performance.

They will have to find a sweet spot for CPU clock speeds that allows for usable battery life, and it will certainly not be the max rated clock speed.

Most speculation in this thread is ignoring that fact and is based on maximum clock speeds, which is unrealistic.
 

EDarkness

Member
Is there any credible information on the power of pascal tegra chips? Googled around but didn't find much of anything. lots of noise in my searchterms not helping.

I've been looking this up, but I haven't found any videos showing it off. I wasn’t able to find many for the X1, either.
 

Schnozberry

Member
I'm expecting Pascal but not necessarily Cortex A72. Switching from one ARM uArch to the next doesn't seem like it would cause major problems, but it's still a different core and we don't know if Nintendo were willing to pay the licensing costs for the newer architecture. I hope someone leaks the final specs, so that we don't have to spend months arguing over die photos again.

RAM is something that can change late in the game, but at this point, we are really late in the game and this thing is going to be hitting the assembly lines soon. I believe Emily when she says 4 GB in the final unit, although it would be great if Nintendo upgraded to one of these faster 6 GB modules from Samsung.

Let's hope for both. I think A72 makes a lot of sense considering there would be very little cost difference for a nice boost in performance/watt.

The part of the Eurogamer rumor that has stuck with me is that the Dev Kits had a noisy coooler. The Shield TV has an active cooler (I own one) and it is completely silent. The Jetson TX1 dev board also has an active cooler, and it is silent. I have a hard time
believing Nintendo incompetent enough to stick a cooler that can't dissipate 10w inside a dev kit, so that tells us that they were pushing the X1 outside its normal parameters to target the final hardware. That's good news any way you look at it.
 

ozfunghi

Member
As I've already said, CPU and memory performance (normalized to GPU bandwidth / resolution) have to be the same in mobile mode and docked mode, because you cannot scale down the things that run on the CPU as easily as you can scale down resolution, which roughly scales linearly to GPU performance.

They will have to find a sweet spot for CPU clock speeds that allows for usable battery life, and it will certainly not be the max rated clock speed.

Most speculation in this thread is ignoring that fact and is based on maximum clock speeds, which is unrealistic.

Right. I should have said GPU, which is what most discussion is about. We also have devs/insiders saying hardware performance won't be the reason for 3rd parties to not develop/port to the system. And we have a credible insider who said the CPU outperforms the XBO/PS4. So that shouldn't be a problem.
 

Zedark

Member
Is there any credible information on the power of pascal tegra chips? Googled around but didn't find much of anything. lots of noise in my searchterms not helping.

Here:
NVIDIA said:
Built around NVIDIA’s highest performing and most power-efficient Pascal GPU architecture and the next generation of NVIDIA’s revolutionary Denver CPU architecture, Parker delivers up to 1.5 teraflops(1) of performance for deep learning-based self-driving AI cockpit systems.

1.5 TFLOPS is for FP16, so 750 GFLOPS FP32.
 

Terrell

Member
Seriously, it's like none of you lived through the discussions on GameCube's spec sheets. So far, we're hitting all the same high notes...

We have Matt and OsirisBlack both indicating that PS4/XB1 ports shouldn't be a technical problem...

Insiders who don't reference the spec sheet numbers telling us there's no need for outright panic, but are disregarded because they don't validate their statements with spec sheets...

Seriously there is no quantity of posts from random jagoffs on the net that will amount to someone with actual knowledge of the situation. Having an idea of the RAM needed for the average multiplat game on switch is going to need more than the ability to know 8>4 or list your phone's specs. People are tripping on their way to jump to conclusions. Open insight from devs is a few months away at any rate. People can have fun with the last stretch of speculation I guess.

People appealing to reason being summarily ignored...

I am a computer scientist, so I appreciate rational arguments and evidence over arguments from authority.

People with vocational experience believing themselves to be a voice of authority on what is and isn't possible...

All capped off with a bunch of reactionary fear-mongering, people who think they understand tech positioning themselves as "realists".

It's a really unfortunate replay of that era and no one seems to have caught on to that yet.
 

ozfunghi

Member
Is there any credible information on the power of pascal tegra chips? Googled around but didn't find much of anything. lots of noise in my searchterms not helping.

There was an article by the factory that produces the chips, who claimed going from 20 to 16 nm fab node, (or Maxwell to Pascal) means 60% more power efficiency (at the same performance), or 40% more performant (at the same power draw). So if you have the info for Maxwell Tegra, you have a good idea what to expect from Pascal.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
It all depends on bandwidth and resolution. There's no way Switch could run console quality ports at 720p, but if it had 50GB/s memory bandwidth it should be able to run console titles at qHD resolution. The CPU would also be weaker than what's in the current consoles, so other cuts would have to be made there as well.

Problem is the dev kit spec was half of what you'd need for that level of qHD performance.
What would that be?
 

EDarkness

Member
I don't necessarily care for the numbers just real world what it can do. I just feel like Nintendo should be over this and release the specs and what the console can do. Hell let Nvidia do it. They will do a better job than Nintendo as giving us the details.

We'll find out real world info in January when we can see games running on the system.
 

Donnie

Member
It's not the entire SoC. Obviously, the GPU is not contributing during the benchmark.

Another reference, the Nvidia Shield Android TV with a Tegra X1 (20nm) is consuming 20W during gaming. The process node is different, but it also doesn't have a display.

Assuming that the target for Switch games will be a chip at its maximum clock speeds is just setting yourself up for disappointment.

The GPU and memory are not turned off either, Anyway its nothing to do with assuming it'll run the maximum. 2Ghz is listed as the maximum for a rumoured development kit from god knows when.

My opinion is merely that on a modern process there is no reason at all why a device of Switch's size would struggle to contain a A57 that can run at 2Ghz during gameplay. We're going to have to agree to disagree.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
People with vocational experience believing themselves to be a voice of authority on what is and isn't possible...

Where have I claimed to be a voice of authority? I didn't even bring that point up and asked for quite the contrary. Thanks for your vacuous meta-commentary, though.

My opinion is merely that on a modern process there is no reason at all why a device of Switch's size would struggle to contain a A57 that can run at 2Ghz during gameplay. We're going to have to agree to disagree.

I do not see how you can argue against the very data that you yourself posted.
 

ozfunghi

Member
ACHTUNG: Does anybody remember the "feature" that was supposedly included in the (false) rumored Polaris based GPU, that had the fake dev insider who fed 10k his bullshit, recognize the chip as "Polaris" because it had this "capping" ability, to only render what was in the view of the player (and not render what was left/right/below/above the screen view)?

What was this called again? It was something the PS4 and XBO did not have. Does the Tegra X1 have something similar?
 
ACHTUNG: Does anybody remember the "feature" that was supposedly included in the (false) rumored Polaris based GPU, that had the fake dev insider who fed 10k his bullshit, recognize the chip as "Polaris" because it had this "capping" ability, to only render what was in the view of the player (and not render what was left/right/below/above the screen view)?

What was this called again? It was something the PS4 and XBO did not have. Does the Tegra X1 have something similar?
I thought the Dreamcast did it back then... man I'm out of touch.
 
Status
Not open for further replies.
Top Bottom