Schnozberry
Member
But its also a stationary console, not just a handheld.
Also beholden to the laws of thermodynamics.
But its also a stationary console, not just a handheld.
I couldn't ask for much more from a handheld.
But its also a stationary console, not just a handheld.
But its also a stationary console, not just a handheld.
Third party games won't be using FP16, not if they're multi-platform. First party games will make use of the advantage though I'd expect.Nice so we could be looking at something quite close to XB1 which is 1.3TF right?
Third party games won't be using FP16, not if they're multi-platform. First party games will make use of the advantage though I'd expect.
Unreal engine 4 can apparently use fp16 so we may see some third party games using it
That depends on whether third parties porting the game would be willing to go through the effort to make the switch. Just because the engine supports something doesn't make it trivial to change your entire arithmetic architecture. It's not as simple as flipping a switch.
That depends on whether third parties porting the game would be willing to go through the effort to make the switch. Just because the engine supports something doesn't make it trivial to change your entire arithmetic architecture. It's not as simple as flipping a switch.
For UE4 it should be trivial. I obviously can't speak for Switch development, but on mobile UE4 will use FP16 for all pixel shader arithmetic by default, and you have to go out of your way to get it to run in FP32.
I am still a bit taken aback by and really happy about this partnership. They just seem, on paper, to fit together. Their strengths and weaknesses sort of balance out when considering them together.
Sort of like yin and yang.
Ninception.
If it becomes a doable thing for PC hardware, I imagine multiplatform games would start to use it or fall behind in the Bulletpoint Wars.That depends on whether third parties porting the game would be willing to go through the effort to make the switch. Just because the engine supports something doesn't make it trivial to change your entire arithmetic architecture. It's not as simple as flipping a switch.
So this fp16 is the new thing for gaming that hdr is for tvs?
So if all devs were somehow forced to use fp16 on all machines then we get the significant double the flops right? That's a breakthrough. This would also put the PS4 pro on 8.4tf and Scorpio on 12tf. Shit!
Sorry for the slight OT and yes I somehow hyped myself up a little bit.
It's true. Getting a company that knows what they're doing with hardware possibly better than anyone else to do all the technical work for them was the best possible choice Nintendo could have made. Not only in terms of getting a competent and developer friendly system, but also inspiring confidence in their system to fans. And Nintendo does seem to have a much better grasp on what makes a portable system practical and appealing than Nividia did.
I really hope Nintendo goes aggressive on bang-for-your-buck power with Switch.
Gamecube is still one of my favorite hardware designs.
Nvidia seems like a company filled with, and I use this term with endearment, nerds. They are wonderful at things that land in their wheelhouse, and designing SOCs is definitely in their wheelhouse, but their effort at building devices has always been something of a mixed bag. I think their problem with the Shield line of products is that they clearly staked out a niche from the beginning, and one they had to know would be difficult to satisfy, and released a series of half baked ideas. Gaming on Android is something of a chore with it's various software quirks, and Nvidia's game streaming always had frame timing and hitching problems that made it something of a difficult sell to the hardcore niche they were aiming for. It also doesn't help that they had issues with industrial design and faulty batteries, but nobody ever questioned whether or not the guts of the product were solid. They just weren't setup to succeed.
Nintendo, despite their flaws, is a good partner for Nvidia. They generally have great hardware QA and industrial design. They were willing to work closely with Nvidia to avoid the OS level problems with Android that rob mobile hardware of it's performance potential, with low level API's and custom software for system integration. Nvidia was able to take a step back for the thing they struggled with, which is finding games that could actually drive people to the platform. The more general audience Nintendo aims for won't have same psychotic OCD about hardware specs and graphics dick swinging contests, so the Tegra line offers enough punch to do the job. The timing worked out great. Nintendo was looking to resurrect it's dedicated hardware business at the same time Nvidia was realizing that they had serious issues penetrating the tablet market. The deal allowed both companies to do what they do best, and fill in the gaps for the other. It really seems like it worked out for both sides, considering the response outside the Neogaf bubble has been great for The Switch, and Nvidia was able to secure their biggest Tegra customer ever and cancel their doomed Shield line to pursue other big fish like the automotive industry.
The design (no camera, mic, cheap controllers, single regular screen, nothing super fancy) would indicate most of the budget went into the internals.I'm on the same boat, really.
I wouldn't mind paying some premium money for Nintendo consoles if the tech behind them justified the price.
Not saying they should charge $500. They could still go $250-300 but I would like to be sure they got the best specs and components said price could afford.
Nvidia seems like a company filled with, and I use this term with endearment, nerds. They are wonderful at things that land in their wheelhouse, and designing SOCs is definitely in their wheelhouse, but their effort at building devices has always been something of a mixed bag. I think their problem with the Shield line of products is that they clearly staked out a niche from the beginning, and one they had to know would be difficult to satisfy, and released a series of half baked ideas. Gaming on Android is something of a chore with it's various software quirks, and Nvidia's game streaming always had frame timing and hitching problems that made it something of a difficult sell to the hardcore niche they were aiming for. It also doesn't help that they had issues with industrial design and faulty batteries, but nobody ever questioned whether or not the guts of the product were solid. They just weren't setup to succeed.
Nintendo, despite their flaws, is a good partner for Nvidia. They generally have great hardware QA and industrial design. They were willing to work closely with Nvidia to avoid the OS level problems with Android that rob mobile hardware of it's performance potential, with low level API's and custom software for system integration. Nvidia was able to take a step back for the thing they struggled with, which is finding games that could actually drive people to the platform. The more general audience Nintendo aims for won't have same psychotic OCD about hardware specs and graphics dick swinging contests, so the Tegra line offers enough punch to do the job. The timing worked out great. Nintendo was looking to resurrect it's dedicated hardware business at the same time Nvidia was realizing that they had serious issues penetrating the tablet market. The deal allowed both companies to do what they do best, and fill in the gaps for the other. It really seems like it worked out for both sides, considering the response outside the Neogaf bubble has been great for The Switch, and Nvidia was able to secure their biggest Tegra customer ever and cancel their doomed Shield line to pursue other big fish like the automotive industry.
What is this X2 I've been hearing about that suppose to be in the NX now? I heard the old kit was using the overclocked X1 as a placemat for another chip but not sure what the X2 could be.
In pure flops? No, that would be unlikely. It's possible in this type of form factor but it's likely too expensive for what Nintendo wants to charge.
But Nvidia typically has better tools and software that end up getting better performance out of their hardware, at least on PC (which is apparently where we get the whole "Nvidia flops are better than AMD flops). We don't know if that will be the case on a console though.
So we really don't know what the real world performance will be, but it's likely much closer to XB1 than Wii U.
If anything the Switch is Gotenks. It's a fusion, powerful, has a funny attitude and it only last 30 minutes.Something tells me in power
Switch = Perfect Cell
XBone1 = Super Perfect Cell
PS4 = SSJ2 Gohan.
Yeah, I went there. ._.
"Based on" doesn't mean much really. Pascal isn't that much different from Maxwell so unless we're talking about more units / wider bus / higher clocks - there's not much difference between a SoC based on X1 or a SoC based on Parker. Parker itself should be faster than X1 by some 50% but there's no indication that Switch uses Parker itself and not some custom Parker SoC which may be faster or slower than Parker.
So that 750 Gflops fp32 and 1500 Gflops fp16...
So if all devs were somehow forced to use fp16 on all machines then we get the significant double the flops right? That's a breakthrough. This would also put the PS4 pro on 8.4tf and Scorpio on 12tf. Shit!
You can't use fp16 for everything. In fact, if you can use it for half the computations, that 'd be a succes. So you will never "double the flops" like that.
"Based on" doesn't mean much really. Pascal isn't that much different from Maxwell so unless we're talking about more units / wider bus / higher clocks - there's not much difference between a SoC based on X1 or a SoC based on Parker. Parker itself should be faster than X1 by some 50% but there's no indication that Switch uses Parker itself and not some custom Parker SoC which may be faster or slower than Parker.
If anything the Switch is Gotenks. It's a fusion, powerful, has a funny attitude and it only last 30 minutes.
It's actually named Parker after Peter Parker. The X1 was Erista (Wolverine) and the next one will be called Xavier.
Nvidia have been using superhero codenames since Tegra 3/Kal-El.
I almost replied to this seriously.
You can't use fp16 for everything. In fact, if you can use it for half the computations, that 'd be a succes. So you will never "double the flops" like that.
A reaction to some inoffensive codenames? Sarcasm? One never knows.Wow, that's........pretty freaking pathetic.
I've very interested in the reality that Nintendo is going to reveal the specs of Switch, something they didn't do for Wii or Wii U. They probably want to impress people who give a shit about that kind of thing.
A reaction to some inoffensive codenames? Sarcasm? One never knows.
Not sarcasm. Now you know.
This superhero stuff is getting more pathetic by the day. Time to get over it.
A shame I now know 'Parker' comes from Peter Parker. Wish I didn't know that. Childish, sad and pathetic.
Not sarcasm. Now you know.
This superhero stuff is getting more pathetic by the day. Time to get over it.
A shame I now know 'Parker' comes from Peter Parker. Wish I didn't know that. Childish, sad and pathetic.
I feel like a lot of you guys are doing sum heavy mental gymnastics to make fp16 do more than it actually can. Im expecting 60- 75% of XB1 graphical fidelity. Its probs a 16nm chip that is a customized version of a newer Maxwell architecture. Maybe a wider mem bus but im not even expecting that. UE4 seems to run decently on mobile devices with limited bandwidth.
Educate us.I feel like a lot of you guys are doing sum heavy mental gymnastics to make fp16 do more than it actually can. Im expecting 60- 75% of XB1 graphical fidelity. Its probs a 16nm chip that is a customized version of a newer Maxwell architecture. Maybe a wider mem bus but im not even expecting that. UE4 seems to run decently on mobile devices with limited bandwidth.
Why did you feel the need to make it personal ("you guys" part)? I noticed this trend. How does it help with anything?
Not much I can say other than the likelyhood of the NX matching (or exceeding) XB1's 1.3TF performance cuz of fp16 is kinda...idk out there.Educate us.
It's going to be 16nm, Pascal based and with a 128-bit bus. All signs are pointing this way.I feel like a lot of you guys are doing sum heavy mental gymnastics to make fp16 do more than it actually can. Im expecting 60- 75% of XB1 graphical fidelity. Its probs a 16nm chip that is a customized version of a newer Maxwell architecture. Maybe a wider mem bus but im not even expecting that. UE4 seems to run decently on mobile devices with limited bandwidth.
Well, apparently our minds produce completely different associations, 'cause when I hear 'Tegra Erista' I envision this:Not sarcasm. Now you know.
This superhero stuff is getting more pathetic by the day. Time to get over it.
A shame I now know 'Parker' comes from Peter Parker. Wish I didn't know that. Childish, sad and pathetic.
Ummmm...I didnt mean any harm by it homie. :/ How else would I address the people talking in this thread?
Not much I can say other than the likelyhood of the NX matching (or exceeding) XB1's 1.3TF performance cuz of fp16 is kinda...idk out there.
Lol. I'm going to use this line some day.It is time for the internet to take a break from you.
Not sarcasm. Now you know.
This superhero stuff is getting more pathetic by the day. Time to get over it.
A shame I now know 'Parker' comes from Peter Parker. Wish I didn't know that. Childish, sad and pathetic.