blu
Wants the largest console games publisher to avoid Nintendo's platforms.
Flops, watts, price - you can pick any two of the three, but not all three at once.In fact I think the other machines will be very efficient measured as flops/watts
Flops, watts, price - you can pick any two of the three, but not all three at once.In fact I think the other machines will be very efficient measured as flops/watts
Ok, it's Traktor, not Tracktor XD
What the hell is that????????
Final wii u dev kit.
There is a general sense of condescension and pity among people who prefer graphical showcases towards people who don't and frankly it's pathetic and insulting.
The Wii U currently is the only console to render to two different screens simultaneously, one of which is wireless, with a more or less lag-free experience. That is impressive.
In terms of power draw it's only impressive to me if it's from a mobile device since it'll use less energy resulting in less battery usage. Less wattage does mean less heat resulting in a cooler device, but I'd rather have a system that uses up to 200 watt.
Are you taking about me?
Whats so zealot about saying "Its impressive... for its power draw?"
Overall, not just that post. But I hear ya, I'm also trying to look at things in as positive a light as possible. And you're not coming off as a douche bag at all, so you've got that going for you.
I know there will be some great games for me to play in the years to come so I'm good. But the whole technical deal is interesting to me to as a former compsci major, even though I'm years removed from my real hardware/architecture experience.
Flops, watts, price - you can pick any two of the three, but not all three at once.
XD Ok, I will never ever again talk about graphics, I must be blind...Is Thraktor.
No, if you look closely, this has to be the memory connectors (part of the SRAM, because they are on all of the SRAM blocks you can see). The ALUs are in between the blocks, on the orange square at the middle.OK, so here's a noob question. In this image below, I'm assuming the darker parts are the ALUs, and the lighter brown parts are the SRAM for each ALU. Is that correct?
Can we use this to count the ALUs in each SIMD? Surely that's possible right? Is each block a pair of ALUs? If so, I count 32, which doesn't jive with the 40 ALUs per SIMD we've been thinking.
XD Ok, I will never ever again talk about graphics, I must be blind...
No, if you look closely, this has to be the memory connectors (part of the SRAM, because they are on all of the SRAM blocks you can see). The ALUs are in between the blocks, on the orange square at the middle.
You really can't see the ALUs or the other circuitry at all, but based on the SRAM and components visible, and comparing to what can be seen on other die shots, you have to try to guess what does every one of the parts of the GPU...
This is why it's so hard to try to guess what this GPU is capable off, since it has been severely customized, resulting in a much more difficult task to compare it to other known GPUs on the market.
Well, this is how we got to the 350Gflops number, through the amount of SRAM blocks and comparing to "similar" AMD designs...Thanks. Is there a known ratio of SRAM blocks per ALU? That's probably a stupid question, but surely there has to be a way to figure this out.
Having the entire system in the controller would be very interesting. How would that work in relation to the TV? Have a shell that's purely for receiving a signal from the tablet and displaying it on the screen? Would you be able to play the system anywhere you want with just the tablet controller?As it always happens in these threads the interesting posts get lost, but someone a few pages ago mentioned that it almost seems like Nintendo wants to eventually cram the hardware into the controller, which is a pretty interesting proposition.
The only sure thing is that Nintendo is more concerned about other things than raw performance, but in no way this means they are tecnologically impaired. If the chip was custom designed it was for a good reason, not just because Nintendo hates flops and wants to piss off the graphic whores.
I noticed I wasn't the first person to suggest that Nintendo might have a goal of putting the whole thing in the controller.Having the entire system in the controller would be very interesting. How would that work in relation to the TV? Have a shell that's purely for receiving a signal from the tablet and displaying it on the screen? Would you be able to play the system anywhere you want with just the tablet controller?
What's the power draw compared to a laptop with mid level GPU?The low power draw is technically impressive, but to act like gamers should be patting Nintendo on the back for that achievement is just silly. Who's realistically going to care.
LOL at Nintendo being secretive about the raw specs even to developers. It's like they were trying to pull a "fast one" not only on consumers but also their development partners. Coming out clean would have ended all speculation in an instant.
What's the power draw compared to a laptop with mid level GPU?
Coming out clean would have ended all speculation in an instant.
You assume that they omitted information relevant to developers as opposed to developers being bound by NDAs. Or are you parroting an off-the-cuff remark from DF?
It's information from DF article. They obviously have close sources within the dev community.You assume that they omitted information relevant to developers as opposed to developers being bound by NDAs. Or are you simply parroting an off-the-cuff remark from DF?
Yeah, that part of the DF article was total nonsense. Like Nintendo Ninjas broke into developer offices around the world and setup devkits without the knowledge of anyone and just left them there without any documentation whatsover.
With how non-standard the GPU apparently is, there probably wasn't an easy way to "come clean." With PS4/720, it seems to be mostly standard PC parts with a little extra here or there, so there's a built-in familiarity with how to interpret the power. Not so, apparently, with Wii U.
There's also the current philosophical stance of Nintendo completely against going on about specs. That position can be harangued, but then it still goes back to the earlier point. :/
It's information from DF article. They obviously have close sources within the dev community.
Nope. DF have been very reliable with all their insider thus far.
What's the power draw compared to a laptop with mid level GPU?
Nope. DF have been very reliable with all their insider thus far.
Come on!!! If you're basing what you said on that article, it was clearly an assumption they made up to explain why no info has been leaked to the internet! It's not even presented as a fact!!!It's information from DF article. They obviously have close sources within the dev community.
Question about tessellation, is it a fixed function feature, if so, is it also a big memory or bandwidth hog?
Reason why I ask, is Shinen comment about using tessellation for their next game. With Nintendo focus on memory latency, and the high bandwidth of the eDRAM, could it also make tessellation even more efficient.
Traditionally consoles have always used non standard parts. Particularly GPU's. And it's never really caused too many problems when specs were made public. Nintendo for example published a good amount of the GCN specs.With how non-standard the GPU apparently is, there probably wasn't an easy way to "come clean." With PS4/720, it seems to be mostly standard PC parts with a little extra here or there, so there's a built-in familiarity with how to interpret the power. Not so, apparently, with Wii U.
There's also the current philosophical stance of Nintendo completely against going on about specs. That position can be harangued, but then it still goes back to the earlier point. :/
It's information from DF article. They obviously have close sources within the dev community.
Come on!!! If you're basing what you said on that article, it was clearly an assumption they made up to explain why no info has been leaked to the internet! It's not even presented as a fact!!!
Traditionally consoles have always used non standard parts. Particularly GPU's. And it's never really caused too many problems when specs were made public. Nintendo for example published a good amount of the GCN specs.
A memory controller performs a wide range of memory control related functions including arbitrating between various competing resources seeking access to main memory, handling memory latency and bandwidth requirements of the resources requesting memory access, buffering writes to reduce bus turn around, refreshing main memory, and protecting main memory using programmable registers. The memory controller minimizes memory read/write switching using a "global" write queue which queues write requests from various diverse competing resources. In this fashion, multiple competing resources for memory writes are combined into one resource from which write requests are obtained. Memory coherency issues are addressed both within a single resource that has both read and write capabilities and among different resources by efficiently flushing write buffers associated with a resource.
Yup. That the lack of a possible counter-argument being presented by them as an alternative to their own claim (like NDAs) is a classic case of amateur editorializing.
So you accept as fact the contention that Nintendo purposely denied developers access to information about the hardware so they could "discover it themselves"? On what do you base this other than the conjecture in the poorly written DF article?
A low power draw on the other hand, well, it might save you 70 cents a month! Score!
Well I doubt it's as bad as DF is reporting, but having read Nintendo's previous developer docs (available in the SDK's), I'd say it's possible that Nintendo doesn't mention some things that other makers do. For example with the Wii, Nintendo did specify the clock speeds, but for performance, it literally told developers in the docs that they could expect 2x's the performance of the gamecube, and that's about it.
If the "developer" DF is referencing has a Wii U developer kit, then they'd also have the SDK, which would have the documentation that is available to all developers. Having said that, it's often the case where early developer SDK's won't have most of the information (especially specifics) that is in the finalized 1.0 SDK documentation. (Perhaps these "developers" told DF what they said prior to finalized 1.0 SDK documentation?)
bgassassin was saying the same thing, clock speed and other performance metrics were not provided to devs.
bgassassin was saying the same thing, clock speed and other performance metrics were not provided to devs.
bgassassin was saying the same thing, clock speed and other performance metrics were not provided to devs.
Come on!!! If you're basing what you said on that article, it was clearly an assumption they made up to explain why no info has been leaked to the internet! It's not even presented as a fact!!!
So the die shots/released info are looking to confirm R6xx 160ALU then.
Let me look back through my old posts......
what the fuck are you talking about?
So the die shots/released info are looking to confirm R6xx 160ALU then.
Let me look back through my old posts......
Marcan has another series of tweets just now. I find this interesting:
Héctor Martín (@marcan42) tweeted at 11:33 PM on Tue, Feb 05, 2013:
Oh, and for those who claim it's not a Radeon-like design: http://t.co/69ErDYjB . R6xx. Register names match AMD ones too.
(https://twitter.com/marcan42/status/298922364740190208)
It's true.