• Register
  • TOS
  • Privacy
  • @NeoGAF

Durante
I'm taking it FROM here, so says Mr. Stewart
(11-19-2012, 04:43 PM)
Durante's Avatar
I don't want to play backseat moderator, but I think this thread wasn't intended to be the place to speculate about the Wii U's fate on the market.
xyla
Member
(11-19-2012, 04:43 PM)
xyla's Avatar

Originally Posted by blu

Ok, I think it's about time we tried to put all known Wii U specification things into its own thread and try to have a civil discussion.

Hard facts (either publicly disclosed, or a non-public leak which can be vouched by somebody trustworthy on this very forum):

  • MCM design: GPU+eDRAM die and CPU die on the same substrate.
  • 2 GB of gDDR3 memory @800MHz (DDR3-1600), organized in 4x 4Gb (256Mx16) modules, sitting on a 64bit bus (@800MHz). That gives a net BW of 12800MB/s (12.5GB/s). We can conveniently refer to this pool as 'MEM2'. Currently 1GB of that pool is reserved for the OS.
  • 32 MB of unknown organisation, unknown specs eDRAM, sitting with the GPU. We can conveniently refer to this pool as 'MEM1'
  • Tri-core CPU, binary compatible with Gekko/Broadway, featuring 3MB of cache in asymmetric config: 2x 512KB, 1x 2048KB; so far several things indicate CPU cache is implemented via eDRAM itself. Unknown clock, unknown architecture enhancements (e.g. SIMD, etc).
  • AMD R700-originating GPU (R700 is AMD architecture codename 'Wekiva'), evolved into its own architecture (AMD architecture codename 'Mario'), relying on MEM1 for framebuffer purposes, but also for local render targets and scratch-pad purposes.
  • Memory access specifics: both MEM1 and MEM2 are read/write accessible by the CPU, both subject to caching. GPU in its turn also has access to both pools, and is likely serving as the north bridge in the system (an educated guess, subject to calling out).
  • System is equipped with extra co-processors in the shape of a dual-core ARM (unknown architecture) and a DSP core (again of unknown architecture) primarily for sounds workloads.
  • BluRay-based optical drive, 22.5MB/s, 25GB media.

Immediate logical implications from the above (i.e. implications not requiring large leaps of logic):
  • Not all WiiU CPU cores are equal - one of them is meant to do things the other two are not. Whether that is related to BC, OS tasks, or both, is unclear.
  • When it comes to non-local GPU assets (read: mainly textures), WiiU's main RAM BW increase over nintendo's most advanced SD platform (i.e. 5.48GB/s -> 12.5GB/s) hints that WiiU is mainly targeted at a 2x-3x resolution increase over the Wii, or IOW, 480p -> 720p.
  • The shared access to MEM1 pool by the GPU and CPU alike indicated the two units are meant to interact at low latency, not normally seen in previous console generations. Definitely a subject for interesting debates this one is.

So what are the other 2 for? Do we have 1 main CPU like the PS3 but only with 2 SPUs?
Do PS3 and 360 have a seperate audio-chip like the Wii-U or could some of the CPU trouble come from forcing the audio procession through the CPU too?

I'm not technically adept at all, so bear with me if these questions are stupid :P
DonMigs85
Member
(11-19-2012, 04:46 PM)
DonMigs85's Avatar
What happens if a GameCube disc is inserted in Wii mode? Does it reject it or does it actually recognize it?
Gravijah
Member
(11-19-2012, 04:48 PM)
Gravijah's Avatar

Originally Posted by DonMigs85

What happens if a GameCube disc is inserted in Wii mode? Does it reject it or does it actually recognize it?

what happens when you insert one into the revised Wii without BC? i'd assume it works the same way.
DonMigs85
Member
(11-19-2012, 04:56 PM)
DonMigs85's Avatar

Originally Posted by Gravijah

what happens when you insert one into the revised Wii without BC? i'd assume it works the same way.

Good question. I don't know anyone with that awful new model though, and can't find vids on Youtube.
efyu_lemonardo
May I have a cookie?
(11-19-2012, 05:00 PM)
efyu_lemonardo's Avatar

Originally Posted by xyla

So what are the other 2 for? Do we have 1 main CPU like the PS3 but only with 2 SPUs?
Do PS3 and 360 have a seperate audio-chip like the Wii-U or could some of the CPU trouble come from forcing the audio procession through the CPU too?

AFAIK the Wii U's setup is unique in that it has a dedicated audio DSP and an ARM co-processor.

Regarding the triple core CPU, I assume he meant we can deduce the cores are asymmetrical due to one having significantly more cache than the other two. (2048 KB vs. 512 KB x2)

edit: If I had to guess I'd say the dual core ARM performs a function similar to that of the Wii's ARM co-processor, nicknamed Starlet by the Homebrew community. By this I mean software encryption and authentication, some wireless functionality and other I/O (USB, optical drive, Wii-specific elements needed for bc). It also possibly handles other background tasks such as downloading updates, managing friend requests, miiverse data, incoming video-calls, etc. since these things have been shown to occur while playing games.
Last edited by efyu_lemonardo; 11-19-2012 at 05:36 PM.
boiled goose
good with gravy
(11-19-2012, 05:01 PM)
boiled goose's Avatar
Whatever happened to bgassasin? His optimism now seems to be mostly unfounded.
Quentyn
Member
(11-19-2012, 05:03 PM)
Quentyn's Avatar
iFixit has now a teardown aswell. Anything new in there? I noticed they have ram chips from Micron in their WiiU.
codhand
Member
(11-19-2012, 05:07 PM)
codhand's Avatar

Panasonic MN864718 HDMI Controller

Samsung KLM8G2FE3B eMMC 8 GB NAND Flash/Memory Controller

Micron 2LEI2 D9PXV [part number MT41K256M16HA-125] 4 Gb DDR3L SDRAM (4 x 4 Gb for a total of 16 Gb or 2 GB RAM)
http://www.micron.com/~/media/Docume..._35V_DDR3L.pdf

DRH-WUP 811309G31

Fairchild DC4AY

SMC 1224EE402

Samsung K9K8G08U1D 4 Gb (512 MB) NAND Flash

.
Van Owen
Member
(11-19-2012, 05:09 PM)
Van Owen's Avatar

Originally Posted by amtentori

Whatever happened to bgassasin? His optimism now seems to be mostly unfounded.

He conveniently bailed on forums when more news of Wii U's capabilities started to surface.
xyla
Member
(11-19-2012, 05:16 PM)
xyla's Avatar

Originally Posted by gumby_trucker

AFAIK the Wii U's setup is unique in that it has a dedicated audio DSP and an ARM co-processor.

Regarding the triple core CPU, I assume he meant we can deduce the cores are asymmetrical due to one having significantly more cache than the other two. (2048 KB vs. 512 KB x2)

I see. So it could be possible that later ports will address the sound part. Maybe it already is and it doesn't matter at all, performance wise.
ozfunghi
Member
(11-19-2012, 05:22 PM)
ozfunghi's Avatar

Originally Posted by xyla

I see. So it could be possible that later ports will address the sound part. Maybe it already is and it doesn't matter at all, performance wise.

There were talks that a couple of ports didn't use the DSP for sound, but ran it over the CPU. IIRC, this is a noticeable burden, sound takes up 1/6th of the XB360 CPU.
Mlatador
Member
(11-19-2012, 05:22 PM)
Mlatador's Avatar
Good to see some civilised poster in here. Hopefully this thread stays this way. It definitely a breath of fresh air among the polution the launch day madness has caused.
ozfunghi
Member
(11-19-2012, 05:23 PM)
ozfunghi's Avatar

Originally Posted by codhand

Panasonic MN864718 HDMI Controller

Samsung KLM8G2FE3B eMMC 8 GB NAND Flash/Memory Controller

Micron 2LEI2 D9PXV [part number MT41K256M16HA-125] 4 Gb DDR3L SDRAM (4 x 4 Gb for a total of 16 Gb or 2 GB RAM)
http://www.micron.com/~/media/Docume..._35V_DDR3L.pdf

DRH-WUP 811309G31

Fairchild DC4AY

SMC 1224EE402

Samsung K9K8G08U1D 4 Gb (512 MB) NAND Flash

What is the 512MB Flash for? Solely Wii compatibility or is this for the OS or what?
cyberheater
PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 XBOX PS4 PS4
(11-19-2012, 05:24 PM)
cyberheater's Avatar
I'll ask again. Has it been confirmed that the CPU has direct access to the EDRAM?
MadeInBeats
Banned
(11-19-2012, 05:25 PM)

Originally Posted by Mlatador

Good to see some civilised poster in here. Hopefully this thread stays this way. It definitely a breath of fresh air among the polution the launch day madness has caused.

.

Would be good if EC could safegaurd it too.
PopcornMegaphone
Member
(11-19-2012, 05:26 PM)
PopcornMegaphone's Avatar
Judging from the 720 rumors (more slower RAM), would it be fair to say WiiU is designed more like their next gen model than the 360? I understand this is impossible to definitively answer, but right now Nintendo's design decisions seem curious. Perhaps as other have mentioned their hoping for down ports of next gen games. Seems unlikely, but I don't know.

edit - I understand WiiU is technically "next gen", but y'all know what I'm trying to say.
Last edited by PopcornMegaphone; 11-19-2012 at 05:29 PM.
Disorientator
Member
(11-19-2012, 05:34 PM)
Disorientator's Avatar

Originally Posted by Quentyn

iFixit has now a teardown aswell. Anything new in there? I noticed they have ram chips from Micron in their WiiU.

Nice one.

Edit: Broadcom module @ 5GHz (today's Broadcom Press Release)
Last edited by Disorientator; 11-19-2012 at 05:44 PM.
Ryoku
Member
(11-19-2012, 05:38 PM)
Ryoku's Avatar

Originally Posted by cyberheater

I'll ask again. Has it been confirmed that the CPU has direct access to the EDRAM?

CPU has its own EDRAM (3MB). 2MB of it to one of its cores, and 512KB to each of the other two cores.
Last edited by Ryoku; 11-19-2012 at 05:55 PM.
elty
Member
(11-19-2012, 05:42 PM)
Just wondering, is that single channel DDR3?
mrklaw
MrArseFace
(11-19-2012, 05:45 PM)
mrklaw's Avatar
does anyone have some specific examples of modern engines, and what the likely texture/vertex size would be per frame, so we can get some context around the lack (or not) of ram bandwidth?
cyberheater
PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 XBOX PS4 PS4
(11-19-2012, 05:47 PM)
cyberheater's Avatar

Originally Posted by Ryoku

CPU has its own EDRAM (3MB). 2MB of it two one of its cores, and 512KB to each of the other two cores.

Thanks. So the CPU does not have any access to the 32Mb EDRAM pool?
Raist
(11-19-2012, 05:47 PM)

Originally Posted by Gravijah

what happens when you insert one into the revised Wii without BC? i'd assume it works the same way.

Wait, what? They dropped BC on the Wii at some point?
xyla
Member
(11-19-2012, 05:54 PM)
xyla's Avatar

Originally Posted by ozfunghi

There were talks that a couple of ports didn't use the DSP for sound, but ran it over the CPU. IIRC, this is a noticeable burden, sound takes up 1/6th of the XB360 CPU.

So maybe some of the choppier games have their low FPS from brute forcing the 360 architecture onto the WiiU which obviously doesn't work.
Sounds promising for the future, when dev teams are more familiar with the hardware and are able to use it the way intended.
Ryoku
Member
(11-19-2012, 05:57 PM)
Ryoku's Avatar

Originally Posted by cyberheater

Thanks. So the CPU does not have any access to the 32Mb EDRAM pool?

I'm not sure about that. Maybe. Maybe not.
Ken Masters
Member
(11-19-2012, 05:59 PM)
Ken Masters's Avatar
If 1gb of ram is reserved for OS then why does it take so damn long to open up menus?
The Abominable Snowman
Pure Life tonsil tickle
(11-19-2012, 05:59 PM)
The Abominable Snowman's Avatar

Originally Posted by PopcornMegaphone

Judging from the 720 rumors (more slower RAM), would it be fair to say WiiU is designed more like their next gen model than the 360? I understand this is impossible to definitively answer, but right now Nintendo's design decisions seem curious. Perhaps as other have mentioned their hoping for down ports of next gen games. Seems unlikely, but I don't know.

edit - I understand WiiU is technically "next gen", but y'all know what I'm trying to say.

You know what, the "Wii HD" rumor was first founded in 2006. There was a buzz of the Wii HD a little after the PS3 launched that was in the same power range as what we're seeing now.

I wonder if this is the Wii HD that we would've seen in 2008 or 2009 had those rumors panned out.
Ryoku
Member
(11-19-2012, 06:01 PM)
Ryoku's Avatar

Originally Posted by Ken Masters

If 1gb of ram is reserved for OS then why does it take so damn long to open up menus?

Unoptomized OS. It's bloated at this point. It'll get better over-time (and probably a reduction of the memory footprint, as well).
Twenty7KVN
Member
(11-19-2012, 06:02 PM)
Twenty7KVN's Avatar

Originally Posted by Ken Masters

If 1gb of ram is reserved for OS then why does it take so damn long to open up menus?

Software bugs.
AzaK
Member
(11-19-2012, 06:03 PM)
AzaK's Avatar

Originally Posted by Durante

I don't want to play backseat moderator, but I think this thread wasn't intended to be the place to speculate about the Wii U's fate on the market.

Agreed.
SlickShoesRUCrazy
Member
(11-19-2012, 06:09 PM)
SlickShoesRUCrazy's Avatar
Seems to me from a technical stand point that Nintendo is going to continue with it's 5-6 year cycles for consoles. We probable will see the next nintendo console in 2016-2017.
Fourth Storm
Member
(11-19-2012, 06:13 PM)
Fourth Storm's Avatar
Nice OP, blu.

I was toying around with some of the system features of Wii U last night, and revisited an idea that was surely kicked around in one of the WUSTs. Regarding the assymetric cache split on the CPU, I tend to agree with Shifty Geezer(?) on Beyond3D, who said basically looking at that size of the CPU, it wouldn't make sense for the cores themselves to differ beyond cache. It would be an insult for any of the cores to be anything less than fully functional!

So I'm wondering this. We know with a fair amount of certainty that there are ARM cores on the GPU for OS, I/O, and security. So when I am using Netflix (works pretty damn good btw) or the browser, alone, does that all run on the ARM cores? No, I think not. I may have fallen into the trap of thinking that before, but the actual "OS" involves none of those applications. So now I am thinking perhaps "Espresso" has an energy saver mode for BC and non-game applications, and in this mode only Core 1 would be active. Perhaps that amount of cache just for applications is overkill, but since they are using a whole gig of RAM for the same purpose, maybe that is what's going on there with the asymmetric cache.
Last edited by Fourth Storm; 11-19-2012 at 06:17 PM.
Doc Holliday
Member
(11-19-2012, 06:16 PM)
Doc Holliday's Avatar
Honestly I really think Nintendo designed the Wii U around the Japanese market. The Ps3 and 360 never took off there so in some ways maybe the Wii U tech is still fresh in japan.

I wish the Iwata ask was more in depth about the actual specs and why they went that way.
G-Unit
Member
(11-19-2012, 06:19 PM)
G-Unit's Avatar
I like this topic, less doom and gloom, more diving into specs but waiting for more facts until real judgement
wsippel
(11-19-2012, 06:20 PM)

Originally Posted by Ken Masters

If 1gb of ram is reserved for OS then why does it take so damn long to open up menus?

It doesn't actually use the RAM. That would be my guess, at least. That would explain the horrible load times. And the insane amount of RAM they reserved - they simply don't really know how much they'll actually need, so they reserved a ton.
Kai Dracon
Writing a dinosaur space opera symphony
(11-19-2012, 06:21 PM)
Kai Dracon's Avatar

Originally Posted by Doc Holliday

Honestly I really think Nintendo designed the Wii U around the Japanese market. The Ps3 and 360 never took off there so in some ways maybe the Wii U tech is still fresh in japan.

I wish the Iwata ask was more in depth about the actual specs and why they went that way.

Could it be telling that compared to the west, Nintendo's Japanese "lifestyle" videos for Wii U have focused on low key single-person scenarios. In smaller apartments, etc.

Aside from power consumption and size, I wonder if a super quiet device is also a big concern for appealing to Japanese customers.
Van Owen
Member
(11-19-2012, 06:22 PM)
Van Owen's Avatar

Originally Posted by Disorientator

Nice one.

Edit: Broadcom module @ 5GHz (today's Broadcom Press Release)

So Wii U can't connect to my 5ghz wifi since its used for the gamepad?
Fourth Storm
Member
(11-19-2012, 06:29 PM)
Fourth Storm's Avatar

Originally Posted by Kaijima

Could it be telling that compared to the west, Nintendo's Japanese "lifestyle" videos for Wii U have focused on low key single-person scenarios. In smaller apartments, etc.

Aside from power consumption and size, I wonder if a super quiet device is also a big concern for appealing to Japanese customers.

Less noisy console means less tv volume necessary to hear over the whoosh which leads to happier neighbors? Could be...
Tallshortman
Member
(11-19-2012, 06:34 PM)
Tallshortman's Avatar

Originally Posted by Doc Holliday

Honestly I really think Nintendo designed the Wii U around the Japanese market. The Ps3 and 360 never took off there so in some ways maybe the Wii U tech is still fresh in japan.

I wish the Iwata ask was more in depth about the actual specs and why they went that way.

Why would you go mainly for the smallest of the 3 markets though? It's not like the US doesn't buy Nintendo products. You do have a point, I'm just wondering why. It has impressive power consumption which is a non-issue with cheap electricity in the US but more of a selling point in Japan.
KageMaru
Member
(11-19-2012, 06:35 PM)
KageMaru's Avatar

Originally Posted by Durante

Talking about the eDRAM, can we agree that its use is basically required just to achieve parity with PS3/360? Because if so, then I can't imagine that being a very good thing for developer support. In the end it boils down to manual caching / memory management, which was very unpopular on PS3. Clearly it's easier to deal with a single 32 MB buffer than 6 256 kB buffers, but the central idea is still that of a user-managed scratchpad memory.

Now, personally, I love that idea and the associated programming challenges and opportunities (and loved it back in Cell, and when doing GPGPU), but I wonder if the general game developer mindset has changed enough to make it viable as a central pillar of a system's design.

I wish we had the exact bandwidth/latency of the EDRAM to GPU and CPU.

Well IMO devs are going to have to get used to it if the next xbox also has a small pool of fast memory for a buffer.

Originally Posted by Durante

I think his point was that games targeting 1080p on the big consoles could be ported to run at 720p on Wii U.

I think it's not a bad idea in general, and I made posts to that effect a while ago. The issues are that
- it assumes developers will be going for 1080p on those consoles
- it only helps in scaling graphics, which are pretty easy to scale in the first place. General purpose code is much harder to scale, and I doubt the Wii U CPU (and its bandwidth) will help there vis-a-vis PS4/720 (Even though the latter will probably also feature disappointing CPUs, at least IMHO)

If you're right, I apologize for the misunderstanding and agree with you entirely. Thing is, I imagine the most graphically intensive games next gen won't be 1080p, but 720p and that leaves less room for scalability for the Wii-U on top of the issue with general purpose code.

I'm taking a wait and see on the CPUs in the other two systems. I find it hard to believe that they will just stick stock jaguar cores in their system.

Originally Posted by randomengine

I think the Wii U will be more successful with this level of power than most people think. I think it is ingenious actually.

Think about it. The next generation is going to come out. Costs will explode, but profits will lag for quite a long period of time. Wii U will be the safe bet. It extends the life of the current generation, so everyone who wants to make games at the current level of power can do so, much more cheaply and at a higher profit, while at the same time not making games for an old, dead system. I can see a lot of smaller Japanese developers who absolutely don't want to absorb next-gen costs to use Wii U as a safe haven.

The way I see it playing out is other than the higher end next gen games, most games will be cross generational scaling per platform (similar to what we saw with BF3). After a few years, when engines are optimized and the install base for the two bigger platforms have grown, last gen will be slowly left behind. I don't see budgets exploding like they did this gen for multiple reasons.

This really is no different than the last few generations. A developer told me in the beginning of this generation that the PSone was the most profitable platform a few years into the DC/ps2/xbox/GC generation and he expected the same with the PS2 this gen. Only now we don't have one dominant platform, so support will be spread out across multiple current gen systems and the Wii-U will benefit because of this IMO.

Originally Posted by PopcornMegaphone

Judging from the 720 rumors (more slower RAM), would it be fair to say WiiU is designed more like their next gen model than the 360? I understand this is impossible to definitively answer, but right now Nintendo's design decisions seem curious. Perhaps as other have mentioned their hoping for down ports of next gen games. Seems unlikely, but I don't know.

edit - I understand WiiU is technically "next gen", but y'all know what I'm trying to say.

Even if the rumors are true regarding the memory in the next gen xbox, it still won't be nearly as slow as the memory in the Wii-U. Plus it's supposed to have a pool of DRAM (sDRAM?) to help compensate for the slower memory.

I'm not entirely sure what Nintendo was thinking with this memory. They probably wanted to get the most amount possible at the cheapest price possible.

Originally Posted by Ken Masters

If 1gb of ram is reserved for OS then why does it take so damn long to open up menus?

I think we're seeing the effect of the low bandwidth here. It needs to access the amount reserved for the OS while the game is also trying to access the other 1GB. Unless it's also slow while you're in the dashboard, then that's pretty sad.
Van Owen
Member
(11-19-2012, 06:35 PM)
Van Owen's Avatar
I feel like Nintendo needs an American division to design hardware and the OS and let Japan handle game software...
v1oz
Member
(11-19-2012, 06:36 PM)
v1oz's Avatar

Originally Posted by Ken Masters

If 1gb of ram is reserved for OS then why does it take so damn long to open up menus?

Nintendo are not known for writing OSes.
Disorientator
Member
(11-19-2012, 06:38 PM)
Disorientator's Avatar

Originally Posted by Van Owen

So Wii U can't connect to my 5ghz wifi since its used for the gamepad?

Not an expert but:

The greatest strength of the 5 GHz band is the availability of 23* non-overlapping channels; 20* more channels than what is available in the 2.4 GHz band. Since there is no other wireless technology that “fights” for the radio space, the 23* available non-overlapping channels can provide a possibility for easier planning of an interference-free and stable wireless communication. Another advantage of the 5 GHz band is that the greater number of available channels provides for increased density, which means that more wireless devices can be connected in the same radio environment.

source
Van Owen
Member
(11-19-2012, 06:41 PM)
Van Owen's Avatar

Originally Posted by Disorientator

Not an expert but:



source

Ok. Should be able to test it out tomorrow if I decide to keep my Wii U...
pulsemyne
Member
(11-19-2012, 06:41 PM)
Seems like the hylix memory used is gDDR like the samsung memory. This is DDR 3 that is specifically tweaked for graphics/desktop.
Also if the x720 rumours are true then you will likely see a simular set up to the WiiU's. Considering console makers are desperatly trying to keep costs down using DDR 3 is the way to go. It's till far cheaper than GDDR 3 and espically GDDR 5.
NBtoaster
Member
(11-19-2012, 06:42 PM)
NBtoaster's Avatar

Originally Posted by wsippel

It doesn't actually use the RAM. That would be my guess, at least. That would explain the horrible load times. And the insane amount of RAM they reserved - they simply don't really know how much they'll actually need, so they reserved a ton.

It also has a seperate processor for the OS too right? Maybe that's not so hot.
v1oz
Member
(11-19-2012, 06:47 PM)
v1oz's Avatar

Originally Posted by Van Owen

I feel like Nintendo needs an American division to design hardware and the OS and let Japan handle game software...

Agreed. In fact I think they should let their partners like ATI and IBM design their hardware instead. Basically give them a certain price point, like 199 full retail and tell them to engineer the best possible hardware for that price point.

I have to quote the Eurogamer review because I agree with them 100%. Because Nintendo mass produce and buy in huge quantities they can source parts far cheaper than most manufacturers.

Originally Posted by Eurogamer

Bearing in mind the general level of performance we've seen from the 300 Digital Foundry PC - built using off-the-shelf parts - it's a touch disappointing that graphical quality in Wii U shows no generational leap at all over the Xbox 360 or PlayStation 3. The DFPC is in the same ballpark price-point as Wii U, it features superior CPU/GPU power, twice as much RAM and much more storage. The notion that Nintendo could not match or better it in an integrated design bearing in mind the vast buying power it has at its disposal - even factoring in the additional cost of touch-screen GamePad - is disappointing.

In a world where Chinese manufacturers can sell complete Android tablets with capacitive touch-screens for 50, it's safe to say that the Wii U GamePad won't be costing Nintendo too much to construct. That being the case, factoring in the modest processing power on offer, we were firmly of the belief that the platform holder would be targeting a 199/$299 price-point for Wii U. Sadly, it was not to be. From what we've experienced of the hardware and games thus far, the new console definitely feels a bit pricey, bearing in mind the gaming proposition on offer.

The Abominable Snowman
Pure Life tonsil tickle
(11-19-2012, 06:50 PM)
The Abominable Snowman's Avatar

Originally Posted by pulsemyne

Seems like the hylix memory used is gDDR like the samsung memory. This is DDR 3 that is specifically tweaked for graphics/desktop.
Also if the x720 rumours are true then you will likely see a simular set up to the WiiU's. Considering console makers are desperatly trying to keep costs down using DDR 3 is the way to go. It's till far cheaper than GDDR 3 and espically GDDR 5.

No.

the cost between DDR3 and GDDR3 is not so insurmountable as to fuck the performance of the systems to that degree. Chill.
The Boat
Member
(11-19-2012, 06:50 PM)
The Boat's Avatar
But... they are targeting the $299 price-point. That's what the base model costs.
Eteric Rice
Junior Member
(11-19-2012, 06:51 PM)
Eteric Rice's Avatar

Originally Posted by mrklaw

smaller/indie devs have been producing great looking games on PS3/360 already this gen, and that will continue next gen. You don't have to spend a fortune to develop a game.

Maybe Nintendo is hoping to get those smaller, indie games? Might be evidenced by free patching and what not.

Thread Tools