• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Speculation Thread of Brains Beware: Wii U Re-Unveiling At E3 2012

Status
Not open for further replies.

ReyVGM

Member
If the Power7 only comes with 4, 6 or 8 cores and the Power7 has been now confirmed (really confirmed?) to be what the Wii U will use, then doesn't that make the unconfirmed info about using 3 cores fake or at least obsolete?

People are using the old 3 core news to deny the confirmed Power7 chip news on the Wii U?
 

EloquentM

aka Mannny
ReyVGM said:
If the Power7 only comes with 4, 6 or 8 cores and the Power7 has been now confirmed (really confirmed?) to be what the Wii U will use, then doesn't that make the unconfirmed info about using 3 cores fake or at least obsolete?

People are using the old 3 core news to deny the confirmed Power7 chip news on the Wii U?
read all the posts literally right above yours
 

wsippel

Banned
MDX said:
Well I guess its really not an assumption, but a fact:

full

http://twitpic.com/594zsv/full

Wii will be using a customized Power 7 chip
It certainly would need to be customized, considering all the weird features the chip has. It's a bit weird that lherre says they CPU was two-way SMT when Power7 is four-way SMT. Unless the number of threads was reduced after a couple of execution units were stripped - the DFPU wouldn't be needed, and one VMX unit and two dual-pipelined VSX units per core are overkill, anyway.
 

EloquentM

aka Mannny
Maxrpg said:
4th core is probably designated to tablet functions
that's another possible explanation. I'm betting more on the OS speculation though since 3ds has one of its core strictly reserved for its OS.
 

Deguello

Member
I wonder about that IBM tweet myself. It's from June of this year, in fact it's pretty much during E3.

Are they truly confirming that the Power7 is in the Wii U or were they lost in the confusion of the trade event?
 
wsippel said:
It certainly would need to be customized, considering all the weird features the chip has. It's a bit weird that lherre says they CPU was two-way SMT when Power7 is four-way SMT. Unless the number of threads was reduced after a couple of execution units were stripped - the DFPU wouldn't be needed, and one VMX unit and two dual-pipelined VSX units per core are overkill, anyway.

Isn't it possible that the chip that's in the dev kits now isn't a power7 based one? Meaning they've got something in there that's similar enough, but are still waiting on final parts?
 
Deguello said:
I wonder about that IBM tweet myself. It's from June of this year, in fact it's pretty much during E3.

Are they truly confirming that the Power7 is in the Wii U or were they lost in the confusion of the trade event?

Why would IBM be lost in the confusion of E3? What else would they have invested in E3 to get confused about? You suggesting maybe the Xbox 3's CPU is a Power7 and they somehow got confused?


Shin Johnpv said:
Isn't it possible that the chip that's in the dev kits now isn't a power7 based one? Meaning they've got something in there that's similar enough, but are still waiting on final parts?

I think this is fairly likely. This is how they did the earlier PS3 devkits and used a lot of RAM in lieu of the finished Cell.
 

Deguello

Member
Lupin the Wolf said:
Why would IBM be lost in the confusion of E3? What else would they have invested in E3 to get confused about? You suggesting maybe the Xbox 3's CPU is a Power7 and they somehow got confused?




I think this is fairly likely. This is how they did the earlier PS3 devkits and used a lot of RAM in lieu of the finished Cell.

I don't know. E3 is a time of rapid question-asking and response-giving. I certainly believe it myself, but it wouldn't the the first time somebody misspoke on an official page.
 
Naked Prime said:
If the WiiU can run Battlefield 3 (or Witcher 2) at 1080P, 60FPS and texture detail almost on par with the PC version at launch, imagine how it will perform later on in its life-cycle. Additionally, I hope that Nintendo signs with Valve to power their online & digital distribution. EA would be better for Nintendo, but Valve better for gamers/consumers IMO.

While there maybe something to these next Xbox rumors, MS's true appeal lies in Xbox Live and its related services. And would likely do just as well in a machine with a quad core and 1 GPU much less hex-core and a dual GPU setup. But we'll see.


Youll get 1080p or 60fps but not both.
 

AmFreak

Member
Lupin the Wolf said:
No one said it was a "stock" Power7 chip, but they just confirmed it was a Power7. It probably uses the same type of PowerPC architecture with some customization options (Out-of-order?).

That was an answer to this:

"if power7 comes in 4, 6, and 8 core variants why do rumors suggest wiiu is 3 cores?"

That sounded to me like the poster thought of a stock Power7 cause a customized chip could have 3 cores. Power7 is already OOE btw.
 

BurntPork

Banned
Lupin the Wolf said:
No one said it was a "stock" Power7 chip, but they just confirmed it was a Power7. It probably uses the same type of PowerPC architecture with some customization options (Out-of-order?).




There you go.
Power7 is already out-of-order.

Edit: fucking beaten
 
AmFreak said:
That was an answer to this:

"if power7 comes in 4, 6, and 8 core variants why do rumors suggest wiiu is 3 cores?"

That sounded to me like the poster thought of a stock Power7 cause a customized chip could have 3 cores. Power7 is already OOE btw.

True enough. Also, I put a question mark because I wasn't positive if it was OOE by default or not.
 

Ormberg

Member
The POWER7 architecture is a very interesting approach, and I think I understand why Nintendo might have chosen it.

As we know, Moore's Law isn't really up to snuff anymore, the progress lies with multicores. However, writing code for multicores is hard, and whilst engines has improves there are still many problems with writing such engines. Even today, as has been discussed already in this thread (I think), most engines only relies on two cores.

I won't dive into why writing multicore game engines is a nightmare, so I simply cut and paste from the Wikipedia section of POWER7 on why I think Nintendo went with it:
One feature that IBM and DARPA collaborated on is modifying the addressing and page table hardware to support global shared memory space for POWER7 clusters. This enables research scientists to program a cluster as if it were a single system, without using message passing. From a productivity standpoint, this is essential since some scientists are not conversant with MPI or other parallel programming techniques used in clusters.
http://en.wikipedia.org/wiki/Power7

Having worked with this, I know that any help with reducing the complexity of concurrency problems, race conditions etc is very welcome.

This choice of hardware would build upon the Wii, where the aim was to help developers reducing the work of the engine, focusing more on gameplay rather than spending time on implementing message flow in the engine.

Thoughts?
 

disap.ed

Member
wsippel said:
It certainly would need to be customized, considering all the weird features the chip has. It's a bit weird that lherre says they CPU was two-way SMT when Power7 is four-way SMT. Unless the number of threads was reduced after a couple of execution units were stripped - the DFPU wouldn't be needed, and one VMX unit and two dual-pipelined VSX units per core are overkill, anyway.

I thought about lherre's words too, maybe it is in fact only a 2 way SMT variant (6 threads are probably sufficient for most tasks) and they wanted to lower transistor count. Do Intel's i5 and i7 have a seperate dies or is hyperthreading just deactivated on i5s?
 

Vinci

Danish
DeaconKnowledge said:
I really wish you guys would talk about something else.

The Wii by design was a generation behind PS360, so in that relative sense it was underpowered. There is no way this would apply to Wii U, even if it is technologically inferior to the next XBOX or PlayStation. All Nintendo has to do on the tech side is guarantee Wii U is still in the conversation with the other two consoles and the battle is won.

Nintendo doesn't, and never had to match Sony and Microsoft shoulder to shoulder. This mission is to make it financially feasible for third parties to leverage their existing tools and engines over to the Wii U with little legwork or financial investment. Re-configuring already established engines and hiring new teams for one platform didn't make fiscal sense for Wii. The end.

Thank you. It's been pretty obvious that Nintendo reaching anything remotely close to parity with the other two in terms of 3rd party support would be very destructive to the company's competitors in two fundamental ways:

1) Nintendo's 1st party games outsell everyone else's. This is fact, and it's been the case since the NES. No matter the outcome of Nintendo's consoles, the software has always sold.

2) It would strike a massive thorn into the paw of the '3rd party games don't sell on Nintendo systems' myth the industry has promoted back, forward, and sideways for over a decade.

Neither of these is good news for Sony or Microsoft.
 

gaheris

Member
Ormberg said:
The POWER7 architecture is a very interesting approach, and I think I understand why Nintendo might have chosen it.

As we know, Moore's Law isn't really up to snuff anymore, the progress lies with multicores. However, writing code for multicores is hard, and whilst engines has improves there are still many problems with writing such engines. Even today, as has been discussed already in this thread (I think), most engines only relies on two cores.

I won't dive into why writing multicore game engines is a nightmare, so I simply cut and paste from the Wikipedia section of POWER7 on why I think Nintendo went with it:

http://en.wikipedia.org/wiki/Power7

Having worked with this, I know that any help with reducing the complexity of concurrency problems, race conditions etc is very welcome.

This choice of hardware would build upon the Wii, where the aim was to help developers reducing the work of the engine, focusing more on gameplay rather than spending time on implementing message flow in the engine.

Thoughts?

WOW that makes a lot of sense now that I think about it. I hope that it proves to be true.
 

McHuj

Member
Ormberg said:
The POWER7 architecture is a very interesting approach, and I think I understand why Nintendo might have chosen it.

As we know, Moore's Law isn't really up to snuff anymore, the progress lies with multicores. However, writing code for multicores is hard, and whilst engines has improves there are still many problems with writing such engines. Even today, as has been discussed already in this thread (I think), most engines only relies on two cores.

I won't dive into why writing multicore game engines is a nightmare, so I simply cut and paste from the Wikipedia section of POWER7 on why I think Nintendo went with it:

http://en.wikipedia.org/wiki/Power7

Having worked with this, I know that any help with reducing the complexity of concurrency problems, race conditions etc is very welcome.

This choice of hardware would build upon the Wii, where the aim was to help developers reducing the work of the engine, focusing more on gameplay rather than spending time on implementing message flow in the engine.

Thoughts?


I don't believe that's applicable to the WiiU. The CPU (whatever it is) already sees a shared memory. Yes, you have concurrency issues that you have to deal with as a programmer, but they are different than the concurrency issues and data partitioning that a computer cluster would see.

Explicitly having to transfer data between computing nodes is a design challenge itself (through an API like MPI), creating a shared memory (gigabytes if not terabytes) like this quote suggests would be a tremendous help to someone programming a super computer. You're talking about a scale of probably hundreds of CPU's.
 
EloquentM said:
if power7 comes in 4, 6, and 8 core variants why do rumors suggest wiiu is 3 cores?

1 core is probably reserved for the OS.

EloquentM said:
that's another possible explanation. I'm betting more on the OS speculation though since 3ds has one of its core strictly reserved for its OS.

Not true. It was reserved solely for the OS, but since the last firmware update developers can now access the 2nd core for their games.
 
Nuclear Muffin said:
1 core is probably reserved for the OS.
this is the likely answer. We know the 3DS has done something similar with regards to os being dedicated resources. Granted, nintendo is apparently about to let devs access said dedicated resource soon.
 

Log4Girlz

Member
EloquentM said:
what would be the point of that?

Do you know what manufacturers do with high end GPU's that are somewhat defective but still useable? Like if some shader units aren't functioning? They sell that as a mid-range or low-range part...tada.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Rolf NB said:
No they're not. ARM CPUs are beastly clock-for-clock ..
The dual-issue, in-order A8 (and its non-pipelined fpu) is beastly now? Well then, I guess the 750 with its dual-plus-a-branch, shallow out-of-order issue (and a fully-pipelined fpu) is an ungodly monstrosity of a cpu, eh?

.. and it's clocked higher than the Wii CPU to boot.
That it surely is. It also has some abnormally large (for an A8) L2 cache of 512KB, while Broadway has "only" 256KB L2 (also DMA-able to/from the GPU for ultra fast CPU-to-GPU feeds [ed: it's the L1 cache which had the DMA engine used for the direct CPU-GPU transfers; false memories FTL]). Unfortunately for the ipad, the Wii also has 24MB of ultra low-latency 1T-SRAM with 2GB/s of BW to the CPU, 4GB/s to the GPU, and both units getting *in addition* hefty BW from the GDDR3 pool (both CPU and GPU have access to either pools, and games decide how to best utilise that). That even before considering Hollywood has some 15.5GB/s when operating from its own 1MB of texcache. In comparison ipad's UMA memory comprises of mDDR that peaks at 2.66GB/s [ed: thanks Lonely1 for the memory type correction], which is all that is shared between the CPU and the GPU. No wonder Apple had to go with the bigger L2 cache - they had to lower that pressure on the memory bus somehow.

Without looking at memory paths, Wii's and ipad's CPUs are not very different MIPS-wise. At some synthetic int tests the A8 might even edge out on Broadway due to former's clock advantage, but at normal int workloads they'd be neck-in-neck. And of course, Broadway would always slaughter the A8 at common scalar fp work. Once we consider the datapaths picture though, Wii's RAM can serve the system much better for most practical game-related scenarios. The only scenario where the ipad might have a chance at an advantage is excessive use of RTT ping-ponging.

The GPU is also as wide as Wii's, clocked comparably, supports a lot more fancy shading features, and is much more efficient because it's a TBDR.
..And peaks at 400MPix/s of theoretical physical fillrate (200MHz * 2 ROPs) for the case when its unified shader architecture does nothing but the lightest imaginable pixel shading (basically nothing more than a single texture). Unfortunately, in more realistic scenarios the same shader units have also vertex work to do, so there goes your fillrate, before you'd even start doing any fancy shading work (at 25/75 split of vs/ps your max theoretical fillrate has already dropped to 300MPix/s). So the 535 has to rely on its HSR to save the day. Unfortunately TBDR's HSR advantage vanishes when fillrate is needed the most, eg. in situations with heavy translucency overdraw like particles, etc. In contrast, Hollywood has some 1GPix/s of absolutely sustainable physical fillrate (with one texture and vertex color), and a separete TnL unit to boot. And Hollywood has an embedded fb with hefty BW plus Early-Z HSR (not exactly as efficient as TBDR at opaque overdraw, but still a very snappy HSR tool when used right), which does counter TBDR's traditional advantages over IMRs.

As I said, the Wii is decisively better than the ipad in most game-related scenarios.
 
The lowest iteration of the POWER7 CPU claims 6 cores as it's max... what's the likelihood Nintendo only selecting, say, 3 or 4 and not going for 6? Pretty high I'd imagine...
 

DCKing

Member
The lowest iteration of the POWER7 CPU claims 6 cores as it's max... what's the likelihood Nintendo only selecting, say, 3 or 4 and not going for 6? Pretty high I'd imagine...
100%.

Chip size is a huge consideration for console manufacturers. It's what drives silicon costs up, destroys yield and increases motherboard and cooling complexity. POWER7 chips are 576mm^2 in size. Nintendo will not want to go beyond 200mm^2 (360 was 176, GC was 43, so probably even smaller) and therefore will have it customized beyond recognition. The first thing to go are at least four cores which will effectively half the original design's size. Next up is core simplification, throwing out supercomputer and enterprise stuff (who needs decimal floating point units), decrease multithreading capabilities (POWER7 cores are 4-way, Wii U could be 2-way), cache reduction and so on. Their goal is not only to come up with a smaller core, but a much cooler one as well.

That's also why everybody saying Nintendo will use a chip with one disabled core is 100% wrong. When you have a chip tailor made for your console, you are not going to ship it with defective components. An entire defective core is something that still needs to be made, put on a motherboard and shipped. Sony did it with Cell, but Cell has been a tragedy Nintendo doesn't want to repeat.
 

Lonely1

Unconfirmed Member
The dual-issue, in-order A8 (and its non-pipelined fpu) is beastly now? Well then, I guess the 750 with its dual-plus-a-branch, shallow out-of-order issue (and a fully-pipelined fpu) is an ungodly monstrosity of a cpu, eh?


That it surely is. It also has some abnormally large (for an A8) L2 cache of 512KB, while Broadway has "only" 256KB L2 (also accessible from the GPU for ultra fast CPU-to-GPU feeds). Unfortunately for the ipad, the Wii also has 24MB of ultra low-latency 1T-SRAM with 2GB/s of BW to the CPU, 4GB/s to the GPU, and both units getting *in addition* hefty BW from the GDDR3 pool (both CPU and GPU have access to either pools, and games decide how to best utilise that). That even before considering Hollywood has some 15.5GB/s when operating from its own 1MB of texcache. In comparison ipad's UMA memory comprises of LPDDR2 that peaks at 3.2GB/s, which is all that is shared between the CPU and the GPU. No wonder Apple had to go with the bigger L2 cache - they had to lower that pressure on the memory bus somehow.

Without looking at memory paths, Wii's and ipad's CPUs are not very different MIPS-wise. At some synthetic int tests the A8 might even edge out on Broadway due to former's clock advantage, but at normal int workloads they'd be neck-in-neck. And of course, Broadway would always slaughter the A8 at common scalar fp work. Once we consider the datapaths picture though, Wii's RAM can serve the system much better for most practical game-related scenarios. The only scenario where the ipad might have a chance at an advantage is excessive use of RTT ping-ponging.


..And peaks at 400MPix/s of theoretical physical fillrate (200MHz * 2 ROPs) for the case when its unified shader architecture does nothing but the lightest imaginable pixel shading (basically nothing more than a single texture). Unfortunately, in more realistic scenarios the same shader units have also vertex work to do, so there goes your fillrate, before you'd even start doing any fancy shading work (at 25/75 split of vs/ps your max theoretical fillrate has already dropped to 300MPix/s). So the 535 has to rely on its HSR to save the day. Unfortunately TBDR's HSR advantage vanishes when fillrate is needed the most, eg. in situations with heavy translucency overdraw like particles, etc. In contrast, Hollywood has some 1GPix/s of absolutely sustainable physical fillrate (with one texture and vertex color), and a separete TnL unit to boot. And Hollywood has an embedded fb with hefty BW plus Early-Z HSR (not exactly as efficient as TBDR at opaque overdraw, but still a very snappy HSR tool when used right), which does counter TBDR's traditional advantages over IMRs.

As I said, the Wii is decisively better than the ipad in most game-related scenarios.

I was going to ask beastly compared to what, since it gets beaten by even Atom, but this is much better. Thanks. But what does the iPad GPU does show is the importance of a modern feature set over pure RAW power. And isn't the A4 LPDDR1 still?
 

Lonely1

Unconfirmed Member
kkx7M.png


30 FPS with top notch parts, I don't think the WiiU will be able to match these results (at least not in the beginning)

And it is 1080p with 4xAA to be fair enough.

The 6850 (which will be the maximum on a performance level we can expect of the next gen IMO) does 20fps under these settings, maybe with less overheads of the consoles (x2 according to Carmack) and without AA we will see 1080p @60fps but def. not on WiiU (or only with other technical limitations)

I specifically said UE3 games...
 

wsippel

Banned
lherre said current devkits had three dual threaded cores. What if the final hardware will in fact have two Power7 cores, four-way multithreaded, with two threads reserved?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
But what does the iPad GPU does show is the importance of a modern feature set over pure RAW power.
Well, it has been repeated ad nauseam by some respectable middlware devs ; )

And isn't the A4 LPDDR1 still?
Yep, my bad entirely. The ipad has 2 x 1Gb x 333Mb/s = 2.66GB/s (64bit bus).
 

Deguello

Member
Does anybody know how much an actual standalone Power7 CPU costs?

I've been in customer service chats with IBM and I'm still not able to glean anything about how much one would cost, only that it's "100 PVUs per core." They keep asking me to refer to my server reseller (lawl) and I get that they mostly serve businesses (Hence the B in IBM) but it's still confusing as heck trying to figure out how much the part actually costs.
 

DCKing

Member
lherre said current devkits had three dual threaded cores. What if the final hardware will in fact have two Power7 cores, four-way multithreaded, with two threads reserved?
Two POWER7 cores seems to be a proper way to reduce die size.
But what would they reserve two threads for though?
 

wsippel

Banned
Does anybody know how much an actual standalone Power7 CPU costs?

I've been in customer service chats with IBM and I'm still not able to glean anything about how much one would cost, only that it's "100 PVUs per core." They keep asking me to refer to my server reseller (lawl) and I get that they mostly serve businesses (Hence the B in IBM) but it's still confusing as heck trying to figure out how much the part actually costs.
Completely irrelevant. It's that expensive because it's a high end, low volume component. Order 50 million pieces, not to mention stripped down parts, and you'll get a very different price.


Two POWER7 cores seems to be a proper way to reduce die size.
But what would they reserve two threads for though?
OS maybe.
 

Deguello

Member
Completely irrelevant. It's that expensive because it's a high end, low volume component. Order 50 million pieces, not to mention stripped down parts, and you'll get a very different price.

I just wanna know the price. I'm not saying Nintendo is going to use one of the 32-core ones reserved for industrial servers.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Apropos, I just recalled something I read once that can be of interest to this discussion: power7 has dynamically configurable number of threads per core. For instance, in 'power6 BC' mode the cores are 2-wide SMT.

Also, the 567mm^2 area is for the 8-core, 32MB edram die.
 

BurntPork

Banned
I'm bored, so this is random.

My new spec guess, now with more sanity!

-1.5GHz POWER7 based quad-core with 2-way SMT @45nm. One core is dedicated to the system and won't be used for games. Unnecessary features are removed. Customizations allow perfect hardware emulation of Broadway. 5-10MB eDRAM

-600MHz customized Turks GPU @40nm. Customizations allow perfect hardware emulation of Broadway. 16MB eDRAM

-1GB GDDR5 unified RAM

- Proprietary Wii U Optical Disc Drive, capable of reading dual-layer Wii U Optical Discs. Each layer holds up to 25GB, allowing 50GB total.

- 8GB internal flash storage. Game save can also be transferred to a cloud service which will be revealed by Nintendo early next year as part of the next major 3DS update. Still no internal HDD, but games can be saved on USB drives.

- USB 3.0 ports. Still no built-in Ethernet or optical audio out.

- 802.11b/g/n. No dual band support.

Other predictions.

- GCN games will be added to the Virtual Console. Standard price will be $14.99 in the US.

- There will be a unified friend code-based friends list, but third-parties will be allowed to make games or online systems with separate, name-based friends lists. All lists will have some method of transfer.

- It will not be possible to buy most third-party DLC via the eShop. Each publisher will have their own shop interface. (Unless EA gets what they want...)

- Only one person needs to enter a friend code; the second person receives an invite.

- Party text, voice, and video chat.

- Everyone will still hate it.

- Nintendo will place a lot more focus on DD.

- Several new IPs will be shown at E3 2012. None will release until Holiday 2013 at the earliest, at least one will be canceled, and only the worst non-canceled one will make it to the US.

- Reggie will be assassinated. It will be swift. When he suddenly collapses, the first question asked will be "What's wrong with you?" before the bullet is noticed. The assassin will then check Reggie's box on his list.

- The funk is real.
 

Gravijah

Member
- It will not be possible to buy most third-party DLC via the eShop. Each publisher will have their own shop interface. (Unless EA gets what they want...)

i don't see the point of this. i think, even if there's an in game shop, that you'll still be able to buy dlc from the eShop.
 

BurntPork

Banned
i don't see the point of this. i think, even if there's an in game shop, that you'll still be able to buy dlc from the eShop.

Maybe, and hope so, but third-party publishers would LOVE complete control of their content, and they might convince Nintendo to give it to them.
 

Man God

Non-Canon Member
RAM is going to be a bit higher. I'm thinking 1.5GB.

Also it is going to be exactly like everything we love from Apple and will drive Burntpork mad(der)
 

BurntPork

Banned
RAM is going to be a bit higher. I'm thinking 1.5GB.

Also it is going to be exactly like everything we love from Apple and will drive Burntpork mad(der)

Apple already drives me mad. Well, Apple fans and media, anyway.

I have doubts about it having 1.5GB, since that would mean the the dev kit should have at least 3GB.

Also, who killed the avatars? :(
 
^ Be careful not to bring up the avatars too much BP.

Well I guess its really not an assumption, but a fact:

full

http://twitpic.com/594zsv/full

Wii will be using a customized Power 7 chip

How old is this exactly? The fact that it links to Engadget seems to be back from E3 time. That seems like a pretty big waste of die space. For those mentioning the 4, 6, and 8-core variations, everything I've seen has indicated that IBM only makes the 8-core version and disables cores to achieve the others. In other words Wii U's CPU would be a 3-core processor with 5-cores disabled. I have a tough time believing it's a full POWER7 chip.

And I'm glad this is the discussion. Stepping away from my (weak) belief of possibly not being able to properly classify the CPU's architecture for a moment, I found a picture of the POWER7 core last night and was waiting for the boards to come back up to discuss it. I wanted input from wsippel, blu, and anyone else about what they felt might could be modified from it.

ibmpowerchip7pic2.jpg


I'm still of the opinion that it may resemble a Xenon in that it puts together modified cores on their own die with a pool of eDRAM for L2 cache. So I could see the L2 cache being "trimmed" from the core. Considering IBM's process achieves a density of around 1MB per 2mm^2 and looking at how large Xenon's 1MB of L2 cache is compared to its cores, I would think they could place a pretty large amount in a shared pool like Xenon's.
 
I'm bored, so this is random.

My new spec guess, now with more sanity!

-1.5GHz POWER7 based quad-core with 2-way SMT @45nm. One core is dedicated to the system and won't be used for games. Unnecessary features are removed. Customizations allow perfect hardware emulation of Broadway. 5-10MB eDRAM

Xenon is a 3.2 GHz TriCore. Also, I believe at last leak, the Wii U cpu was tricore. It will not be weaker than the 360 if it's as easy to port as Vigil is claiming. In Fact, IGN's pre-e3 leaks claimed it would be clocked faster than Xenon.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
How old is this exactly? The fact that it links to Engadget seems to be back from E3 time. That seems like a pretty big waste of die space. For those mentioning the 4, 6, and 8-core variations, everything I've seen has indicated that IBM only makes the 8-core version and disables cores to achieve the others. In other words Wii U's CPU would be a 3-core processor with 5-cores disabled. I have a tough time believing it's a full POWER7 chip.
Something's telling we might be able to see in the not-too-distant future a new off-the-shelf power7 version of smaller die space than the 6/8-core variant.. Not unlike how one can get an off-the-shelf "gekko" 750CL nowadays.

For one, you can immediately scratch out the DFU (decimal floating-point unit).
 

Deguello

Member
Something's telling we might be able to see in the not-too-distant future a new off-the-shelf power7 version of smaller die space than the 6/8-core variant.. Not unlike how one can get off-the-shelf "gekko" 750CL nowadays.


For one, you can immediately scratch out the DFU (decimal floating-point unit).

You should also take off those blue things, they look like razor blades and the manufacturers might cut themselves.

And CRU BRU sounds like a frat-boy targeted beer.
 

BurntPork

Banned
Xenon is a 3.2 GHz TriCore. Also, I believe at last leak, the Wii U cpu was tricore. It will not be weaker than the 360 if it's as easy to port as Vigil is claiming. In Fact, IGN's pre-e3 leaks claimed it would be clocked faster than Xenon.

First of all, lolIGN.

Second, Xenon is really old and inefficient now. Also, Power7 is OOE. This means that a Power7-based CPU can outperform the in-order Xenon even at a much lower clock. Combine that with the heat issue, and there's zero chance of this being clocked that high, as well as zero need.

That said, 1.5GHz may have been a bit too low. 2.0GHz should be fine.
 
Something's telling we might be able to see in the not-too-distant future a new off-the-shelf power7 version of smaller die space than the 6/8-core variant.. Not unlike how one can get an off-the-shelf "gekko" 750CL nowadays.

So which came first? The chi.. Gekko or the 750CL?


For one, you can immediately scratch out the DFU (decimal floating-point unit).

Yeah I remember wsippel saying that back at the beginning of the thread. Oh and that was an interesting find about configurable threads. But if two threads per core is true, why drop to that?
 
Status
Not open for further replies.
Top Bottom