• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Speculation Thread The Third: Casting Dreams in The Castle of Miyamoto

EDarkness

Member
I know enough of game development that uprezzing in general is not a hard thing. At all. If the power is available, it's a matter of changing some constants and then it should be running. The PC version does not need some magic stuff to pull this off.

We don't know anything about the Wii U. We don't know what the dev environment is or how easy it is to do this sort thing with what they have setup. Maybe they upped it to 1080p, had some strange texture issues, and brought it back down. Who knows. Either way, we don't know how "easy" or "hard" it is. We do know it can have higher res textures even at 720p, but they're not doing that either.

You claim a lot of "we don't know this, we don't know that", but we don't know this either. This statement is based on nothing really. I'm not saying it's not true, but the statements made by Vigil and Ideaman are in fact pointing in exactly the opposite direction. We shouldn't ignore that, especially considering they give us the most concrete performance 'measures' available.

You're right. There's a lot we don't know. However, I'm going with the fact that we know the Wii U has more RAM, which means they can at the very least up the textures, but they aren't doing that. If they're not doing that, then what makes any of us think that they'll do anything else? Considering they didn't do much with the PC version of the first game, it's not out of the realm of possibility that they'll skimp on the visuals. One of the developers of the game said specifically that the Wii U can do a lot more and can use more resources, but whether or not they do that is a totally different issue.
 

Bagu

Member
i've been pretty busy in the last few days, did anything worth knowing happen?

if the answer is "no" what did you guys talked about for 20 pages? XD

Virgil has been Tsundere
Toki Tori 2 is doing the most horrendous of teasing
Wavebirds need to return
Wavebirds!
 
But how could they sell a new Animal Crossing after implementing this?

:'D

Works both ways, Could be called "Mii Crossing", and when you play/install "Animal Crossing" they just become another neighborhood.

Ironically I think this idea could catch on with the HUGE # of people who play FB games night and day.
 

DCKing

Member
We don't know anything about the Wii U. We don't know what the dev environment is or how easy it is to do this sort thing with what they have setup. Maybe they upped it to 1080p, had some strange texture issues, and brought it back down. Who knows. Either way, we don't know how "easy" or "hard" it is. We do know it can have higher res textures even at 720p, but they're not doing that either.

You're right. There's a lot we don't know. However, I'm going with the fact that we know the Wii U has more RAM, which means they can at the very least up the textures, but they aren't doing that. If they're not doing that, then what makes any of us think that they'll do anything else? Considering they didn't do much with the PC version of the first game, it's not out of the realm of possibility that they'll skimp on the visuals. One of the developers of the game said specifically that the Wii U can do a lot more and can use more resources, but whether or not they do that is a totally different issue.
I think you don't quite understand the technology involved here. Nintendo is not going to build a console in which it's going to be a hurdle to change the output picture resolution to 1080p from 720p. This is a matter of basic configuration. Also, the fact that the Wii U has more RAM by itself definitely does not mean they can up the texture resolution, as the GPU also needs to be able to cope with the added texture fillrate.

I don't think this is a done deal, mind. These statements are not something you can dismiss like that however.
 

wsippel

Banned
It seems Nintendo is trying to cover their trails on the TLS database. A few entries simply vanished in recent days - or more precisely: They were updated. The devices are still in the database, but Nintendo isn't mentioned anymore. Paranoia at it's best.
 

Bagu

Member
It seems Nintendo is trying to cover their trails on the TLS database. A few entries simply vanished in recent days - or more precisely: They were updated. The devices are still in the database, but Nintendo isn't mentioned anymore. Paranoia at it's best.

Should we be worried for this thread then?
 

darthdago

Member
ok beside all the GPU talk.

I have found something that is from January but I dont really remember that it was talked about here.
Its about the IBM Power 7 Wafer production in 28/32nm process in Fab8 from Globalfoundries 100 miles north of East Fishkill.
Mass production should start mid year there.

If its old news/information, then I'm sorry.

Thats the link:
http://www-03.ibm.com/press/us/en/pressrelease/36465.wss

Quote:

IBM and GLOBALFOUNDRIES Begin First Production At New York’s Latest Semiconductor Fab

First products from Fab 8 developed and manufactured in New York’s ‘Tech Valley’


Saratoga County, N.Y. - 09 Jan 2012: GLOBALFOUNDRIES and IBM (NYSE: IBM) today announced an agreement to jointly manufacture advanced computer chips at the companies’ semiconductor fabs in New York’s “Tech Valley.” The chips are the first silicon produced at GLOBALFOUNDRIES’ newest and most advanced manufacturing facility, "Fab 8" in Saratoga County, and are planned to ramp to volume production in the second half of 2012. The new products recently began initial production at IBM’s 300mm fab in East Fishkill.

GF Fab 8 in Saratoga

Workers prep Global Foundries' newest semiconductor factory, "Fab 8" in Saratoga County, New York State. The fab comes on line for the first time with a maiden production run of microprocessors based on IBM's latest, 32nm, silicon-on-insulator chip technology. The chips will be used by manufacturers in networking, gaming and graphics.

The chips are based on IBM’s 32nm, Silicon-on-Insulator (SOI) technology, which was jointly developed with GLOBALFOUNDRIES and other members of IBM’s Process Development Alliance, with early research at the University at Albany’s College of Nanoscale Science and Engineering. The technology vastly improves microprocessor performance in multi-core designs and speeds the movement of graphics in gaming, networking, and other image intensive, multi-media applications. The SOI process was used to build the microprocessor that powered IBM Watson, the question-answering computer that won the Jeopardy! quiz show in early 2011.

“IBM has helped make New York State one of the world’s premier locations for semiconductor design and manufacturing,” said Michael Cadigan, general manager, IBM Microelectronics. “Recently, we announced that we would spend $3.6 billion researching and developing new silicon technology in New York. We bring the skills, investments and partnerships that keep New York at the forefront of advanced silicon development and manufacturing.”

“Today’s announcement is a natural extension of our longstanding partnership with IBM that includes production of 65nm and 45nm chips at our fabs in Singapore and Germany,” said GLOBALFOUNDRIES CEO Ajit Manocha. “With the addition of our newest factory in New York, we will now be jointly producing chips with IBM at four fabs on three continents.”

New York’s “homegrown” HKMG technology offers cost-savings, better performance

GLOBALFOUNDRIES’ new Fab 8 campus, located in the Luther Forest Technology Campus about 100 miles north of the IBM campus in East Fishkill, stands as one of the most technologically advanced wafer fabs in the world and the largest leading-edge semiconductor foundry in the United States. When fully ramped, the total clean-room space will be approximately 300,000 square feet and will be capable of a total output of approximately 60,000 wafers per month. Fab 8 will focus on leading-edge manufacturing at 32/28nm and below.

The companies’ 32/28nm technology uses the same “Gate First” approach to High-k Metal Gate (HKMG) that has reached volume production in GLOBALFOUNDRIES’ Fab 1 in Dresden, Germany. This approach to HKMG offers higher performance with a 10-20% cost saving over HKMG solutions offered by other foundries, while still providing the full entitlement of scaling from the 45/40nm node.

The new chips also will feature IBM’s eDRAM (embedded dynamic random access memory) technology, which dramatically improves on-processor memory performance in about one-third the space with one-fifth the standby power of conventional SRAM (static random access memory). IBM chips are at the heart of the company's server and storage systems, the world's fastest supercomputers and many of the best-known and widely used communications and consumer electronics brands.

ABOUT GLOBALFOUNDRIES

GLOBALFOUNDRIES is the world’s first full-service semiconductor foundry with a truly global manufacturing and technology footprint. Launched in March 2009 through a partnership between AMD [NYSE: AMD] and the Advanced Technology Investment Company (ATIC), GLOBALFOUNDRIES provides a unique combination of advanced technology, manufacturing excellence and global operations. With the integration of Chartered Semiconductor in January 2010, GLOBALFOUNDRIES significantly expanded its capacity and ability to provide best-in-class foundry services from mainstream to the leading edge. GLOBALFOUNDRIES is headquartered in Silicon Valley with manufacturing operations in Singapore, Germany, and Saratoga County, New York. These sites are supported by a global network of R&D, design enablement, and customer support in Singapore, China, Taiwan, Japan, the United States, Germany, and the United Kingdom.

For more information on GLOBALFOUNDRIES, visit http://www.globalfoundries.com.
 

tkscz

Member
Ign's article is nonsense. I know how they got that #. When I read that, I thought, "hmmm i think i know what they're doing to get these bogus #'s".

http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units#Radeon_R400_series
http://en.wikipedia.org/wiki/FLOPS
We'll be using this AMD GPU wiki for reference to the #'s i'm about to dish out. Also linked is a wiki on what a FLOP is and what it means if you need it.

(1 GFLOPS = 1 billion FLoating-point OPerations per Second, 1000 GFLOPS = 1 TFLOP)

Xbox 360 GPU = 240 GFLOPS
AMD HD 6670 (rumored to be in a dual setup f/ the Xbox 720) = 768 GFLOPS X 2 = 1536 GFLOPS
AMD HD 4870 (the rumored RV770 in Wii U) = 1200 GFLOPS

1536/240 = 6.4 which is where they get the whole "6x more powerful than 360 thing"
1200/240 = 5 which is where they get the whole "5x more powerful than 360 thing"
(1536-1200)/1536= 0.21875 which is where they get the whole "Xbox 720 is 20% stronger than Wii U thing"


IT'S ALL BS PEOPLE. It's all nonsense speculation that anyone can do as I proved above.

Oh, and the Wii U will not be limited to a single thread. It's rumored to be a tri-core IBM cpu based on "Watson" aka the Power7 CPU from IBM which does out of order execution, and does 4..YES FOUR simultaneous threads per core...so that tri-core would do 12 threads.

Math Conquers all!
 
Oh, and the Wii U will not be limited to a single thread. It's rumored to be a tri-core IBM cpu based on "Watson" aka the Power7 CPU from IBM which does out of order execution, and does 4..YES FOUR simultaneous threads per core...so that tri-core would do 12 threads.

Wii U's CPU is targeted to have two threads per core. POWER7 is capable of 1-way, 2-way, and 4-way SMT. Good calcs on the other stuff though.

ok beside all the GPU talk.

That's not necessarily about POWER7. And hypothetically not limited to Wii U's CPU. :)

I bailed as soon as I clicked submit so I won't be giving any immediate responses.
 

MDX

Member
Maybe someone was just guessing based on the fact that Nintendo dropped the 1T-SRAM. Then again, I wouldn't be all that surprised if they dropped it. I think Wii hardware compatibility would be a waste of transistors, anyway - the die space could be used to add more eDRAM or shader units instead. They should do HD remasters on eShop.


Problem is, I was looking forward to playing my Wii games on the controller.
I was even planning on buying more Wii games cheaply to do this. It would put a dent in my plans if they do not include it.
 
The never-ending search for Wii U related info often leads to interesting, even if unnecessary, observations.

Speaking of which, have you guys noticed that -on some instances- the image on the Upad is not matching the image on the TV screen precisely?

I'm not sure if it's something related to the specific TV set, specific game or even due to unfinished hardware, but take a look here:

(google image search - not sure if it's the same setup)





The U pad screen is clearly (highlighted) displaying "a more complete" image of the game.

Any ideas?
 

MDX

Member
ok beside all the GPU talk.

I have found something that is from January but I dont really remember that it was talked about here.
Its about the IBM Power 7 Wafer production in 28/32nm process in Fab8 from Globalfoundries 100 miles north of East Fishkill.
Mass production should start mid year there.

This was brought up before, but many people swear that its regarding Xbox 3... dev kits.
 

wsippel

Banned
Super useless information alert! This is a picture of one of the wireless adapter used in the modular V4 devkits:

XP8T7.jpg



This was brought up before, but many people swear that its regarding Xbox 3... dev kits.
The chips in question are in low volume production since late last/ early this year. That's way too soon for a system not expected for another 18 months. The very first Xbox3 chips for devkits will probably be produced at the end of the year, maybe in early 2013.
 

guek

Banned
Super useless information alert! This is a picture of one of the wireless adapter used in the modular V4 devkits:

http://i.imgur.com/XP8T7.jpg[IMG]

[/QUOTE]

DAT CIRCUIT BOARD


Where are you diggin this stuff up? And how did you get a hold of that middleware document that talked about the cpu? [I]Who are you??[/I]
 

EDarkness

Member
I think you don't quite understand the technology involved here. Nintendo is not going to build a console in which it's going to be a hurdle to change the output picture resolution to 1080p from 720p. This is a matter of basic configuration. Also, the fact that the Wii U has more RAM by itself definitely does not mean they can up the texture resolution, as the GPU also needs to be able to cope with the added texture fillrate.

I don't think this is a done deal, mind. These statements are not something you can dismiss like that however.

Oh, I know all about that sort of thing since I do a bit of programming myself (working on a little project now as a matter of fact). That said, I am open to the possibility that there may be real issues with getting this stuff to work, but when one of the guys says the system is powerful yet they aren't pushing the system I can't help but think there are two issues here 1) They're simply playing us all lip service and the system isn't all that powerful, or 2) the system is plenty powerful but they don't want to invest the time needed to take advantage of that and instead want to focus on the controller as the real draw.

I'm willing to accept it's actually a "weak" system, but having worked in the game biz and knowing people who work in the game biz, I'll just say that I wouldn't put it past being simply not an important thing to do regardless of how easy it should be. The resources are limited, so perhaps they're going to put those resources to work on controller functions instead of graphical fidelity. No one should take their comments (good or bad) as any definition of how powerful the system is.
 

Nibel

Member
The never-ending search for Wii U related info often leads to interesting, even if unnecessary, observations.

Speaking of which, have you guys noticed that -on some instances- the image on the Upad is not matching the image on the TV screen precisely?

I'm not sure if it's something related to the specific TV set, specific game or even due to unfinished hardware, but take a look here:

(google image search - not sure if it's the same setup)





The U pad screen is clearly (highlighted) displaying "a more complete" image of the game.

Any ideas?

Uhm.. two completely different screens = two different sets of colors?
 
R

Rösti

Unconfirmed Member
DAT CIRCUIT BOARD


Where are you diggin this stuff up? And how did you get a hold of that middleware document that talked about the cpu? Who are you??
I just purchased Skyrim today and have been busy with that and thus haven't been paying attention to this thread. Does the document concern Wii U or Orbis/Durango?

Nice information anyway, wsippel.
 

DrWong

Member
The never-ending search for Wii U related info often leads to interesting, even if unnecessary, observations.

Speaking of which, have you guys noticed that -on some instances- the image on the Upad is not matching the image on the TV screen precisely?

I'm not sure if it's something related to the specific TV set, specific game or even due to unfinished hardware, but take a look here:

(google image search - not sure if it's the same setup)





The U pad screen is clearly (highlighted) displaying "a more complete" image of the game.

Any ideas?

It's on par (sorry).
 
So I couldn't reply earlier, so sorry for bringing it back up, but that TwoTribes Steam-esque tweet is really promising.

That said, if they're teasing bullshit, I'm going to cut them.
 

BD1

Banned
Works both ways, Could be called "Mii Crossing", and when you play/install "Animal Crossing" they just become another neighborhood.

Ironically I think this idea could catch on with the HUGE # of people who play FB games night and day.

I definitely think they could have a potential gold mine if they created some sort of Mii Plaza/PS Home/Animal Crossing/Social Games Frankenstein and had it preinstalled on the Wii U OS Day One.

On a similar note, do you think Nintendo will rebrand the "Wii" Series of games as "Mii" Series. Mii Sports, Mii Play, etc. The "Wii ____" series is one of the most successful in gaming history, but you can't carry the brand to different consoles. If they switched it to "Mii ____" you can keep continuity and brand recognition.
 

royalan

Member
Ugh, I've been avoiding this 3rd OT because I knew in my fragile heart that I didn't have the emotional fortitude to survive another one.


But...anything? Anything at all? 11,000 posts, 2 months (and some change) until E3, and still no peep?
 
What if they award points like microsoft in their accomplishments, but give mii adornments as graduated rewards... hats, tanooki suits, mario moustaches, waluigi legs. Say a reward every 500 points, and a big one every 2k?

I still maintain that points are a terrible idea. People will just buy shittier games that happen to give out points like candy in order to get more e-bling.

Rewards need to be linked to specific actions. You beat NMH3? It unlocks the beam katana in SSBU. You win the Thwomp Cup in "Mario Kart U Dash" in reversed controls mode? Captain Falcon now has the option of riding a hoverbike in "F Negative One". Complete 500 dives in "PilotWings: Endless Warfare"? Bam, your Mii gets to fly around the background of the Wii U Menu in a jetpack. Crazy shit like that. But not tied to points.
 

Nibel

Member
Ah, now I see it.

It's probably the TV resolution. If I connect my PC to my 52 inch screen, then the image gets cut a bit. :)
 

wsippel

Banned
DAT CIRCUIT BOARD

Where are you diggin this stuff up? And how did you get a hold of that middleware document that talked about the cpu? Who are you??
I'm just pretty experienced at finding stuff, that's all. The circuit board was easy to find for example. The description on TLS Singapore mentions the wireless adapters (the one I posted is just one of several adapters Nintendo used), the product code plus the educated guess that Hon Hai Precision (Foxconn) would manufacture the adapters led me to likely FCC IDs (MCLMICA2 for the part I posted), and lo and behold: all the parts are in the FCC database, with photos and all.
 
Yeah, I'm with you on this. Maybe I have too much respect for developer's skills and views.

The comments on the system are a confused mess. An HD7770 would easily perform better than 'touched up 360 game in 720p + Upad' like IdeaMan indicated, which either means the Wii U is not powerful or that developers are incapable or unwilling to do much with it right now.

Not necessarily...

When you consider that launch titles for new platforms rarely ever look great and are sometimes downright horrible (Gun for instance), let alone previous generation ports, anything is possible. It could simply be a matter of those complaining about "slightly better" may simply be working on launch window ports (which will likely be bottom of the barrel more often than not) as opposed to someone who is maybe early on in development on a title that isn't expected to release till late 2013 which will actually begin to show true system capabilities. There are a number of variables and that is why we get a "confused mess" worth of comments.
 

Azure J

Member
I still maintain that points are a terrible idea. People will just buy shittier games that happen to give out points like candy in order to get more e-bling.

Rewards need to be linked to specific actions. You beat NMH3? It unlocks the beam katana in SSBU. You win the Thwomp Cup in "Mario Kart U Dash" in reversed controls mode? Captain Falcon now has the option of riding a hoverbike in "F Negative One". Complete 500 dives in "PilotWings: Endless Warfare"? Bam, your Mii gets to fly around the background of the Wii U Menu in a jetpack. Crazy shit like that. But not tied to points.

This is seriously one of the coolest ideas for achievements I've seen yet. Thing is, it'd require a ton of micro management to make sure cross game things like this held throughout all games on the system.
 

DCKing

Member
Not necessarily...

When you consider that launch titles for new platforms rarely ever look great and are sometimes downright horrible (Gun for instance), let alone previous generation ports, anything is possible. It could simply be a matter of those complaining about "slightly better" may simply be working on launch window ports (which will likely be bottom of the barrel more often than not) as opposed to someone who is maybe early on in development on a title that isn't expected to release till late 2013 which will actually begin to show true system capabilities. There are a number of variables and that is why we get a "confused mess" worth of comments.
No, I don't believe that. This stuff used to be the case back when every new generation involved new, never-seen-before hardware that did things developers never even dreamt of before. That era has gone away now graphics chips are fully programmable beasts. The Wii U GPU will be customized to the max, but will still be an evolution of what was in the Xbox 360 and what has been in PCs for the last six years. In many cases it will do things almost exactly the same as what they did on the Xbox 360 (or at least on DX10+ Radeon cards), just quicker and more of it. The devkit hardware may have had reliability issues, performance swings, Nintendo quirks and whatnot, but at least it would be somewhat familiar to any developer who has made 360 or PC games.

I'm not an expert in these architectures, so I hope blu can clarify whether this is the case or not.
 
This is seriously one of the coolest ideas for achievements I've seen yet. Thing is, it'd require a ton of micro management to make sure cross game things like this held throughout all games on the system.

Always seemed pretty logical to me.

The "Do stuff in game" → "Get stuff in system menu" part of it is the easiest thing to do. I'm told that you get some of that on the Xbox 360 dashboard even now.

But, yeah, I'd really like to see a situation where each player has a dramatically different set of Rewards because they happen to have different game playing habits. Then I'd have a reason to go visit my friends' homes more often. :)
 

darthdago

Member

Azure J

Member
so. here is a test of the 7770 vs. 7750.

as the 7770 is cosidered to be the basis on which the WiiU GPU could be build around...

Cos its in german I have translated it via google...

Link (original-german):
http://www.tomshardware.de/radeon-hd-7770-7750-benchmark-Cape-Verde,testberichte-240960.html

Link (translation):
http://translate.google.com/transla...k-Cape-Verde,testberichte-240960.html&act=url

So all in all the 7750 will be the right basis (theoretically 819GFLOPS, 55watts)

Only reason why I still opt for speculation based on the 7770 as an inkling of what the final product might look like over the 7750 is the number of shader units really. Dev kits starting with 640SPU GPUs going to a modern 640 SPU part just seems like the right idea. There's also the idea that the performance hits that "over 1TFlop" notation AMD made regarding the card. Of course as always this is just speculation. I'm inclined to believe nothing has happened until we get confirmation, but some coincidences here and there do help to put together an idea of what to expect.
 
I'll be happy to see the day when all speculation ends and we have a techie thread that I hardly understand but all posts are based on facts of the retail hardware.

I believe when its all said and done by January 2014 we may all look back and see the gap was not as big a deal as we are making it out to be... by gap I mean PS4/Xbox3 > WiiU
 

z0m3le

Banned
-paraphrased.
Higher rez textures aren't a done deal.
Remember: E3 2011

“Right now we’re still finding out what kind of final tech specs the Wii U is going to have,” said Martel.

“But we like the system a lot; we think it’s going to be a really cool stop-gap in between this generation and the next generation. We think it’s really smart of Nintendo, and the fact that as a platform it’s a lot more capable for hardcore first-person shooter-style gaming – for us that’s fantastic.”

“We’ve got the [Aliens: Colonial Marines] engine running on the Wii U, and as far as the console goes, you’re going to see textures at a resolution that you haven’t seen on [the current] generation,” said Martel.


Darksiders 2 could have higher resolutions, that much is possible with the underclocked devkits they had last year, they still haven't received devkit V5, which is a bump in the direction we want them to go.

Martel's most interesting quote is the stop-gap comment, that is a pretty good indicator that the machine will be 2x-4x times more powerful than PS360.

I know it's hip to have low expectations to save yourself from disappointment, but where is the fun in that? The hardware will have parts that are 1x current gen and I am sure there will be a few parts that are 4 or 5 times what is in 360, that doesn't seem outrageous to me.
 

nordique

Member
so. here is a test of the 7770 vs. 7750.

as the 7770 is cosidered to be the basis on which the WiiU GPU could be build around...

Cos its in german I have translated it via google...

Link (original-german):
http://www.tomshardware.de/radeon-hd-7770-7750-benchmark-Cape-Verde,testberichte-240960.html

Link (translation):
http://translate.google.com/transla...k-Cape-Verde,testberichte-240960.html&act=url

So all in all the 7750 will be the right basis (theoretically 819GFLOPS, 55watts)



This is fine and all, but we have to keep in mind, it was all speculation by one poster; not "considered to be the basis by which it could be built around"

Thraktor had excellent posts, which were - thankfully I might add - hypothesized around educated reasoning and well thought out speculation based on credible rumour information and his/her own deductions.


This does not mean however everyone should go overboard and start assuming the 7770 (or any customized equivalent) is what the Wii U GPU revolves around now.

That may very well turn out to be the case, but for now, it is not at all. Every time that card is mentioned I fear people are subconsciously setting expectations which may or may not be there.

So as long as everyone keeps this in mind, it is safe to talk about.
 

nordique

Member
Only reason why I still opt for speculation based on the 7770 as an inkling of what the final product might look like over the 7750 is the number of shader units really. Dev kits starting with 640SPU GPUs going to a modern 640 SPU part just seems like the right idea. There's also the idea that the performance hits that "over 1TFlop" notation AMD made regarding the card. Of course as always this is just speculation. I'm inclined to believe nothing has happened until we get confirmation, but some coincidences here and there do help to put together an idea of what to expect.

Agreed


But we don't know the theoretical FLOP levels; it may not end up being over 1TFLOP or it may end up surpassing those expectations

That said, 640 SPU is reasonable when considering all the information we can accurately suppose to be true especially when first dev kits' GPUs were revealed
 
No, I don't believe that. This stuff used to be the case back when every new generation involved new, never-seen-before hardware that did things developers never even dreamt of before. That era has gone away now graphics chips are fully programmable beasts. The Wii U GPU will be customized to the max, but will still be an evolution of what was in the Xbox 360 and what has been in PCs for the last six years. In many cases it will do things almost exactly the same as what they did on the Xbox 360 (or at least on DX10+ Radeon cards), just quicker and more of it. The devkit hardware may have had reliability issues, performance swings, Nintendo quirks and whatnot, but at least it would be somewhat familiar to any developer who has made 360 or PC games.

I'm not an expert in these architectures, so I hope blu can clarify whether this is the case or not.

agree to disagree i guess. Correct me if I am wrong but fully programmable GPUs have been in consoles for the past 3 generations and yet we were still plagued by sloppy launches. What makes the Wii U any different?
 

11redder

Member
I'm not talking about colors.

untitled7rklh.jpg


Could be the TV settings though.

I don't know if these were taken from E3 2011, but Nintendo used Panasonic sets there, which generally tend to default to showing 95% of the image. It's probably just a case of Nintendo not bothering to change the settings in the aspect adjustment menu.
 

wsippel

Banned
so. here is a test of the 7770 vs. 7750.

as the 7770 is cosidered to be the basis on which the WiiU GPU could be build around...

Cos its in german I have translated it via google...

Link (original-german):
http://www.tomshardware.de/radeon-hd-7770-7750-benchmark-Cape-Verde,testberichte-240960.html

Link (translation):
http://translate.google.com/transla...k-Cape-Verde,testberichte-240960.html&act=url

So all in all the 7750 will be the right basis (theoretically 819GFLOPS, 55watts)
The Wii U GPU won't be built around or based on Southern Islands. From everything we know, it's a based on R700 but customized beyond recognition. It's unlike any off-the-shelf AMD GPU.

I talked to bgassassin a few days ago, and I believe we concluded that the chip is probably pretty slow on paper, maybe 300, 400GFLOPS or something, but extended with a couple of shortcuts to accelerate certain common, taxing operations.
 
Top Bottom