• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Speculation Thread The Third: Casting Dreams in The Castle of Miyamoto

IdeaMan

My source is my ass!
I know enough of game development that uprezzing in general is not a hard thing. At all. If the power is available, it's a matter of changing some constants and then it should be running. The PC version does not need some magic stuff to pull this off.
You claim a lot of "we don't know this, we don't know that", but we don't know this either. This statement is based on nothing really. I'm not saying it's not true, but the statements made by Vigil and Ideaman are in fact pointing in exactly the opposite direction. We shouldn't ignore that, especially considering they give us the most concrete performance 'measures' available.

Just to correct something, my informations don't tend to the same direction as the Vigil director statement at all. I spent a lot of time trying to explain how this declaration could be viewed and fit the overall image that we have of the Wii U power, but i think i made myself perfectly clear by repeating 100th time that no, the Big N new system is not "on par".
 

StevieP

Banned
The chips in question are in low volume production since late last/ early this year. That's way too soon for a system not expected for another 18 months. The very first Xbox3 chips for devkits will probably be produced at the end of the year, maybe in early 2013.

We do know that the Wii U CPU is being made at East Fishkill, however IBM's press release last year stated that it would be a 45nm SOI
 

Azure J

Member
The Wii U GPU won't be built around or based on Southern Islands. From everything we know, it's a based on R700 but customized beyond recognition. It's unlike any off-the-shelf AMD GPU.

I talked to bgassassin a few days ago, and I believe we concluded that the chip is probably pretty slow on paper, maybe 300, 400GFLOPS or something, but extended with a couple of shortcuts to accelerate certain common, taxing operations.

Hmm interesting. I know about the customization that would go into something like this, but what you're suggesting is a weird little thing. It'd be weird yet efficient enough to be a Nintendo endeavor. At the same time though, something like this going towards a 5-6 year lifespan against a thoroughly power obsessed Microsoft (as per ShockingAlberto) and middle road Sony? (As per Brain_stew on Sony using a complete AMD solution with a SI GPU base) I can only hope for better to be quite honest. :p
 

luffeN

Member
Ah, now I see it.

It's probably the TV resolution. If I connect my PC to my 52 inch screen, then the image gets cut a bit. :)
That should not happen. Do you connect it via HDMI? Then go to your graphics settings and search for Overscan, you can adjust a slider there so that the picture will fit perfectly.
 

z0m3le

Banned
The Wii U GPU won't be built around a Southern Islands GPU. From everything we know, it's a based on R700 but customized beyond recognition. It's unlike any off-the-shelf AMD GPU.

I talked to bgassassin a few days ago, and I believe we concluded that the chip is probably pretty slow on paper, maybe 300, 400GFLOPS or something, but extended with a couple of shortcuts to accelerate certain common, taxing operations.

That is a pretty low estimation, glad it's just guess work, what is the point in having a custom part if it isn't built for efficiency? they could grab a mobile GPU at that point and blow those numbers right up.

Seriously gaf, as knowledgeable as you both are when it comes to hardware, Your pessimism actually bewilders me, considering the tablet controller, that number is as outrageous as 2Tflops.

It is bare minimum 600Gflops, just based on the assumption that they built this thing knowing they were releasing it in 2012 and that they could use a 40nm part off the shelf chip that hits those numbers for $20.

Just to correct something, my informations don't tend to the same direction as the Vigil director statement at all. I spent a lot of time trying to explain how this declaration could be viewed and fit the overall image that we have of the Wii U power, but i think i made myself perfectly clear by repeating 100th time that no, the Big N new system is not "on par".

Ideaman, I think this thread is going through a new low right now, this is about as bad as before you first posted here, I mean with statements like the one I just quoted, it seems like they lack belief.
 

IdeaMan

My source is my ass!
It seems Nintendo is trying to cover their trails on the TLS database. A few entries simply vanished in recent days - or more precisely: They were updated. The devices are still in the database, but Nintendo isn't mentioned anymore. Paranoia at it's best.

Maybe a sign that the v5 dev kit story that i've mentioned earlier is true :p (I think we even talked about the possibility that Nintendo will use other channel to distribute dev kits, or hide their shippings, etc.)
Well, at least, it was what Nintendo told big non-japanese studios a few weeks ago.
 

wsippel

Banned
We do know that the Wii U CPU is being made at East Fishkill, however IBM's press release last year stated that it would be a 45nm SOI
Right. But what about the GPU? The IBM press release specifically mentions "graphics in gaming" (=> GPU) and confirms that the chips they produce use eDRAM. PC GPUs don't use 32nm SOI or eDRAM. Also, Fab 8 is a joint IBM/ Globalfoundries project, and Globalfoundries manufactures for AMD.
 
The Wii U GPU won't be built around or based on Southern Islands. From everything we know, it's a based on R700 but customized beyond recognition. It's unlike any off-the-shelf AMD GPU.

I talked to bgassassin a few days ago, and I believe we concluded that the chip is probably pretty slow on paper, maybe 300, 400GFLOPS or something, but extended with a couple of shortcuts to accelerate certain common, taxing operations.

Isn't this drastically lower than your previous expectations? I thought we were assuming 640 SPUs based on the dev kits.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Just to correct something, my informations don't tend to the same direction as the Vigil director statement at all. I spent a lot of time trying to explain how this declaration could be viewed and fit the overall image that we have of the Wii U power, but i think i made myself perfectly clear by repeating 100th time that no, the Big N new system is not "on par".

It's on par until I see something pretty concrete other then these rumours, hints and speculations.
 
has anyone seriously questioned why Nintendo needs to be this secretive?

They are never ever going to be the cutting edge top dog spending as much as others and no matter what after launch they still can be copied by the other guys. So why make it this hard to get info about the upcoming product?
 
has anyone seriously questioned why Nintendo needs to be this secretive?

They are never ever going to be the cutting edge top dog spending as much as others and no matter what after launch they still can be copied by the other guys. So why make it this hard to get info about the upcoming product?
we discussed it many a times. The conclusion seemed to be that they don't want to take the focus off the 3DS rebound to the top, and also probably because they want to put all their eggs in order so they don't fuck up 2 E3s in a row.

People need to remember that their upcoming E3 is probably their most important one for the next few years. Better to make sure everything is in order than not.

As to why certain 3rd parties can't talk about their shit, that much I can't tell.
 
seriously, I wish the entire Wii U block diagram and specs would leak, like what happened around mid 2004 when the Xenon (Xbox 360) diagram and specs leaked, more than a year before 360 was revealed on MTV in 2005.
 

Azure J

Member
has anyone seriously questioned why Nintendo needs to be this secretive?

They are never ever going to be the cutting edge top dog spending as much as others and no matter what after launch they still can be copied by the other guys. So why make it this hard to get info about the upcoming product?

I'm guessing it's precisely because of the effort MS & Sony put into advertising that their systems are "bigger, better & more badass" that Nintendo opts to say little about what's going on. They don't want to preemptively dampen any hype for their own offerings even if we know some expectations on the hardware side should be dampened.

640 shader units is what I was told, but as I also mentioned back then, the GPU was supposedly running at a very low clock speed.

Oh yeah, I can't believe I forgot that detail. 500MHz 4830s because full blast was causing weird lockups, correct?

[z0mble's post below makes me wonder even more why the 4770 wasn't chosen over the 4830. :p]

So Kid Icarus development started on PC & Wii as Sora didn't have 3DS dev kits...

So they have better-quality and maybe HD assets for the game...

I told you ! Kid Icarus for Wii U is more and more probable !

Would love to see some of the assets on both. (Also Kid Icarus with pointer controls? YES PLEASE)

The franchise is exploding, I don't doubt a big entry happening on Wii U in the future, but nothing's going o happen for a while on that front.
 
R

Rösti

Unconfirmed Member
seriously, I wish the entire Wii U block diagram and specs would leak, like what happened around mid 2004 when the Xenon (Xbox 360) diagram and specs leaked, more than a year before 360 was revealed on MTV in 2005.
Yeah, I feel the same. I hope that Nintendo at least issues a specification sheet to the public, so people don't have to resort to excessive reverse engineering to find out what Wii U incorporates. Of course are various tech sites going to open up the box and perhaps run some tests on the hardware if deemed possible, so from there clock frequencies and such may be gathered, but an official white paper would be very convenient.
 

StevieP

Banned
Right. But what about the GPU? The IBM press release specifically mentions "graphics in gaming" (=> GPU) and confirms that the chips they produce use eDRAM. PC GPUs don't use 32nm SOI or eDRAM. Also, Fab 8 is a joint IBM/ Globalfoundries project, and Globalfoundries manufactures for AMD.

So you're inferring the 32nm part that everyone (including Charlie) is speculating on being produced in East Fishkill w/GF is actually the Wii U's GPU?
 

Azure J

Member
So you're inferring the 32nm part that everyone (including Charlie) is speculating on being produced in East Fishkill w/GF is actually the Wii U's GPU?

Link?

*Is actually pretty hungry for some new detail or old overlooked detail that can put together an idea of what to expect here*
 

z0m3le

Banned
640 shader units is what I was told, but as I also mentioned back then, the GPU was supposedly running at a very low clock speed.

And why would they keep the clock speed so low? I'm sure that was before it was a nintendo chip and simply an AMD/ATI GPU. the HD4830 (640spu hit 736Gflops @ stock 575mhz) and the HD4770 (other 640spu hit 960Gflops @ stock 750mhz)

Even bringing those numbers down to 300mhz shouldn't hit that 300gflops in your estimates.

http://en.wikipedia.org/wiki/Compar...essing_units#Radeon_R700_.28HD_4xxx.29_series
 
Rösti;36469789 said:
Yeah, I feel the same. I hope that Nintendo at least issues a specification sheet to the public, so people don't have to resort to excessive reverse engineering to find out what Wii U incorporates. Of course are various tech sites going to open up the box and perhaps run some tests on the hardware if deemed possible, so from there clock frequencies and such may be gathered, but an official white paper would be very convenient.

Exactly what I was thinking.
 

wsippel

Banned
So you're inferring the 32nm part that everyone (including Charlie) is speculating on being produced in East Fishkill w/GF is actually the Wii U's GPU?
I consider that a possibility, yes. Cu45-HP CPU manufactured at the East Fishkill fab, Cu32 GPU manufactured at the new Fab 8 (initial production for devkits in East Fishkill).



And why would they keep the clock speed so low? I'm sure that was before it was a nintendo chip and simply an AMD/ATI GPU. the HD4830 (640spu hit 736Gflops @ stock 575mhz) and the HD4770 (other 640spu hit 960Gflops @ stock 750mhz)

Even bringing those numbers down to 300mhz shouldn't hit that 300gflops in your estimates.

http://en.wikipedia.org/wiki/Compar...essing_units#Radeon_R700_.28HD_4xxx.29_series
I believe it was 400 or 450MHz (it's been a while), and it was still overheating when actually pushed. And yes, that was the AMD GPU. The Nintendo GPU wasn't done until many months later.
 

VAPitts

Member
has anyone seriously questioned why Nintendo needs to be this secretive?

They are never ever going to be the cutting edge top dog spending as much as others and no matter what after launch they still can be copied by the other guys. So why make it this hard to get info about the upcoming product?

may i ask where u have been for the last 10 yrs, 2 gens, and 2 consoles?
 

z0m3le

Banned
Oh yeah, I can't believe I forgot that detail. 600MHz 4830s because full blast was causing weird lockups, correct?

The 4830 ran at 575mhz on stock (95watt TDP), that is actually a 25mhz overclock... so I'm sure an underclocked 4830 in the devkits (if they were 4830s) would be something like ~450mhz. Which would be more like 600+Gflops.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Right. I'm off to play some snooker and sink a few pints. I'll check this thread later.
 

Nibel

Member
That should not happen. Do you connect it via HDMI? Then go to your graphics settings and search for Overscan, you can adjust a slider there so that the picture will fit perfectly.

Thanks, will try this out :)

Yeah, I use a Nvidia 570 GTX and an HDMI cable.
 

Azure J

Member
The 4830 ran at 575mhz on stock (95watt TDP), that is actually a 25mhz overclock... so I'm sure an underclocked 4830 in the devkits (if they were 4830s) would be something like ~450mhz. Which would be more like 600+Gflops.

Yeah, I made a mistake and fixed that. :p
 

AzaK

Member
Who's making a Wii U only game with the budget to make something "jaw dropping"? Maybe Nintendo? And even then would they talk about it? Most of the games we know about for the Wii U are ports. I doubt very seriously that any of those games are using the Wii U as the lead platform. So we won't be getting much of that until at least the next batch of games come out.

Developer comments are hard to gauge by because no one is really supposed to be talking and anything that does get said is usually stupidly vague and then you are left to the integrity of the site,mag etc reporting on the incredibly vague stuff.

Even if the Wii U ends up with nothing more than a modernized 4850 type GPU it would offer considerable advantages over xenos etc. that studios have to want to take advantage of. THQ is not a publisher thats sitting on a boatload of cash tossing it every which direction for improvements for instance.

To be fair not many third party titles aimed for launch will use the full power of the console. Just look at the 360 launch. Not saying it's definitely going to be jawdropping, but we haven't heard much from developers working on non-multiplat games.
Sure but I would have expected higher praise from Epic and what not if I was leaps and bounds ahead of P60. Even something cryptic like "No stop-gaps here" as opposed to more detailed specs and information.
EDIT:To clarify

The Wii U GPU won't be built around or based on Southern Islands. From everything we know, it's a based on R700 but customized beyond recognition. It's unlike any off-the-shelf AMD GPU.

I talked to bgassassin a few days ago, and I believe we concluded that the chip is probably pretty slow on paper, maybe 300, 400GFLOPS or something, but extended with a couple of shortcuts to accelerate certain common, taxing operations.
Why do you think 3-400 GFlops? How is it going to get its performance, and why go that slow? Isn't a chip faster than that less than the cost of a bag of chips?
 

Anth0ny

Member
An HD version of Kid Icarus with Wiimote controls would be fucking incredible. I sincerely hope they release console ports of some of the more beautiful games on 3DS (KI, OOT, Star Fox, KI). Poor little buggers are confined to that tiny screen =(
 

IdeaMan

My source is my ass!
It's on par until I see something pretty concrete other then these rumours, hints and speculations.

Ok, let's do this.

Ignore everything written in these speculations threads since 10 months

Now check a video of the japanese garden demo on Wii U from E3 2011, with AT THE VERY LEAST, a Xbox360+ content on the main screen + a 480p content on the padlet.

You know as a FACT that this demo was quickly built, run on firsts iterations of the dev kit (we are 4 dev kits after at the moment), on firsts SDK, etc etc etc.

Check a video of Ghost Recon Online with the UAV drone. You have at the very least, a current gen HD game + a different content on the padlet. Again, it was one year ago, on firsts dev kits, etc etc etc.

So you have at the very least:

A xbox360 on the main screen
A 480p xbox360 on the second screen. Let's say 480p + calculating a second view/scene/content from the same hardware is demanding, so it's roughly 0,5x xbox360

1 xbox 360 + 0,5x Xbox 360 = 1,5x xbox360

This was the situation one year ago, with all the elements derived from the very important context that i've described i don't know how many time. Add to that, at worst case scenarioS, just tweaking here and there, small boosts, optimizations, advancements in titles development, graphical polishing for these softs, devs being more and more accustomed to the asymmetrical setting (again, small improvement here), etc etc, all that since one year, you have, let's say roughly 0,5x xbox360 (through slightly better framerate, better/greater AA/texture filtering, and some things that the current gen GPU's can't handle).

So 1 xbox 360 + 0,5x Xbox 360 + 0,5x Xbox 360 = 2x Xbox 360, at the very least, considering the two screens. The Wii U isn't "on par" technologically, period, and this demonstration has always considered the worst that could have happened.
 

z0m3le

Banned
Yeah, I made a mistake and fixed that. :p

The only real reasons to use the 4830 in the devkits, was that they were disabled R770 chips (so they were probably cheap to come by) and the 256mbit memory bus. Other then those two things, I would say that the 4770 has it beat in every aspect, though one last thing to consider is that they plan on running the GPU at ~500-600mhz. Remember the devkits don't really matter much, only that the final hardware should end up more powerful.

A custom GPU with the performance of a HD4830+ GPU and the feature set closer to the HD7770. That is as close as we can guess for the GPU right now, somewhere 600gflops or more is the only thing that makes sense, unless the HD4830 in the original dev kits were clocked under 300mhz. (which seems pointless imo)

So 1 xbox 360 + 0,5x Xbox 360 + 0,5x Xbox 360 = 2x Xbox 360, at the very least, considering the two screens. The Wii U isn't "on par" technologically, period, and this demonstration has always considered the worst that could have happened.
Thanks Ideaman for the completely rational logic used to get to that number, this thread has it's ups and downs, but 300-400gflops is simply ridiculous in this day and age. They aren't building up from the Wii, and Nintendo has no reason to limit themselves to 2006 consoles, so why would they spend money and resources to simply match the competition?
 

DCKing

Member
the Big N new system is not "on par".
And this is not what I'm saying either. There are a lot of conflicting stories, with some early reports saying that it is a stop-gap until next gen. Recently however, we've heard about the Wii U actually running stuff from you and Vigil, and that was seriously unimpressive. Not indicating that it was "on par" or anything, but also not indicating that the earlier hardware that was rumoured is actually available to developers. The Wii U GPU should've been more powerful than running a 360 game at 720p + 480p Upad. There are a lot of explanations that could fit, but your and Vigil's descriptions are simply unimpressive and that's worrisome.

Whatever the case, of course the Wii U will be more powerful than the Xbox 360, nobody is saying otherwise. The question is how much. The argument you posted up there is really flawed however, and even 2x (let's assume GPU performance figures here) the Xbox 360 is really not super impressive either.

@wsippel: Why the R700 again? They designed the GPU in 2009 at the earliest apparently, which is when AMD already had Cayman Evergreen on the market and at least Northern Islands (and almost certainly Southern Islands) designs in the works.
 

wsippel

Banned
Why do you think 3-400 GFlops? How is it going to get its performance, and why go that slow? Isn't a chip faster than that less than the cost of a bag of chips?
That's where the hypothetical "shortcut" (fixed function) part would come into play. It's probably kinda slow when running standard shaders, but could achieve visible results typically requiring a lot more raw horsepower if developers use Nintendo's proprietary extensions.
 
That's where the hypothetical "shortcut" (fixed function) part would come into play. It's probably kinda slow when running standard shaders, but could achieve visible results typically requiring a lot more raw horsepower if developers use Nintendo's proprietary extensions.
So... TEV 2.0? Seems a bit odd and I can't see third parties pushing for it.
 

z0m3le

Banned
That's where the hypothetical "shortcut" (fixed function) part would come into play. It's probably kinda slow when running standard shaders, but could achieve visible results typically requiring a lot more raw horsepower if developers use Nintendo's proprietary extensions.

@wsippel: At what clock speed does the HD4830 hit 300Gflops? hell even 400Gflops? considering that SPUs have increased in efficiency, why would Nintendo's final hardware hit anywhere near those numbers with a box bigger than the Wii?
 
Ok, let's do this.

Ignore everything written in these speculations threads since 10 months

Now check a video of the japanese garden demo on Wii U from E3 2011, with AT THE VERY LEAST, a Xbox360+ content on the main screen + a 480p content on the padlet.

You know as a FACT that this demo was quickly built, run on firsts iterations of the dev kit (we are 4 dev kits after at the moment), on firsts SDK, etc etc etc.

Check a video of Ghost Recon Online with the UAV drone. You have at the very least, a current gen HD game + a different content on the padlet. Again, it was one year ago, on firsts dev kits, etc etc etc.

So you have at the very least:

A xbox360 on the main screen
A 480p xbox360 on the second screen. Let's say 480p + calculating a second view/scene/content from the same hardware is demanding, so it's roughly 0,5x xbox360

1 xbox 360 + 0,5x Xbox 360 = 1,5x xbox360

This was the situation one year ago, with all the elements derived from the very important context that i've described i don't know how many time. Add to that, at worst case scenarioS, just tweaking here and there, small boosts, optimizations, advancements in titles development, graphical polishing for these softs, devs being more and more accustomed to the asymmetrical setting (again, small improvement here), etc etc, all that since one year, you have, let's say roughly 0,5x xbox360 (through slightly better framerate, better/greater AA/texture filtering, and some things that the current gen GPU's can't handle).

So 1 xbox 360 + 0,5x Xbox 360 + 0,5x Xbox 360 = 2x Xbox 360, at the very least, considering the two screens. The Wii U isn't "on par" technologically, period, and this demonstration has always considered the worst that could have happened.

This is what I've been trying to say for the past 3000 years! But thanks for collating it all, hopefully this logic gets through to some people, because it makes perfect sense.

What we saw a year ago, with the very first dev kits at E3, was beyond the 360 anyway - both on screen and on the UPAD, if you get me.

This power + further tweaks + the large increase reported in the v4 dev kits, and then a little more int he v5 = a console that is not on par. It's better.
 

IdeaMan

My source is my ass!
And this is not what I'm saying either. There are a lot of conflicting stories, with some early reports saying that it is a stop-gap until next gen. Recently however, we've heard about the Wii U actually running stuff from you and Vigil, and that was seriously unimpressive. Not indicating that it was "on par" or anything, but also not indicating that the earlier hardware that was rumoured is actually available to developers. The Wii U GPU should've been more powerful than running a 360 game at 720p + 480p Upad. There are a lot of explanations that could fit, but your and Vigil's descriptions are simply unimpressive and that's worrisome.

This was for 3rd party ports, with tons of other parameters to consider. This can't be used to estimate precisely the Wii U power, or exclude that it will be a stop-gap, a noticeable leap in the right situation, compare to current gen systems.
 

StevieP

Banned
I consider that a possibility, yes. Cu45-HP CPU manufactured at the East Fishkill fab, Cu32 GPU manufactured at the new Fab 8 (initial production for devkits in East Fishkill).

5TTd4.jpg


Hmm...

That's where the hypothetical "shortcut" (fixed function) part would come into play. It's probably kinda slow when running standard shaders, but could achieve visible results typically requiring a lot more raw horsepower if developers use Nintendo's proprietary extensions.

Judging by the GC and the Wii, this isn't going to happen lol
 

wsippel

Banned
So... TEV 2.0? Seems a bit odd and I can't see third parties pushing for it.
Depends, really. Might be possible to do many optimizations at the compiler level for high level shader languages, but to achieve the best possible performance, one would certainly need hand crafted shaders designed specifically for the system. Anyway, even with 300GFLOPs raw performance and no compiler level optimizations, it would still outperform the 360 running the exact same shaders.
 

IdeaMan

My source is my ass!
This is what I've been trying to say for the past 3000 years! But thanks for collating it all, hopefully this logic gets through to some people, because it makes perfect sense.

What we saw a year ago, with the very first dev kits at E3, was beyond the 360 anyway - both on screen and on the UPAD, if you get me.

This power + further tweaks + the large increase reported in the v4 dev kits, and then a little more int he v5 = a console that is not on par. It's better.

Well, you can switch some 0,X between the measurement of the 480p content rendered, and the "bonus" from more modern technology, but the idea is here. And it's really at worst case.
 

McHuj

Member
@wsippel: At what clock speed does the HD4830 hit 300Gflops? hell even 400Gflops? considering that SPUs have increased in efficiency, why would Nintendo's final hardware hit anywhere near those numbers with a box bigger than the Wii?

Formula for flops is Shader Clock * Number of Shaders * 2

4830 = 640 Shaders * .575 GHz * 2 = 736 Gflops

For 300 Gflops clock needs to be about 234 MHz.
 
Depends, really. Might be possible to do many optimizations at the compiler level for high level shader languages, but to achieve the best possible performance, one would certainly need hand crafted shaders designed specifically for the system. Anyway, even with 300GFLOPs raw performance, it would still outperform the 360 running the exact same shaders.
The problem comes from Nintendo obviously bending over backwards to please developers and added headaches like that would be a huge red flag.
I just can't really see it being a reality.
 

DCKing

Member
This was for 3rd party ports, with tons of other parameters to consider. This can't be a way to estimate precisely the Wii U power, or exclude that it will be a stop-gap, a noticeable leap in the right situation, compare to current gen systems.
Well then we should stop speculating about any news we get... If we disregard the Wii U not performing well in real gaming scenarios* all we have left is some vague statements saying "oh it's not so powerful" or "oh it's quite powerful", and some hardware parameters from some time ago.

I'm just trying to piece the rumours together, and I'm not going to disregard these recent statements because of 'vagueness'. They're the most concrete rumours we have!

* the E3 demos you posted really did not show off the power you claim that they did. There's absolutely no way you can even quantify 'how many times an Xbox 360' those demos were, but even if you could mirroring the graphics from the main screen on the Upad or drawing a picture on the Upad definitely doesn't qualify as '0.5x a 360'.
 
Well, you can switch some 0,X between the measurement of the 480p content rendered, and the "bonus" from more modern technology, but the idea is here. And it's really at worst case.

Exactly. This is looking at old hardware. Plus, it varies when developers choose to run a second game camera on the screen or not.
 

z0m3le

Banned
Formula for flops is Shader Clock * Number of Shaders * 2

4830 = 640 Shaders * .575 GHz * 2 = 736 Gflops

For 300 Gflops clock needs to be about 234 MHz.

LOL thanks, yeah if that was what the dev kits ran out, I can't see how they ever overheated.
 

Pocks

Member
Depends, really. Might be possible to do many optimizations at the compiler level for high level shader languages, but to achieve the best possible performance, one would certainly need hand crafted shaders designed specifically for the system. Anyway, even with 300GFLOPs raw performance and no compiler level optimizations, it would still outperform the 360 running the exact same shaders.


That seems to fly in the face of developers' comments on how easy it is to port to the system.

But at the same time, perhaps it explains why Vigil has no plans to add any extra sparkle to the Wii U version.
 

wsippel

Banned
The problem comes from Nintendo obviously bending over backwards to please developers and added headaches like that would be a huge red flag.
I just can't really see it being a reality.
I'd say these days, as long as Epic and maybe Crytek are cool with Nintendo's custom extensions, they'd be fine. It's even possible companies like Epic had input there as well - instead of designing an engine around a GPU, they'd design a GPU around an engine. The audio algorithms embedded in the Gamecube GPU were in part developed by Factor 5 for example, and they also did the MusyX audio middleware for the system.
 
Top Bottom