• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Speculation Thread 2: Can't take anymore of this!!!

Status
Not open for further replies.
I'm going to take a stab in the dark and come to believe that the RAM amount is the biggest thing that will see the 4x/5x multiplier.

CPU: IBM POWER7 (Derivative) Tri-Core/2-Way SMT OoO Processor @ 3.0GHz
RAM: 2GB GDDR3/5 Unified + 32MB eDRAM Secondary Pool
GPU: Efficient 640SPU part that performs closer to a AMD Radeon HD 7770 than a 4830 (On a scale of 4830 - 5770/6770 - 7750 - Wii U[?] - 7770) on a 32nm process; 128bit Bus.

I won't rule out a higher clocked CPU but I won't go higher than this for this guesstimate simply because I don't understand the consoles innards and airflow strategy yet. On the GPU side, I still think that the performance of the mentioned 7770 from AMD sets a good performance/wattage precedent for a console designed to be a good bump up from current gen while not breaking the bank or nuking itself. In actuality, the card I'd reference back to if possible would sit somewhere between the 7750 & 7770 in performance, but as no card exists like that at current, I made a small scale above. GDDR3 vs 5 is still all about the power envelope vs. Nintendo's stance on latency with a potential price aspect playing in the consideration as well, but with the eDRAM basically acting as a Super Saiyan Pikmin fetching from the lesser pool of Pikmins, GDDR3 might be the actual choice.

IdeaMan's post has made me believe more in this idea I have in my mind. The numbers all look weird and match the strange 2x lower bound of general multipliers but the performance puts it in that higher bound of speculated multipliers.

I can actually agree with these specs, except the issue of whether they use GDDR3 or GDDR5 is huge, I think. I've read that GDDR5 will make more sense pricewise in the coming years, but that will require more chips at the present (2GB would be a wopping 8 chips) so it may be out of the realm of possibility. But who knows? It's not impossible and Nintendo did use GDDR3 in the Wii, which was modern for the time, after apparently learning from the Gamecube mistake of including the so-called ARAM, which was so slow it was practically useless for most tasks. The extra bandwidth on the GDDR5 would be enticing in the eyes of Nintendo, as this is something that directly effects gameplay. 800 vs 640 spus, on the other hand, is more like splitting hairs.

Actually, if I were to take a guess, at 32nm, they could get a 640 spu part running at 729 MHz. That's a 3x multiplier from the Wii. Similarly a 5x multiplier for Wii's CPU would be 3.645 GHz, something a POWER7 derivative (slimmed down for on 2 Way SMT) should be able to hit easily and at a moderate wattage.
 
I might be a bit out of the loop, but is there a reason some of you seem to be expecting news out of GDC? Or just wishful thinking?

I know Nintendo occasionally lets slip new details at GDC, but it's usually during keynotes, of which this year Iwata/Miyamoto/Reggie/Nintendo have none. Outside of this their GDC presence is usually minimal at best.

There's always a chance engine technology will show up, but I'd expect that stuff to be shown behind closed doors, as Nintendo appears dead set on keeping cards close to their chest until E3. We all know what the Nintendo ninjas are like, so I don't really expect many, if any leaks, and honestly nothing at all official.

Unless I missed something?

cuz it's NeoGaf *smh*
 

Anth0ny

Member
Also, Nintendo really needs a graphical vehicle game to showcase at E3. Something that people can look at and immediately say "Wow, this isn't possible on current systems at all".

For that, I'm hoping they utilize one of their best developers, Retro Studio. Retro has the knowledge and know how to truly wow us. Either something sci-fi or fantasy based, Retro can truly pull off something amazing.

The Zelda demo plays from last year. Then, the camera pans behind Link, and the player is given control.

Retro Zelda. Believe.
 

Donnie

Member
I can actually agree with these specs, except the issue of whether they use GDDR3 or GDDR5 is huge, I think. I've read that GDDR5 will make more sense pricewise in the coming years, but that will require more chips at the present (2GB would be a wopping 8 chips) so it may be out of the realm of possibility. But who knows? It's not impossible and Nintendo did use GDDR3 in the Wii, which was modern for the time, after apparently learning from the Gamecube mistake of including the so-called ARAM, which was so slow it was practically useless for most tasks. The extra bandwidth on the GDDR5 would be enticing in the eyes of Nintendo, as this is something that directly effects gameplay. 800 vs 640 spus, on the other hand, is more like splitting hairs.

Actually, if I were to take a guess, at 32nm, they could get a 640 spu part running at 729 MHz. That's a 3x multiplier from the Wii. Similarly a 5x multiplier for Wii's CPU would be 3.645 GHz, something a POWER7 derivative (slimmed down for on 2 Way SMT) should be able to hit easily and at a moderate wattage.

Yeah that would also give 4 times the shader performance of Xenos and would be preferable to the 800 SPU's at 600Mhz since you'd also get more out of other parts of the GPU such as ROPs ect.
 
That makes no sense at all. So as long as someone doesn't claim that they are working with it, you believe everything they have to say about the WiiU?

Your response makes no sense because it has nothing to do with what I said. I said that Arkam claimed to work for a developer and IdeaMan didn't. And even then Arkam wasn't obligated to confirm that though when a person makes that claim they are normally expected to back it up. Had nothing to do with believability since I've said multiple times that I believed Arkam.
 

Donnie

Member
Wha else can he compare? There is no retail WiiU so the only thing he can compare is the devkit.

He can compare the WiiU dev kit to a 360 dev kit obviously.. In which case the extra memory for debugging is there in both instances (360 dev kit uses 1GB RAM), so 4-5 would mean 4-5GB of RAM in WiiU dev kit, not 2-2.5GB.
 

Mr Swine

Banned
Good point, lherre has hinted at "more than 2GB" in the devkit. Which would be double of the retail unit. 640 or 800 SPUs are not really 5 times the 240 SPUs of Xenos, although they may perform as such eventually.

You are forgetting that the Xenos inside the Xbox360 is old. Even if a 640 SPU is "only" roughly 2.5 times more powerful. The newer cards from 2007 have architect that is better than what was made in 2006. I would guess that a 640 SPU would roughly be around 4x as powerful as the one in the 360
 

DCKing

Member
Actually, if I were to take a guess, at 32nm, they could get a 640 spu part running at 729 MHz. That's a 3x multiplier from the Wii. Similarly a 5x multiplier for Wii's CPU would be 3.645 GHz, something a POWER7 derivative (slimmed down for on 2 Way SMT) should be able to hit easily and at a moderate wattage.
Why is that figure so special, and why should Nintendo even consider using that? Wii clocks are absolutely useless when talking about clocks for the Wii U.

Furthermore I think 700+ MHz with 640 SPUs is at the edge of what's possible, and I think a chip designed for ~600 MHz is much more likely. Furthermore, all rumours we have heard uptil now rule out that the Wii U has 2 GB of memory. It'll have 1.5 GB at the very most.
You are forgetting that the Xenos inside the Xbox360 is old. Even if a 640 SPU is "only" roughly 2.5 times more powerful. The newer cards from 2007 have architect that is better than what was made in 2006. I would guess that a 640 SPU would roughly be around 4x as powerful as the one in the 360
Although I'm sure modern shaders have their performance advantages (blu could comment on that), it is very much untrue that "modern" or "efficient" (tell me what's so efficient?) are magically much faster, when both are actually quite simple calculation units. Even if it were the case, they definitely do not _superficially_ appear faster like IdeaMan suggested.
 

radcliff

Member
Furthermore, but read that with a grain of salt, many graphical effects are applied near the “end” of the visual development of a game. I guess some parameters that cause what is rendered on the screen to be more or less clean, complex, for example the type of shadow, the AA applied, and new effects that the Wii U GPU is probably capable of fall into this category (the shiny stuff that developers adds at the end, once the engine is running well), and therefore the final result will feel more like a 3 or 4 or even the famous 5x than 2x Xbox360 to the eyes of my sources, I keep in touch with them to know if it will be the case.

Thanks for the info!

I think the above is also pretty important. Graphics on the WiiU look impressive compared to the X360 now, but the gap could widen even further once all the other graphical bells and whistles are applied. Again, thanks for sharing.
 

botty

Banned
Hmmm... For me, gaming on the Wii U is going to look great. I absolutely do not care about the specs, past what I've already read. I've read enough to understand that it will comfortably fall into "yup, that looks good" territory.

I love great graphics, but once I'm playing a game, I'm like... playing the game, man. I'm not critically eyeballing every detail looking for nitpicking little things to put into some supposed "cons" bullet point list.

Maybe it is generational? Playing on the Atari you had to use your imagination. Indiana Jones did not look a fucking thing like Indiana Jones. The point wasn't the graphics, the point was to have fun playing the game. Now that graphics are great I think it may be easier to point to slightly better graphics and claim a game is better, but c'mon man. I guess I feel like rating a game based on graphics is like rating a novel based on its typeset. Like, Ender's Game is good, but I would've liked it a lot more in Helvetica.

So basically, do you all think there's a real difference in the Wii Us future depending on what the final specs are? Given what we "know", we can guess a range of capability for the Wii U.

How different will it be if it comes out at the low end of that range than the high end? Will it affect it's success?

If Iwata announced the Wii U could cure cancer, people in this thread would still only care about the specs.
And if Retro were making Zelda
 

Jarsonot

Member
If Iwata announced the Wii U could cure cancer, people in this thread would still only care about the specs.
And if Retro were making Zelda

Eh, I don't think people ONLY care about the specs, and this is a speculation thread and all, I get it.

I actually LIKE the specs speculation =), I just think, for me anyway, that it's already reached "good enough" and so I don't necessarily care as much anymore about the final specs.

I mean, I don't care as in I'll like 'em whatever they are. I still care enough to follow this thread religiously. =)
 

Fredrik

Member
Based on what IdeaMan is saying, I'm expecting a jump similar to the PS3 version of Battlefield 3 compared to the maxed out version on PC.

Things improved includes hi-res textures, tons of geometry, better lighting, effects, and AA. If the Wii U can produce that on average, than I think we're all going to be very happy.



The thing is, playing just the PS3 or Xbox version, most gamers will just get use to the way it looks. But once you play the PC version on Max, you can't go back to playing the PS3 and Xbox version.
I see no difference worth mentioning in that pic :/ If that's the difference we'll get visually next gen then I'll be very disappointed unless framerates get much much higher.
 
You are forgetting that the Xenos inside the Xbox360 is old. Even if a 640 SPU is "only" roughly 2.5 times more powerful. The newer cards from 2007 have architect that is better than what was made in 2006. I would guess that a 640 SPU would roughly be around 4x as powerful as the one in the 360

Don't forget that Xenos' design dates back to late 2004, I think that's when it taped out or whatever. Xenos was based on the canceled R400 project.
 

AmFreak

Member
Eh? He can compare the WiiU dev kit to a 360 dev kit obviously..

He only mentions 360 and never says anything about the 360 devkit. But in theroy he could, but that would ne really odd.

But if you think about it, he gave us nothing new all he said was components are between 2x and 5x Wii. It's not really hard to be right with such a vague statement (and we also had rumours that said that more specifically). I can tell you the 720 components will be 1x-10x more powerful than their 360 counterparts. But he even made an error if we believe the 3 core oooe cpu rumors (as long as the tri-core isn't only clocked at 1.2 Ghz).
 
If Iwata announced the Wii U could cure cancer, people in this thread would still only care about the specs.
And if Retro were making Zelda

lol or, if Sony announced the same thing, people would bash Nintendo for being outdated and still put Sony on a pedestal for having the best cancer curing console.
 
Yeah that would also give 4 times the shader performance of Xenos and would be preferable to the 800 SPU's at 600Mhz since you'd also get more out of other parts of the GPU such as ROPs ect.

That's exactly what I'm thinking, especially since I believe 2 controller support is a lock. As we've learned, clock rate is one of the easiest ways to increase performance evenly around the board. And I don't think there are going to be lots of pretty effects on the controllers in local 2 player. We'll probably be talking bare bones menus and maps unless the graphic style is very simple. The GPU just needs to be strong enough to support 1 720p display and 2 480p displays.

Why is that figure so special, and why should Nintendo even consider using that? Wii clocks are absolutely useless when talking about clocks for the Wii U.

Furthermore I think 700+ MHz with 640 SPUs is at the edge of what's possible, and I think a chip designed for ~600 MHz is much more likely. Furthermore, all rumours we have heard uptil now rule out that the Wii U has 2 GB of memory. It'll have 1.5 GB at the very most.
Although I'm sure modern shaders have their performance advantages (blu could comment on that), it is very much untrue that "modern" or "efficient" (tell me what's so efficient?) are magically much faster, when both are actually quite simple calculation units. Even if it were the case, they definitely do not _superficially_ appear faster like IdeaMan suggested.

We actually don't know how much of a leap the last dev kit revision was, but can only infer past configurations utilized theoretical amounts at 1 GB and 1.5GB. I'm entertaining a rumor of 4x/5x some quantity in 360 and that Nintendo have increased certain components enough to make some devs raise an eyebrow. I said that 1.5GB is most likely, especially if they use GDDR5, which would make more sense (if they use the eDRAM as a frame buffer to get AA at 720p, the rest of the system is still going to need some decent main RAM). However, if they are serious about supporting 2 tablets at once, and I think they need to be, then 2 GB of GDDR3 doesn't seem that ridiculous.

As for the Wii multipliers, I speculate Nintendo might have the Wii U downclock into some low power Wii mode. The architectures won't be exactly the same ala Wii and GCN, but the system was designed from the ground up with backwards compatibility in mind, so it is bound to have certain similarities.
 

botty

Banned
Eh, I don't think people ONLY care about the specs, and this is a speculation thread and all, I get it.

I actually LIKE the specs speculation =), I just think, for me anyway, that it's already reached "good enough" and so I don't necessarily care as much anymore about the final specs.

I mean, I don't care as in I'll like 'em whatever they are. I still care enough to follow this thread religiously. =)

If you were around for the first thread, the [spec]ulation does go in circles, and without BurntPork the entertainment value here is diminishing. e3 can not come soon enough.
 
Yeah that would also give 4 times the shader performance of Xenos and would be preferable to the 800 SPU's at 600Mhz since you'd also get more out of other parts of the GPU such as ROPs ect.

That's about what I see actually. An 800 ALU GPU clocked at ~600Mhz. That seems more like a path that Nintendo would take so I agree with DC.
 

Somnia

Member
I really love this time of the year, GDC is here, E3 is quickly approaching and the rumor mills go abuzz. Though the rumors on the WiiU all seem to be close together so there is some truth in these somewhere for sure.
 

Donnie

Member
I see no difference worth mentioning in that pic :/ If that's the difference we'll get visually next gen then I'll be very disappointed unless framerates get much much higher.

Diminishing returns means that differences just won't be as drastic as some people expect. Having said that there are quite noticeable differences between those images if you look closer.

Look at the number of branches on the trees, the amount of plants on the ground. Far fewer of both on the PS3 version. look at the textures on the rocks (those are night and day especially the one to the right). Look at the textures on the guys arm/hand, very blurry on the PS3 in comparison to PC. Also look into the distance at the floor and the crates and wall, look how clear the PC version is compared to the PS3 version. Finally take notice of the shadows, look at the trees and how the branches cast shadows onto the trunk, the PS3 shadow is far less defined.
 

guek

Banned
Tekken producer Katsuhiro Harada said:
“We’re thinking about a different use for the second screen. I’ve often heard about or seen fighting-game players playing with a strategy guide open at their feet. So it would be useful if we could, for example, distribute an enhanced digital version of the guide that the player could see while playing, and even touch to have a live preview on the main screen.”“Another thing I would like to do is use the capacity to write on the screen,” he says. “I’d like to implement a feature such as being able to customise the character by drawing and painting on the screen.”
Harada also talks about just how quickly the Wii U can transmit visuals to the controller.

“The speed of the image transmission feature from the screen to the controller is impressive. The first time I saw Wii U, I thought, ‘isn’t the delay longer than one frame? If it is, it’s going to be difficult for fighting games.’ But when I heard that the latency actually isn’t more than 1/60th second I was really happy.”

Not unexpected at all but good to hear nonetheless
 

DCKing

Member
However, if they are serious about supporting 2 tablets at once, and I think they need to be, then 2 GB of GDDR3 doesn't seem that ridiculous.
Why do you think multiple tablets are important here? What memory use do you have in mind? 1.5GB is a huge amount already.
As for the Wii multipliers, I speculate Nintendo might have the Wii U downclock into some low power Wii mode. The architectures won't be exactly the same ala Wii and GCN, but the system was designed from the ground up with backwards compatibility in mind, so it is bound to have certain similarities.
This downclocking is only relevant in the case where they use identical hardware for direct BC, which they won't. Even if it was relevant, Nintendo could downclock the new CPU whether it was running at 5x729MHz or at 4.78531x729MHz. Using Wii clockspeeds make no sense at all.
 

Jarsonot

Member
If you were around for the first thread, the [spec]ulation does go in circles, and without BurntPork the entertainment value here is diminishing. e3 can not come soon enough.

Yeah, I was, and you're right. I do like reading about different memory types, and SPUs, and whatnot. A little while ago I kind of reached the stage where I know I'll be happy with it, and now I'm just jonesing for more info (like everybody else) but especially more info on the UI, online play, and GAMEZ!!!

Does anyone know if BurntPork was permabanned? It's been a good while.
 

Donnie

Member
He only mentions 360 and never says anything about the 360 devkit. But in theroy he could, but that would ne really odd.

But if you think about it, he gave us nothing new all he said was components are between 2x and 5x Wii. It's not really hard to be right with such a vague statement (and we also had rumours that said that more specifically). I can tell you the 720 components will be 1x-10x more powerful than their 360 counterparts. But he even made an error if we believe the 3 core oooe cpu rumors (as long as the tri-core isn't only clocked at 1.2 Ghz).

Comparing a WiiU dev kit directly to a retail XBox 360 would be really odd IMO. Like I said there's no reason to assume he'd be doing that, IF he's even talking about RAM to begin with :) Nothing he says is very specific no, however he's being nowhere near as vague as "1x-10x".

Also how did he make an error on the CPU? He said it may look only slightly faster than the CPU in 360, but is more efficient which makes it more powerful than it may look at first. That fits the bill for a triple core out of order CPU at a similar clock rate to Xenon.
 

Bit-Bit

Member
I see no difference worth mentioning in that pic :/ If that's the difference we'll get visually next gen then I'll be very disappointed unless framerates get much much higher.

View the full picture and try to say that again. http://img97.imageshack.us/img97/5766/ps3vspc3.jpg

It's like I said, once you play the PC version on max settings, you can't go back to the PS3 or Xbox version.

Think of it like this, when you first watch a Blu-Ray movie, you know there's a difference between it and a DVD movie but you might not think there was that much of a difference. But once you're exposed to that quality for an extended amount of time, go back to watch a DVD movie. You'll see a very noticeable and considerable downgrade.
 

Jarsonot

Member
View the full picture and try to say that again. http://img97.imageshack.us/img97/5766/ps3vspc3.jpg

It's like I said, once you play the PC version on max settings, you can't go back to the PS3 or Xbox version. Think of it like this, when you first watch a Blu-Ray movie, you know there's a difference between it and a DVD movie. But you might not think there was that much of a difference. But once you're exposed to that quality for an extended amount of time, go back to watch a DVD movie. You'll see a very noticeable and considerable downgrade.

I agree that the PC is better, but when I'm playing the game, I'm looking for d00dz to kill, and not noticing the texture on the rock. Maybe it all adds up when playing in motion, but looking at those two pictures I honestly don't think I'd notice that the PC is any better, when I'm busy running around and playing.
 

Bit-Bit

Member
I agree that the PC is better, but when I'm playing the game, I'm looking for d00dz to kill, and not noticing the texture on the rock. Maybe it all adds up when playing in motion, but looking at those two pictures I honestly don't think I'd notice that the PC is any better, when I'm busy running around and playing.

The point isn't that you'll be admiring and being distracted by it while you play the PC version. Since you'll get use to it the moment you see it. But the point is that once you've played it on PC for any period of time, you'll find it really hard to play the PS3 and Xbox version. Because then you'll be distracted by the very noticeable dip in quality.
 

Donnie

Member
I agree that the PC is better, but when I'm playing the game, I'm looking for d00dz to kill, and not noticing the texture on the rock. Maybe it all adds up when playing in motion, but looking at those two pictures I honestly don't think I'd notice that the PC is any better, when I'm busy running around and playing.

Some people might not notice a big difference closer to the camera. Though what you would definitely notice is the clarity difference in the distance between the two versions (as you'll be looking in the distance for "doodz to kill" :) ). On the PS3 objects and characters would be quite jagged and low quality, while they would appear much clearer and more defined on the PC version.
 

bachikarn

Member
It's like I said, once you play the PC version on max settings, you can't go back to the PS3 or Xbox version.

I did. All my friends got the x360 version so I decided to too. It was a little annoying at first, but didn't bother me that much. But I guess I'm not a "graphics whore."
 

Bit-Bit

Member
I did. All my friends got the x360 version so I decided to too. It was a little annoying at first, but didn't bother me that much. But I guess I'm not a "graphics whore."

You don't have to be a graphics whore to be distracted by blurry textures and low geometry.
 
If we're all honest with ourselves, deep down we know the system will have 360 level graphics with added spit and polish.

I think the graphics difference between the 360 and Wii U will be the same as the difference between the Gamecube and Wii. There's a little difference but only a few games will showcase it and they'll be made by Nintendo.
 
Why do you think multiple tablets are important here? What memory use do you have in mind? 1.5GB is a huge amount already.
This downclocking is only relevant in the case where they use identical hardware for direct BC, which they won't. Even if it was relevant, Nintendo could downclock the new CPU whether it was running at 5x729MHz or at 4.78531x729MHz. Using Wii clockspeeds make no sense at all.

As for your first point, shouldn't it be obvious? It's one more 480p image to render independently. And all my own spec forecasts in previous posts state 1.5 GB. I'm open to alternate scenarios that make sense, though. 2 GB GDDR3 one a 128 bit bus would probably be pretty cheap.

And in all due respect, please explain to me exactly how Nintendo are going to achieve Wii emulation before you shoot down anyone else's speculation. I'm no expert so if you can lay it all out, I'd be fascinated. As far as I do know, 3DS uses a combination of hardware and software emulation to achieve DS emulation, but I haven't even gotten solid details on that besides the ARM code being compatible. In this scenario, the configurations don't have to be exactly the same but it sure helps when they are similar. If setting a clock speed is as easy as inputting a few variables, then Nintendo could have set Wii's clocks to more rounded numbers and not bothered doing a clean 1.5x. But they chose to have two set modes with a clean multiplier. I don't know if they're gonna do it the same way this time.They'll probably have built in OS access ala DS emulation on 3DS, but I'm sure they'll be turning off one or more cores and downclocking as well since we've been guaranteed that Wii games will display exactly the same. I mean, if they weren't heavily reliant on emulation on the hardware level, wouldn't they at least have had an upscaling feature?
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
If we're all honest with ourselves, deep down we know the system will have 360 level graphics with added spit and polish.

I think the graphics difference between the 360 and Wii U will be the same as the difference between the Gamecube and Wii. There's a little difference but only a few games will showcase it and they'll be made by Nintendo.

Well-crafted.

Edit: I assumed you were joking.
 
If we're all honest with ourselves, deep down we know the system will have 360 level graphics with added spit and polish.

I think the graphics difference between the 360 and Wii U will be the same as the difference between the Gamecube and Wii. There's a little difference but only a few games will showcase it and they'll be made by Nintendo.

Yeah, no.

On the contrary - pessimists know deep down that this is going to be a considerable step, and that it will more than likely be comparable to Microsoft's next system.

Think Gamecube to Xbox.
 

EloquentM

aka Mannny
Yeah, no.

On the contrary - pessimists know deep down that this is going to be a considerable step, and that it will more than likely be comparable to Microsoft's next system.

Think Gamecube to Xbox.

As long as games don't put extensive amounts of horsepower into the u pad at the same time then yeah.
 
Yeah, no.

On the contrary - pessimists know deep down that this is going to be a considerable step, and that it will more than likely be comparable to Microsoft's next system.

Think Gamecube to Xbox.


It's not going to happen.

Maybe as in 360 (Gamecube) to Wii U (XBOX).

No way the difference between the Wii U and next gen XBOX will be that close.

I know a lot of the insider info says it'll be a good step up, but the direct developer comments pretty much all say they're begging Nintendo to make it more powerful and that it's more comparable to 360 and PS3 than anything coming in the future.

I hope I'm wrong. I hope third party ports like Arkham City look as good as the PC version etc. I just think that if it was truly a step up in power then more devs would be coming out saying 'the next gen has arrived!' etc
 

antonz

Member
It's not going to happen.

Maybe as in 360 (Gamecube) to Wii U (XBOX).

No way the difference between the Wii U and next gen XBOX will be that close.

I know a lot of the insider info says it'll be a good step up, but the direct developer comments pretty much all say they're begging Nintendo to make it more powerful and that it's more comparable to 360 and PS3 than anything coming in the future.

I hope I'm wrong. I hope third party ports like Arkham City look as good as the PC version etc. I just think that if it was truly a step up in power then more devs would be coming out saying 'the next gen has arrived!' etc

There really hasn't been a single Wii U interview in almost a year. Alot of sites for some reason are dragging out E3 interviews and claiming them new though.
 
If we're all honest with ourselves, deep down we know the system will have 360 level graphics with added spit and polish.

I think the graphics difference between the 360 and Wii U will be the same as the difference between the Gamecube and Wii. There's a little difference but only a few games will showcase it and they'll be made by Nintendo.
I agree, with the caveat that the console will be able to pump decent graphics to the subscreen while doing the above. Doesn't mean third parties won't have a decent stab at creating impressive games though. At least the tools and techniques for creating Wii U games will be closer to those for other next-gen consoles, unlike the gulf between Wii and PS360 which means down-porting isn't feasible for 90% of titles.
 

Wolfie5

Member
Thanks for the info Ideaman, got this thread going :)

Though 2x this and 5x that doesn´t mean anything for me, other than it will be more powerful than x360, which I had already assumed as much. Question is how will that translate to the TV screen, and I am assuming we won´t know until E3.
 
Status
Not open for further replies.
Top Bottom