If they are "reunveiling" at E3, what stage of hardware will they have to show at CES? I was expecting a bit of a redesign.
Unless by reunveil, they just meant changing the marketing strategy.
If they are "reunveiling" at E3, what stage of hardware will they have to show at CES? I was expecting a bit of a redesign.
Unless by reunveil, they just meant changing the marketing strategy.
Nintendo will be at CES, according to CNet: http://ces.cnet.com/8301-33363_1-57344520/what-will-ces-2012-have-in-store-for-gaming/?tag=mncol;txt
So, yes, we will definitely see Wii U before E3.
Careful guys, I remember the same confirmation going around last year. Nintendo didn't end up showing at CES.
Ninja-edit: VOOK has the article above.
Careful guys, I remember the same confirmation going around last year. Nintendo didn't end up showing at CES.
Ninja-edit: VOOK has the article above.
We won't get anything from CES. They'll be there, showing it off to developers and maybe a limited number of press.
And that'll be it.
No showings, no floor demos. Just a little "OK, this is where we are right now." to important people.
Dreamwriter, you can't take one example and make indefectible law out of it. Just look at Metro 2033, 4A is a small studio that created higher quality assets than what the console can handle. To counter your FFXIV example, which is a heavily stylized game so it's easier to hide technical shortcomings. In Metro 2033 case is easy to appreciate the differences between the game running on a decent spected PC in comparison to the Xbox360 version, you can consider it a generation ahead. Other examples that might be good are Crysis 1/2, Physx titles like the Batmans or Mafia.Spoken like someone who'se never played around with settings in an advanced PC game You know what happens if you take FFXIV), for example, and turn on Ambient Occlusion, Depth of Field, Texture Filtering, and render the whole thing at twice the resolution you are displaying? and the end result isn't really that much different than not doing it at all. These kind of effects can make things look subtly better.
Sorry, should be more specific, "trueHD" in regards to televisions sets 1080p.I dunno what you mean by "true HD", my argument was based on the Wii-U being able to do the best of what XBox 360 and PS3 currently can do, but at 1080p at 60fps. And stereoscopic 3D isn't going to become something that sways the mainstream until it can be done well without 3D glasses or $1000 TV's. As for sandbox games like GTA, what limitations? The only limitations there is how much money they want to spend on those games, those games have yet to be limited by technology.
Exactly my point - higher quality assets cost more to make, 95% of the games won't push a system because of that. Otherwise most games on Xbox would look like Gears of War, most games on PS3 would look like Uncharted.Dreamwriter, you can't take one example and make indefectible law out of it. Just look at Metro 2033, 4A is a small studio that created higher quality assets than what the console can handle.
I was countering the argument that the ONLY thing that mattered was effects, since my argument was that being able to push more detail means nothing unless developers are willing to spend the money to create that detail. The games you are bringing up aren't looking better based on effects alone.To counter your FFXIV example, which is a heavily stylized game so it's easier to hide technical shortcomings. In Metro 2033 case is easy to appreciate the differences between the game running on a decent spected PC in comparison to the Xbox360 version, you can consider it a generation ahead. Other examples that might be good are Crysis 1/2, Physx titles like the Batmans or Mafia.
The difference is, HDTV was a government standard that was already in 20% of households when the Wii came out, with a schedule to remove all non-digital programmin, and all major networks were broadcasting in HD at the time. 3DTV's aren't even close to that point - even the manufacturers don't have a 3D standard yet, though they are working on one, and I think there are like two 3D channels, one available on satellite, one on cable, neither are major networks.stereoscopic 3D was just one example among others i gave don't know why you specifically centered around it. It's funny because you are using the same argument the Wii advocates were using to justify the Wii lack of processing power in relation to the adoption rate of HDTVs. Sony is heavily investing on it at least.
Read what you said there - MODDERS creating better assets like texture packs. These are games that could have looked better but the developer chose not to spend thetime and money to do it. I've not read one single interview of any sandbox game developer complaining at lack of system power, and blaming that for any limitations. And I already talked about slowdown, my point of view is based in the Wii-U able to do the best the XBox 360 and PS3 can do, but at 1080p at 60fps.Regarding sandbox games. Was talking about limitations like slow downs, reduced draw distances and texture quality, environment and object pop up. Just look what modders have done with the game in PC, the same game with the added effects looks miles ahead of the console versions.
The "only thing that matter are effects". Who made that statement? Because it certainly wasn't me.The games you are bringing up aren't looking better based on effects alone.
I was countering the argument that the ONLY thing that mattered was effects, since my argument was that being able to push more detail means nothing unless developers are willing to spend the money to create that detail. The games you are bringing up aren't looking better based on effects alone.
Don't understand why you keep gravitating toward the 3D adoption matter. I just mentioned the stereoscopic 3D gaming as an example of how developers, if they choose to, could employ the extra processing power while maintaining similar asset throughput. That's clear, is logical so there's no more need for you to keep arguing that.The difference is, HDTV was a government standard that was already in 20% of households when the Wii came out, with a schedule to remove all non-digital programmin, and all major networks were broadcasting in HD at the time. 3DTV's aren't even close to that point - even the manufacturers don't have a 3D standard yet, though they are working on one, and I think there are like two 3D channels, one available on satellite, one on cable, neither are major networks.
It's not a good discussion practice to completely disqualify a reasonable statement because of what you could consider minor inconsistencies in it.Read what you said there - MODDERS creating better assets like texture packs. These are games that could have looked better but the developer chose not to spend thetime and money to do it. I've not read one single interview of any sandbox game developer complaining at lack of system power, and blaming that for any limitations. And I already talked about slowdown, my point of view is based in the Wii-U able to do the best the XBox 360 and PS3 can do, but at 1080p at 60fps.
My guess for the clocks is 3.6GHz CPU, 1.8GHz RAM, 450MHz GPU. Clean multipliers, the way Nintendo does things. And I think it'll either be a SoC or a SiP. Most likely the latter, with a 45nm CPU and a 32nm GPU and logic part. I'm also under the impression that AMD contributed very little except for providing the GPU design base, with most of the actual customization done by NTD and IBM.
Really? All of it? Lets not forget, Im not speculating on what the dev kits contain, Im speculating what Nintendo will possibly launch with. Rumors might be correct about the early dev kits and the target specs Nintendo wanted to go for, but all that could have changed based on E3 feedback and technological developments.
But here is what Caramello speculated:
Its not that much different.
Price is the same.
Caramello clocked the CPU higher,
I clocked the GPU higher, but then again, we dont know anything about the GPU.
About the same amount of total memory, with differences how much of it will
be allocated as embedded.
Yes, I gave the CPU the extra core, but thats because
I simply dont see how Nintendo can sell this system as being more
powerful than current gen when
-the CPU "appears" the same as the the 360,
-when smaller devices are coming out with quadcores,
-the fact that Power7 starts being multi-core at quadcore.
And if you want to go by rumors, we have one stating triple and one stating quad.
:OWhile traversing and discussing around the Internet, someone was kind enough to let me take a peak at the dev kit, which confirms lherre's description. Obviously the info is very sensitive so I won't say much on it or who (just in case don't think lherre, as I wouldn't put him in this kind of position thanks to what he's already kindly shared with us.) But I have to believe based on how compact it is, that the GPU will see some kind of die shrink for Nintendo to place a hot chip in that case. So being with what I've researched I'm still leaning towards a GPU at 28nm.
While traversing and discussing around the Internet, someone was kind enough to let me take a peak at the dev kit, which confirms lherre's description. Obviously the info is very sensitive so I won't say much on it or who (just in case don't think lherre, as I wouldn't put him in this kind of position thanks to what he's already kindly shared with us.) But I have to believe based on how compact it is, that the GPU will see some kind of die shrink for Nintendo to place a hot chip in that case. So being with what I've researched I'm still leaning towards a GPU at 28nm.
600MHz would work as well in that scenario (3.6 : 6; 1.8 : 3 - clean multiplier). Also, as I already wrote some time ago, the only hint I've seen so far would point at a 32nm GPU. And that would actually make sense, as both IBM and Renesas have 32nm fabs and are producing 32nm chips for about a year now.You wouldn't see at least 600 for the GPU? I'm still expecting at least 600, but I see why you are saying that.
^ Ok. I definitely don't expect it to be 40nm. That's for sure.
Well that's my belief at least EloquentM.
But y'all know I can't do that. I made sure to get approval before even saying that much. I actually saw it about a week or so ago.
28nm.While traversing and discussing around the Internet, someone was kind enough to let me take a peak at the dev kit, which confirms lherre's description. Obviously the info is very sensitive so I won't say much on it or who (just in case don't think lherre, as I wouldn't put him in this kind of position thanks to what he's already kindly shared with us.) But I have to believe based on how compact it is, that the GPU will see some kind of die shrink for Nintendo to place a hot chip in that case. So being with what I've researched I'm still leaning towards a GPU at
@bgassassin: is it really fair to make inferences on the manufacturing process based on seeing the dev-kit casing? At the very least you'd have to combine that with expected chip complexity (e.g., number of ALUs), something that really is very much in the air right now, to make any reasonable speculation toward manufacturing process. You took a new fact (size of the dev-kit) and speculated on that... I'm just afraid a lot of readers will neglect to noticed the required chip complexity assumptions and jump to conclusions as they like to do.
Why are people hanging on 28nm so much? The only way I think it would make sense, is if the smaller process were to be employed to make more complex chips while limiting chip cost. But at the same time, the same chip complexity could be kept and the smaller manufacturing process used to only make the chip less expensive. Considering that there aren't any 28nm parts available yet, I doubt Nintendo would have a technological target that critically depended on good results from a young process. I mean, they're going to be using 45nm for their CPU. As I see it, 28nm or not it is not going to have an impact on what the specs of the machine are going to be, because I believe their target won't be dependent on that. If anything, it would increase their profit margin.
Sorry guys. No additional specs were given to me that we don't already have an idea on. Just what it looked like. But if you want an example, imagine the early Wii dev kit with about half the height and what seems like poorer ventilation than the retail case we saw at E3.
Sarusama. I've felt for awhile that they would target a smaller process to make whatever they are planning cooler instead of more powerful. That is another option for targeting a smaller process. So considering the GPU was overheating in that case, then yes I believe it's fair to make that assumption, especially since GPUs normally run hotter than a CPU.
Sorry guys. No additional specs were given to me that we don't already have an idea on. Just what it looked like. But if you want an example, imagine the early Wii dev kit with about half the height and what seems like poorer ventilation than the retail case we saw at E3.
What about the tablet controller? Did you get to see it work? Was it running wired? Was there only one?
It was probably being emulated with a DS lol
We got an official response from NoA. http://www.nintendoworldreport.com/news/28735
Nintendo of America, working with the 2012 International CES management, will offer demos of the upcoming Wii U console to members of the media who did not see the system at the 2011 E3 Expo. However, Nintendo will not have a booth at CES, nor does it plan to include any games, experiences or information beyond what was available at the 2011 E3 Expo. Production and development efforts remain on track for the Wii U launch, which will take place between the start of the 2012 E3 Expo in June, and the end of 2012."
So yea, nothing to see here.
I really do hope the final unit is smaller than the E3 variant. I know I keep going on about it, but that model was too big. I like my small systems.
^ Also, I know it wasn't huge, but it looked bigger than it had to be. Plus it was curvy. I don't like curves.
nron10 said:We got an official response from NoA. http://www.nintendoworldreport.com/news/28735