• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Speculation thread IV: Photoshop rumors and image memes

Status
Not open for further replies.
I'm disappointed by the latest devkit news. One of Nintendo's little demons is the tendency to think up new concepts then be unable to implement them all in time for launch. It would be a bit of a downer if some of what we saw at last E3 wasn't available as a feature until some future, post-release "upcoming system update".
 
A lot of what was shown in the reveal video looked like OS functionality or dedicated apps to me, I can believe that the camera stuff wasn't finalised or readied for the v4 sdk, but I don't think that'll have any bearing on what the machine does at launch. Why do devs need access or knowledge of what the OS is going to do in its entirity? They don't really. And there are other things they could have been working on while Nintendo finalised the camera stuff... Nintendo clearly plan for at least a video chat app and this hasn't been rushed like the 3DS -- whereas 3DS had to wait for its shop and certain apps, I'm 100% certain they'll avoid that with Wii-U.
 

Kjellson

Member
Wii U related?

I5V1bt
Looks like the map of a European microstate.
 

AlStrong

Member
High latency at low speeds?

The latency and bandwidth for GDDR3 and DDR3 are very similar. DDR3 has the main advantage of still being in high production (GDDR3 has been EOL'd awhile back), lower power consumption, and higher chip densities. There is absolutely no advantage to using GDDR3 over DDR3, and one should note that DDR3 is often the choice these days for lower end GPUs (cost factor).

By the way, there's something I've been meaning to ask you. Whenever the subject of of the GPU fab comes up, you never mention the possibility of Nintendo choosing NEC like they have for the past two generations and you seem pretty adamant that they'll use TMSC's 40nm. Is there a particular reason for that?

I had the impression that NEC had higher prices - which was the reason MS switched to TSMC (larger fab capacity) - even for an older node (360 slim die sizes indicated 55/65nm eDRAM).

edit: Besides, they may be making use of IBM technology design for the eDRAM design anyway.
 

Trevelyan

Banned
What other games would have dialoge similar to "Rue your fate, comrade" ? Trying to think of Nintendo IPs that would fit that bill but having a hard time.
 

IdeaMan

My source is my ass!
I'm disappointed by the latest devkit news. One of Nintendo's little demons is the tendency to think up new concepts then be unable to implement them all in time for launch. It would be a bit of a downer if some of what we saw at last E3 wasn't available as a feature until some future, post-release "upcoming system update".

It was the situation in January/February. They have a lot of time to implement all the features, but it's an interesting peak on the Wii U designing process, with, it seems, a heavy focus at first on the hardware guts & capability to handle two screens (and a tactile + motion controller) & DRC streaming/wireless technology (this focus is seen through the main changes of the dev kits & sdk until recently). It's rather logic. And as briefly reminded in my previous big post, it could also be one of the explanations to the huge memory space reserved for the OS/Background layer on v4 dev kits, because it was heavily in development and they didn't know its imprint, the activation of more and more features, the addition of new ones, etc.

A lot of what was shown in the reveal video looked like OS functionality or dedicated apps to me, I can believe that the camera stuff wasn't finalised or readied for the v4 sdk, but I don't think that'll have any bearing on what the machine does at launch. Why do devs need access or knowledge of what the OS is going to do in its entirity? They don't really. And there are other things they could have been working on while Nintendo finalised the camera stuff... Nintendo clearly plan for at least a video chat app and this hasn't been rushed like the 3DS -- whereas 3DS had to wait for its shop and certain apps, I'm 100% certain they'll avoid that with Wii-U.

I agree. But with this info, you can be nearly sure that some of what we've seen in the E3 2011 presentation video wasn't "real", just mockups/concrete-concept-art of what they plan to have for the Wii U. Especially the video calls part.
 

IdeaMan

My source is my ass!
For the scan, it looks like a stylized Russia map, you can spot the Kamchatka, etc.

If you have some questions about my post, i'm here, don't hesitate, if i can answer. A pity that i've not seen the scan before posting my message, it will be overlooked :p
 

BD1

Banned
Still seems weird that they would do two Bond games in a year.

I'm pretty sure Shiggy said in the 007 Collection thread that the Skyfall Wii U game he leaked and 007 Collection are the same. So no Skyfall only Wii U game. (Assuming that's the second Bond game you're talking about)
 

EvilMario

Will QA for food.
You guys should stop quoting Discomurf's image, or he's going to get the ax. Or are these scans okay in Wii U Spec land?

Also, I hope it's the outline of some Monster Hunter with a weapon. ;_;
 

schuelma

Wastes hours checking old Famitsu software data, but that's why we love him.
You guys should stop quoting Discomurf's image, or he's going to get the ax.

Also, I hope it's the outline of some Monster Hunter with a weapon. ;_;

I believe those types of pages have always been considered fine by the powers that be.
 
For the scan, it looks like a stylized Russia map, you can spot the Kamchatka, etc.

If you have some questions about my post, i'm here, don't hesitate, if i can answer. A pity that i've not seen the scan before posting my message, it will be overlooked :p


Let's talk about ram. If "TF1" means 1 a
nd "France 2" means 2..
How much ram there is ?;p
 

BurntPork

Banned
The latency and bandwidth for GDDR3 and DDR3 are very similar. DDR3 has the main advantage of still being in high production (GDDR3 has been EOL'd awhile back), lower power consumption, and higher chip densities. There is absolutely no advantage to using GDDR3 over DDR3, and one should note that DDR3 is often the choice these days for lower end GPUs (cost factor).



I had the impression that NEC had higher prices - which was the reason MS switched to TSMC (larger fab capacity) - even for an older node (360 slim die sizes indicated 55/65nm eDRAM).

Hm. I thought GDDR3 was a bit faster.

One last thing. Based on what you've said here and on B3D forums, 240-320 SPUs is the most reasonable range and anything over 400 is pretty much impossible in a case that size on 40nm, right?
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
For the scan, it looks like a stylized Russia map, you can spot the Kamchatka, etc.

If you have some questions about my post, i'm here, don't hesitate, if i can answer. A pity that i've not seen the scan before posting my message, it will be overlooked :p

Do you honestly think they're going for a facial recognition sign-in like on Kinect?
 
Status
Not open for further replies.
Top Bottom