• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

Not sure how related this is, but since Wii U's CPU is described as "enhanced broadway" cores, I guess it's relevant...

Despite how "weak" Wii was compared to other consoles, I was surprised that it did had games that focused mainly on physics - which struck me as a bit odd. Right out of the gate, there was Elebits, then Boom Blox(and it's sequel) came along. Some will write those off as "not impressive" simply based on simplistic visuals, but they did a really good job off showcasing what Broadway could do for physics.

Below is a video - of Elebits' editor mode - actually boasting about this very thing.
http://www.youtube.com/watch?v=3xJXvFqhCk0


This makes me anxious to see what would be possible(if devs pushed it) now that Wii U have higher clocked multi-core CPUs, with extra large caches, low-latency access to the GPU's Edram, and who knows what other "enhancements."

What's strange, is that for all the power PS360 packed in their CPU's, we didn't see physics & AI taken to "the next-level" as developers promise each generation. Of course, it could be argued that this generation was more focused on pushing GPUs than anything else. Console CPU's this gen were in some ways over-powered, but were somewhat inefficient and spent a lot of time doing tasks that dedicated chips could have been used for. It does seem like this gen was a failed experiment hoping that a ridiculous amount of CPU FLOPs would bring significant advances in Physics and AI.

That didn't quite pan out, so now it appears that next-gen consoles CPUs will be trading a lot of those FLOPs in favor of efficiency, large caches, plus dedicated silicon and/or enough cores to handle various tasks. That combined with a strong focus on GPGPU capabilities...maybe this is the better formula for the results we are promised each generation?

Going back to this point, what was the initial purpose of having CPUs the way they were this current-gen besides going with the concept "faster frequency is better?"

I despise the NaturalMotion shit in most games. It introduces a massive disconnect between the player's actions and the actions performed by the character. Controls (and by proxy animations) need to be tight, precise and 100% predictable. Realism is a bonus. Uncharted was a great example of an animation system getting in the way.

Yeah.. though that is not a new concept. Reminds me of my issue with the original Prince of Persia for the PC decades ago. It was beautifully animated, but the controls were affected due to animated realism.
 

relaxor

what?
The comment about the enhanced physics capabilities is great, and I think highlights the very toy-like nature of Nintendo's thinking. I can definitely see that emphasis at work in Pikmin 3 and Wonderful 101 which have this very physical play upon many discrete objects, like toy soldiers.
 

MDX

Member
Not sure how related this is, but since Wii U's CPU is described as "enhanced broadway" cores, I guess it's relevant...

Despite how "weak" Wii was compared to other consoles, I was surprised that it did had games that focused mainly on physics - which struck me as a bit odd. Right out of the gate, there was Elebits, then Boom Blox(and it's sequel) came along. Some will write those off as "not impressive" simply based on simplistic visuals, but they did a really good job off showcasing what Broadway could do for physics.

Below is a video - of Elebits' editor mode - actually boasting about this very thing.
http://www.youtube.com/watch?v=3xJXvFqhCk0

I recall there were rumors that the Wii had a PPU. And Elebits was one of the first titles showing this off.
 

MDX

Member
Speaking of Physics, does Nintendo want developers to use the GPGPU to handle it, or the CPU?

AMD:
right now our gaming strategy at AMD on GPGPU is based on the Bullet Physics engine.'

http://www.bit-tech.net/hardware/graphics/2011/02/17/amd-manju-hegde-gaming-physics/3

AMD may materialize its plans to bring a GPU-accelerated version of Havok, which has till now been CPU-accelerated.
http://www.guru3d.com/news_story/amd_gpu_havok_physics_acceleration_at_gdc.html

Havok on the WiiU
The demo shown used CPU-processed physics (as opposed to GPU), which, Gargan said, would be the case when the engine runs on Wii U.

Edit to add:
If they meant for the GPU to handle physics, then I wonder, in particular for the ports, if the developers didnt bother optimizing their code for it. I assume, most, if not all physics, this gen was handled by the CPU.
 
Speaking of Physics, does Nintendo want developers to use the GPGPU to handle it, or the CPU?

AMD:


http://www.bit-tech.net/hardware/graphics/2011/02/17/amd-manju-hegde-gaming-physics/3


http://www.guru3d.com/news_story/amd_gpu_havok_physics_acceleration_at_gdc.html

Havok on the WiiU


Edit to add:
If they meant for the GPU to handle physics, then I wonder, in particular for the ports, if the developers didnt bother optimizing their code for it. I assume, most, if not all physics, this gen was handled by the CPU.
That is probably the case. Devs will probably not try hard on implementing GPGPU tasks until there is a higher focus on next-gen development. Playing around with GPGPU methods now will make games harder to port down to current-gen systems, and there is not enough of a next-gen userbase for most publishers to risk that.

I remember the talk about the Wii having a Physics-focused coprocessor. From the way things went this gen, that would prbably been wasted.

I'm very interested in the modifications AMD/Nintendo did to the r700 base, but it doesn't look like we will find out what was done anytime soon.
 

CoolS

Member
I guess I'll ask this here as well, since it kind of fits:

Guys, I got a quick question. I'm wondering if there's something wrong with my WiiU/my gamepad.
When playing new Super Mario Bros U, Marios hat next to the lives alway looks really pixelated on the gamepad screen. The same when you enter a level and it shows Mario and your lives, really really pixelated. Other hud elements are nice and smooth, but Marios head always is a bit pixelated.

Am I the only one with that problem? And if so, what might it be, some inteference? It doesn't change if I sit right next to the WiiU.
 
I guess I'll ask this here as well, since it kind of fits:

Guys, I got a quick question. I'm wondering if there's something wrong with my WiiU/my gamepad.
When playing new Super Mario Bros U, Marios hat next to the lives alway looks really pixelated on the gamepad screen. The same when you enter a level and it shows Mario and your lives, really really pixelated. Other hud elements are nice and smooth, but Marios head always is a bit pixelated.

Am I the only one with that problem? And if so, what might it be, some inteference? It doesn't change if I sit right next to the WiiU.

Could it be scaling artifacts? The game is rendered at 1280x720, but it has to display at 854x480 on the gamepad. Depending on how the image is scaled, you could possibly get some poor looking results.
 

CoolS

Member
Could it be scaling artifacts? The game is rendered at 1280x720, but it has to display at 854x480 on the gamepad. Depending on how the image is scaled, you could possibly get some poor looking results.

The weird thing is that it only seems to affect the one element of the HUD, Marios head. If othe people have the same problem it's okay I guess, but right now I'm a bit scared something might be wrong with my WiiU.
 
The weird thing is that it only seems to affect the one element of the HUD, Marios head. If othe people have the same problem it's okay I guess, but right now I'm a bit scared something might be wrong with my WiiU.

I've noticed it when it zooms in on Mario at the end of a stage too, so I think it's normal. I doubt you have anything to worry about.

You can definitely notice it if you look closely at Mario when you first select your save file.
 

Panajev2001a

GAF's Pleasant Genius
Could it be scaling artifacts? The game is rendered at 1280x720, but it has to display at 854x480 on the gamepad. Depending on how the image is scaled, you could possibly get some poor looking results.

Downscaling would improve the image... it is essentially FSAA applied to the image.
The problem with the video stream sent to the Wii U GamePad is that they are using a somewhat too aggressive video compression scheme. I hope they do improve it, they should have more than enough bandwidth to do it with modern 802.11n/5GHz and similar solutions: see WiFi HDMI streaming products which stream an extremely low lag stream of video and audio data (1080p video and surround sound audio, so far more data than the 480p stereo audio stream Wii U has to deal with).
 
Downscaling would improve the image... it is essentially FSAA applied to the image.
The problem with the video stream sent to the Wii U GamePad is that they are using a somewhat too aggressive video compression scheme. I hope they do improve it, they should have more than enough bandwidth to do it with modern 802.11n/5GHz and similar solutions: see WiFi HDMI streaming products which stream an extremely low lag stream of video and audio data (1080p video and surround sound audio, so far more data than the 480p stereo audio stream Wii U has to deal with).

Derp. I keep thinking 720p scales by a weird uneven factor to 480p, but it doesn't.

So it's compression artifacts, not scaling. :/
 
You guys have to keep in mind they compress the image which causes the frame to artifact. They even said it on Iwata Asks when they talked about how the gamepad tech works. They had to minimize the artifacting as much as they could, but they can never get rid of it completely.

EDIT: Derp, answered above.
 
I wonder if they have various options for various purposes. For instance, lossless compression should give much higher quality without breaking the bandwidth bank for any pre-N64 emulated games.

EDIT: As long as they're scaled properly.
 

Zornica

Banned
I just played the CoD wiiu port at a friends house, and I wondered if framerate/resolution is sacrificed when playing with two people (one on pad, the other one on the tv).
I am generally bad in noticing those things, so I couldn't tell if the resolution was changed or if the framerate was lowered. I found that to be very interessting, the game being a cheap launch port and the wiiu running the game twice basically. At least I was kinda impressed by that.

so anyone knows whats being sacrificed? if anything at all?
 
I just played the CoD wiiu port at a friends house, and I wondered if framerate/resolution is sacrificed when playing with two people (one on pad, the other one on the tv).
I am generally bad in noticing those things, so I couldn't tell if the resolution was changed or if the framerate was lowered. I found that to be very interessting, the game being a cheap launch port and the wiiu running the game twice basically. At least I was kinda impressed by that.

so anyone knows whats being sacrificed? if anything at all?

Dynamic shadows. That's about it. The framerate already fluctuates due to V-sync being enabled and the fact that it's a launch title.
 

guek

Banned
Framerate seemed halved when I did separate pad+tv play. It was stable at 30 though unless there were 50+ zombies on the screen, in which case it chugged.
 

Tmdean

Banned
The weird thing is that it only seems to affect the one element of the HUD, Marios head. If othe people have the same problem it's okay I guess, but right now I'm a bit scared something might be wrong with my WiiU.

I've noticed that saturated reds always compress the worst in many compression formats. Keep an eye out for it next time you're watchinga DVD or something. I remember reading once that this was caused by a bug in a commonly used encoder, but I've never been able to find that article again.
 

ozfunghi

Member
So the console has been out for two months now, and still no updates? Ugh. I guess the guy who revealed the CPU clockspeed, hasn't found any new revelations either.
 
I just played the CoD wiiu port at a friends house, and I wondered if framerate/resolution is sacrificed when playing with two people (one on pad, the other one on the tv).
I am generally bad in noticing those things, so I couldn't tell if the resolution was changed or if the framerate was lowered. I found that to be very interessting, the game being a cheap launch port and the wiiu running the game twice basically. At least I was kinda impressed by that.

so anyone knows whats being sacrificed? if anything at all?

second screen is much lower res so it's not rendering everything twice at the same overhead... however It's really impressive (IMO).
 

Thraktor

Member
second screen is much lower res so it's not rendering everything twice at the same overhead... however It's really impressive (IMO).

Nintendo Land is arguably the more impressive game in that regard. In some of the games (Metroid, Zelda, Mario & Animal Crossing) it's rendering five different viewpoints while maintaining a rock-solid 60fps.
 

eternalb

Member
Nintendo Land is arguably the more impressive game in that regard. In some of the games (Metroid, Zelda, Mario & Animal Crossing) it's rendering five different viewpoints while maintaining a rock-solid 60fps.

In Zelda Battle Quest, the TV's framerate drops to 30 for 3-players or more, while the Gamepad remains at 60 (as well as its TV display). Haven't checked the other games for comparison.
 

AzaK

Member
Nintendo Land is arguably the more impressive game in that regard. In some of the games (Metroid, Zelda, Mario & Animal Crossing) it's rendering five different viewpoints while maintaining a rock-solid 60fps.

True. Although those viewpoints are smaller so the overall realestate will be on average a full screen.
 

IdeaMan

My source is my ass!
I'm back guys, i had problems with my PC (just half resolved) + busy irl.

Best wishes for 2013 to the regulars :)

So what's new since the beginning of December i would say, technically ?
 

JohnB

Member
I'm back guys, i had problems with my PC (just half resolved) + busy irl.

Best wishes for 2013 to the regulars :)

So what's new since the beginning of December i would say, technically ?

Suppose there's this: http://www.edge-online.com/news/nin...nsists-console-is-definitely-next-generation/

Short quote:

"“The Wii U is an infant that’s just been born,” Hayashi tells us. “It’s a little unfair to compare it to mature platforms that people have been working on for over five years. I’m sure people will find ways to bring out even more power as the platform matures.

“To be completely blunt and honest, there’s no way that the Wii U processor is ‘horrible and slow’ compared to other platforms. I think that comment was just 4A trying to find a scapegoat for a simple business decision on their part.”

However, Hayashi does not dispute that Wii U’s spec sheet isn’t much of a leap over current-generation consoles, if it is at all – but argues that the console’s functionality does more than enough for it to be classed as the start of a new generation.

“If you’re basing this simply on processor speed, then it’s not next generation,” he says. “If you’re basing this on Wii U being a new idea that challenges existing platforms, then it definitely is next generation. It is a console videogame platform that is now independent of the TV. Nobody has done that before."
 
I'm back guys, i had problems with my PC (just half resolved) + busy irl.

Best wishes for 2013 to the regulars :)

So what's new since the beginning of December i would say, technically ?

Welcome back, Ideaman! Unfortunately, no specifics have leaked since clockspeeds. Very frustrating and I have been losing interest. Wii U is what it is, and I am enjoying Zombi U and Nintendoland (having beat Mario) while my gf is hooked on Scribblenauts.

But by God I wish the GPU core config and eDRAM bandwidth were known!
 

Thraktor

Member
Welcome back, Ideaman! Unfortunately, no specifics have leaked since clockspeeds. Very frustrating and I have been losing interest. Wii U is what it is, and I am enjoying Zombi U and Nintendoland (having beat Mario) while my gf is hooked on Scribblenauts.

But by God I wish the GPU core config and eDRAM bandwidth were known!

What I'm really interested in is the way in which the eDRAM is integrated with the GPU, and the extent to which the GPU is customised over the regular R700 line. That said, I don't expect either of those to become public knowledge for a long time, if ever, so I suppose the GPU core config will have to do when it inevitably leaks out.
 

AzaK

Member
What I'm really interested in is the way in which the eDRAM is integrated with the GPU, and the extent to which the GPU is customised over the regular R700 line. That said, I don't expect either of those to become public knowledge for a long time, if ever, so I suppose the GPU core config will have to do when it inevitably leaks out.

Yeah, I would have thought some geek site would have been able to get us some fancy pictures by now at least.
 
Why has news been so slow? Where are the CPU and GPU X-rays? WHO IS THE MILKMAN? WHAT ARE THE GOGGLES FOR? WHERE DID YOU GET THE RED SIGN?
 

tipoo

Banned
I'm pretty frustrated with how little we still know too. Every new APple product has gotten its chip looked at under a microscope, for a gaming console the details are perhaps even more relevant (for some, at least). It would be easy for someone like Chipworks to at least tell us how many shader units and how much eDRAM is in the GPU for instance, and a bit about the CPU. All we know right now is clock speed and memory bandwidth, with a decent guess at core count (3 all but confirmed at this point).

About the eDRAM: Is the consensus here that both the CPU and GPU have their own dedicated pools? 3MB eDRAM, cache for the CPU, 32mb for the GPU is what I keep hearing, but I can't find official mention of both having some.

I wonder if they can talk to the other chips eDRAM pool quickly, that would make for some interesting programming. Ie, if the GPU didn't need all of its 32MB, the CPU could use it as an extended L3 cache or at the least a scratchpad. I think the GPU will be using most of that anyways though, I think it's just big enough to fit one 1080p buffer or one 720p buffer with 4x AA in it.


May I suggest a series of polite emails from a few of us to Chipworks to check the chips?
 

wsippel

Banned
I'm pretty frustrated with how little we still know too. Every new APple product has gotten its chip looked at under a microscope, for a gaming console the details are perhaps even more relevant (for some, at least). It would be easy for someone like Chipworks to at least tell us how many shader units and how much eDRAM is in the GPU for instance, and a bit about the CPU. All we know right now is clock speed and memory bandwidth, with a decent guess at core count (3 all but confirmed at this point).

About the eDRAM: Is the consensus here that both the CPU and GPU have their own dedicated pools? 3MB eDRAM, cache for the CPU, 32mb for the GPU is what I keep hearing, but I can't find official mention of both having some.

I wonder if they can talk to the other chips eDRAM pool quickly, that would make for some interesting programming. Ie, if the GPU didn't need all of its 32MB, the CPU could use it as an extended L3 cache or at the least a scratchpad. I think the GPU will be using most of that anyways though, I think it's just big enough to fit one 1080p buffer or one 720p buffer with 4x AA in it.
Yes, both chips have dedicated eDRAM. That's official. The CPU eDRAM was confirmed in the old IBM press release, the GPU eDRAM was confirmed by Nintendo in a recent Iwata Asks. The GPU eDRAM is neither a framebuffer nor any form of L3 cache - it's the system's primary memory pool (MEM1).
 
GIMME MICROGRAPHS.

Fuck, the console is officially released, does no one really have the tech to take a micrograph of a taken apart Wii U!?
 

tipoo

Banned
I have a feeling the GPU on the Wii U isn't powerful enough to produce the visual spectacle and work on the physics at the same time.

It's possible that they worked out their own solution for that ahead of desktop graphics cards allowing for the GPU to do both without a performance hit that even high end cards take, but I have not heard anything like that from devs so it seems unlikely. Yeah, an older mid range at best chip can't do lots of GPGPU work while producing sufficiently advanced visuals at the same time.
 

AzaK

Member
Yes, both chips have dedicated eDRAM. That's official. The CPU eDRAM was confirmed in the old IBM press release, the GPU eDRAM was confirmed by Nintendo in a recent Iwata Asks. The GPU eDRAM is neither a framebuffer nor any form of L3 cache - it's the system's primary memory pool (MEM1).

Wsippel, what's the difference? If it's the main memory pool, it will be used as a framebuffer anyway right?
 

tipoo

Banned
Yes, both chips have dedicated eDRAM. That's official. The CPU eDRAM was confirmed in the old IBM press release, the GPU eDRAM was confirmed by Nintendo in a recent Iwata Asks. The GPU eDRAM is neither a framebuffer nor any form of L3 cache - it's the system's primary memory pool (MEM1).



Can I bother you for sources and quotes? Because I was talking to someone on another forum a while ago and he challenged me on this, and I remember I could not find a reference to both components having eDRAM, only one. I think the one was the CPU.

And yeah, as mentioned above, since the main memory is so slow it wouldn't make much sense for that 32mb (if it is that much) pool to be used for anything but a framebuffer if it can barely fit one 1080p frame in, would it?
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
Do we have any idea how much this part costs...?
4bADacmWBUPiXVRu.medium
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
That's not the Gamepad's video transmitter? I'm asking because I wonder if that is not a major cost of the console's bill of materials.
 

AzaK

Member
Can I bother you for sources and quotes? Because I was talking to someone on another forum a while ago and he challenged me on this, and I remember I could not find a reference to both components having eDRAM, only one. I think the one was the CPU.

And yeah, as mentioned above, since the main memory is so slow it wouldn't make much sense for that 32mb (if it is that much) pool to be used for anything but a framebuffer if it can barely fit one 1080p frame in, would it?

A 1080 32bit framebuffer is just about 8MB. You have to add in Z etc too of course but 32 is enough, and Nintendo are only targetting 720 anyway which is only about 3.5 MB - 1280 * 720 * 4 / (1024 * 1024)
 

tipoo

Banned
A 1080 32bit framebuffer is just about 8MB. You have to add in Z etc too of course but 32 is enough, and Nintendo are only targetting 720 anyway which is only about 3.5 MB - 1280 * 720 * 4 / (1024 * 1024)



Hmm I may be wrong there, I read somewhere that the 32mb framebuffer was enough for 720p with 4xaa or 1080p with 0 and assumed that was right.

But still, the Wii U still has half the memory bandwidth of the 360, I'd imagine the 32MB is filled trying to make up for that by storing the most needed textures etc.
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
I only see 12 pages to this thread.
 

Donnie

Member
Hmm I may be wrong there, I read somewhere that the 32mb framebuffer was enough for 720p with 4xaa or 1080p with 0 and assumed that was right.

But still, the Wii U still has half the memory bandwidth of the 360, I'd imagine the 32MB is filled trying to make up for that by storing the most needed textures etc.

Those are just two examples from Nintendo's documentation. 32MB is more than enough to hold a 1080p frame and z buffer with space to spare (you'd still have about half the eDRAM free).

Also storing certain frequently used textures in eDRAM isn't the only option to optimise WiiU's main memory bandwidth. WiiU's main memory has 12.8GB/s that can be used for reads and/or writes. While 360 has 11.2GB/s for reads only and 11.2GB/s for writes only. If you use WiiU's eDRAM to limit writes to main memory to a very small amount you could technically have more read bandwidth from WiiU's main memory than 360's.
 
Those are just two examples from Nintendo's documentation. 32MB is more than enough to hold a 1080p frame and z buffer with space to spare.

Also storing certain frequently used textures in eDRAM isn't the only option to optimise WiiU's main memory bandwidth. WiiU's main memory has 12.8GB/s that can be used for reads and/or writes. While 360 has 11.2GB/s for reads only and 11.2GB/s for writes only. If you use WiiU's eDRAM to limit writes to main memory to a very small amount you could technically have more read bandwidth from WiiU's main memory than 360's.

Ah so wiiu memory bandwidth isn't quite as terrible as has been made out
 

Donnie

Member
On freeze framed images and start menus, yes that applies.

If you have a large amount of embedded memory then there are plenty of ways to limit writes to main memory. I really don't know why you're trying to trivialize that fact. Or are you referring specifically to my example of being able to get more read bandwidth out of WiiU's memory than 360's? Obviously that would take some very tight optimisations and probably wouldn't be possible in most cases, I did say technically.
 
Top Bottom