• Register
  • TOS
  • Privacy
  • @NeoGAF

pottuvoi
Member
(12-28-2012, 08:01 AM)
pottuvoi's Avatar
We certainly saw decent jump in physics when comparing to previous generation.
saunderez
Member
(12-28-2012, 08:04 AM)
saunderez's Avatar

Originally Posted by pottuvoi

We certainly saw decent jump in physics when comparing to previous generation.

Watching Euphoria do its stuff in Backbreaker is like poetry in motion. So good.
phosphor112
Banned
(12-28-2012, 08:12 AM)
phosphor112's Avatar

Originally Posted by saunderez

Watching Euphoria do its stuff in Backbreaker is like poetry in motion. So good.

The NFL Blitz remake should have been blitz 64 with new graphics and euphoria for tackles. That's it. Too bad it's not though. =[
wsippel
(12-28-2012, 08:48 AM)

Originally Posted by saunderez

Watching Euphoria do its stuff in Backbreaker is like poetry in motion. So good.

I despise the NaturalMotion shit in most games. It introduces a massive disconnect between the player's actions and the actions performed by the character. Controls (and by proxy animations) need to be tight, precise and 100% predictable. Realism is a bonus. Uncharted was a great example of an animation system getting in the way.
lwilliams3
Member
(12-28-2012, 09:03 AM)
lwilliams3's Avatar

Originally Posted by OryoN

Not sure how related this is, but since Wii U's CPU is described as "enhanced broadway" cores, I guess it's relevant...

Despite how "weak" Wii was compared to other consoles, I was surprised that it did had games that focused mainly on physics - which struck me as a bit odd. Right out of the gate, there was Elebits, then Boom Blox(and it's sequel) came along. Some will write those off as "not impressive" simply based on simplistic visuals, but they did a really good job off showcasing what Broadway could do for physics.

Below is a video - of Elebits' editor mode - actually boasting about this very thing.
http://www.youtube.com/watch?v=3xJXvFqhCk0


This makes me anxious to see what would be possible(if devs pushed it) now that Wii U have higher clocked multi-core CPUs, with extra large caches, low-latency access to the GPU's Edram, and who knows what other "enhancements."

What's strange, is that for all the power PS360 packed in their CPU's, we didn't see physics & AI taken to "the next-level" as developers promise each generation. Of course, it could be argued that this generation was more focused on pushing GPUs than anything else. Console CPU's this gen were in some ways over-powered, but were somewhat inefficient and spent a lot of time doing tasks that dedicated chips could have been used for. It does seem like this gen was a failed experiment hoping that a ridiculous amount of CPU FLOPs would bring significant advances in Physics and AI.

That didn't quite pan out, so now it appears that next-gen consoles CPUs will be trading a lot of those FLOPs in favor of efficiency, large caches, plus dedicated silicon and/or enough cores to handle various tasks. That combined with a strong focus on GPGPU capabilities...maybe this is the better formula for the results we are promised each generation?

Going back to this point, what was the initial purpose of having CPUs the way they were this current-gen besides going with the concept "faster frequency is better?"

Originally Posted by wsippel

I despise the NaturalMotion shit in most games. It introduces a massive disconnect between the player's actions and the actions performed by the character. Controls (and by proxy animations) need to be tight, precise and 100% predictable. Realism is a bonus. Uncharted was a great example of an animation system getting in the way.

Yeah.. though that is not a new concept. Reminds me of my issue with the original Prince of Persia for the PC decades ago. It was beautifully animated, but the controls were affected due to animated realism.
relaxor
what?
(12-28-2012, 09:18 AM)
relaxor's Avatar
The comment about the enhanced physics capabilities is great, and I think highlights the very toy-like nature of Nintendo's thinking. I can definitely see that emphasis at work in Pikmin 3 and Wonderful 101 which have this very physical play upon many discrete objects, like toy soldiers.
MDX
Member
(12-28-2012, 03:57 PM)
MDX's Avatar

Originally Posted by OryoN

Not sure how related this is, but since Wii U's CPU is described as "enhanced broadway" cores, I guess it's relevant...

Despite how "weak" Wii was compared to other consoles, I was surprised that it did had games that focused mainly on physics - which struck me as a bit odd. Right out of the gate, there was Elebits, then Boom Blox(and it's sequel) came along. Some will write those off as "not impressive" simply based on simplistic visuals, but they did a really good job off showcasing what Broadway could do for physics.

Below is a video - of Elebits' editor mode - actually boasting about this very thing.
http://www.youtube.com/watch?v=3xJXvFqhCk0

I recall there were rumors that the Wii had a PPU. And Elebits was one of the first titles showing this off.
MDX
Member
(12-28-2012, 04:39 PM)
MDX's Avatar
Speaking of Physics, does Nintendo want developers to use the GPGPU to handle it, or the CPU?

AMD:

right now our gaming strategy at AMD on GPGPU is based on the Bullet Physics engine.'

http://www.bit-tech.net/hardware/gra...ming-physics/3

AMD may materialize its plans to bring a GPU-accelerated version of Havok, which has till now been CPU-accelerated.

http://www.guru3d.com/news_story/amd...on_at_gdc.html

Havok on the WiiU

The demo shown used CPU-processed physics (as opposed to GPU), which, Gargan said, would be the case when the engine runs on Wii U.

Edit to add:
If they meant for the GPU to handle physics, then I wonder, in particular for the ports, if the developers didnt bother optimizing their code for it. I assume, most, if not all physics, this gen was handled by the CPU.
Last edited by MDX; 12-28-2012 at 05:34 PM.
lwilliams3
Member
(12-29-2012, 04:09 AM)
lwilliams3's Avatar

Originally Posted by MDX

Speaking of Physics, does Nintendo want developers to use the GPGPU to handle it, or the CPU?

AMD:


http://www.bit-tech.net/hardware/gra...ming-physics/3


http://www.guru3d.com/news_story/amd...on_at_gdc.html

Havok on the WiiU


Edit to add:
If they meant for the GPU to handle physics, then I wonder, in particular for the ports, if the developers didnt bother optimizing their code for it. I assume, most, if not all physics, this gen was handled by the CPU.

That is probably the case. Devs will probably not try hard on implementing GPGPU tasks until there is a higher focus on next-gen development. Playing around with GPGPU methods now will make games harder to port down to current-gen systems, and there is not enough of a next-gen userbase for most publishers to risk that.

I remember the talk about the Wii having a Physics-focused coprocessor. From the way things went this gen, that would prbably been wasted.

I'm very interested in the modifications AMD/Nintendo did to the r700 base, but it doesn't look like we will find out what was done anytime soon.
CoolS
Member
(12-29-2012, 02:00 PM)
CoolS's Avatar
I guess I'll ask this here as well, since it kind of fits:

Guys, I got a quick question. I'm wondering if there's something wrong with my WiiU/my gamepad.
When playing new Super Mario Bros U, Marios hat next to the lives alway looks really pixelated on the gamepad screen. The same when you enter a level and it shows Mario and your lives, really really pixelated. Other hud elements are nice and smooth, but Marios head always is a bit pixelated.

Am I the only one with that problem? And if so, what might it be, some inteference? It doesn't change if I sit right next to the WiiU.
UltimateIke
Member
(12-29-2012, 02:09 PM)
UltimateIke's Avatar

Originally Posted by CoolS

I guess I'll ask this here as well, since it kind of fits:

Guys, I got a quick question. I'm wondering if there's something wrong with my WiiU/my gamepad.
When playing new Super Mario Bros U, Marios hat next to the lives alway looks really pixelated on the gamepad screen. The same when you enter a level and it shows Mario and your lives, really really pixelated. Other hud elements are nice and smooth, but Marios head always is a bit pixelated.

Am I the only one with that problem? And if so, what might it be, some inteference? It doesn't change if I sit right next to the WiiU.

Could it be scaling artifacts? The game is rendered at 1280x720, but it has to display at 854x480 on the gamepad. Depending on how the image is scaled, you could possibly get some poor looking results.
CoolS
Member
(12-29-2012, 02:11 PM)
CoolS's Avatar

Originally Posted by UltimateIke

Could it be scaling artifacts? The game is rendered at 1280x720, but it has to display at 854x480 on the gamepad. Depending on how the image is scaled, you could possibly get some poor looking results.

The weird thing is that it only seems to affect the one element of the HUD, Marios head. If othe people have the same problem it's okay I guess, but right now I'm a bit scared something might be wrong with my WiiU.
UltimateIke
Member
(12-29-2012, 02:12 PM)
UltimateIke's Avatar

Originally Posted by CoolS

The weird thing is that it only seems to affect the one element of the HUD, Marios head. If othe people have the same problem it's okay I guess, but right now I'm a bit scared something might be wrong with my WiiU.

I've noticed it when it zooms in on Mario at the end of a stage too, so I think it's normal. I doubt you have anything to worry about.

You can definitely notice it if you look closely at Mario when you first select your save file.
Last edited by UltimateIke; 12-29-2012 at 02:17 PM.
Panajev2001a
GAF's Pleasant Genius
(12-29-2012, 02:14 PM)

Originally Posted by UltimateIke

Could it be scaling artifacts? The game is rendered at 1280x720, but it has to display at 854x480 on the gamepad. Depending on how the image is scaled, you could possibly get some poor looking results.

Downscaling would improve the image... it is essentially FSAA applied to the image.
The problem with the video stream sent to the Wii U GamePad is that they are using a somewhat too aggressive video compression scheme. I hope they do improve it, they should have more than enough bandwidth to do it with modern 802.11n/5GHz and similar solutions: see WiFi HDMI streaming products which stream an extremely low lag stream of video and audio data (1080p video and surround sound audio, so far more data than the 480p stereo audio stream Wii U has to deal with).
UltimateIke
Member
(12-29-2012, 02:55 PM)
UltimateIke's Avatar

Originally Posted by Panajev2001a

Downscaling would improve the image... it is essentially FSAA applied to the image.
The problem with the video stream sent to the Wii U GamePad is that they are using a somewhat too aggressive video compression scheme. I hope they do improve it, they should have more than enough bandwidth to do it with modern 802.11n/5GHz and similar solutions: see WiFi HDMI streaming products which stream an extremely low lag stream of video and audio data (1080p video and surround sound audio, so far more data than the 480p stereo audio stream Wii U has to deal with).

Derp. I keep thinking 720p scales by a weird uneven factor to 480p, but it doesn't.

So it's compression artifacts, not scaling. :/
Last edited by UltimateIke; 12-29-2012 at 03:06 PM.
phosphor112
Banned
(12-29-2012, 10:47 PM)
phosphor112's Avatar
You guys have to keep in mind they compress the image which causes the frame to artifact. They even said it on Iwata Asks when they talked about how the gamepad tech works. They had to minimize the artifacting as much as they could, but they can never get rid of it completely.

EDIT: Derp, answered above.
JoshuaJSlone
Member
(12-30-2012, 12:32 PM)
JoshuaJSlone's Avatar
I wonder if they have various options for various purposes. For instance, lossless compression should give much higher quality without breaking the bandwidth bank for any pre-N64 emulated games.

EDIT: As long as they're scaled properly.
Zornica
Member
(12-31-2012, 03:03 AM)
Zornica's Avatar
I just played the CoD wiiu port at a friends house, and I wondered if framerate/resolution is sacrificed when playing with two people (one on pad, the other one on the tv).
I am generally bad in noticing those things, so I couldn't tell if the resolution was changed or if the framerate was lowered. I found that to be very interessting, the game being a cheap launch port and the wiiu running the game twice basically. At least I was kinda impressed by that.

so anyone knows whats being sacrificed? if anything at all?
Last edited by Zornica; 12-31-2012 at 03:17 AM.
Smurfman256
Member
(12-31-2012, 06:46 AM)
Smurfman256's Avatar

Originally Posted by Zornica

I just played the CoD wiiu port at a friends house, and I wondered if framerate/resolution is sacrificed when playing with two people (one on pad, the other one on the tv).
I am generally bad in noticing those things, so I couldn't tell if the resolution was changed or if the framerate was lowered. I found that to be very interessting, the game being a cheap launch port and the wiiu running the game twice basically. At least I was kinda impressed by that.

so anyone knows whats being sacrificed? if anything at all?

Dynamic shadows. That's about it. The framerate already fluctuates due to V-sync being enabled and the fact that it's a launch title.
guek
Member
(12-31-2012, 06:51 AM)
guek's Avatar
Framerate seemed halved when I did separate pad+tv play. It was stable at 30 though unless there were 50+ zombies on the screen, in which case it chugged.
Tmdean
Banned
(12-31-2012, 07:30 AM)

Originally Posted by CoolS

The weird thing is that it only seems to affect the one element of the HUD, Marios head. If othe people have the same problem it's okay I guess, but right now I'm a bit scared something might be wrong with my WiiU.

I've noticed that saturated reds always compress the worst in many compression formats. Keep an eye out for it next time you're watchinga DVD or something. I remember reading once that this was caused by a bug in a commonly used encoder, but I've never been able to find that article again.
ozfunghi
Member
(01-03-2013, 02:55 PM)
ozfunghi's Avatar
So the console has been out for two months now, and still no updates? Ugh. I guess the guy who revealed the CPU clockspeed, hasn't found any new revelations either.
Ninja Moomin
Banned
(01-03-2013, 02:59 PM)

Originally Posted by Zornica

I just played the CoD wiiu port at a friends house, and I wondered if framerate/resolution is sacrificed when playing with two people (one on pad, the other one on the tv).
I am generally bad in noticing those things, so I couldn't tell if the resolution was changed or if the framerate was lowered. I found that to be very interessting, the game being a cheap launch port and the wiiu running the game twice basically. At least I was kinda impressed by that.

so anyone knows whats being sacrificed? if anything at all?

second screen is much lower res so it's not rendering everything twice at the same overhead... however It's really impressive (IMO).
pottuvoi
Member
(01-03-2013, 03:40 PM)
pottuvoi's Avatar

Originally Posted by Tmdean

I've noticed that saturated reds always compress the worst in many compression formats. Keep an eye out for it next time you're watchinga DVD or something. I remember reading once that this was caused by a bug in a commonly used encoder, but I've never been able to find that article again.

Color information is compressed in smaller resolution, this causes most of the problems.
http://en.wikipedia.org/wiki/YUV_4:2:2
Zornica
Member
(01-03-2013, 06:41 PM)
Zornica's Avatar

Originally Posted by NinjaFusion

second screen is much lower res so it's not rendering everything twice at the same overhead... however It's really impressive (IMO).

gamepad screen resolution is 854×480, the internal render resolution for cod seems to be 880x720 with 2xAA (like the 360 version). I am not sure if you can call that "much lower res"
RedSwirl
Junior Member
(01-03-2013, 06:42 PM)
RedSwirl's Avatar
Do we know anything specific about the GPU now?
Thraktor
Member
(01-03-2013, 11:16 PM)
Thraktor's Avatar

Originally Posted by NinjaFusion

second screen is much lower res so it's not rendering everything twice at the same overhead... however It's really impressive (IMO).

Nintendo Land is arguably the more impressive game in that regard. In some of the games (Metroid, Zelda, Mario & Animal Crossing) it's rendering five different viewpoints while maintaining a rock-solid 60fps.
eternalb
Member
(01-03-2013, 11:22 PM)
eternalb's Avatar

Originally Posted by Thraktor

Nintendo Land is arguably the more impressive game in that regard. In some of the games (Metroid, Zelda, Mario & Animal Crossing) it's rendering five different viewpoints while maintaining a rock-solid 60fps.

In Zelda Battle Quest, the TV's framerate drops to 30 for 3-players or more, while the Gamepad remains at 60 (as well as its TV display). Haven't checked the other games for comparison.
AzaK
Member
(01-03-2013, 11:22 PM)
AzaK's Avatar

Originally Posted by Thraktor

Nintendo Land is arguably the more impressive game in that regard. In some of the games (Metroid, Zelda, Mario & Animal Crossing) it's rendering five different viewpoints while maintaining a rock-solid 60fps.

True. Although those viewpoints are smaller so the overall realestate will be on average a full screen.
IdeaMan
My source is my ass!
(01-07-2013, 05:58 PM)
IdeaMan's Avatar
I'm back guys, i had problems with my PC (just half resolved) + busy irl.

Best wishes for 2013 to the regulars :)

So what's new since the beginning of December i would say, technically ?
lightchris
Member
(01-07-2013, 07:02 PM)
lightchris's Avatar
Nothing really new since the clock speeds became known, iirc.
JohnB
Member
(01-08-2013, 12:59 AM)
JohnB's Avatar

Originally Posted by IdeaMan

I'm back guys, i had problems with my PC (just half resolved) + busy irl.

Best wishes for 2013 to the regulars :)

So what's new since the beginning of December i would say, technically ?

Suppose there's this: http://www.edge-online.com/news/ninj...xt-generation/

Short quote:

"“The Wii U is an infant that’s just been born,” Hayashi tells us. “It’s a little unfair to compare it to mature platforms that people have been working on for over five years. I’m sure people will find ways to bring out even more power as the platform matures.

“To be completely blunt and honest, there’s no way that the Wii U processor is ‘horrible and slow’ compared to other platforms. I think that comment was just 4A trying to find a scapegoat for a simple business decision on their part.”

However, Hayashi does not dispute that Wii U’s spec sheet isn’t much of a leap over current-generation consoles, if it is at all – but argues that the console’s functionality does more than enough for it to be classed as the start of a new generation.

“If you’re basing this simply on processor speed, then it’s not next generation,” he says. “If you’re basing this on Wii U being a new idea that challenges existing platforms, then it definitely is next generation. It is a console videogame platform that is now independent of the TV. Nobody has done that before."
Fourth Storm
Member
(01-08-2013, 01:42 AM)
Fourth Storm's Avatar

Originally Posted by IdeaMan

I'm back guys, i had problems with my PC (just half resolved) + busy irl.

Best wishes for 2013 to the regulars :)

So what's new since the beginning of December i would say, technically ?

Welcome back, Ideaman! Unfortunately, no specifics have leaked since clockspeeds. Very frustrating and I have been losing interest. Wii U is what it is, and I am enjoying Zombi U and Nintendoland (having beat Mario) while my gf is hooked on Scribblenauts.

But by God I wish the GPU core config and eDRAM bandwidth were known!
Thraktor
Member
(01-08-2013, 03:07 AM)
Thraktor's Avatar

Originally Posted by Fourth Storm

Welcome back, Ideaman! Unfortunately, no specifics have leaked since clockspeeds. Very frustrating and I have been losing interest. Wii U is what it is, and I am enjoying Zombi U and Nintendoland (having beat Mario) while my gf is hooked on Scribblenauts.

But by God I wish the GPU core config and eDRAM bandwidth were known!

What I'm really interested in is the way in which the eDRAM is integrated with the GPU, and the extent to which the GPU is customised over the regular R700 line. That said, I don't expect either of those to become public knowledge for a long time, if ever, so I suppose the GPU core config will have to do when it inevitably leaks out.
AzaK
Member
(01-08-2013, 03:13 AM)
AzaK's Avatar

Originally Posted by Thraktor

What I'm really interested in is the way in which the eDRAM is integrated with the GPU, and the extent to which the GPU is customised over the regular R700 line. That said, I don't expect either of those to become public knowledge for a long time, if ever, so I suppose the GPU core config will have to do when it inevitably leaks out.

Yeah, I would have thought some geek site would have been able to get us some fancy pictures by now at least.
Smurfman256
Member
(01-08-2013, 06:44 AM)
Smurfman256's Avatar
Why has news been so slow? Where are the CPU and GPU X-rays? WHO IS THE MILKMAN? WHAT ARE THE GOGGLES FOR? WHERE DID YOU GET THE RED SIGN?
tipoo
Banned
(01-08-2013, 04:12 PM)
I'm pretty frustrated with how little we still know too. Every new APple product has gotten its chip looked at under a microscope, for a gaming console the details are perhaps even more relevant (for some, at least). It would be easy for someone like Chipworks to at least tell us how many shader units and how much eDRAM is in the GPU for instance, and a bit about the CPU. All we know right now is clock speed and memory bandwidth, with a decent guess at core count (3 all but confirmed at this point).

About the eDRAM: Is the consensus here that both the CPU and GPU have their own dedicated pools? 3MB eDRAM, cache for the CPU, 32mb for the GPU is what I keep hearing, but I can't find official mention of both having some.

I wonder if they can talk to the other chips eDRAM pool quickly, that would make for some interesting programming. Ie, if the GPU didn't need all of its 32MB, the CPU could use it as an extended L3 cache or at the least a scratchpad. I think the GPU will be using most of that anyways though, I think it's just big enough to fit one 1080p buffer or one 720p buffer with 4x AA in it.


May I suggest a series of polite emails from a few of us to Chipworks to check the chips?
wsippel
(01-08-2013, 04:16 PM)

Originally Posted by tipoo

I'm pretty frustrated with how little we still know too. Every new APple product has gotten its chip looked at under a microscope, for a gaming console the details are perhaps even more relevant (for some, at least). It would be easy for someone like Chipworks to at least tell us how many shader units and how much eDRAM is in the GPU for instance, and a bit about the CPU. All we know right now is clock speed and memory bandwidth, with a decent guess at core count (3 all but confirmed at this point).

About the eDRAM: Is the consensus here that both the CPU and GPU have their own dedicated pools? 3MB eDRAM, cache for the CPU, 32mb for the GPU is what I keep hearing, but I can't find official mention of both having some.

I wonder if they can talk to the other chips eDRAM pool quickly, that would make for some interesting programming. Ie, if the GPU didn't need all of its 32MB, the CPU could use it as an extended L3 cache or at the least a scratchpad. I think the GPU will be using most of that anyways though, I think it's just big enough to fit one 1080p buffer or one 720p buffer with 4x AA in it.

Yes, both chips have dedicated eDRAM. That's official. The CPU eDRAM was confirmed in the old IBM press release, the GPU eDRAM was confirmed by Nintendo in a recent Iwata Asks. The GPU eDRAM is neither a framebuffer nor any form of L3 cache - it's the system's primary memory pool (MEM1).
Can Crusher
Member
(01-08-2013, 04:18 PM)
Can Crusher's Avatar
I have a feeling the GPU on the Wii U isn't powerful enough to produce the visual spectacle and work on the physics at the same time.
phosphor112
Banned
(01-08-2013, 05:23 PM)
phosphor112's Avatar
GIMME MICROGRAPHS.

Fuck, the console is officially released, does no one really have the tech to take a micrograph of a taken apart Wii U!?
tipoo
Banned
(01-08-2013, 06:34 PM)

Originally Posted by Can Crusher

I have a feeling the GPU on the Wii U isn't powerful enough to produce the visual spectacle and work on the physics at the same time.

It's possible that they worked out their own solution for that ahead of desktop graphics cards allowing for the GPU to do both without a performance hit that even high end cards take, but I have not heard anything like that from devs so it seems unlikely. Yeah, an older mid range at best chip can't do lots of GPGPU work while producing sufficiently advanced visuals at the same time.
Last edited by tipoo; 01-08-2013 at 06:37 PM.
AzaK
Member
(01-08-2013, 08:49 PM)
AzaK's Avatar

Originally Posted by wsippel

Yes, both chips have dedicated eDRAM. That's official. The CPU eDRAM was confirmed in the old IBM press release, the GPU eDRAM was confirmed by Nintendo in a recent Iwata Asks. The GPU eDRAM is neither a framebuffer nor any form of L3 cache - it's the system's primary memory pool (MEM1).

Wsippel, what's the difference? If it's the main memory pool, it will be used as a framebuffer anyway right?
tipoo
Banned
(01-08-2013, 08:56 PM)

Originally Posted by wsippel

Yes, both chips have dedicated eDRAM. That's official. The CPU eDRAM was confirmed in the old IBM press release, the GPU eDRAM was confirmed by Nintendo in a recent Iwata Asks. The GPU eDRAM is neither a framebuffer nor any form of L3 cache - it's the system's primary memory pool (MEM1).



Can I bother you for sources and quotes? Because I was talking to someone on another forum a while ago and he challenged me on this, and I remember I could not find a reference to both components having eDRAM, only one. I think the one was the CPU.

And yeah, as mentioned above, since the main memory is so slow it wouldn't make much sense for that 32mb (if it is that much) pool to be used for anything but a framebuffer if it can barely fit one 1080p frame in, would it?
Gahiggidy
My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
(01-08-2013, 08:58 PM)
Gahiggidy's Avatar
Do we have any idea how much this part costs...?
The Abominable Snowman
Pure Life tonsil tickle
(01-08-2013, 09:00 PM)
The Abominable Snowman's Avatar

Originally Posted by phosphor112

GIMME MICROGRAPHS.

Fuck, the console is officially released, does no one really have the tech to take a micrograph of a taken apart Wii U!?

I think its moreso lack of interest. It'll come in time.

@Gahig: I think that's just a slightly modified one-way WiFi 802.11ag(or ad) chip.
Gahiggidy
My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
(01-08-2013, 09:05 PM)
Gahiggidy's Avatar
That's not the Gamepad's video transmitter? I'm asking because I wonder if that is not a major cost of the console's bill of materials.
AzaK
Member
(01-08-2013, 09:15 PM)
AzaK's Avatar

Originally Posted by tipoo

Can I bother you for sources and quotes? Because I was talking to someone on another forum a while ago and he challenged me on this, and I remember I could not find a reference to both components having eDRAM, only one. I think the one was the CPU.

And yeah, as mentioned above, since the main memory is so slow it wouldn't make much sense for that 32mb (if it is that much) pool to be used for anything but a framebuffer if it can barely fit one 1080p frame in, would it?

A 1080 32bit framebuffer is just about 8MB. You have to add in Z etc too of course but 32 is enough, and Nintendo are only targetting 720 anyway which is only about 3.5 MB - 1280 * 720 * 4 / (1024 * 1024)
tipoo
Banned
(01-08-2013, 09:57 PM)

Originally Posted by AzaK

A 1080 32bit framebuffer is just about 8MB. You have to add in Z etc too of course but 32 is enough, and Nintendo are only targetting 720 anyway which is only about 3.5 MB - 1280 * 720 * 4 / (1024 * 1024)



Hmm I may be wrong there, I read somewhere that the 32mb framebuffer was enough for 720p with 4xaa or 1080p with 0 and assumed that was right.

But still, the Wii U still has half the memory bandwidth of the 360, I'd imagine the 32MB is filled trying to make up for that by storing the most needed textures etc.
The Abominable Snowman
Pure Life tonsil tickle
(01-08-2013, 10:00 PM)
The Abominable Snowman's Avatar

Originally Posted by Gahiggidy

That's not the Gamepad's video transmitter? I'm asking because I wonder if that is not a major cost of the console's bill of materials.

It is, but its over regular (or modified) Wifi. The cost isn't exorbitant.

Look a couple dozen pages back
Gahiggidy
My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
(01-08-2013, 10:06 PM)
Gahiggidy's Avatar
I only see 12 pages to this thread.

Thread Tools