• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Once and for all: Does Wii support bump/normal mapping?

SpokkX

Member
None of the released games (or pictures of coming games) seem to have avanced texture filters...

Is the graphics card in Wii identical to the GCs card (at a higher speed that is) or are there additional features?

The wierd thing is that for example Star Fox adventures used bump mapping in some caves...
 
The short answer is "yes" on bump mapping at least. It just takes some effort from developers to learn the TEV. Most are used to working in a PC environment, which the GC/Wii is quite different from. Rare pulled it off. Factor 5 pulled it off. I can't think of any others...but hopefully with the increased RAM in Wii it won't be quite as rare as it was on GC.
 

Oblivion

Fetishing muscular manly men in skintight hosery
Starfox Adventures and F5's games had bump mapping, so it's definitely possible on the Wii. Just needs a little more effort to program the TEV from what I understand.

edit: oh well, beaten. :/
 
Mario Sunshine had a few bump-mapped glasses. Also some of the trophies in SSBM had bump-maps.

It's possible, but it seems very hard to accomplish on the Cube/Wii hardware.
 

BlueTsunami

there is joy in sucking dick
The texture effect for the Grass in Super Mario Galaxy actually looks real nice (in the videos that have been released)
 

Xavien

Member
http://www.gametrailers.com/player.php?id=10457&type=wmv&pl=game

Go to 00:25 on the player and observe yummy bump-mapping goodness. So yes, the Wii can do bump-mapping, hell even the gamecube could do bump-mapping. Matt Cassamassina should be shot for starting this ridiculous rumour.

He stated that the Gamecube can't do shaders, this is true, because the term 'shaders' is reserved for Hardware that runs DirectX games, since with DirectX shaders are required for bump mapping, the extension of this rumour is that it cant do bump-mapping either. The gamecube (and subsequently the Wii) uses TEV, the results of it are virtually the same as shaders, but requires different code, also unlike DirectX the SDK is not really fleshed out in comparison.

Factor 5 during the development of its gamecube games really got to know the TEV system, which allowed them to release games using technology never thought possible on the Cube (Normal mapping and such). However since the SDK was insufficent to do this stuff, they did it via writing code that directly interfaced the hardware, utilizing the best they could out of the system.

So to conclude, Wii can do Bump and Normal Mapping, its just that most developers are too inexperienced with TEV to actually make it happen.

P.S. PS3 uses OpenGL, Xbox 360 uses DirectX and Wii uses TEV for reference.
 
From what some techies have said ... yes it can, but it's not done in the same way as the XBox. Also it was very limited on the GameCube because the GCN's 24MB main RAM. The Wii with an additional 64MB of RAM should make that type of an effect more workable.

Mario Galaxy has a lot of bump mapping.
 

LCGeek

formerly sane
Your question is really two questions.

Bump mapping - This is a technique that has many forms like normal mapping, displacement mapping, parallax mapping, and evironment mapped bump mapping (EMBM). There are more but the discussion would reqquire a lot of pics and time to know the differences. Most devs are more familiar with normal and parallax mapping on console because that's what the DX architecture allows for more of easier due to how it works.

Displacement > parallax > EMBM> normal

Cube has done Displacement and EMBM in it's games the easiest example of both is Rebel Strike or Rogue Squadron. The death star has displacement mapping on it and examples of EMBM can be found in a ton of levels in both games. Simlpe fact cube sports ability to do bump mapping techniques it's the developers/publishers and nintendo on this part that are to blame. Most blame lies publishers for not giving developers time or money to develop their own effects with the TEV. Nintendo gets some blame because now having a HLSL should be a requirement as part of their development package.

Normal Mapping - Cube can do it it but it lacks ram to see it done real time Wii can do it , but why bother when EMBM takes less resources, looks better, and can be used for more effects. Hate to tell the nintendo hatas but this method of BM technique is one the lowest. Tell xbox to do EMBM instead it will choke on it think halo2 or any other game could still keep it's shaky 30fps think again. Cube can use the superior techique of bm, embm, in more amounts and faster than xbox. Until devs actually bother to learn how to achieve this with the tev don't expect good looking games even half of RE, Zelda, Fzero, and the Rogue Squadron games.

Want more reading start with these

How bump mapping works - Basics on BM with emboss and embm comparisons.
The TEV on the Gamecube -If you want pixel shading or bump mapping on cube it's through this.
Why RE4's lighting may be the GCN's best - go to post 93 for EMBM vs Normal mapping and from there.
 

Ravidrath

Member
Programmer friend told me the following a while ago, so it's a bit vague...

But he said the GPU was "3X as powerful" as the GC's. He said that it didn't support programmable shaders, per se, but had texture effects hardwired into it that could do "85% of the things people use shaders for", and that they were faster because they were built into the hardware. I suspect that's the TEV the other posters are talking about, but it sounds like its been expanded.
 
I think the next batch of games will look alot better, and it'll spark a renewed interest in wii right around the time they are readily available to buy in stores

a deliberate strategy on nintendo's part??
 

lexi

Banned
Yeah, people seem to expect PSone-ish graphics from the Wii, it's graphical potential has in no way been reached or seen (Except Mario Galaxy) as of yet. Mario Galaxy is a good example of the caliber of graphics to expect within the next 18 months IMO.

My only gripe with the Wii for what it is, is that the frame buffer is absolute shit for this day and age. WiiLite perhaps?
 

aeolist

Banned
For God's sake, Zelda has bump-mapping (or displacement or parallax or whatever) in spades. What kind of ****ing retarded question is this?
 

Oblivion

Fetishing muscular manly men in skintight hosery
Ravidrath said:
Programmer friend told me the following a while ago, so it's a bit vague...

But he said the GPU was "3X as powerful" as the GC's.

Lol, what? Where does this guy work, and are you sure he's not talking about Wii2?



One interesting thing I like to mention. If you just go by clockrates, the Wii GPU isn't that bad compared to Xbox 360, and PS3 (1/2). Of course by "isn't that bad" I mean that at least it doesn't seem to be orders of magnitude worse than in other aspects like CPU (10 times, at least), and RAM (6-7 times). Course, we don't know the pixel pipelines, transistor numbers, and all that other shit.
 

SpokkX

Member
thanks for the answers, explains a lot about why there was a big difference between companies in graphics quality on the GCN also..
 

borghe

Loves the Greater Toronto Area
SpokkX said:
thanks for the answers, explains a lot about why there was a big difference between companies in graphics quality on the GCN also..
yeah, if only Nintendo would introduce a section into the API that would give devs a better interface to prgram to.. we might see a lot more games start using such effects. :(
 
I'm late but was going to say if the DC supported it in the past then im sure alot of consoles do nowdays. (the dc though never realy used it at all because of performance hits back then)
 

omaremad

Member
Ok lets fix this mess once and for all.

PLEASE NEOGAF DONT USE THE WORD SHADERS EVER AGAIN.

Ok the DS has "shader supoourt" it has a built in per pixel toon shader, then you might say why cant it do bumpmapping?.

Well if you wire special circuitry you can do anything really fast, thats why ipod's dont need powerful cpu's they have specialised decoding chips (same case with the ds toon shader). The down side is the hardware circuits can only do a limited range of things.

However if you have what is called "Shader Hardware"(the stuff in pc's xbox ps3 etc..) it actually means you have the ability to write programs that utilise smaller less specialised circuits on the gpu. These programs run independtly of the cpu, they utilise opcodes or commands that are wired in the hardware but the whole program is not wired it is excuted, so its not as fast as a pre built circuit but at the same time it is faster that computing everything in software mode and not utilising any hardware (specialised hardware that is)

what gamecube and wii have is the TEV, Gamecube and the wii DO NOT have programmable gpu's and thus the comments about the lack of shaders. SO YES gamecube and wii dont have the "shaders" because shaders insatnatly might mean programmable gpus. what they do have is the ability to do perfragment operations (per pixel) and thus they can look like "shaders" even though the fragments are modified by specialised circuits like the ones in the (pc xbox etc...) the sequence of opcodes does not run on the gpu (thus they are unprogrammable). The perpixel commands are issued from the cpu(wii gc), so does the whole "shading program", so that explains the comments about GPU(wii gc) being completely fixed function. So (wii,gc) "shading power" is governed by 2 things , how fast the cpu does the "shaders" and the fact most shaders take allot of ram since they use several textures. Wii,gc cpu has specialised math circuits (dot, mult devide etc...) so its math performance can match the math performance of a completly programmable gpu, its gc's ram that sucked)

So in esscence wii and gc are shaded by the cpu, this means the shaders are very flexible and are easily interfaced with the rest of the game since both the "shader" and the game are written in the same language and are running on the same cpu.

However the wii/gc will never have newer shaders since the maic happens on the cpu(cpu code is as flexible as you want but slower than gpu code(pc xbox etc..) if you ignore the use of gc's math hardware on the cpu) , which eventually calls a handful of fixed function calls that modify the fragments on the gpu

i havent proof read this so i hope it makes sense
 
Unlike the hardware created today, Wii capabilities are more than likely hardwired. Its gonna take a dev thats dedicated to learning the strengths and weakness of Hollywood. Its harder to determine how much of a performance hit you'll get, when implementing features like BM or normal mapping. F5 has a engine built for the Wii, don't know if they have plans to use it, maybe they'll put it on the market for devs.
 
Fourth Storm said:
The short answer is "yes" on bump mapping at least. It just takes some effort from developers to learn the TEV. Most are used to working in a PC environment, which the GC/Wii is quite different from. Rare pulled it off. Factor 5 pulled it off. I can't think of any others...but hopefully with the increased RAM in Wii it won't be quite as rare as it was on GC.
Square did with FF:CC iirc... the crab boss's shell looked rather nice.
 

theBishop

Banned
Oblivion said:
One interesting thing I like to mention. If you just go by clockrates, the Wii GPU isn't that bad compared to Xbox 360, and PS3 (1/2). Of course by "isn't that bad" I mean that at least it doesn't seem to be orders of magnitude worse than in other aspects like CPU (10 times, at least), and RAM (6-7 times). Course, we don't know the pixel pipelines, transistor numbers, and all that other shit.

...which is exactly why you don't go by clockrates.

Wii's GPU is clocked at 243 Mhz. PS3's GPU is 500Mhz (or 550 depending on who you believe)

Compared to the PC side, Nvidia Geforce4 Ti4200 (released Q3 2002) was 250Mhz.

Nvidia's just-released Geforce8 8800GTS is clocked at 500Mhz.

This is a screenshot from the demo Nvidia was touting on the Geforce4:
screenshot3.jpg


...and from the Geforce8:
screenshot1.jpg
 

Oblivion

Fetishing muscular manly men in skintight hosery
theBishop said:
...which is exactly why you don't go by clockrates.

Er, yes I know that. I was just making a statement. (and I even mentioned in my original post that clockrates leave out other shit as well)
 

Ravidrath

Member
Originally Posted by Ravidrath:
Programmer friend told me the following a while ago, so it's a bit vague...

But he said the GPU was "3X as powerful" as the GC's.

Lol, what? Where does this guy work, and are you sure he's not talking about Wii2?

Activision central tech. This was a while ago (9 months, I think?), but I'm pretty sure that's the multiplier he gave, and the guy knows his stuff, too. Based on how the conversation went and how he talked about the lack of shaders, I'd say this multiplier refers solely to polygon performance, but it could be some kind of approximated meta-number for the GPU as a whole. Still, not sure why that's so surprising - that still doesn't exactly put it in the range of PS3 or 360.

And please don't make the usual stupid comments about Activision, CoD3, etc. - not terribly productive, especially since he works on PS3 tech, his department didn't have anything to do with it, it was rushed, etc.


Nintendo can help with this stuff, somewhat. While they have extensive developer support now, it sounds like they need things like on-site training seminars and improved documentation to help developers learn to get more out of TEV, otherwise Japanese games are going to look far superior to Western games for quite some time.
 

PkunkFury

Member
omaremad said:
Ok lets fix this mess once and for all.

PLEASE NEOGAF DONT USE THE WORD SHADERS EVER AGAIN.

Ok the DS has "shader supoourt" it has a built in per pixel toon shader, then you might say why cant it do bumpmapping?.

Well if you wire special circuitry you can do anything really fast, thats why ipod's dont need powerful cpu's they have specialised decoding chips (same case with the ds toon shader). The down side is the hardware circuits can only do a limited range of things.

However if you have what is called "Shader Hardware"(the stuff in pc's xbox ps3 etc..) it actually means you have the ability to write programs that utilise smaller less specialised circuits on the gpu. These programs run independtly of the cpu, they utilise opcodes or commands that are wired in the hardware but the whole program is not wired it is excuted, so its not as fast as a pre built circuit but at the same time it is faster that computing everything in software mode and not utilising any hardware (specialised hardware that is)

what gamecube and wii have is the TEV, Gamecube and the wii DO NOT have programmable gpu's and thus the comments about the lack of shaders. SO YES gamecube and wii dont have the "shaders" because shaders insatnatly might mean programmable gpus. what they do have is the ability to do perfragment operations (per pixel) and thus they can look like "shaders" even though the fragments are modified by specialised circuits like the ones in the (pc xbox etc...) the sequence of opcodes does not run on the gpu (thus they are unprogrammable). The perpixel commands are issued from the cpu(wii gc), so does the whole "shading program", so that explains the comments about GPU(wii gc) being completely fixed function. So (wii,gc) "shading power" is governed by 2 things , how fast the cpu does the "shaders" and the fact most shaders take allot of ram since they use several textures. Wii,gc cpu has specialised math circuits (dot, mult devide etc...) so its math performance can match the math performance of a completly programmable gpu, its gc's ram that sucked)

So in esscence wii and gc are shaded by the cpu, this means the shaders are very flexible and are easily interfaced with the rest of the game since both the "shader" and the game are written in the same language and are running on the same cpu.

However the wii/gc will never have newer shaders since the maic happens on the cpu(cpu code is as flexible as you want but slower than gpu code(pc xbox etc..) if you ignore the use of gc's math hardware on the cpu) , which eventually calls a handful of fixed function calls that modify the fragments on the gpu

i havent proof read this so i hope it makes sense

Thank you


Read this post people
 

Oblivion

Fetishing muscular manly men in skintight hosery
Ravidrath said:
Activision central tech. This was a while ago (9 months, I think?), but I'm pretty sure that's the multiplier he gave, and the guy knows his stuff, too. Based on how the conversation went and how he talked about the lack of shaders, I'd say this multiplier refers solely to polygon performance, but it could be some kind of approximated meta-number for the GPU as a whole. Still, not sure why that's so surprising - that still doesn't exactly put it in the range of PS3 or 360.

Ah, okay thanks. And that number seems suprising because even though it's not PS3/360 level, it's still more that what most people think (seriously, it's giving the impression that it's a 50% overclock only).

And please don't make the usual stupid comments about Activision, CoD3, etc. - not terribly productive, especially since he works on PS3 tech, his department didn't have anything to do with it, it was rushed, etc.

Oh, I wasn't going to. ;)

I know that has to do more with developers talent than anything.
 

Branduil

Member
NecroNesia and Rampage use bumpmapping.

That should tell you all you need to know about the relation between bumpmapping and how good a game looks.
 
Top Bottom