• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

What is wrong with the IQ on Wii?

BenT

Member
Warm Machine / Bepbo: Thanks. I have noticed the pixalation a bit in the past and wondered. How about when, say, Xbox 360 does 480p widescreen? Does that surpass 640x480? Does it vary by game?
 

dark10x

Digital Foundry pixel pusher
One thing this doesn't explain though is why games look so jaggy. I keep my games in 4:3 as, like you said, the 16:9 is fake and only hurts the image quality by pulling the same amount of pixels over a wider area. Yet games like Rayman are as jaggy as the jaggiest PS2 games :(
I haven't seen anything other than Excite Truck and Zelda, so it's difficult to say.

Both were in 16:9 mode and suffered from typically poor image quality. I figured they'd be a lot sharper in 4:3, though. I'd say the Gamecube-like dithering is the most disappointing aspect here. The games are jaggy, but having yours colors ruined by 16-bit color really blows. Zelda is particularly bad due to its deeper color scheme.

At this point, I'm still not convinced that the graphics chip is even up to the same level as what was used in the XBOX. What exactly is driving developers to use 16-bit color anyways? It can't be a ram issue and I see no reason why the Wii wouldn't be able to render a Gamecube class title in 24-bit color.
 
JoshuaJSlone said:
That is anamorphic.
True, although I'm disappointed (but not surprised) that it's not rendered in 852x480 (WVGA).

As for the IQ, from playing Excite Truck, Zelda, and Wii Sports, so far I'd put it a small notch below how the Xbox looks on my HDTV.
 
You can really tell the colors are bleeding if you play Bonk's Adventure through the VC thing. The flashing background in one of the levels looked weird, and I guess this is why?

Is it the Screen Burn-in Reduction at work?
 
Bebpo said:
AFAIK none of the 480i/p games on any system show >640x480 pixels in 16:9 mode. The games just reshape the aspect ratio so when you stretch out the 640x480 image to fit a 16:9 screen everything is in the correct proportion.

But that means you have the same amount of pixels spread over a larger space. IMO it always looks worse than the 4:3 image with black bars on the side.

I played almost all the "16:9" PS2 games like Yakuza, Valkyrie Profile 2, etc... in 4:3 and I'm playing Zelda in 4:3 as well. I think the only game that I've gone with 16:9 was SoTC because the game was more about atmosphere than looking good resolution/pixel-wise.

I'll be playing Wii on a 4:3 CRT TV that does 16:9 as "anamorphic squeeze", so the image actually takes a smaller space of the screen in widescreen than 4:3... The image is squeezed, not stretched, so the pixels are closer to each other which gives a crisper image.

So in this case 16:9 looks better than 4:3, the only thing sacrificed is screen size.
 
Bartman3010 said:
You can really tell the colors are bleeding if you play Bonk's Adventure through the VC thing. The flashing background in one of the levels looked weird, and I guess this is why?
I don't think so. According to old tech specs I've looked at, the resolution of most TG16 games should be the same as most NES and SNES games. However, the NES emulation on VC appears very clean and sharp, while the TG16 emulation doesn't seem nearly so. They must be resizing them differently.

dark10x said:
What exactly is driving developers to use 16-bit color anyways? It can't be a ram issue and I see no reason why the Wii wouldn't be able to render a Gamecube class title in 24-bit color.
Wii still uses the same frame buffer as GameCube, so though the system as a whole has had a RAM increase, they still must fit the final screen into the same amount of memory. Thus, resolution and color depth are limited in exactly the same way as GameCube. To be 24-bit color rather than 16- would use 50% more memory. Total bummer, yes.
 

Buggy Loop

Member
JoshuaJSlone said:
Wii still uses the same frame buffer as GameCube, so though the system as a whole has had a RAM increase, they still must fit the final screen into the same amount of memory. Thus, resolution and color depth are limited in exactly the same way as GameCube. To be 24-bit color rather than 16- would use 50% more memory. Total bummer, yes.

Wtf are you talking about? Majority if not all gamecube games had 24bit color

Frame buffer = 640x480x24 = 7 372 800bit = 921 600 Byte = 0.921 MB
Zbuffer = 640x480x24 = 7 372 800bit = 921 600 Byte = 0.921 MB

Total rendering memory required is 1843200 Byte, 1.843MB, Add the 1MB texture cache and you're left with .0157MB for... probably nothing.

32bit though, thats not possible.
 

JRW

Member
Im glad im not the only one that noticed ... Zelda Twilight has heavy dithering via 480P , I actually play it at 480i on purpose because of it, luckily I have a CRT HDTV so it still looks good @ 480i especially with the dithering gone (using component cables).
 
If that's what dithering is then I guess Red Steel has some serious, serious dithering because some textures look like they're running in safe mode.
 

mrklaw

MrArseFace
If it was 858 wide then it wouldn't be 480p, it'd be some variant of VGA. Any resolution under 720p is sent to the TV as a 4:3 frame (even 16:9 games or DVDs) which you then stretch out horizontally to fit the screen and restore the correct aspect ratio.


As for the dithering, if its along the edges between areas of strong contrast then thats most likely due to using composite. If i'm sitting at a normal distance from my TV (eg 10ft) then its not really noticable, but if I stand closer its *very* obvious.

I'm not rushing to get component cables though because my component switcher is full. I might leave it on composite or perhaps RGB SCART
 
I got mine the the other day, and already had the component cables. I'd agree that the IQ is pretty disappointing, though I haven't run the Cube through component on the big TV in a while. I remember F-Zero GX and SC2 looking great in widescreen VS widescreen PS2 games like Burnout 3 and SOTC. So, I don't remember GC games being THIS jaggy, Wii Sports looks like a more colorful PS2 title :(

Damned cheap-ass Nintendo.

I haven't even tried Zelda yet, I've been too busy to give it enough time... Hopefully it won't be too bad, this talk of dithering is distressing me though.
 

dark10x

Digital Foundry pixel pusher
Wii still uses the same frame buffer as GameCube, so though the system as a whole has had a RAM increase, they still must fit the final screen into the same amount of memory. Thus, resolution and color depth are limited in exactly the same way as GameCube. To be 24-bit color rather than 16- would use 50% more memory. Total bummer, yes.
Thanks for the reply. I did not realize that the Wii was limited in that area. If I remember right, the GC had a total of ~3mb (2 MB frame buffer cache plus 1 MB texture cache). Is that correct? I wonder why they took this route with the Wii?

I remember F-Zero GX and SC2 looking great in widescreen
Those games were rendered in 24-bit color, I believe.
 
huzkee said:
Reading these threads make me so happy to know that I'm not a graphix whore.
Lucky you, but it goes beyond graphix whoredom. I would have been much happier to have GC quality in 720p or with good antialiasing.
Aliasing for me can ruin great art and even mess up gameplay. Trying to see down the road in in Burnout PS2 was very difficult through the sea of shimmering pixels, and dodging traffic in Burnout is essential.
 

_bla_

Member
I think the problem is caused by developers implementing some shader like effects by multiple passes over the same pixel. That way dithering errors add up and become visible. Newer graphic chips use more than 8 bit per color channel internally, can very likely implement many of these effects in a single pass and also got smarter dithering algorithms that do a pretty good job at preventing errors adding up. Wii got more power than GC, so developers are using more multipass stuff where errors add up and dithering becomes visible.
 

Fatghost

Gas Guzzler
dark10x said:
I haven't seen anything other than Excite Truck and Zelda, so it's difficult to say.

Both were in 16:9 mode and suffered from typically poor image quality. I figured they'd be a lot sharper in 4:3, though. I'd say the Gamecube-like dithering is the most disappointing aspect here. The games are jaggy, but having yours colors ruined by 16-bit color really blows. Zelda is particularly bad due to its deeper color scheme.

At this point, I'm still not convinced that the graphics chip is even up to the same level as what was used in the XBOX. What exactly is driving developers to use 16-bit color anyways? It can't be a ram issue and I see no reason why the Wii wouldn't be able to render a Gamecube class title in 24-bit color.


I thought it was already confirmed that the Wii's GPU isn't Xbox1 class. It can't handle the same level of effects and shaders, right?
 

painey

Member
ive been playing Wii on my HDTV and it looks awful (with component) and i tried it on my CRT downstairs today and it looked like a different console.. colours were vibrant, text was sharp.. im so gutted its not a HD machine
 

Durante

Member
Lots of people in this thread are confusing dithering and banding. Current Wii games have copious amounts of both, but they're still different artifacts and should be named correctly. Dithering mostly appears in connection with transparencies, while banding appears in big color gradients, especially dark ones. I made a picture to illustrate the difference:
T0_-1_1822387.png
 

Johnny

Member
Someone's been spending too much time on their 360/PS3, you should really keep an SDTV around for all the other systems, including Wii.
 

dark10x

Digital Foundry pixel pusher
Johnny said:
Someone's been spending too much time on their 360/PS3, you should really keep an SDTV around for all the other systems, including Wii.
Do you really think that is a good solution? If someone has a main room for gaming/movie watching etc, you'd think they would want to use the Wii in that same room. I mean, the system is set to take over the market, so relegating it to some random TV in another room isn't really the best solution.

Thankfully, when selecting my TV, I made certain that 480i/480p output was of the highest possibly quality. I feel bad for those who are using displays that handle those resolutions poorly.
 

Durante

Member
Johnny said:
Someone's been spending too much time on their 360/PS3
For me, it's even worse: I spend a lot of time on my PC, playing at 1920*1200 with 2xAA and 16xAF :lol

Luckily I'm used to Wii-level graphics from my PS2, but that's a bit disappointing considering the specs. I hope developers put a bit more effort into 2nd gen Wii titles.
 

Johnny

Member
Didn't Dreamcast render games at a higher resolution than 640x480 before actually outputting at that resolution?
 

Pellham

Banned
screw keeping an SDTV around for the Wii (although I do have a projector for that). I'll live with the jaggies and dithering.

I get copious amounts of that crap when I hook my PS2 up to my HD LCD anyway.
 

borghe

Loves the Greater Toronto Area
Johnny said:
Didn't Dreamcast render games at a higher resolution than 640x480 before actually outputting at that resolution?
no, they rendered to 640x480 and just did 4x(iirc)supersampling antialiasing. all that means is that the frame is scaled to 4 times the resolution before antialiasing is applied.

as for the image quality, IMHO people are making a mountain out of a mole hill. in zelda the only places where dithering was in anyway noticeable were fogged or bloomed areas. for the vast majority of the game the graphics looked perfectly fine.

I would also be curious to see the memory specs for broadway on something official because I can't believe that they would upgrade the video processor in any way shape or form at all and still leave the same amount of memory on it. not saying they didn't, but I would definitely like to see something official before really believing it. Zelda we know is a GCN game and Excite Truck we know is pretty rushed...
 

RuGalz

Member
LanceStern said:
I still don't get dithering. All those pictures showed me was a white circle on a black background

Any other explanations?
See those dots in the circles in dithering? It's a way to trick your eyes to make gradient smoother without introducing more colors. Say you have only 2 colors black and white, without introducing a new color to make something appear to be kind of grayish you dither it (i.e introduce some pattern of black on white to "dirty" it up).

...because I can't believe that they would upgrade the video processor in any way shape or form at all and still leave the same amount of memory on it. not saying they didn't...

I think you answered your own question. What does it take for people to believe Wii is GC 1.5? :lol Certainly, Nintendo isn't going to release a PR saying so... Or are you still believing what Nintendo said way back during GDC before any real details of Wii was announced - something to the effect of "We won't be cutting edge but we'll be comparable.".
 
RuGalz said:
See those dots in the circles in dithering? It's a way to trick your eyes to make gradient smoother without introducing more colors. Say you have only 2 colors black and white, without introducing a new color to make something appear to be kind of grayish you dither it (i.e introduce some pattern of black on white to "dirty" it up).

Ok that helps almost fully... Now the topic creator is saying that Zelda is using too much dithering? Like instead of a complete orange sky, he sees a lot of dithering of red and say white?

Any Zelda pics guys?
 

borghe

Loves the Greater Toronto Area
RuGalz said:
I think you answered your own question. What does it take for people to believe Wii is GC 1.5? :lol Certainly, Nintendo isn't going to release a PR saying so... Or are you still believing what Nintendo said way back during GDC before any real details of Wii was announced - something to the effect of "We won't be cutting edge but we'll be comparable.".
but we know for a fact that they DID upgrade the video processor. at the very least a die shrink given that the chip itself is smaller. my point being, all GCN 1.5 aside, is that it IS a different chip than flipper.. so if they go through the effort to modify the chip from flipper (as opposed to just sticking flipper right back in there, which is what some of you seem to be implying but we KNOW isn't true), why is one to believe that they didn't even just double the 3MB framebuffer on board to allow for 24-bit depth?
 

dark10x

Digital Foundry pixel pusher
borghe said:
but we know for a fact that they DID upgrade the video processor. at the very least a die shrink given that the chip itself is smaller. my point being, all GCN 1.5 aside, is that it IS a different chip than flipper.. so if they go through the effort to modify the chip from flipper (as opposed to just sticking flipper right back in there, which is what some of you seem to be implying but we KNOW isn't true), why is one to believe that they didn't even just double the 3MB framebuffer on board to allow for 24-bit depth?
They don't really seem to care about that, if you ask me.

There were some incredible looking games running in 24-bit color on GC (Rebel Strike, Metroid Prime 1 & 2, Star Fox Adventures, and F-Zero GX among others - games which also ran at 60 fps BTW). The majority of Nintendo's games ran in 16-bit color, however. Not exactly proof of anything, I know, but it does suggest that color depth isn't really all that important to Nintendo (visuals in general do not seem to be).

The XBOX360 only offers 10mb of framebuffer memory, by the way (there are ways around it, though). :p
 

dark10x

Digital Foundry pixel pusher
Adagio said:
I love how Star Fox Adventures runs in 24-bit color...and Wind Waker and Twilight Princess do not...
SFA also puts the TEV to serious use with some nice shader effects (the grass shader would have been AWESOME for TP's open fields), made heavy usage of data streaming (no load points), featured higher resolution textures than both Zelda games, made use of some impressive shadow effects (soft environment shadows casting over characters), and ran at 60 fps. This was Rare's first effort on the GC back in 2002 and they did an incredible job.

The art direction certainly isn't up to Nintendo's standards (as usual), but their engine was fantastic.
 

borghe

Loves the Greater Toronto Area
dark10x said:
The XBOX360 only offers 10mb of framebuffer memory, by the way (there are ways around it, though). :p
well, you only need as big a framebuffer as a frame you can render :p

Like I (and you) said.. there is no clear evidence that we have that Wii is only capable of 16-bit color for most games, just like we have no proof (that I know of) that Broadway only has 3MB vram. Hopefully we will see an increase in 24-bit games. In the meantime i really do feel like this is being made a bigger deal out of than it actually is. I really feel that anyone who objectively calls Zelda "ugly" is talking through their ass. and all of the comments about the system looking better through composite than component are, misguided to say the least. but I digress. this is clearly an area where people won't agree. you will either be outraged by the color depth aliasing or you won't care. I just more or less wanted to voice my opinion on the subject and bring to light that we don't really know if the Wii is limited in the same way the GCN was in regards to this. only when we have a larger selection of games will we be able to make a better assessment.
 

Shapermc

Member
Matt said:
And I haven't ever used another system on this TV, but I don't got them though cable, antenna, or DVD (via HDMI).
I recommend using your warranty to have a tech come out and fix this. I had some weird issues with component input for my TV, had a tech come out and he adjusted the index color and it cleared right up.

Also, you guys like to complain. I understand this topic is old, but really, did you all forget that most of TP was designed with the GC in mind?
 

sprocket

Banned
Its because most of those games are really just GC games . using GC apis .

Go look at RED STEEL it is the only launch game I know of that doesn't use dithered blending and has full screen AA .
 

dark10x

Digital Foundry pixel pusher
sprocket said:
Its because most of those games are really just GC games . using GC apis .

Go look at RED STEEL it is the only launch game I know of that doesn't use dithered blending and has full screen AA .
Red Steel looks terrible, though. As I pointed out, there were games on GC (which look much better than Red Steel) which ran in 24-bit color.
 

sprocket

Banned
dark10x said:
Red Steel looks terrible, though. As I pointed out, there were games on GC (which look much better than Red Steel) which ran in 24-bit color.


You may not agree with the art style but technically RS is far more advanced than any of the other launch games . Like I said it didn't use the old GC dither blending and it had full screen AA . It also has the best bloom shader I have seen yet .
 

dark10x

Digital Foundry pixel pusher
sprocket said:
You may not agree with the art style but technically RS is far more advanced than any of the other launch games . Like I said it didn't use the old GC dither blending and it had full screen AA . It also has the best bloom shader I have seen yet .
I dunno, there are portions which looked nice, but the majority of the game is filled with incredibly bland, flat surfaces, poor textures, framerate issues, and typical rough image quality (are you sure about AA? It sure doesn't look like it uses any to me)...

I've only played it for a bit at a friend's house (on a Panasonic plasma), however, so perhaps there are more impressive pieces that I haven't seen.
 
Red Steel has Full Screen AA??? I'm getting some of the worst jaggies I've ever seen while playing that game!

And the banding is horrendous!

And the framerate/slowdown is atrocious!

I was chalking most of the bashing on this title to people just being haters, but after actually playinig it, I'm blown away at the imcompetence of the developers. And I'm not even going to go into the control flaws and absolutely laughable AI.
 

RuGalz

Member
borghe said:
but we know for a fact that they DID upgrade the video processor. at the very least a die shrink given that the chip itself is smaller. my point being, all GCN 1.5 aside, is that it IS a different chip than flipper.. so if they go through the effort to modify the chip from flipper (as opposed to just sticking flipper right back in there, which is what some of you seem to be implying but we KNOW isn't true), ...

They did need to make some adjustments to the chip because of other minor changes to the overall system besides the clock speed. Slapping on a different name doesn't automatically make it a big upgrade. :p

why is one to believe that they didn't even just double the 3MB framebuffer on board to allow for 24-bit depth?
Because we know the system has 88M of total memory. 64 was added for Wii so you are left with 24M memory which is the same as GC. Of course you can argue that they might have shuffle that 24M around but that'd break backward compatibility so no they didn't shuffle it around.

Edit: I'm sure some games will not use this dithering crap at some point but it all depends on what they can or are willing to trade off.
 
The 88 does not include the 3 MBs of embedded memory on Hollywood - and we don't know for sure if it is really 3MB or if they did actually add more. But the size of Hollywood in comparison to flipper is a giveaway that something extra is going on there, whether it be beefier TEVs, more EDRAM or whatever.
 

RuGalz

Member
Fourth Storm said:
The 88 does not include the 3 MBs of embedded memory on Hollywood - and we don't know for sure if it is really 3MB or if they did actually add more. But the size of Hollywood in comparison to flipper is a giveaway that something extra is going on there, whether it be beefier TEVs, more EDRAM or whatever.
You win. I'm bailing out before losing my job. :)
 

BenT

Member
mrklaw said:
If it was 858 wide then it wouldn't be 480p, it'd be some variant of VGA. Any resolution under 720p is sent to the TV as a 4:3 frame (even 16:9 games or DVDs) which you then stretch out horizontally to fit the screen and restore the correct aspect ratio.
Is that so? Thanks for clearing this up.
 
Top Bottom