That is anamorphic.Pachinko said:so what, the much touted 16:9 mode for Wii zelda isn't even anamorphic? It's just skinny 4:3 stretched? ugh.
I haven't seen anything other than Excite Truck and Zelda, so it's difficult to say.One thing this doesn't explain though is why games look so jaggy. I keep my games in 4:3 as, like you said, the 16:9 is fake and only hurts the image quality by pulling the same amount of pixels over a wider area. Yet games like Rayman are as jaggy as the jaggiest PS2 games
DSN2K said:just gonna use an RGB scart for the time being until I can get a VGA option.
True, although I'm disappointed (but not surprised) that it's not rendered in 852x480 (WVGA).JoshuaJSlone said:That is anamorphic.
Bebpo said:AFAIK none of the 480i/p games on any system show >640x480 pixels in 16:9 mode. The games just reshape the aspect ratio so when you stretch out the 640x480 image to fit a 16:9 screen everything is in the correct proportion.
But that means you have the same amount of pixels spread over a larger space. IMO it always looks worse than the 4:3 image with black bars on the side.
I played almost all the "16:9" PS2 games like Yakuza, Valkyrie Profile 2, etc... in 4:3 and I'm playing Zelda in 4:3 as well. I think the only game that I've gone with 16:9 was SoTC because the game was more about atmosphere than looking good resolution/pixel-wise.
I don't think so. According to old tech specs I've looked at, the resolution of most TG16 games should be the same as most NES and SNES games. However, the NES emulation on VC appears very clean and sharp, while the TG16 emulation doesn't seem nearly so. They must be resizing them differently.Bartman3010 said:You can really tell the colors are bleeding if you play Bonk's Adventure through the VC thing. The flashing background in one of the levels looked weird, and I guess this is why?
Wii still uses the same frame buffer as GameCube, so though the system as a whole has had a RAM increase, they still must fit the final screen into the same amount of memory. Thus, resolution and color depth are limited in exactly the same way as GameCube. To be 24-bit color rather than 16- would use 50% more memory. Total bummer, yes.dark10x said:What exactly is driving developers to use 16-bit color anyways? It can't be a ram issue and I see no reason why the Wii wouldn't be able to render a Gamecube class title in 24-bit color.
JoshuaJSlone said:Wii still uses the same frame buffer as GameCube, so though the system as a whole has had a RAM increase, they still must fit the final screen into the same amount of memory. Thus, resolution and color depth are limited in exactly the same way as GameCube. To be 24-bit color rather than 16- would use 50% more memory. Total bummer, yes.
Thanks for the reply. I did not realize that the Wii was limited in that area. If I remember right, the GC had a total of ~3mb (2 MB frame buffer cache plus 1 MB texture cache). Is that correct? I wonder why they took this route with the Wii?Wii still uses the same frame buffer as GameCube, so though the system as a whole has had a RAM increase, they still must fit the final screen into the same amount of memory. Thus, resolution and color depth are limited in exactly the same way as GameCube. To be 24-bit color rather than 16- would use 50% more memory. Total bummer, yes.
Those games were rendered in 24-bit color, I believe.I remember F-Zero GX and SC2 looking great in widescreen
dark10x said:Those games were rendered in 24-bit color, I believe.
Lucky you, but it goes beyond graphix whoredom. I would have been much happier to have GC quality in 720p or with good antialiasing.huzkee said:Reading these threads make me so happy to know that I'm not a graphix whore.
Obviously, it depends on the memory requirements per game.BrodiemanTTR said:So if those games could, why can't more or all do the same?
dark10x said:I haven't seen anything other than Excite Truck and Zelda, so it's difficult to say.
Both were in 16:9 mode and suffered from typically poor image quality. I figured they'd be a lot sharper in 4:3, though. I'd say the Gamecube-like dithering is the most disappointing aspect here. The games are jaggy, but having yours colors ruined by 16-bit color really blows. Zelda is particularly bad due to its deeper color scheme.
At this point, I'm still not convinced that the graphics chip is even up to the same level as what was used in the XBOX. What exactly is driving developers to use 16-bit color anyways? It can't be a ram issue and I see no reason why the Wii wouldn't be able to render a Gamecube class title in 24-bit color.
Do you really think that is a good solution? If someone has a main room for gaming/movie watching etc, you'd think they would want to use the Wii in that same room. I mean, the system is set to take over the market, so relegating it to some random TV in another room isn't really the best solution.Johnny said:Someone's been spending too much time on their 360/PS3, you should really keep an SDTV around for all the other systems, including Wii.
For me, it's even worse: I spend a lot of time on my PC, playing at 1920*1200 with 2xAA and 16xAF :lolJohnny said:Someone's been spending too much time on their 360/PS3
no, they rendered to 640x480 and just did 4x(iirc)supersampling antialiasing. all that means is that the frame is scaled to 4 times the resolution before antialiasing is applied.Johnny said:Didn't Dreamcast render games at a higher resolution than 640x480 before actually outputting at that resolution?
See those dots in the circles in dithering? It's a way to trick your eyes to make gradient smoother without introducing more colors. Say you have only 2 colors black and white, without introducing a new color to make something appear to be kind of grayish you dither it (i.e introduce some pattern of black on white to "dirty" it up).LanceStern said:I still don't get dithering. All those pictures showed me was a white circle on a black background
Any other explanations?
...because I can't believe that they would upgrade the video processor in any way shape or form at all and still leave the same amount of memory on it. not saying they didn't...
RuGalz said:See those dots in the circles in dithering? It's a way to trick your eyes to make gradient smoother without introducing more colors. Say you have only 2 colors black and white, without introducing a new color to make something appear to be kind of grayish you dither it (i.e introduce some pattern of black on white to "dirty" it up).
but we know for a fact that they DID upgrade the video processor. at the very least a die shrink given that the chip itself is smaller. my point being, all GCN 1.5 aside, is that it IS a different chip than flipper.. so if they go through the effort to modify the chip from flipper (as opposed to just sticking flipper right back in there, which is what some of you seem to be implying but we KNOW isn't true), why is one to believe that they didn't even just double the 3MB framebuffer on board to allow for 24-bit depth?RuGalz said:I think you answered your own question. What does it take for people to believe Wii is GC 1.5? :lol Certainly, Nintendo isn't going to release a PR saying so... Or are you still believing what Nintendo said way back during GDC before any real details of Wii was announced - something to the effect of "We won't be cutting edge but we'll be comparable.".
They don't really seem to care about that, if you ask me.borghe said:but we know for a fact that they DID upgrade the video processor. at the very least a die shrink given that the chip itself is smaller. my point being, all GCN 1.5 aside, is that it IS a different chip than flipper.. so if they go through the effort to modify the chip from flipper (as opposed to just sticking flipper right back in there, which is what some of you seem to be implying but we KNOW isn't true), why is one to believe that they didn't even just double the 3MB framebuffer on board to allow for 24-bit depth?
SFA also puts the TEV to serious use with some nice shader effects (the grass shader would have been AWESOME for TP's open fields), made heavy usage of data streaming (no load points), featured higher resolution textures than both Zelda games, made use of some impressive shadow effects (soft environment shadows casting over characters), and ran at 60 fps. This was Rare's first effort on the GC back in 2002 and they did an incredible job.Adagio said:I love how Star Fox Adventures runs in 24-bit color...and Wind Waker and Twilight Princess do not...
well, you only need as big a framebuffer as a frame you can renderdark10x said:The XBOX360 only offers 10mb of framebuffer memory, by the way (there are ways around it, though).
I recommend using your warranty to have a tech come out and fix this. I had some weird issues with component input for my TV, had a tech come out and he adjusted the index color and it cleared right up.Matt said:And I haven't ever used another system on this TV, but I don't got them though cable, antenna, or DVD (via HDMI).
Red Steel looks terrible, though. As I pointed out, there were games on GC (which look much better than Red Steel) which ran in 24-bit color.sprocket said:Its because most of those games are really just GC games . using GC apis .
Go look at RED STEEL it is the only launch game I know of that doesn't use dithered blending and has full screen AA .
dark10x said:Red Steel looks terrible, though. As I pointed out, there were games on GC (which look much better than Red Steel) which ran in 24-bit color.
I dunno, there are portions which looked nice, but the majority of the game is filled with incredibly bland, flat surfaces, poor textures, framerate issues, and typical rough image quality (are you sure about AA? It sure doesn't look like it uses any to me)...sprocket said:You may not agree with the art style but technically RS is far more advanced than any of the other launch games . Like I said it didn't use the old GC dither blending and it had full screen AA . It also has the best bloom shader I have seen yet .
borghe said:but we know for a fact that they DID upgrade the video processor. at the very least a die shrink given that the chip itself is smaller. my point being, all GCN 1.5 aside, is that it IS a different chip than flipper.. so if they go through the effort to modify the chip from flipper (as opposed to just sticking flipper right back in there, which is what some of you seem to be implying but we KNOW isn't true), ...
Because we know the system has 88M of total memory. 64 was added for Wii so you are left with 24M memory which is the same as GC. Of course you can argue that they might have shuffle that 24M around but that'd break backward compatibility so no they didn't shuffle it around.why is one to believe that they didn't even just double the 3MB framebuffer on board to allow for 24-bit depth?
You win. I'm bailing out before losing my job.Fourth Storm said:The 88 does not include the 3 MBs of embedded memory on Hollywood - and we don't know for sure if it is really 3MB or if they did actually add more. But the size of Hollywood in comparison to flipper is a giveaway that something extra is going on there, whether it be beefier TEVs, more EDRAM or whatever.
Is that so? Thanks for clearing this up.mrklaw said:If it was 858 wide then it wouldn't be 480p, it'd be some variant of VGA. Any resolution under 720p is sent to the TV as a 4:3 frame (even 16:9 games or DVDs) which you then stretch out horizontally to fit the screen and restore the correct aspect ratio.