• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Which UbiSoft Branch is developing Far Cry Wii? They stink...

I thought shouting "downgrade!" or saying "X console sucks" was not allowed. A mod even opened a thread to warn us.
 
he didn't say downgrade, and that's all the thread that was opened by a mod said was unallowed.

If he said something sucked that didn't actually suck, then we'd totally ban Drinky. Totally.
 
92089020060209screen001qs1.jpg
 
LanceStern said:
Why are some of the Xbox specs higher than Wii's? Come on Nintendo!! I hope GPU would be more important than the CPU...

What are 360's specs if you don't mind me asking...

The Gamecube running at 485Mhz is comparable with the Xbox 733Mhz, because of architectural diffrences.

So the Xbox specs are not higher then Wii, a PowerPC CPU running 729Mhz is MUCH better then a Wintel running 733Mhz.

If someone much technicaly skilled want/care he/she could explain why and how.
 
Mithos said:
a PowerPC CPU running 729Mhz is slightly better then a Wintel running 733Mhz.
Fixed.

Not sure why you called it a "Wintel," though, because that's used to traditionally described an Intel machine running a full, desktop Windows operating system (i.e., not Xbox).
 
loosus said:
Fixed.

Not sure why you called it a "Wintel," though, because that's used to traditionally described an Intel machine running a full, desktop Windows operating system (i.e., not Xbox).

I'd say you broke it.

But why i said Wintel, well wrote it on 2 places and edited, seems i missed one place changing it to Xbox.
 
Red Steel is being made from the ground up on the Wii, isn't it?

I'm guessing that they are porting Far Cry over first, and then tweaking it.

Secondly, don't compare it to Xbox...very few Wii games will compare favourably against the Xbox. The focus is on the gameplay, not the graphics.
 
loosus said:
Not when the Intel in question's a budget Celeron used for gaming specific apps (AI, physics, etc). GameCube's Gekko overall was bit more effective and Wii's Super Gekko totally destroys it... again, Xbox 1's sole advantage here seems to be in NV2A's more programmable/documented shaders. Overall though, the Wii architecture spanks Xbox (as it should coming 5 years later).
 
jarrod said:
Not when the Intel in question's a budget Celeron used for gaming specific apps (AI, physics, etc). GameCube's Gekko overall was bit more effective and Wii's Super Gekko totally destroys it... again, Xbox 1's sole advantage here seems to be in NV2A's more programmable/documented shaders. Overall though, the Wii architecture spanks Xbox (as it should coming 5 years later).
Spanks? Destroys? :lol Ah ha, you're so transparent. They're so ****ing in the exact same league, with Wii being insignificantly better.

Then again, you think everything Nintendo does is God's gift to Earth while everyone else is complete shit, so I'm not exactly surprised that you're spreading "opinions" like that.
 
Lets give it up for current gen tech with a gimmick controller. I give wii 3mths before older people are tired of the controller and the graphics and move on to 360/PS3. Bad thing is I'll prob get one for smash brothers :(.
 
JDSN said:
It looks better in this mag.
57193_ccf2909200600004xr7_122_363lo.jpg

57200_ccf2909200600002ua9_122_364lo.jpg

Becaue megazine images are of small size. All Wii images look good in small size. I'm afraid that it will look horrible on HDTV. Good thing is that I didn't throw away my old tv.
 
loosus said:
Spanks? Destroys? :lol Ah ha, you're so transparent. They're so ****ing in the exact same league, with Wii being insignificantly better.

Then again, you think everything Nintendo does is God's gift to Earth while everyone else is complete shit, so I'm not exactly surprised that you're spreading "opinions" like that.
I dunno... throw a highly cutsomized 64bit PPC up against an off the shelf mobile Celeron, and I'd wager you're going to see an appreciable difference here.

And if we're now using "bias" to judge credibility, I'd say yours may be just slightly better than mine... really they're so ****ing in the exact same league, with yours being insignificantly better. ;)
 
jarrod said:
I dunno... throw a highly cutsomized 64bit PPC up against an off the shelf mobile Celeron, and I'd wager you're going to see an appreciable difference here.

And if we're now using "bias" to judge credibility, I'd say yours may be just slightly better than mine... really they're so ****ing in the exact same league, with yours being insignificantly better. ;)
i'll buy that for a dollar.
 
jarrod said:
Not when the Intel in question's a budget Celeron used for gaming specific apps (AI, physics, etc). GameCube's Gekko overall was bit more effective and Wii's Super Gekko totally destroys it... again, Xbox 1's sole advantage here seems to be in NV2A's more programmable/documented shaders. Overall though, the Wii architecture spanks Xbox (as it should coming 5 years later).

Yes, that's such a great accomplishment, 5 years later -_-
 
Maybe, just MAYBE, the gameplay in Far Cry is really good.

Then again, most at this board put that on the low end of the totem pole when it comes to gaming.

I mean if it doesn't use shaders up the butt or doesn't have anti-aliasing up the wazoo(never noticed jaggies much when playing games myself), it's a horrible effort.

I love when a thread with screens and GAMEPLAY info is made and no one mentions the latter at all. Sums up a lot at this board.
 

r_fullbright 1

Maybe, just MAYBE, the gameplay in Far Cry is really good.

Then again, most at this board put that on the low end of the totem pole when it comes to gaming.

I mean if it doesn't use shaders up the butt or doesn't have anti-aliasing up the wazoo(never noticed jaggies much when playing games myself), it's a horrible effort.

I love when a thread with screens and GAMEPLAY info is made and no one mentions the latter at all. Sums up a lot at this board.

Do you not read threads about released games or something?
 
TheGreatMightyPoo said:
Maybe, just MAYBE, the gameplay in Far Cry is really good.

Then again, most at this board put that on the low end of the totem pole when it comes to gaming.

I mean if it doesn't use shaders up the butt or doesn't have anti-aliasing up the wazoo(never noticed jaggies much when playing games myself), it's a horrible effort.

I love when a thread with screens and GAMEPLAY info is made and no one mentions the latter at all. Sums up a lot at this board.

We all know it's going to suck.
 
Aeon712 said:
Lets give it up for current gen tech with a gimmick controller. I give wii 3mths before older people are tired of the controller and the graphics and move on to 360/PS3. Bad thing is I'll prob get one for smash brothers :(.

Funny that I only hear that trendy word "gimmick" thrown at the Wii from people that haven't played it.

Hell, even those that have played Madden(which many thought the controller wouldn't work for) say it is definitely not "gimmicky".
 
PC FarCry: Good

Xbox FarCry: Great - One of the best FPS games on the console

360 FarCry: Good - Even though they updated the graphics on the vehicles and water and put in a longer draw distance no one seemed to notice. Too bad they messed up the sensitivity on the controls.

Wii FarCry: ???

When I only owned a Gamecube I would have loved to have been able to play the game.
 
Mithos said:
The Gamecube running at 485Mhz is comparable with the Xbox 733Mhz, because of architectural diffrences.

So the Xbox specs are not higher then Wii, a PowerPC CPU running 729Mhz is MUCH better then a Wintel running 733Mhz.

If someone much technicaly skilled want/care he/she could explain why and how.
Pretty much. I mean people are forgetting that some Cube titles were able to compete with Xbox's best efforts. That is with a clock speed difference of over 200MHz plus who knows what else instructional improvements. It's only logical to assume that when that 485MHz number is increased to 729, you're going to be able to surpass the Xbox efforts on paper. This, of course, not even taking into account the superior memory archeticture/speeds and GPU speeds/improvements that Wii possesses.

Some developers will, of course, do better than others. That is to be expected. I guarantee that if Rare, Factor 5, Capcom, Nintendo themselves, or Square-Enix let loose on the system, it would blow away any Xbox efforts. Xbox gets bailed out because it uses the DirectX API, though..... GC/Wii does not have that luxury.
 
PhoenixDark said:
Yes, that's such a great accomplishment, 5 years later -_-
Well no, not really. Nintendo just polished up their GC design to it's most capable... the Wii hardware is likely along the lines of what we'd gotten in 2001 if Nintendo were crafting a $299 loss taker like their rivals. As is though, the GC architecture is a wonderfully clean, insanely efficient design. If you're going to take a last gen architecture to spruce up, it's pretty easily the best canidate.... and given the R&D focus on extremely low emission and energy drain, I sort of think the Wii chipset likely started out being a handhled design. It's more similar to PSP's chipset in focus, and likely cost around the same to develop.

The advances between Wii and GameCube in terms of chipset are about equal to the advances from GB to GBC actually.
 
Amir0x said:
he didn't say downgrade, and that's all the thread that was opened by a mod said was unallowed.

If he said something sucked that didn't actually suck, then we'd totally ban Drinky. Totally.

Thats low..
 
Fight for Freeform said:
The focus is on the gameplay, not the graphics.
entering Next-Gen gives us the right to bitch about the graphics for the new platforms.
I'm curious to see how long people will stand for this if devs continue to do this for the next 2-3 years for the Wii
 
LanceStern said:
Why are some of the Xbox specs higher than Wii's? Come on Nintendo!! I hope GPU would be more important than the CPU...

Totally different architectures. In real world terms Wii is at least twice as powerful as Xbox. But the architecture isn't PC-centric like the Xbox, which is why developers need to put in special effort to develop nice looking games for it. Same thing happened with the Gamecube.
 
gutter_trash said:
I'm curious to see how long people will stand for this if devs continue to do this for the next 2-3 years for the Wii
Shouldn't GC/Xbox ports be running dry by then? Of course then we'll probably just get direct PS2/PSP ports from EA/Ubi/Activision/THQ/2K/etc. :lol

The only place we're ever gonna see high end, cutting edge Wii visuals that actually push the hardware from 3rd parties is in Japan. Just like GBA/DS.
 
Oh man, at least have the physics engine in tact. It looks pretty bad as it is, last thing I need is pre-animated death sequences.
 
Warm Machine said:
PC FarCry: Good

Xbox FarCry: Great - One of the best FPS games on the console

360 FarCry: Good - Even though they updated the graphics on the vehicles and water and put in a longer draw distance no one seemed to notice. Too bad they messed up the sensitivity on the controls.
Wii FarCry: ???

When I only owned a Gamecube I would have loved to have been able to play the game.

Just making sure... You know that the patch is there and does fix the controls in the multiplayer mode?
 
Mr. TV Goggles said:
crystal20shockedlx5.jpg

SOUND THE TROLL ALARM!!!!
I'm sorry, I didn't know it was trolling to compare similar games, my mistake! :lol

Look, if guys like you don't want to see people posting about these games looking like shit, close your eyes or keep out. I don't really care about your delicate sensibilities. Hell, this game would look bad if it were on the Dreamcast let alone the Wii or Xbox.
 
loosus said:
Fixed.

Not sure why you called it a "Wintel," though, because that's used to traditionally described an Intel machine running a full, desktop Windows operating system (i.e., not Xbox).

risc > cisc
 
I wish nintendo/ati didn't half ass wii's shader abilities. If wii games looked slightly better than riddick i dont there would be a whole lot to complain about.
 
big_z said:
I wish nintendo/ati didn't half ass wii's shader abilities. If wii games looked slightly better than riddick i dont there would be a whole lot to complain about.
DX7+ shaders aren't needed for normal mapping... even Dreamcast and PSP would be capable of normal maps in game with more RAM available. GameCube could've run Riddick as is probably if it had as much RAM as Xbox... wouldn't be at all a problem on Wii really given it has as much RAM avaialble for game apps and GC and Xbox combined.
 
jarrod said:
DX7+ shaders aren't needed for normal mapping... even Dreamcast and PSP would be capable of normal maps in game with more RAM available. GameCube could've run Riddick as is probably if it had as much RAM as Xbox... wouldn't be at all a problem on Wii really given it has as much RAM avaialble for game apps and GC and Xbox combined.

So Wii CAN run Riddick?
 
Oblivion said:
So Wii CAN run Riddick?
If GameCube had more RAM, it could run Riddick. If memory wasn't an issue, every current gen system but (possibly) PS2 would be capable of in game normal maps from what I understand.
 
segasonic said:
don't expect next-gen graphics from Gamecube Turbo and you won't be disappointed as much!

Don't expect next gen graphics from a souped up G71, RSX, and you won't be disappointed as much. Really some of the peeps in this forum need to STFU when it comes to Wii and it's abilities. Current gen platforms can't touch Wii, Wii can't touch PS3 or 360, and none of the consoles will touch the pc once DX10 cards come out and the core duos start seeing improvements made on them.

BTW normal mapping sucks on the GC architecture, it takes 3 passes do to dot3 operations IIRC, where EMBM is faster and works on a variety of surface same could be said for it's ability to do displacement mapping as demonstrated in RS. Developers taking the pc route (like ubisoft usually does) with GC Architecture will never stack up to what sega, rare, nintendo, capcom, or factor 5 have done.
 
Top Bottom